Updates from December, 2016 Toggle Comment Threads | Keyboard Shortcuts

  • maravindblog 2:07 pm on December 13, 2016 Permalink | Reply  

    Project: Personal website #2 

    Directives are core of angular2. Even components also come under directives. There are various kinds of directives.

    Attribute directives

    ngClass, ngStyle are 2 examples. They are used to apply css classes to the DOM elements.

    Structural directives

    *ngIf, *ngFor are 2 examples. These kind of directives are used to change the actual DOM elements. The “*” is the syntax to use these directives.
    The general syntax to use *ngIf

    <li *ngIf=”condition”>some-statement</li>

    But *ngIf can also be written differently using property binding

    <template [ngIf]=”condition”>

    <li>some-statement</li>

    </template>

    In previous post I have just used a single dummy object to display the data. After learning about *ngFor directive we can use it to display collection of objects.
    components:

    I have realised that I did not explain much about how components are laid out and various events binding used.

    about-component

    This component is same as before. There is nothing much to be done other than changing some UI. I thought I make my UI more fancy once I get core angular2 part done! As of now it looks like this (below). I have used a some dummy image.

    projects-component

    I have used *ngFor to insert multiple projects as <li> items. Actually projects component is subdivided into project-list and project-detail component. The click event on any project item is registered and that particular item’s details are displayed other right side. I have created a class of type Project.ts to store project details.

    skills-component

    Similar to projects components I have used *ngFor to display list of items. I wanted to show professional and interpersonal skills separately hence I have used 2 more components for the about under skills component. I have created a class of type Skill.ts to store project details.

    Contact-component

    I’m keeping contact component as constant footer. I made the contact strip dynamic It means that how many contact objects I add in the future the strip can adapt. Contact component might have got biggest make over by far!

    None the less I impressed by the progess I’m making. I’m also devloping another project along side to this for learning purpose. I’ll upload it to git so that you can look at it too. Right now, I’m having a bit of problem about using dropdown using materialize.css .

    skills

     

    See you in next!

    Advertisements
     
  • maravindblog 4:03 pm on December 6, 2016 Permalink | Reply
    Tags: Angular2, , web devlopment   

    Project: Personal website 

    I went to a couple of events in the month of november. There I got a chance to talk to amazing people who have good experience in the industry. As I’m about to start my “official” professional career I wanted to get some insights about how companies work and what are the technologies are being used. To my surprise 75% of people talked about angular. How an amazing technology it is and etc. So, Naturally I wanted to learn that as well !

    Many people in the events talked about various online courses. I learned a lot of things on internet but never bought a course ! I love free stuff ! But people suggested that some paid courses are worthful. As angular2 is pretty new software there were no good free courses online. At the point of time I saw a ad on facebook that there is a black friday sale in udemy. I took no time to grab the offer and bought a angular2 course just for 14$.

    I must say the course is really good. All the topics are carefully chosen and are explained. Therefore I’m learning the technology and at the same time building a personal website project. To actually understand angular2 fully I will surely build at least couple of projects.

    Whole angular2 revolves around components more precisely directives. Therefore initially I have thought of components such as about, contact, projects, skills.

    Project as of now: I have completed 4 sections in the udemy course. I have learnt till databinding, property binding, event binding, components.

    components :

    about-component

    projects-component

    projects-component

    projects-component

    skills-component

    skills-compoenent

    contact-component

    contact-component

    So, Clearly a long way to go ! But I guarantee that learning and mastering this technology will be a great asset to my career. And by the way I’m loving angular2. 

    See you in next!

     
  • maravindblog 12:07 pm on November 26, 2016 Permalink | Reply
    Tags: data analysis   

    Beginning ML: What next? 

    Machine learning is a branch of artificial intelligence and deals with lots and lots of data. In our neural network models we have used MNIST dataset which was pre processed and can be directly used by tensorflow. And we also have used a raw dataset that we had to pre process to make use for our model. Clearly one need to understand and analyse data ( In fact large amounts of data ) .

    Data Analysis is the branch that deals with data and helps us to understand data and use to draw conclusions. Therefore in machine learning data is critical part  and we have to learn to analyse data.

    Next posts we will look how we can use python to perform data analysis. I’m looking up few courses online to learn data analysis .

    Let’s catch up in next!

     
  • maravindblog 4:48 pm on November 23, 2016 Permalink | Reply
    Tags: , ,   

    Beginning ML: Movie Review Sentiment Analyser cont. :KnowDev 

    In last post we have used pandas to extract raw data from .csv files and used bag of words model to pre process our data into feature sets.

    In this post we will train the model. It’s most simplest thing. We will use RandomForests to predict. Random forest is a collection of decision trees.

    First we initialize forest with 100 decision trees.

    forest = RandomForestClassifier(n_estimators=100)

    We will use fit function in forest variable to build a forest of trees from training set.

    forest = forest.fit(train_cleaned_data, train[‘sentiment’])

    trained_cleaned_data is the pre processed data from our last post. train[‘sentiment’] is the labels for all the data corresponding to X. And we are done with training our model.

    Now, we can test and predict using our model.

    To test we have first transform the test raw data into required format. We will use transform while testing because to avoid over-fitting.

        test_data_features = vectorizer.transform(clean_test_reviews)

    Then we will simply predict using predict function of forest variable.

        result = forest.predict(test_data_features)

    We will finish off our testing by simply loading all the predictions to a file for permanent storage. And that’s it we have used a new model and a new technique to build a sentiment analyzer. This model is not a perfect one for commercial use because one, we did not use a large dataset and also we did not use a more sophisticated model. In up coming posts we  will see what are those “sophisticated” techniques or models. I’m sure those concepts will be much more interesting, with that I’ll see you soon!

    Complete source code here

     
  • maravindblog 3:57 pm on November 22, 2016 Permalink | Reply
    Tags: ,   

    Beginning ML – Movie Review Analysis: KnowDev 

    Till the last post we have seen methods of building a sentiment analyzer using multi-layer feedforward neural network. In fact in this post also we will build sentiment analyzer which can predict positiveness or negativeness of a movie review, We can consider this as one of the user case of what we learned so far.

    This particular concept is divided into 2 parts. One, Pre processing our data. Two, Using random forest technique to predict.

    Pre-Processing :

    We will use pandas module to extract data from a csv file. As we did before we will use bag of words model to create feature sets. But before we have to clear little dirt like html tags (using beautifulsoup module), removing punctuations, and removing stopwords . StopWords  are the words like the, and, an, is etc which do not add any specific emotion to the sentence. We are removing punctuations as well to just remove the complexity, once we get quite familiar with what we are doing we add more complexities to our model. We will implement all this functionality in function clean_text.

    Now we have to apply these modifications to all the reviews in our file. We call that function as create_clean_train. This function might take couple of minutes because there are almost 25000 reviews all together.

    We will create feature sets using CountVectorizer from scikit learn.

    In next, we will complete building our movie review sentiment analyser. See you next!

    Complete source code: here

     
  • maravindblog 4:39 pm on November 19, 2016 Permalink | Reply
    Tags: , ,   

    Beginning ML – Sentiment Analysis Using Neural Network cont. : KnowDev 

    This post is a continuation from this .

    I hope you have got a good understanding why we have to pre-process. In this post we shall train our model and also input our own sentences.

    First of all we shall get our feature sets that we have created either from pickle or call the function to store into a variable.

    from create_sentiment_featuresets import create_feature_sets_and_labels
    train_x, train_y, test_x, test_y = create_feature_sets_and_labels('pos.txt', 'neg.txt')

    We will be using the same neural network model that we used here. First we have define our placeholder for features.

    x = tf.placeholder('float', [None, len(train_x[0])])
    y = tf.placeholder('float')

    len( train_x[0] ) returns the length the features.

    The neural network model is define using neural_network_model function. After the neural network is defined it’s time to train our model.

    First we’ll capture the prediction / output of neural network using

    prediction = neural_network_model(x)

    Then, we have to find the cross entropy of the prediction made by our model. We are using softmax regresstion.

    #1
    cross_entropy = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(
                                                          prediction, y))

    After finding the cross entropy is time to back propagate and try reduce the difference.

    #2
    optimizer = tf.train.AdadeltaOptimizer(0.5).minimize(cross_entropy)

    both #1 and #2 makes the training step. We’ll start session and using number of epochs as 10.

    The accuracy we could achieve was 55.44

    sentiment-54-accuracy

    The trained model is saved into ‘sentiment_model.ckpt’, later we can use that to restore our variables ( i.e weights and biases ) to use.

    Making predictions :

    To make predictions using our model that we have just trained we have to preprocess our input sentence so that can be passed as features to our model. After we prepocess our input sentence we predict.

    result = (sess.run(tf.argmax(prediction.eval(
    feed_dict={x: [features[:423]]}), 1)))

    we print out whether the output is positive or negative using

    if result[0] == 0:
        print('Positive:', input_data)
    elif result[0] == 1:
        print('Negative:', input_data

    sentiment_output

    As you can see our model makes pretty good prediction even though the accuracy is 54% .

    In this post we have seen how we can train our own data as well as use it. In less than a week time we are able to make a machine which can predict the sentiment of any sentence pretty interesting right ? In next post I will introduce you to more sophisticated version of sentiment analysis. See you in next !

    link to complete source code :  here

     
  • maravindblog 4:54 pm on November 17, 2016 Permalink | Reply
    Tags: ,   

    Beginning ML – First Neural Net : KnowDev 

    We have gone through some of the important topics in tensorflow and believe me there are ton of others ! but no worries.. We’ll catch up ! I always believed in project based learning. Therefore we’ll do the same this time as well. We shall be building a feed forward deep neural network which can classify handwritten digits. Sounds interesting right ? Let’s get into it. Open up any text editor or IDE. I personally prefer coding in an pycharm IDE.  It’s a wonderful piece of software to write your python scripts.So, What is the most critical part of any neural network ? Data ! right ! Neural networks shine when there is lots and lots of data to train it. We will be using mnist dataset provided in tensorflow.org tutorials.We can get the data by simply importing and loading into python variable.

    import tensorflow as tf
    from tensorflow.examples.tutorials.mnist import input_data
    mnist = input_data.read_data_sets("MNIST_data/", one_hot=True)
    

    In MNIST, Every data point contains 2 points 1) image 2) label. Every image is of size 28 x 28.

    Let’s start building our graph by creating a placeholder variable.

    x = tf.placeholder(tf.float32, [None, 784])
    y = tf.placeholder(tf.float32) # for labels
    

    xisn’t a specific value, but we’ll input when we ask tensorflow to run computations. Here x represents a MNIST image, flattened into 784-dimentional vector. We represent this as 2-D tensor of floating point.

    Now we need variables for weights and biases. These we represent with tensorflow variable as these variables can be modified by operational computations.

    W = tf.Variable(tf.random_normal([784, 10]))
    b = tf.Variable(tf.random_normal([10]))

    Notice that W, b are initialized with some random values but it doesn’t actually matter and w is tensor of shape [784, 10] because we want to generate classification for 10 classes i.e 0,1,2..9.  As we are building a deep net we need to have one or more hidden layers.

    hidden_1_layer = { ‘weights’: tf.Variable(tf.random_normal([784, n_nodes_hl1])),
    ‘biases’: tf.Variable(tf.random_normal([n_nodes_hl1])) }

    n_nodeshl1 is declared as 500. It is number of nodes in single hidden layer. We can tweak these numbers to check the change in accuracy. Moving on..

    Now we can completed the neural network model by completing the implementation of layers.

    l1 = tf.add(tf.matmul(data, hidden_1_layer[‘weights’]), hidden_1_layer[‘biases’])
    l1 = tf.nn.relu(l1)  #activaion func

    There few more steps before we start training our model i.e Actually defining classification algorithm and employ a optimizer for back propagation. We are using softmax_cross_entropy_with_logits  function.

    cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(prediction, y))

    #comparing the diff with predicted vs orginal 

    optimizer = tf.train.GradientDescentOptimizer(0.5).minimize(cost)

    There are many kinds of optimizers available in tensorflow, each one is optimal for some particular use-case. 0.5 mentioned is the learning rate of the neural network.

    Let’s train our neural network. Each cycle of feed forward and back propagation is called an epoch. I have set number of epoch as 5 and also 10. we have to start a session and initialize all variables. We can run our both optimizer and cost epoch number of times here it is called as training step.

    _, c = sess.run( [optimizer, cost], feed_dict = { x:epoch_x, y:epoch_y } )

    We are placing the values of both x and y in batches. c is the variable which holds the value of epoch loss in each epoch.

    Its time to test the neural network and check the accuracy of our model.

    correct = tf.equal(tf.argmax(prediction,1), tf.argmax(y,1))
    accuracy = tf.reduce_mean(tf.cast(correct, ‘float’))

    print(‘Accuracy: ‘, accuracy.eval( { x:mnist.test.images, y:mnist.test.labels }) )

    The accuracy I could achieve was 97.999.

    This implementation of simple deep neural network. This code is inspired from pythonprogramming.net and also tensorflow.org .

    complete source code : https://github.com/makaravind/ImageClassifier

    We shall be using this model for more couple of use-cases before we move on to another model. I’ll catch you up in next post.

     
  • maravindblog 4:04 pm on November 16, 2016 Permalink | Reply
    Tags: ,   

    Beginning ML – basics : KnowDev 

    Let’s get our hands dirty !

    First. Some basics

    First things first, Import tensorflow library

    import tensorflow as tf

    Let’s first understand how tensorflow works by taking 2 tensorflow constants.

    x1 = tf.constant(5)

    x2 = tf.constant(6)

    Multiply x1 and x2.

    result = tf.mul(x1, x2)

    Now, if you try to print the result and run the program you wouldn’t get any output because till now if have constructed the computational graph. To actually multiply the constants and get the result of the multiplication, you must launch the graph in the session.

    with tf.Session() as sess:

    out = sess.run(result)

    print out

    The actual computaion takes place when sess.run(result) is called ! The output can be seen in the terminal as :

    Tensor(“Mul:0”, shape=(), dtype=’int32′)

    30

    As you can see the everything in tensorflow is represented as tensor. A tensor can be thought of as an multi-dimensional matrix. Each node in tensorflow computational graph is called ‘ops’. An op can contain zero or more tensors and tensors can only be passed for operations in the graph.

    I hope this post gives you a brief introduction of tensorflow and its core working in a nutshell. In coming posts we shall be looking into more complicated and yet interesting concepts.

     

     
  • maravindblog 12:23 pm on November 16, 2016 Permalink | Reply
    Tags: , tesorflow   

    Beginning ML – Intro & Installation: KnowDev 

    It’s been a long since I’ve posted. But I’m back with the most interesting and trending concept i.e machine learning. I have just started learning ML and I want to share my whole experience. Hence I will be posting regularly all the new things I’m learning and various sources etc.  Tune in because as I’m a noob to ML my experience and resources I’m posting here will be a great use for someone who wanna start learning ML.

    I’m using tensorflow on python2. Tensorflow is library maintained by Google. It contains various utility functions which can perform complex calculations. After all ML is result of complex math models and computations. In tensorflow, we can model a neural network and train it and also test with great ease. So let’s get into it but before we actually start working we need to install tensorflow library !  Tensorflow can very easily installed on Linux or Mac Os. We can work with tensorflow on windows only through virtual machine or docker. I’m using ubuntu hence tensorflow can be installed just like any other library in python i.e using pip.

    Following are the commands to install tensorflow. Note that I’m using python2.

    #ubuntu/Linux 64-bit 

    $ sudo apt-get install python-pip python-dev

    Above is just for installing pip if not already present.

    Tensorflow is available for both CPU and GPU versions. As of now I’m using CPU version.

    # Ubuntu/Linux 64-bit, CPU only, Python 2.7

    $exportTF_BINARY_URL=https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-0.11.0-cp27-none-linux_x86_64.whl

    Tensorflow installation will be completed with the following command, And you can test it by just importing tensorflow and running the python file. If you don’t get any errors you are good to go.

    sudo pip install –upgrade $TF_BINARY_URL

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel