TensorFlow Introduction

Started my TensorFlow learning last weekend, this is my first blog on TensorFlow. While going through the YouTube Video that given the overview of TensorFlow to the high level in Neural Network for the handwritten text images classification. After crossing across some algorithms which were efficiently handled by the Python and SciKit Learn to solve the complex process to much easier steps. I hope the TensorFlow do have support for the Neural Networks library which could help in solving the complex classifications to the simpler ones.

The author made some Introduction which deals with the classification of numbers from 0-9 of different images from the dataset library. The very first term which the instructor used was Softmax

Softmax (Lx) = eLn / || eL ||

Lx   – Softmax — Neuron Outputs = Weighted Sum of All Pixels + Bias

I have gone through the link in which some researchers did 20 years of research for making the Classification of handwritten recognition. The instructor specified we can see that how it is to be implemented in 45 minutes. This statement from him made me open my eyes how he could solve this complex issue in a quick time.

The images which are inputs are been converted to 10 neurons that are equivalent to 10 digits(0-9). If we choose the weights of weighted sums and bias correctly, one of these neurons will have strong outputs.

For Classifications Softmax is a good activation function and easy to use. After the normalizing event, we get the vector element,

L = X.W + b

If we are trying to add two elements, whose sizes don’t match, reproducing the small element as much as possible till the sizes matches makes feasible outputs. Here is the basic formula for one layer of Neural,

Y = Softmax(X.W + b)

The same can be implemented with TensorFlow function,

Y = tf.nn.softmax(tf.matmul(X,W)+b)

where NN is a neural network library in TensorFlow. In any machine learning application, the evaluation and validation step is followed. It helps in finding the distance between what our system predicts and the actual result. The exact distance can be calculated by cross entropy,

Cross-Entropy = ∑ Yi‘ . log(yi)

The images fed to the systems are one hundred images(100) at a time, with 10 neurons as a base and one layered network where each neuron is a weighted sum of all images. The whole progress involves the following technical terms,

(i) Activation Functions (Ex. RELU and Sigmoid Functions[which is a continuous function going from 0 to 1])

(ii) Softmax Functions

(iii) Dropouts

(iv) Broadcasts and Biases,

Y = tf.nn.relu(tf.matmul(X, W) + b)

The biological neurons in human brain behave like the Sigmoids, when they are not stimulated output is zero. When Simulated above a certain threshold, they start to output not zero but proportional to the amount of stimulation.

Installing TensorFlow on Ubuntu 16.04 Machine

There were two types of installations,(i) TF with CPU Support and (ii) TF with GPU. I have chosen TF with CPU support when there is need to train the data in batches we may need it to be a GPU supported. The TensorFlow Library requires some pre-requisites that need to be Satisfied before installing it.

Pip:

sudo apt install python-pip

Numpy:

sudo pip install numpy
sudo apt install python-pip python-dev

TensorFlow:

pip install tensorflow

 

 

 

 

 

Leave a comment