Back Propagation in TensorFlow

While executing any commands, TensorFlow actually builds computational graph in memory without actually returning any Values. When the Session initializes, it logically processes this graph and produces data. When we update the strength of connections based on the training data that we saw and the error that we calculated is the Process of Back Propagation. Continue reading “Back Propagation in TensorFlow”

The art of Deep Learning and Neural Networks

Neural Networks along with deep learning provide a solution to image recognition, speech recognition, and natural language processing problems. The Introduction video on TensorFlow and Deep Learning gave an overview of how a neural network and deep learning help on a Classification Problem to Classify hand-written images and how it was actually done with TensorFlow got cleared. Again after viewing the following video to understand well. Continue reading “The art of Deep Learning and Neural Networks”

Machine Learning Algorithms: Which, When and Where

After Successfully Completing introduction to Machine Learning Course in Udacity, with the basic understanding of machine learning algorithms, in this blog I am consolidating the algorithms with its specialization. There are several algorithms which make the machine learning easier, also having several accomplishments with Python Programming and SciKit-Learn (SKLearn) support. The Understanding from the course made me some Classification of several topics. There are four ideas behind all machine learning process,

(i) Dataset/Question

(ii) Features

(iii) Algorithms

(iv) Evaluation

(i) Dataset/Question

Before diving into the actual progress in a machine learning application, collecting enough, the relevant dataset is important as this will be helpful in starting the progress. The more we study the data, the more we could train the systems. So probably the collection of relevant data is the ultimate part of starting an application.

 

 

 

 

TensorFlow Introduction

Started my TensorFlow learning last weekend, this is my first blog on TensorFlow. While going through the YouTube Video that given the overview of TensorFlow to the high level in Neural Network for the handwritten text images classification. After crossing across some algorithms which were efficiently handled by the Python and SciKit Learn to solve the complex process to much easier steps. I hope the TensorFlow do have support for the Neural Networks library which could help in solving the complex classifications to the simpler ones. Continue reading “TensorFlow Introduction”

Evaluation Metrics

The last part of the Machine Learning Course is the Validation and Evaluation. This is the section where we identify the originality of our result. So this is much important to make sure that our Algorithm/Machine Learning Process is doing the right one as planned. After the validation step, there can be a situation which alarms/showcases the critical errors found in it. Continue reading “Evaluation Metrics”

Principal Component Analysis in Machine Learning

The definition of Principle Component Analysis(PCA) is Systematized way to transform input features into Principal components. The Principal components are used as new features. PCs are directions in data that maximize variance(obviously minimize information loss) when we Project/Compress down onto them. More variance of data along a PC, higher that PC is ranked.  Continue reading “Principal Component Analysis in Machine Learning”

Clustering and Feature Scaling

The lesson 9 and lesson 10 in the course are Clustering and Feature Scaling.

Clustering:

Clustering comes under unsupervised learning methods. An unsupervised learning is also important because most of the time we get data in the real world doesn’t have flags attached to it. If it so, we would turn to unsupervised learning techniques. Continue reading “Clustering and Feature Scaling”

Outliers in Regression

As we saw the Regression is one of the popular machine learning algorithms, I have come over errors and performance of the regression in my previous blog. Outliers are causing in the regression that could also happen like the one in Support Vector Machines or in the Naive Bayes Classifier algorithm do. An outlier is the point of a data that is far away from the regression line. But I had a question that, is it necessary to remove the outliers and do the fit again? The answer is here. Continue reading “Outliers in Regression”