Back Propagation in TensorFlow

While executing any commands, TensorFlow actually builds computational graph in memory without actually returning any Values. When the Session initializes, it logically processes this graph and produces data. When we update the strength of connections based on the training data that we saw and the error that we calculated is the Process of Back Propagation.

Because of the model being represented in the graph of operations instead of code, we don’t have to write additional code, we can just compute and apply those updates automatically.

Another advantage of Computational Graph is that we can decide where to execute the part of the graph. One part of it can be executed on one machine while the other in somewhere may be in GPU’s while the data input code runs back on the CPU.

TensorFlow runs on CPU’s, GPU’s, iOS, Android, Arduino and even in Raspberry Pi. In Google datacenters, there is a specialized hardware specifically designed to perform the computational graphs which are likely to faster(TPU).

different Machines TF

 

Leave a comment