Posts

Loss functions in Deep Learning using Keras

Loss functions in Deep Learning using Keras In Deep Learning, neural networks requires an optimizer and a loss function to configure an efficient model. The purpose of loss functions is to compute the quantity that a model should seek to minimize during training. Loss function are also termed as Cost function. Loss functions are categorized  namely, Probabilistic losses and Regression losses. For training the neural network various algorithm are used. To achieve optimization the weights are updated using back propagation and the optimization algorithms are used to reduce errors in the next iteration with weights changed. The score calculated after each evaluation is called the loss Probabilistic  Loss: These loss functions are used to identify classification based models Majority used loss functions in this category are Binary Cross Entropy: This function calculates the loss of classification model where the target variable is binary like 0 and 1.   Categorical Cross Entr...

Optimization techniques in Deep Learning

  In the deep learning world, the neural network is connected to all the layers(Input layer, Hidden layer and the output layer). In the front propagation we get the Y^ and calculate the error function. Error function is also called as the lost function or cost function. To reduce the loss function the optimizers are used. They update the weight in the back propagation.   Gradient Descent: The foremost optimizer used was the Gradient Descent. It works as follows 1.  Calculate what a small change in each individual weight would do to the loss function 2.  Adjust each individual weight based on its gradient 3.  Keep doing steps #1 and #2 until the loss function gets as low as possible   During optimization there could be problem in getting stuck on local minima . To avoid this need to make use of learning rate.   The learning rate  variable is used to multiply the gradients to scale them and need to ensure by changing weights at the right pace, not m...