Neural Network -Optimizer

We have seen earlier the entire purpose of to optimize weights so as to reduce loss. Let’s see in detail different optimization functions we can use with neural networks

What are Optimizes?

We have seen an optimizer already in our Linear Regression blog called “SGD”. As described earlier the purpose of a optimizer is to update weights of the NN in order to reduce the loss function.

There are many different types of optimizer available in Keras. Its important to know what they are and there pro/cons https://keras.io/optimizers/

But in the current case we will use an optimizer called “Adam”. To use this in our NN we do this simply

# blog related to optmizer 
from keras.optimizers import Adam

model.compile(Adam(lr=.0001), .......   )

To understand different types of optimizes i recommend reading this

https://medium.com/datadriveninvestor/overview-of-different-optimizers-for-neural-networks-e0ed119440c3

https://hackernoon.com/some-state-of-the-art-optimizers-in-neural-networks-a3c2ba5a5643

https://blog.algorithmia.com/introduction-to-optimizers

Also, to understand in general optimizes read this further

https://deeplizard.com/learn/video/sZAlS3_dnk0

https://deeplizard.com/learn/video/_N5kpSMDf4o

excellence-social-linkdin
excellence-social-facebook
excellence-social-instagram
excellence-social-skype