AI/Deep Learning

Optimization

neonii 2024. 12. 7. 05:16
728x90

Udemy The Complete Neural Network Bootcamp 참고

Optimization

  • batch gradient descent
    • take all the samples in one iteration = 전체 사용해 update
    • number of iterations per epoch = 1
  • stochastic gradient descent
    • take one sample at each iteration = 한 sample로 각각 update
    • number of iterations per epoch = number of samples
  • mini-batch gradient descent
    • take the best of both batch and stochastic gradient descent
    • number of iterations per epoch = number of samples / batch size

Optimization 변형

Stochastic Gradient Descent with momentum

RMS prop: adaptive learning rate method

Adaptive Moment Estimation Adam

Weight Decay

AMS Grad

728x90