AdaMax is a generalisation of Adam from the $l_{2}$ norm to the $l_{\infty}$ norm. Define:
$$ u_{t} = \beta^{\infty}_{2}v_{t-1} + \left(1-\beta^{\infty}_{2}\right)|g_{t}|^{\infty}$$
$$ = \max\left(\beta_{2}\cdot{v}_{t-1}, |g_{t}|\right)$$
We can plug into the Adam update equation by replacing $\sqrt{\hat{v}_{t} + \epsilon}$ with $u_{t}$ to obtain the AdaMax update rule:
$$ \theta_{t+1} = \theta_{t} - \frac{\eta}{u_{t}}\hat{m}_{t} $$
Common default values are $\eta = 0.002$ and $\beta_{1}=0.9$ and $\beta_{2}=0.999$.
Source: Adam: A Method for Stochastic OptimizationPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Generation | 2 | 33.33% |
Intrusion Detection | 1 | 16.67% |
Speech Recognition | 1 | 16.67% |
Time Series Analysis | 1 | 16.67% |
Quantization | 1 | 16.67% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |