no code implementations • 23 Apr 2023 • Inyoung Paik, Jaesik Choi
In this study, we analyze the occurrence and mitigation of gradient explosion both theoretically and empirically, and discover that the correlation between activations plays a key role in preventing the gradient explosion from persisting throughout the training.
no code implementations • 31 Jul 2019 • Inyoung Paik, Taeyeong Kwak, Injung Kim
The experimental results show that the routing algorithms do not behave as expected and often produce results that are worse than simple baseline algorithms that assign the connection strengths uniformly or randomly.
no code implementations • 31 Jul 2019 • Inyoung Paik, Sangjun Oh, Tae-Yeong Kwak, Injung Kim
To address the issue of catastrophic forgetting in neural networks, we propose a novel, simple, and effective solution called neuron-level plasticity control (NPC).