no code implementations • WMT (EMNLP) 2021 • Nicolas Ballier, Dahn Cho, Bilal Faye, Zong-You Ke, Hanna Martikainen, Mojca Pecman, Guillaume Wisniewski, Jean-Baptiste Yunès, Lichao Zhu, Maria Zimina-Poirot
Experiment 2 uses OpenNMT to fine-tune the model.
no code implementations • 27 May 2024 • Bilal Faye, Mustapha Lebbah, Hanane Azzag
We expand normalization beyond traditional single mean and variance parameters, enabling the identification of data modes prior to training.
1 code implementation • 25 Mar 2024 • Bilal Faye, Hanane Azzag, Mustapha Lebbah
Similarly, mixture normalization (MN) encounters computational barriers in handling diverse Gaussian distributions.
no code implementations • 7 Mar 2024 • Bilal Faye, Hanane Azzag, Mustapha Lebbah, Djamel Bouchaffra
Additionally, the network learns to differentiate embeddings of different modalities through fusion with context and aligns data distributions using a contrastive approach for self-supervised learning.
no code implementations • 14 Mar 2023 • Bilal Faye, Mohamed-djallel Dilmi, Hanane Azzag, Mustapha Lebbah, Djamel Bouchaffra
Normalization is a pre-processing step that converts the data into a more usable representation.