no code implementations • 4 Mar 2024 • Olivier Gandouet, Mouloud Belbahri, Armelle Jezequel, Yuriy Bodjov
In this study, ChatGPT is utilized to create streamlined models that generate easily interpretable features.
no code implementations • 11 May 2021 • Mouloud Belbahri, Olivier Gandouet, Alejandro Murua, Vahid Partovi Nia
In this work, we bring a new vision to uplift modeling.
no code implementations • 28 Nov 2019 • Mouloud Belbahri, Alejandro Murua, Olivier Gandouet, Vahid Partovi Nia
We introduce a Qini-based uplift regression model to analyze a large insurance company's retention marketing campaign.
no code implementations • 18 Sep 2019 • Eyyüb Sari, Mouloud Belbahri, Vahid Partovi Nia
Binary Neural Networks (BNNs) are difficult to train, and suffer from drop of accuracy.
no code implementations • ICLR 2019 • Sajad Darabi, Mouloud Belbahri, Matthieu Courbariaux, Vahid Partovi Nia
Binary neural networks (BNN) help to alleviate the prohibitive resource requirements of DNN, where both activations and weights are limited to 1-bit.
no code implementations • 5 Feb 2019 • Ali Vahdat, Mouloud Belbahri, Vahid Partovi Nia
Erbium-doped fiber amplifier (EDFA) is an optical amplifier/repeater device used to boost the intensity of optical signals being carried through a fiber optic communication system.
no code implementations • 18 Jan 2019 • Mouloud Belbahri, Eyyüb Sari, Sajad Darabi, Vahid Partovi Nia
Using a quasiconvex base function in order to construct a binary quantizer helps training binary neural networks (BNNs) and adding noise to the input data or using a concrete regularization function helps to improve generalization error.
1 code implementation • ICLR 2019 • Sajad Darabi, Mouloud Belbahri, Matthieu Courbariaux, Vahid Partovi Nia
We propose to improve the binary training method, by introducing a new regularization function that encourages training weights around binary values.