Search Results for author: Christophe Tribes

Found 4 papers, 2 papers with code

Hyperparameter Optimization for Large Language Model Instruction-Tuning

no code implementations1 Dec 2023 Christophe Tribes, Sacha Benarroch-Lelong, Peng Lu, Ivan Kobyzev

The performance on downstream tasks of models fine-tuned with LoRA heavily relies on a set of hyperparameters including the rank of the decomposition.

Hyperparameter Optimization Language Modelling +1

Efficient Training Under Limited Resources

1 code implementation23 Jan 2023 Mahdi Zolnouri, Dounia Lakhmiri, Christophe Tribes, Eyyüb Sari, Sébastien Le Digabel

Training time budget and size of the dataset are among the factors affecting the performance of a Deep Neural Network (DNN).

Data Augmentation Neural Architecture Search

HyperNOMAD: Hyperparameter optimization of deep neural networks using mesh adaptive direct search

1 code implementation3 Jul 2019 Dounia Lakhmiri, Sébastien Le Digabel, Christophe Tribes

The performance of deep neural networks is highly sensitive to the choice of the hyperparameters that define the structure of the network and the learning process.

Hyperparameter Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.