Search Results for author: Herbert Woisetschläger

Found 4 papers, 0 papers with code

A Survey on Efficient Federated Learning Methods for Foundation Model Training

no code implementations9 Jan 2024 Herbert Woisetschläger, Alexander Isenko, Shiqiang Wang, Ruben Mayer, Hans-Arno Jacobsen

We discuss the benefits and drawbacks of parameter-efficient fine-tuning (PEFT) for FL applications, elaborate on the readiness of FL frameworks to work with FMs and provide future research opportunities on how to evaluate generative models in FL as well as the interplay of privacy and PEFT.

Federated Learning Privacy Preserving

Federated Fine-Tuning of LLMs on the Very Edge: The Good, the Bad, the Ugly

no code implementations4 Oct 2023 Herbert Woisetschläger, Alexander Isenko, Shiqiang Wang, Ruben Mayer, Hans-Arno Jacobsen

Large Language Models (LLM) and foundation models are popular as they offer new opportunities for individuals and businesses to improve natural language processing, interact with data, and retrieve information faster.

Computational Efficiency Edge-computing +2

Cannot find the paper you are looking for? You can Submit a new open access paper.