1 code implementation • 8 Dec 2022 • Hamidreza Mahini, Hamid Mousavi, Masoud Daneshtalab
GTFLAT, as a game theory-based add-on, addresses an important research question: How can a federated learning algorithm achieve better performance and training efficiency by setting more effective adaptive weights for averaging in the model aggregation phase?
1 code implementation • 14 Jul 2022 • Hamid Mousavi, Mohammad Loni, Mina Alibeigi, Masoud Daneshtalab
In this paper, we propose a new method to search for sparsity-friendly neural architectures.
1 code implementation • Design, Automation and Test in Europe Conference (DATE) 2022 • Mohammad Loni, Hamid Mousavi, Mohammad Riazati, Masoud Daneshtalab, and Mikael Sjodin
This paper proposes TAS, a framework that drastically reduces the accuracy gap between TNNs and their full-precision counterparts by integrating quantization into the network design.
1 code implementation • 4 Mar 2020 • Hamid Mousavi, Jakob Drefs, Florian Hirschberger, Jörg Lücke
Here, we consider LVMs that are defined for a range of different distributions, i. e., observables can follow any (regular) distribution of the exponential family.