no code implementations • 13 Jun 2023 • Ahmet Caner Yüzügüler, Nikolaos Dimitriadis, Pascal Frossard
Finding optimal channel dimensions (i. e., the number of filters in DNN layers) is essential to design DNNs that perform well under computational resource constraints.
1 code implementation • 21 Apr 2023 • Nikolaos Dimitriadis, Francois Fleuret, Pascal Frossard
Continual Learning is an important and challenging problem in machine learning, where models must adapt to a continuous stream of new data without forgetting previously acquired knowledge.
1 code implementation • 18 Oct 2022 • Nikolaos Dimitriadis, Pascal Frossard, François Fleuret
In Multi-Task Learning (MTL), tasks may compete and limit the performance achieved on each other, rather than guiding the optimization to a solution, superior to all its single-task trained counterparts.
1 code implementation • 23 Mar 2022 • Ahmet Caner Yüzügüler, Nikolaos Dimitriadis, Pascal Frossard
Optimizing resource utilization in target platforms is key to achieving high performance during DNN inference.
Hardware Aware Neural Architecture Search Image Classification +1
no code implementations • 2 Mar 2022 • Kyle Matoba, Nikolaos Dimitriadis, François Fleuret
Over the decade since deep neural networks became state of the art image classifiers there has been a tendency towards less use of max pooling: the function that takes the largest of nearby pixels in an image.
no code implementations • 15 Nov 2020 • Nikolaos Dimitriadis, Petros Maragos
In this paper we study an emerging class of neural networks based on the morphological operators of dilation and erosion.