Search Results for author: Yannis Pantazis

Found 12 papers, 4 papers with code

Lipschitz-regularized gradient flows and generative particle algorithms for high-dimensional scarce data

1 code implementation31 Oct 2022 Hyemin Gu, Panagiota Birmpa, Yannis Pantazis, Luc Rey-Bellet, Markos A. Katsoulakis

We build a new class of generative algorithms capable of efficiently learning an arbitrary target distribution from possibly scarce, high-dimensional data and subsequently generate new samples.

Data Integration Representation Learning

Function-space regularized Rényi divergences

1 code implementation10 Oct 2022 Jeremiah Birrell, Yannis Pantazis, Paul Dupuis, Markos A. Katsoulakis, Luc Rey-Bellet

We propose a new family of regularized R\'enyi divergences parametrized not only by the order $\alpha$ but also by a variational function space.

A Variance Reduction Method for Neural-based Divergence Estimation

no code implementations29 Sep 2021 Jeremiah Birrell, Markos A. Katsoulakis, Yannis Pantazis, Dipjyoti Paul, Anastasios Tsourtis

Unfortunately, the approximation of expectations that are inherent in variational formulas by statistical averages can be problematic due to high statistical variance, e. g., exponential for the Kullback-Leibler divergence and certain estimators.

Representation Learning

Forward Looking Best-Response Multiplicative Weights Update Methods for Bilinear Zero-sum Games

no code implementations7 Jun 2021 Michail Fasoulakis, Evangelos Markakis, Yannis Pantazis, Constantinos Varsos

Our work focuses on extra gradient learning algorithms for finding Nash equilibria in bilinear zero-sum games.

Inference of Stochastic Dynamical Systems from Cross-Sectional Population Data

no code implementations9 Dec 2020 Anastasios Tsourtis, Yannis Pantazis, Ioannis Tsamardinos

Inferring the driving equations of a dynamical system from population or time-course data is important in several scientific fields such as biochemistry, epidemiology, financial mathematics and many others.

Epidemiology

$(f,Γ)$-Divergences: Interpolating between $f$-Divergences and Integral Probability Metrics

no code implementations11 Nov 2020 Jeremiah Birrell, Paul Dupuis, Markos A. Katsoulakis, Yannis Pantazis, Luc Rey-Bellet

We develop a rigorous and general framework for constructing information-theoretic divergences that subsume both $f$-divergences and integral probability metrics (IPMs), such as the $1$-Wasserstein distance.

Image Generation Uncertainty Quantification

Enhancing Speech Intelligibility in Text-To-Speech Synthesis using Speaking Style Conversion

1 code implementation13 Aug 2020 Dipjyoti Paul, Muhammed PV Shifas, Yannis Pantazis, Yannis Stylianou

Intelligibility enhancement as quantified by the Intelligibility in Bits (SIIB-Gauss) measure shows that the proposed Lombard-SSDRC TTS system shows significant relative improvement between 110% and 130% in speech-shaped noise (SSN), and 47% to 140% in competing-speaker noise (CSN) against the state-of-the-art TTS approach.

Speech Synthesis Text-To-Speech Synthesis +1

Speaker Conditional WaveRNN: Towards Universal Neural Vocoder for Unseen Speaker and Recording Conditions

1 code implementation9 Aug 2020 Dipjyoti Paul, Yannis Pantazis, Yannis Stylianou

In terms of performance, our system has been preferred over the baseline TTS system by 60% over 15. 5% and by 60. 9% over 32. 6%, for seen and unseen speakers, respectively.

Speech Synthesis

Optimizing Variational Representations of Divergences and Accelerating their Statistical Estimation

no code implementations15 Jun 2020 Jeremiah Birrell, Markos A. Katsoulakis, Yannis Pantazis

Recently, they have gained popularity in machine learning as a tractable and scalable approach for training probabilistic models and for statistically differentiating between data distributions.

Predictive modeling approaches in laser-based material processing

no code implementations13 Jun 2020 Maria Christina Velli, George D. Tsibidis, Alexandros Mimidis, Evangelos Skoulas, Yannis Pantazis, Emmanuel Stratakis

Predictive modelling represents an emerging field that combines existing and novel methodologies aimed to rapidly understand physical mechanisms and concurrently develop new materials, processes and structures.

Cumulant GAN

no code implementations11 Jun 2020 Yannis Pantazis, Dipjyoti Paul, Michail Fasoulakis, Yannis Stylianou, Markos Katsoulakis

In this paper, we propose a novel loss function for training Generative Adversarial Networks (GANs) aiming towards deeper theoretical understanding as well as improved stability and performance for the underlying optimization problem.

Image Generation

Training Generative Adversarial Networks with Weights

no code implementations6 Nov 2018 Yannis Pantazis, Dipjyoti Paul, Michail Fasoulakis, Yannis Stylianou

The impressive success of Generative Adversarial Networks (GANs) is often overshadowed by the difficulties in their training.

Cannot find the paper you are looking for? You can Submit a new open access paper.