no code implementations • 13 Feb 2024 • Franz M. Rohrhofer, Stefan Posch, Clemens Gößnitzer, Bernhard C. Geiger
Furthermore, a specific network architecture is studied which is tailored for solutions in the form of traveling waves.
no code implementations • 3 Aug 2023 • Franz M. Rohrhofer, Stefan Posch, Clemens Gößnitzer, José M. García-Oliver, Bernhard C. Geiger
We assess a simple, yet effective loss weight adjustment that outperforms the standard mean-squared error optimization and enables accurate learning of all species mass fractions, even of minor species where the standard optimization completely fails.
no code implementations • 3 Aug 2023 • Stefan Posch, Clemens Gößnitzer, Franz Rohrhofer, Bernhard C. Geiger, Andreas Wimmer
The turbulent jet ignition concept using prechambers is a promising solution to achieve stable combustion at lean conditions in large gas engines, leading to high efficiency at low emission levels.
no code implementations • 22 Feb 2023 • Maximilian B. Toller, Bernhard C. Geiger, Roman Kern
Rate-distortion theory-based outlier detection builds upon the rationale that a good data compression will encode outliers with unique symbols.
no code implementations • 11 Jan 2023 • Johannes G. Hoffer, Sascha Ranftl, Bernhard C. Geiger
We consider the problem of finding an input to a stochastic black box function such that the scalar output of the black box function is as close as possible to a target value in the sense of the expected squared error.
no code implementations • 2 Nov 2022 • João Machado de Freitas, Bernhard C. Geiger
Learning invariant representations that remain useful for a downstream task is still a key challenge in machine learning.
1 code implementation • 31 May 2022 • João Machado de Freitas, Sebastian Berg, Bernhard C. Geiger, Manfred Mücke
In this paper, we frame homogeneous-feature multi-task learning (MTL) as a hierarchical representation learning problem, with one task-agnostic and multiple task-specific latent representations.
1 code implementation • 25 Mar 2022 • Franz M. Rohrhofer, Stefan Posch, Clemens Gößnitzer, Bernhard C. Geiger
This paper empirically studies commonly observed training difficulties of Physics-Informed Neural Networks (PINNs) on dynamical systems.
no code implementations • 18 Jan 2022 • Andreas B. Ofner, Achilles Kefalas, Stefan Posch, Bernhard C. Geiger
In addition, the model proved to classify knocking cycles in unseen engines with increased accuracy of 89% after adapting to their features via training on a small number of exclusively non-knocking cycles.
1 code implementation • 17 Dec 2021 • Sophie Steger, Bernhard C. Geiger, Marek Smieja
The introduced Constrained Markov Clustering (CoMaC) is an extension of a recent information-theoretic framework for (unsupervised) Markov aggregation to the semi-supervised case.
no code implementations • 3 May 2021 • Franz M. Rohrhofer, Stefan Posch, Bernhard C. Geiger
We use the diffusion equation and Navier-Stokes equations in various test environments to analyze the effects of system parameters on the shape of the Pareto front.
no code implementations • 30 Jan 2021 • Franz M. Rohrhofer, Santanu Saha, Simone Di Cataldo, Bernhard C. Geiger, Wolfgang von der Linden, Lilia Boeri
In this work we seek to understand in depth the effect that the choice of features and the properties of the database have on a machine learning application.
2 code implementations • 21 Jan 2021 • Christian Toth, Denis Helic, Bernhard C. Geiger
We thoroughly validate the effectiveness of our approach on synthetic and empirical networks, respectively, and compare Synwalk's performance with the performance of Infomap and Walktrap.
no code implementations • 18 Aug 2020 • Maximilian Toller, Bernhard C. Geiger, Roman Kern
Distance-based classification is among the most competitive classification methods for time series data.
no code implementations • 21 Mar 2020 • Bernhard C. Geiger
Specifically, we argue that even in feed-forward neural networks the data processing inequality need not hold for estimates of mutual information.
no code implementations • 21 Jun 2019 • Marek Śmieja, Maciej Wołczyk, Jacek Tabor, Bernhard C. Geiger
We propose a semi-supervised generative model, SeGMA, which learns a joint probability distribution of data and their classes and which is implemented in a typical Wasserstein auto-encoder framework.
no code implementations • 6 Jun 2019 • Rana Ali Amjad, Bernhard C. Geiger
We furthermore suggest a neural network where the decoder architecture is a parameterized naive Bayes decoder.
no code implementations • 18 Apr 2018 • Rana Ali Amjad, Kairen Liu, Bernhard C. Geiger
In this work, we investigate the use of three information-theoretic quantities -- entropy, mutual information with the class variable, and a class selectivity measure based on Kullback-Leibler divergence -- to understand and study the behavior of already trained fully-connected feed-forward neural networks.
no code implementations • 27 Feb 2018 • Rana Ali Amjad, Bernhard C. Geiger
In this theory paper, we investigate training deep neural networks (DNNs) for classification via minimizing the information bottleneck (IB) functional.
no code implementations • 2 Jan 2018 • Clemens Bloechl, Rana Ali Amjad, Bernhard C. Geiger
We present an information-theoretic cost function for co-clustering, i. e., for simultaneous clustering of two sets based on similarities between their elements.
no code implementations • 3 May 2017 • Marek Śmieja, Bernhard C. Geiger
By combining the ideas from cross-entropy clustering (CEC) with those from the information bottleneck method (IB), our method trades between three conflicting goals: the accuracy with which the data set is modeled, the simplicity of the model, and the consistency of the clustering with side information.
no code implementations • 17 Aug 2016 • Bernhard C. Geiger, Rana Ali Amjad
In this paper, we investigate mutual information as a cost function for clustering, and show in which cases hard, i. e., deterministic, clusters are optimal.