no code implementations • 13 Feb 2024 • Aryeh Kontorovich, Amichai Painsky
A variety of techniques are utilized and innovated upon, including Chernoff-type inequalities and empirical Bernstein bounds.
no code implementations • 21 Jan 2024 • Simon Anuk, Tamir Bendory, Amichai Painsky
This paper studies the classical problem of detecting the location of multiple image occurrences in a two-dimensional, noisy measurement.
no code implementations • 6 Nov 2022 • Amichai Painsky
Unobserved events are alphabet symbols which do not appear in the sample.
no code implementations • 23 Sep 2022 • Uriel Shiterburd, Tamir Bendory, Amichai Painsky
This paper studies the classical problem of estimating the locations of signal occurrences in a noisy measurement.
1 code implementation • 16 Aug 2022 • Mordechai Roth, Amichai Painsky, Tamir Bendory
This paper studies the classical problem of detecting the locations of signal occurrences in a one-dimensional noisy measurement.
no code implementations • 12 Sep 2021 • Afek Ilay Adler, Amichai Painsky
The effect of this bias was extensively studied over the years, mostly in terms of predictive performance.
1 code implementation • 21 Dec 2020 • Yuval Shalev, Amichai Painsky, Irad Ben-Gal
Estimating the entropy of a discrete random variable is a fundamental problem in information theory and related fields.
no code implementations • ICLR 2019 • Ravid Shwartz-Ziv, Amichai Painsky, Naftali Tishby
Specifically, we show that the training of the network is characterized by a rapid increase in the mutual information (MI) between the layers and the target label, followed by a longer decrease in the MI between the layers and the input variable.
no code implementations • 31 Oct 2018 • Amichai Painsky, Meir Feder, Naftali Tishby
In this work we introduce an information-theoretic compressed representation framework for the non-linear CCA problem (CRCCA), which extends the classical ACE approach.
no code implementations • 26 Oct 2018 • Amichai Painsky, Saharon Rosset
In addition, we introduce a theoretically sound lossy compression scheme, which allows us to control the trade-off between the distortion and the coding rate.
no code implementations • 16 Sep 2018 • Amichai Painsky, Saharon Rosset, Meir Feder
Importantly, we show that the overhead of our suggested algorithm (compared with the lower bound) typically decreases, as the scale of the problem grows.
no code implementations • 13 Sep 2018 • Amichai Painsky
In this paper we focus on the exclusive row biclustering problem for gene expression data sets, in which each row can only be a member of a single bicluster while columns can participate in multiple ones.
no code implementations • 13 Sep 2018 • Amichai Painsky
Independent component analysis (ICA) is a statistical method for transforming an observable multi-dimensional random vector into components that are as statistically independent as possible from each other.
no code implementations • 6 Jul 2018 • Amichai Painsky, Meir Feder
Estimating a large alphabet probability distribution from a limited number of samples is a fundamental problem in machine learning and statistics.
no code implementations • 7 Nov 2017 • Amichai Painsky, Naftali Tishby
In this work we introduce a Gaussian lower bound to the IB curve; we find an embedding of the data which maximizes its "Gaussian part", on which we apply the GIB.
no code implementations • 10 Dec 2015 • Amichai Painsky, Saharon Rosset
The most important consequence of our approach is that categorical variables with many categories can be safely used in tree building and are only chosen if they contribute to predictive power.