Search Results for author: Andrey Y. Lokhov

Found 20 papers, 9 papers with code

Learning of networked spreading models from noisy and incomplete data

no code implementations20 Dec 2023 Mateusz Wilinski, Andrey Y. Lokhov

Recent years have seen a lot of progress in algorithms for learning parameters of spreading dynamics from both full and partial data.

Model Selection

Forced oscillation source localization from generator measurements

no code implementations30 Sep 2023 Melvyn Tyloo, Marc Vuffray, Andrey Y. Lokhov

Malfunctioning equipment, erroneous operating conditions or periodic load variations can cause periodic disturbances that would persist over time, creating an undesirable transfer of energy across the system -- an effect referred to as forced oscillations.

Learning Energy-Based Representations of Quantum Many-Body States

1 code implementation8 Apr 2023 Abhijith Jayakumar, Marc Vuffray, Andrey Y. Lokhov

An ideal representation of a quantum state combines a succinct characterization informed by the system's structure and symmetries, along with the ability to predict the physical observables of interest.

High-quality Thermal Gibbs Sampling with Quantum Annealing Hardware

no code implementations3 Sep 2021 Jon Nelson, Marc Vuffray, Andrey Y. Lokhov, Tameem Albash, Carleton Coffrin

This work builds on those insights and identifies a class of small hardware-native Ising models that are robust to noise effects and proposes a procedure for executing these models on QA hardware to maximize Gibbs sampling performance.

Combinatorial Optimization Vocal Bursts Intensity Prediction

Single-Qubit Fidelity Assessment of Quantum Annealing Hardware

1 code implementation7 Apr 2021 Jon Nelson, Marc Vuffray, Andrey Y. Lokhov, Carleton Coffrin

Overall, the proposed QASA protocol provides a useful tool for assessing the performance of current and emerging quantum annealing devices.

Exponential Reduction in Sample Complexity with Learning of Ising Model Dynamics

1 code implementation2 Apr 2021 Arkopal Dutt, Andrey Y. Lokhov, Marc Vuffray, Sidhant Misra

We observe that for samples coming from a dynamical process far from equilibrium, the sample complexity reduces exponentially compared to a dynamical process that mixes quickly.

Learning Continuous Exponential Families Beyond Gaussian

1 code implementation18 Feb 2021 Christopher X. Ren, Sidhant Misra, Marc Vuffray, Andrey Y. Lokhov

We address the problem of learning of continuous exponential family distributions with unbounded support.

Programmable Quantum Annealers as Noisy Gibbs Samplers

no code implementations16 Dec 2020 Marc Vuffray, Carleton Coffrin, Yaroslav A. Kharkov, Andrey Y. Lokhov

Drawing independent samples from high-dimensional probability distributions represents the major computational bottleneck for modern algorithms, including powerful machine learning frameworks such as deep learning.

Prediction-Centric Learning of Independent Cascade Dynamics from Partial Observations

1 code implementation13 Jul 2020 Mateusz Wilinski, Andrey Y. Lokhov

Spreading processes play an increasingly important role in modeling for diffusion networks, information propagation, marketing and opinion setting.

Marketing

Learning of Discrete Graphical Models with Neural Networks

1 code implementation NeurIPS 2020 Abhijith J., Andrey Y. Lokhov, Sidhant Misra, Marc Vuffray

In addition, we also show a variant of NeurISE that can be used to learn a neural net representation for the full energy function of the true model.

Scalable Influence Estimation Without Sampling

no code implementations29 Dec 2019 Andrey Y. Lokhov, David Saad

The resulting saving of a potentially large sampling factor in the running time compared to simulation-based techniques hence makes it possible to address large-scale problem instances.

Efficient Learning of Discrete Graphical Models

1 code implementation NeurIPS 2020 Marc Vuffray, Sidhant Misra, Andrey Y. Lokhov

We identify a single condition related to model parametrization that leads to rigorous guarantees on the recovery of model structure and parameters in any error norm, and is readily verifiable for a large class of models.

Online Learning of Power Transmission Dynamics

no code implementations27 Oct 2017 Andrey Y. Lokhov, Marc Vuffray, Dmitry Shemetov, Deepjyoti Deka, Michael Chertkov

We consider the problem of reconstructing the dynamic state matrix of transmission power grids from time-stamped PMU measurements in the regime of ambient fluctuations.

Information Theoretic Optimal Learning of Gaussian Graphical Models

no code implementations15 Mar 2017 Sidhant Misra, Marc Vuffray, Andrey Y. Lokhov

What is the optimal number of independent observations from which a sparse Gaussian Graphical Model can be correctly recovered?

Graph Reconstruction

Optimal structure and parameter learning of Ising models

1 code implementation15 Dec 2016 Andrey Y. Lokhov, Marc Vuffray, Sidhant Misra, Michael Chertkov

Reconstruction of structure and parameters of an Ising model from binary samples is a problem of practical importance in a variety of disciplines, ranging from statistical physics and computational biology to image processing and machine learning.

Reconstructing parameters of spreading models from partial observations

no code implementations NeurIPS 2016 Andrey Y. Lokhov

Spreading processes are often modelled as a stochastic dynamics occurring on top of a given network with edge weights corresponding to the transmission probabilities.

Interaction Screening: Efficient and Sample-Optimal Learning of Ising Models

no code implementations NeurIPS 2016 Marc Vuffray, Sidhant Misra, Andrey Y. Lokhov, Michael Chertkov

We prove that with appropriate regularization, the estimator recovers the underlying graph using a number of samples that is logarithmic in the system size p and exponential in the maximum coupling-intensity and maximum node-degree.

Efficient reconstruction of transmission probabilities in a spreading process from partial observations

no code implementations23 Sep 2015 Andrey Y. Lokhov, Theodor Misiakiewicz

A number of recent papers introduced efficient algorithms for the estimation of spreading parameters, based on the maximization of the likelihood of observed cascades, assuming that the full information for all the nodes in the network is available.

Cannot find the paper you are looking for? You can Submit a new open access paper.