Distributed Computing
70 papers with code • 0 benchmarks • 1 datasets
Benchmarks
These leaderboards are used to track progress in Distributed Computing
Libraries
Use these libraries to find Distributed Computing models and implementationsLatest papers with no code
Consensus learning: A novel decentralised ensemble learning paradigm
This work introduces a novel distributed machine learning paradigm -- \emph{consensus learning} -- which combines classical ensemble methods with consensus protocols deployed in peer-to-peer systems.
Scalable Volt-VAR Optimization using RLlib-IMPALA Framework: A Reinforcement Learning Approach
To address this challenge, our research presents a novel framework that harnesses the potential of Deep Reinforcement Learning (DRL), specifically utilizing the Importance Weighted Actor-Learner Architecture (IMPALA) algorithm, executed on the RAY platform.
Streaming IoT Data and the Quantum Edge: A Classic/Quantum Machine Learning Use Case
However, challenges such as (1) the encoding of data from the classical to the quantum domain, (2) hyperparameter tuning, and (3) the integration of quantum hardware into a distributed computing continuum limit the adoption of quantum machine learning for urgent analytics.
A Generalization of Arrow's Impossibility Theorem Through Combinatorial Topology
We present a generalization of Arrow's impossibility theorem and prove it using a combinatorial topology framework.
Programming Distributed Collective Processes in the eXchange Calculus
Recent trends like the Internet of Things (IoT) suggest a vision of dense and multi-scale deployments of computing devices in nearly all kinds of environments.
Distributed Solvers for Network Linear Equations with Scalarized Compression
We then employ such a compressed consensus flow as a fundamental consensus subroutine to develop distributed continuous-time and discrete-time solvers for network linear equations, and prove their exponential convergence properties under scalar node communications.
Federated Analytics for 6G Networks: Applications, Challenges, and Opportunities
Extensive research is underway to meet the hyper-connectivity demands of 6G networks, driven by applications like XR/VR and holographic communications, which generate substantial data requiring network-based processing, transmission, and analysis.
Balanced Multi-modal Federated Learning via Cross-Modal Infiltration
Federated learning (FL) underpins advancements in privacy-preserving distributed computing by collaboratively training neural networks without exposing clients' raw data.
Privacy-preserving quantum federated learning via gradient hiding
Distributed quantum computing, particularly distributed quantum machine learning, has gained substantial prominence for its capacity to harness the collective power of distributed quantum resources, transcending the limitations of individual quantum nodes.
The Landscape of Modern Machine Learning: A Review of Machine, Distributed and Federated Learning
With the advance of the powerful heterogeneous, parallel and distributed computing systems and ever increasing immense amount of data, machine learning has become an indispensable part of cutting-edge technology, scientific research and consumer products.