no code implementations • 21 Jun 2023 • Kun Huang, Linli Zhou, Shi Pu
Notably, both results are comparable to the convergence rates of centralized RR methods (up to constant factors depending on the network topology) and outperform those of previous distributed random reshuffling algorithms.
1 code implementation • 7 May 2023 • Shengfang Zhai, Yinpeng Dong, Qingni Shen, Shi Pu, Yuejian Fang, Hang Su
To gain a better understanding of the training process and potential risks of text-to-image synthesis, we perform a systematic investigation of backdoor attack on text-to-image diffusion models and propose BadT2I, a general multimodal backdoor attack framework that tampers with image synthesis in diverse semantic levels.
3 code implementations • 12 Mar 2023 • Fan Bao, Shen Nie, Kaiwen Xue, Chongxuan Li, Shi Pu, Yaole Wang, Gang Yue, Yue Cao, Hang Su, Jun Zhu
Inspired by the unified view, UniDiffuser learns all distributions simultaneously with a minimal modification to the original diffusion model -- perturbs data in all modalities instead of a single modality, inputs individual timesteps in different modalities, and predicts the noise of all modalities instead of a single modality.
no code implementations • 30 Jan 2023 • Kun Huang, Xiao Li, Shi Pu
Distributed stochastic optimization has drawn great attention recently due to its effectiveness in solving large-scale machine learning problems.
no code implementations • 14 Jan 2023 • Kun Huang, Shi Pu
In this paper, we consider solving the distributed optimization problem over a multi-agent network under the communication restricted setting.
1 code implementation • CVPR 2022 • Shi Pu, Kaili Zhao, Mao Zheng
Further, we synthesize features of unseen classes by proposing a class generator that interpolates and extrapolates the features of seen classes.
Ranked #14 on Zero-Shot Action Recognition on UCF101
no code implementations • 31 Dec 2021 • Kun Huang, Xiao Li, Andre Milzarek, Shi Pu, Junwen Qiu
We show that D-RR inherits favorable characteristics of RR for both smooth strongly convex and smooth nonconvex objective functions.
no code implementations • 26 Jul 2021 • Zhuoqing Song, Lei Shi, Shi Pu, Ming Yan
We consider the decentralized optimization problem, where a network of $n$ agents aims to collaboratively minimize the average of their individual smooth and convex objective functions through peer-to-peer communication in a directed graph.
no code implementations • 14 Jun 2021 • Zhuoqing Song, Lei Shi, Shi Pu, Ming Yan
The second algorithm is a broadcast-like version of CPP (B-CPP), and it also achieves linear convergence rate under the same conditions on the objective functions.
no code implementations • 11 May 2021 • Kun Huang, Shi Pu
To the best of our knowledge, EDAS achieves the shortest transient time when the average of the $n$ cost functions is strongly convex and each cost function is smooth.
no code implementations • 26 Oct 2020 • Shi Pu, Yijiang He, Zheng Li, Mao Zheng
Existing video recommendation systems directly exploit features from different modalities (e. g., user personal data, user behavior data, video titles, video tags, and visual contents) to input deep neural networks, while expecting the networks to online mine user-preferred topics implicitly from these features.
no code implementations • 12 Sep 2020 • Ran Xin, Shi Pu, Angelia Nedić, Usman A. Khan
Decentralized optimization to minimize a finite sum of functions over a network of nodes has been a significant focus within control and signal processing research due to its natural relevance to optimal control and signal estimation problems.
no code implementations • 28 Jun 2019 • Shi Pu, Alex Olshevsky, Ioannis Ch. Paschalidis
We provide a discussion of several recent results which, in certain scenarios, are able to overcome a barrier in distributed stochastic optimization for machine learning.
no code implementations • 6 Jun 2019 • Shi Pu, Alex Olshevsky, Ioannis Ch. Paschalidis
This paper is concerned with minimizing the average of $n$ cost functions over a network, in which agents may communicate and exchange information with their peers in the network.
no code implementations • NeurIPS 2018 • Shi Pu, Yibing Song, Chao Ma, Honggang Zhang, Ming-Hsuan Yang
Visual attention, derived from cognitive neuroscience, facilitates human perception on the most pertinent subset of the sensory data.
no code implementations • 11 Jun 2018 • Shi Pu, Alfredo Garcia
We study a distributed framework for stochastic optimization which is inspired by models of collective motion found in nature (e. g., swarming) with mild communication requirements.
no code implementations • 25 May 2018 • Shi Pu, Angelia Nedić
In this paper, we study the problem of distributed multi-agent optimization over a network, where each agent possesses a local cost function that is smooth and strongly convex.
no code implementations • 21 Mar 2018 • Shi Pu, Angelia Nedić
In this paper, we study the problem of distributed multi-agent optimization over a network, where each agent possesses a local cost function that is smooth and strongly convex.
Optimization and Control Distributed, Parallel, and Cluster Computing Multiagent Systems