1 code implementation • 31 Jan 2024 • Xiaopeng Li, Shasha Li, Shezheng Song, Huijun Liu, Bin Ji, Xi Wang, Jun Ma, Jie Yu, Xiaodong Liu, Jing Wang, Weimin Zhang
In particular, local editing methods, which directly update model parameters, are more suitable for updating a small amount of knowledge.
no code implementations • 10 Nov 2023 • Shezheng Song, Xiaopeng Li, Shasha Li, Shan Zhao, Jie Yu, Jun Ma, Xiaoguang Mao, Weimin Zhang
The study surveys existing modal alignment methods in MLLMs into four groups: (1) Multimodal Converters that change data into something LLMs can understand; (2) Multimodal Perceivers to improve how LLMs perceive different types of data; (3) Tools Assistance for changing data into one common format, usually text; and (4) Data-Driven methods that teach LLMs to understand specific types of data in a dataset.
no code implementations • 21 Nov 2022 • Zixuan Xu, Penghui Wei, Shaoguo Liu, Weimin Zhang, Liang Wang, Bo Zheng
Conventional graph neural network based methods usually deal with each domain separately, or train a shared model to serve all domains.
no code implementations • 15 May 2022 • Penghui Wei, Weimin Zhang, Ruijie Hou, Jinquan Liu, Shaoguo Liu, Liang Wang, Bo Zheng
Calibration techniques aims to post-process model predictions to posterior probabilities.
no code implementations • 20 Jan 2022 • Zixuan Xu, Penghui Wei, Weimin Zhang, Shaoguo Liu, Liang Wang, Bo Zheng
Then a student model is trained on both clicked and unclicked ads with knowledge distillation, performing uncertainty modeling to alleviate the inherent noise in pseudo-labels.