Search Results for author: Boshi Tang

Found 6 papers, 0 papers with code

Multi-view MidiVAE: Fusing Track- and Bar-view Representations for Long Multi-track Symbolic Music Generation

no code implementations15 Jan 2024 Zhiwei Lin, Jun Chen, Boshi Tang, Binzhu Sha, Jing Yang, Yaolong Ju, Fan Fan, Shiyin Kang, Zhiyong Wu, Helen Meng

Variational Autoencoders (VAEs) constitute a crucial component of neural symbolic music generation, among which some works have yielded outstanding results and attracted considerable attention.

Music Generation

SimCalib: Graph Neural Network Calibration based on Similarity between Nodes

no code implementations19 Dec 2023 Boshi Tang, Zhiyong Wu, Xixin Wu, Qiaochu Huang, Jun Chen, Shun Lei, Helen Meng

A novel calibration framework, named SimCalib, is accordingly proposed to consider similarity between nodes at global and local levels.

Explore 3D Dance Generation via Reward Model from Automatically-Ranked Demonstrations

no code implementations18 Dec 2023 Zilin Wang, Haolin Zhuang, Lu Li, Yinmin Zhang, Junjie Zhong, Jun Chen, Yu Yang, Boshi Tang, Zhiyong Wu

This paper presents an Exploratory 3D Dance generation framework, E3D2, designed to address the exploration capability deficiency in existing music-conditioned 3D dance generation models.

Stable Score Distillation for High-Quality 3D Generation

no code implementations14 Dec 2023 Boshi Tang, Jianan Wang, Zhiyong Wu, Lei Zhang

Although Score Distillation Sampling (SDS) has exhibited remarkable performance in conditional 3D content generation, a comprehensive understanding of its formulation is still lacking, hindering the development of 3D generation.

3D Generation

AdaMesh: Personalized Facial Expressions and Head Poses for Adaptive Speech-Driven 3D Facial Animation

no code implementations11 Oct 2023 Liyang Chen, Weihong Bao, Shun Lei, Boshi Tang, Zhiyong Wu, Shiyin Kang, HaoZhi Huang

Existing works mostly neglect the person-specific talking style in generation, including facial expression and head pose styles.

Cannot find the paper you are looking for? You can Submit a new open access paper.