Search Results for author: Tianchi Yang

Found 8 papers, 1 papers with code

Minimum Topology Attacks for Graph Neural Networks

no code implementations5 Mar 2024 Mengmei Zhang, Xiao Wang, Chuan Shi, Lingjuan Lyu, Tianchi Yang, Junping Du

To break this dilemma, we propose a new type of topology attack, named minimum-budget topology attack, aiming to adaptively find the minimum perturbation sufficient for a successful attack on each node.

Text Diffusion with Reinforced Conditioning

no code implementations19 Feb 2024 Yuxuan Liu, Tianchi Yang, Shaohan Huang, Zihan Zhang, Haizhen Huang, Furu Wei, Weiwei Deng, Feng Sun, Qi Zhang

Diffusion models have demonstrated exceptional capability in generating high-quality images, videos, and audio.

Auto Search Indexer for End-to-End Document Retrieval

no code implementations19 Oct 2023 Tianchi Yang, Minghui Song, Zihan Zhang, Haizhen Huang, Weiwei Deng, Feng Sun, Qi Zhang

Generative retrieval, which is a new advanced paradigm for document retrieval, has recently attracted research interests, since it encodes all documents into the model and directly generates the retrieved documents.

Retrieval

Calibrating LLM-Based Evaluator

no code implementations23 Sep 2023 Yuxuan Liu, Tianchi Yang, Shaohan Huang, Zihan Zhang, Haizhen Huang, Furu Wei, Weiwei Deng, Feng Sun, Qi Zhang

Recent advancements in large language models (LLMs) on language modeling and emergent capabilities make them a promising reference-free evaluator of natural language generation quality, and a competent alternative to human evaluation.

In-Context Learning Language Modelling +1

Some exact results on $4$-cycles: stability and supersaturation

no code implementations2 Dec 2019 Jialin He, Jie Ma, Tianchi Yang

A longstanding conjecture of Erd\H{o}s and Simonovits states that every $n$-vertex graph with $ex(n, C_4)+1$ edges contains at least $(1+o(1))\sqrt{n}$ 4-cycles.

Combinatorics

Heterogeneous Graph Attention Networks for Semi-supervised Short Text Classification

no code implementations IJCNLP 2019 Hu Linmei, Tianchi Yang, Chuan Shi, Houye Ji, Xiao-Li Li

Then, we propose Heterogeneous Graph ATtention networks (HGAT) to embed the HIN for short text classification based on a dual-level attention mechanism, including node-level and type-level attentions.

General Classification Graph Attention +3

Cannot find the paper you are looking for? You can Submit a new open access paper.