no code implementations • CCL 2022 • Zhiqiang Xie, Jinzhu Liu, Genhui Liu
“本文聚焦研究较少的古汉语嵌套命名实体识别任务, 以《史记》作为原始语料, 针对古文意义丰富而导致的实体分类模糊问题, 分别构建了基于字词本义和语境义2个标注标准的古汉语嵌套命名实体数据集, 探讨了数据集的实体分类原则和标注格式, 并用RoBERTa-classical-chinese+GlobalPointer模型进行对比试验, 标准一数据集F1值为80. 42%, 标准二F1值为77. 43%, 以此确定了数据集的标注标准。之后对比了六种预训练模型配合GlobalPointer在古汉语嵌套命名实体识别任务上的表现。最终试验结果:RoBERTa-classical-chinese模型F1值为84. 71%, 表现最好。”
no code implementations • 29 Jan 2024 • Yijing Lin, Zhipeng Gao, Hongyang Du, Jinke Ren, Zhiqiang Xie, Dusit Niyato
However, existing works require central servers to retain the historical model parameters from distributed clients, such that allows the central server to utilize these parameters for further training even, after the clients exit the training process.
1 code implementation • 12 Dec 2023 • Lianmin Zheng, Liangsheng Yin, Zhiqiang Xie, Jeff Huang, Chuyue Sun, Cody Hao Yu, Shiyi Cao, Christos Kozyrakis, Ion Stoica, Joseph E. Gonzalez, Clark Barrett, Ying Sheng
SGLang is designed for the efficient programming of LLMs and incorporates primitives for common LLM programming patterns.
1 code implementation • 13 Mar 2023 • Ying Sheng, Lianmin Zheng, Binhang Yuan, Zhuohan Li, Max Ryabinin, Daniel Y. Fu, Zhiqiang Xie, Beidi Chen, Clark Barrett, Joseph E. Gonzalez, Percy Liang, Christopher Ré, Ion Stoica, Ce Zhang
As a result, when running OPT-175B on a single 16GB GPU, FlexGen achieves significantly higher throughput compared to state-of-the-art offloading systems, reaching a generation throughput of 1 token/s for the first time with an effective batch size of 144.
no code implementations • 20 May 2021 • Yang Wang, Chen Zhang, Zhiqiang Xie, Cong Guo, Yunxin Liu, Jingwen Leng
We demonstrate the feasibility of our design with minimal changes to the existing production-scale inner-product-based Tensor Core.
no code implementations • 29 Nov 2019 • Liming Deng, Jie Wang, Hangming Liang, Hui Chen, Zhiqiang Xie, Bojin Zhuang, Shaojun Wang, Jing Xiao
In this paper, we propose a novel iterative polishing framework for highly qualified Chinese poetry generation.