Search Results for author: Zhen Shao

Found 5 papers, 1 papers with code

基于多源知识融合的领域情感词典表示学习研究(Domain Sentiment Lexicon Representation Learning Based on Multi-source Knowledge Fusion)

no code implementations CCL 2022 Ruihua Qi, Jia Wei, Zhen Shao, Xu Guo, Heng Chen

“本文旨在解决领域情感词典构建任务中标注数据资源相对匮乏以及情感语义表示不充分问题, 通过多源数据领域差异计算联合权重, 融合先验情感知识和Fasttext词向量表示学习, 将情感语义知识映射到新的词向量空间, 从无标注数据中自动构建适应大数据多领域和多语言环境的领域情感词典。在中英文多领域公开数据集上的对比实验表明, 与情感词典方法和预训练词向量方法相比, 本文提出的多源知识融合的领域情感词典表示学习方法在实验数据集上的分类正确率均有明显提升, 并在多种算法、多语言、多领域和多数据集上具有较好的鲁棒性。本文还通过消融实验验证了所提出模型的各个模块在提升情感分类效果中的作用。”

Representation Learning

A Randomised Subspace Gauss-Newton Method for Nonlinear Least-Squares

no code implementations10 Nov 2022 Coralia Cartis, Jaroslav Fowkes, Zhen Shao

We propose a Randomised Subspace Gauss-Newton (R-SGN) algorithm for solving nonlinear least-squares optimization problems, that uses a sketched Jacobian of the residual in the variable domain and solves a reduced linear least-squares on each iteration.

regression

Johnson-Lindenstrauss embeddings for noisy vectors -- taking advantage of the noise

no code implementations1 Sep 2022 Zhen Shao

This paper investigates theoretical properties of subsampling and hashing as tools for approximate Euclidean norm-preserving embeddings for vectors with (unknown) additive Gaussian noises.

Dimensionality Reduction LEMMA

Hashing embeddings of optimal dimension, with applications to linear least squares

no code implementations25 May 2021 Coralia Cartis, Jan Fiala, Zhen Shao

The aim of this paper is two-fold: firstly, to present subspace embedding properties for $s$-hashing sketching matrices, with $s\geq 1$, that are optimal in the projection dimension $m$ of the sketch, namely, $m=\mathcal{O}(d)$, where $d$ is the dimension of the subspace.

Cannot find the paper you are looking for? You can Submit a new open access paper.