Learning Word Embeddings
23 papers with code • 0 benchmarks • 0 datasets
Benchmarks
These leaderboards are used to track progress in Learning Word Embeddings
Latest papers with no code
Challenges and Applications of Automated Extraction of Socio-political Events from Text (CASE 2021): Workshop and Shared Task Report
This workshop is the fourth issue of a series of workshops on automatic extraction of socio-political events from news, organized by the Emerging Market Welfare Project, with the support of the Joint Research Centre of the European Commission and with contributions from many other prominent scholars in this field.
Group-Sparse Matrix Factorization for Transfer Learning of Word Embeddings
However, learning word embeddings from new domains with limited training data can be challenging, because the meaning/usage may be different in the new domain, e. g., the word ``positive'' typically has positive sentiment, but often has negative sentiment in medical notes since it may imply that a patient tested positive for a disease.
Points2Vec: Unsupervised Object-level Feature Learning from Point Clouds
This, despite the fact that the physical 3D spaces have a similar semantic structure to bodies of text: words are surrounded by words that are semantically related, just like objects are surrounded by other objects that are similar in concept and usage.
TemporalTeller at SemEval-2020 Task 1: Unsupervised Lexical Semantic Change Detection with Temporal Referencing
This paper describes our TemporalTeller system for SemEval Task 1: Unsupervised Lexical Semantic Change Detection.
In Neural Machine Translation, What Does Transfer Learning Transfer?
Transfer learning improves quality for low-resource machine translation, but it is unclear what exactly it transfers.
Apprentissage de plongements de mots sur des corpus en langue de sp\'ecialit\'e : une \'etude d'impact (Learning word embeddings on domain specific corpora : an impact study )
Pour r{\'e}pondre {\`a} cette question, nous consid{\'e}rons deux corpus en langue de sp{\'e}cialit{\'e} : O HSUMED issu du domaine m{\'e}dical, et un corpus de documentation technique, propri{\'e}t{\'e} de SNCF.
Learning Cross-Context Entity Representations from Text
Language modeling tasks, in which words, or word-pieces, are predicted on the basis of a local context, have been very effective for learning word embeddings and context dependent representations of phrases.
DeepXML: Scalable & Accurate Deep Extreme Classification for Matching User Queries to Advertiser Bid Phrases
The objective in deep extreme multi-label learning is to jointly learn feature representations and classifiers to automatically tag data points with the most relevant subset of labels from an extremely large label set.
Learning Word Embeddings without Context Vectors
Most word embedding algorithms such as word2vec or fastText construct two sort of vectors: for words and for contexts.
Words are Vectors, Dependencies are Matrices: Learning Word Embeddings from Dependency Graphs
Distributional Semantic Models (DSMs) construct vector representations of word meanings based on their contexts.