Learning Word Embeddings

23 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Latest papers with no code

Challenges and Applications of Automated Extraction of Socio-political Events from Text (CASE 2021): Workshop and Shared Task Report

no code yet • ACL (CASE) 2021

This workshop is the fourth issue of a series of workshops on automatic extraction of socio-political events from news, organized by the Emerging Market Welfare Project, with the support of the Joint Research Centre of the European Commission and with contributions from many other prominent scholars in this field.

Group-Sparse Matrix Factorization for Transfer Learning of Word Embeddings

no code yet • 18 Apr 2021

However, learning word embeddings from new domains with limited training data can be challenging, because the meaning/usage may be different in the new domain, e. g., the word ``positive'' typically has positive sentiment, but often has negative sentiment in medical notes since it may imply that a patient tested positive for a disease.

Points2Vec: Unsupervised Object-level Feature Learning from Point Clouds

no code yet • 8 Feb 2021

This, despite the fact that the physical 3D spaces have a similar semantic structure to bodies of text: words are surrounded by words that are semantically related, just like objects are surrounded by other objects that are similar in concept and usage.

TemporalTeller at SemEval-2020 Task 1: Unsupervised Lexical Semantic Change Detection with Temporal Referencing

no code yet • SEMEVAL 2020

This paper describes our TemporalTeller system for SemEval Task 1: Unsupervised Lexical Semantic Change Detection.

In Neural Machine Translation, What Does Transfer Learning Transfer?

no code yet • ACL 2020

Transfer learning improves quality for low-resource machine translation, but it is unclear what exactly it transfers.

Apprentissage de plongements de mots sur des corpus en langue de sp\'ecialit\'e : une \'etude d'impact (Learning word embeddings on domain specific corpora : an impact study )

no code yet • JEPTALNRECITAL 2020

Pour r{\'e}pondre {\`a} cette question, nous consid{\'e}rons deux corpus en langue de sp{\'e}cialit{\'e} : O HSUMED issu du domaine m{\'e}dical, et un corpus de documentation technique, propri{\'e}t{\'e} de SNCF.

Learning Cross-Context Entity Representations from Text

no code yet • 11 Jan 2020

Language modeling tasks, in which words, or word-pieces, are predicted on the basis of a local context, have been very effective for learning word embeddings and context dependent representations of phrases.

DeepXML: Scalable & Accurate Deep Extreme Classification for Matching User Queries to Advertiser Bid Phrases

no code yet • 25 Sep 2019

The objective in deep extreme multi-label learning is to jointly learn feature representations and classifiers to automatically tag data points with the most relevant subset of labels from an extremely large label set.

Learning Word Embeddings without Context Vectors

no code yet • WS 2019

Most word embedding algorithms such as word2vec or fastText construct two sort of vectors: for words and for contexts.

Words are Vectors, Dependencies are Matrices: Learning Word Embeddings from Dependency Graphs

no code yet • WS 2019

Distributional Semantic Models (DSMs) construct vector representations of word meanings based on their contexts.