ACL 2019 分析
ACL 2019 分析
word embedding
22篇!
Towards Unsupervised Text Classification Leveraging Experts and Word Embeddings
Zied Haj-Yahia, Adrien Sieg and Léa A. Deleris
A Resource-Free Evaluation Metric for Cross-Lingual Word Embeddings Based on Graph Modularity
Yoshinari Fujinuma, Jordan Boyd-Graber and Michael J. Paul
How to (Properly) Evaluate Cross-Lingual Word Embeddings: On Strong Baselines, Comparative Analyses, and Some Misconceptions
Goran Glavaš, Robert Litschko, Sebastian Ruder and Ivan Vulić
Diachronic Sense Modeling with Deep Contextualized Word Embeddings: An Ecological View
Renfen Hu, Shen Li and Shichen Liang
Understanding Undesirable Word Embedding Associations
Kawin Ethayarajh, David Duvenaud and Graeme Hirst
Shared-Private Bilingual Word Embeddings for Neural Machine Translation
Xuebo Liu, Derek F. Wong, Yang Liu, Lidia S. Chao, Tong Xiao and Jingbo Zhu
Unsupervised Bilingual Word Embedding Agreement for Unsupervised Neural Machine Translation
Haipeng Sun, Rui Wang, Kehai Chen, Masao Utiyama, Eiichiro Sumita and Tiejun Zhao
Gender-preserving Debiasing for Pre-trained Word Embeddings
Masahiro Kaneko and Danushka Bollegala
Relational Word Embeddings
Jose Camacho-Collados, Luis Espinosa Anke and Steven Schockaert
Classification and Clustering of Arguments with Contextualized Word Embeddings
Nils Reimers, Benjamin Schiller, Tilman Beck, Johannes Daxenberger, Christian Stab and Iryna Gurevych
Probing for Semantic Classes: Diagnosing the Meaning Content of Word Embeddings
Yadollah Yaghoobzadeh, Katharina Kann, T. J. Hazen, Eneko Agirre and Hinrich Schütze
Unsupervised Multilingual Word Embedding with Limited Resources using Neural Language Models
Takashi Wada, Tomoharu Iwata and Yuji Matsumoto
Neural Temporality Adaptation for Document Classification: Diachronic Word Embeddings and Domain Adaptation Models
Xiaolei Huang and Michael J. Paul
Incorporating Syntactic and Semantic Information in Word Embeddings using Graph Convolutional Networks
Shikhar Vashishth, Manik Bhandari, Prateek Yadav, Piyush Rai, Chiranjib Bhattacharyya and Partha Talukdar
Word2Sense: Sparse Interpretable Word Embeddings
Abhishek Panigrahi, Harsha Vardhan Simhadri and Chiranjib Bhattacharyya
Analyzing the limitations of cross-lingual word embedding mappings
Aitor Ormazabal, Mikel Artetxe, Gorka Labaka, Aitor Soroa and Eneko Agirre
A Transparent Framework for Evaluating Unintended Demographic Bias in Word Embeddings
Chris Sweeney and Maryam Najafian
Unsupervised Joint Training of Bilingual Word Embeddings
Benjamin Marie and Atsushi Fujita
Exploring Numeracy in Word Embeddings
Aakanksha Naik, Abhilasha Ravichander, Carolyn Rose and Eduard Hovy
Analyzing and Mitigating Gender Bias in Languages with Grammatical Gender and Bilingual Word Embeddings
Pei Zhou, Weijia Shi, Jieyu Zhao, Kuan-Hao Huang, Muhao Chen and Kai-Wei Chang
On Dimensional Linguistic Properties of the Word Embedding Space
Vikas Raunak, Vaibhav Kumar, Vivek Gupta and Florian Metze
Towards incremental learning of word embeddings using context informativeness
Alexandre Kabbach, Kristina Gulordava and Aurélie Herbelot
Word Representation
Sequence Tagging with Contextual and Non-Contextual Subword Representations: A Multilingual Evaluation
Benjamin Heinzerling and Michael Strube
Word Vector
3 篇
Unraveling Antonym's Word Vectors through a Siamese-like Network
Mathias Etcheverry and Dina Wonsever
Word and Document Embedding with vMF-Mixture Priors on Context Word Vectors
Shoaib Jameel and Steven Schockaert
Generalized Tuning of Distributional Word Vectors for Monolingual and Cross-Lingual Lexical Entailment
Goran Glavaš and Ivan Vulić
Word
LSTMEmbed: Learning Word and Sense Representations from a Large Semantically Annotated Corpus with Long Short-Term Memories
Ignacio Iacobacci and Roberto Navigli
Few-Shot Representation Learning for Out-Of-Vocabulary Words
Ziniu Hu, Ting Chen, Kai-Wei Chang and Yizhou Sun
Zero-shot Word Sense Disambiguation using Sense Definition Embeddings
Sawan Kumar, Sharmistha Jat, Karan Saxena and Partha Talukdar
Text Categorization by Learning Predominant Sense of Words as Auxiliary Task
Kazuya Shimura, Jiyi Li and Fumiyo Fukumoto
Learning to Discover, Ground and Use Words with Segmental Neural Language Models
Kazuya Kawakami, Chris Dyer and Phil Blunsom
Multiple Character Embeddings for Chinese Word Segmentation
Jianing Zhou, Jingkang Wang and Gongshen Liu
ACL 2019 分析的更多相关文章
- AAAI 2019 分析
AAAI 2019 分析 Google Scholar 订阅 CoKE : Word Sense Induction Using Contextualized Knowledge Embeddings ...
- ICML 2019 分析
ICML 2019 分析 Word Embeddings Understanding the Origins of Bias in Word Embeddings Popular word embed ...
- zz【清华NLP】图神经网络GNN论文分门别类,16大应用200+篇论文最新推荐
[清华NLP]图神经网络GNN论文分门别类,16大应用200+篇论文最新推荐 图神经网络研究成为当前深度学习领域的热点.最近,清华大学NLP课题组Jie Zhou, Ganqu Cui, Zhengy ...
- 论文阅读 | Generating Fluent Adversarial Examples for Natural Languages
Generating Fluent Adversarial Examples for Natural Languages ACL 2019 为自然语言生成流畅的对抗样本 摘要 有效地构建自然语言处 ...
- BERT-MRC:统一化MRC框架提升NER任务效果
原创作者 | 疯狂的Max 01 背景 命名实体识别任务分为嵌套命名实体识别(nested NER)和普通命名实体识别(flat NER),而序列标注模型只能给一个token标注一个标签,因此对于嵌套 ...
- Awesome Knowledge-Distillation
Awesome Knowledge-Distillation 2019-11-26 19:02:16 Source: https://github.com/FLHonker/Awesome-Knowl ...
- 【转帖】Infor转型十年启示录:ERP套件厂商为什么要做云平台?
Infor转型十年启示录:ERP套件厂商为什么要做云平台? https://www.tmtpost.com/4199274.html 好像浪潮国际 就是用的infor的ERP软件. 秦聪慧• 2019 ...
- 《构建之法》——GitHub和Visual Studio的基础使用
git地址 https://github.com/microwangwei git用户名 microwangwei 学号后五位 62214 博客地址 https://www.cnblogs.com/w ...
- NLP中的对抗样本
自然语言处理方面的研究在近几年取得了惊人的进步,深度神经网络模型已经取代了许多传统的方法.但是,当前提出的许多自然语言处理模型并不能够反映文本的多样特征.因此,许多研究者认为应该开辟新的研究方法,特别 ...
随机推荐
- wex5 sqllite本地数据库的运用
http://doc.wex5.com/?p=3774 需要引入包require("cordova!com.brodysoft.sqlitePlugin"); //本地数据库操作 ...
- Python笔试面试题目及答案
1.is 和==的区别? is:比较的是两个对象的id值是否相等,也就是比较俩对象是否为同一个实例对象.是否指向同一个内存地址 == : 比较的两个对象的内容/值是否相等,默认会调用对象的eq()方法 ...
- 什么是file_sort?如何避免file_sort
阿里巴巴编码规范有这么一例 [推荐]如果有order by场景,请注意利用索引的有序性. order by最后的字段是组合索引的一部分,并且放在索引组合顺序的最后,避免出现file_sort的情况,影 ...
- linux开启Rsyslog服务收集日志
一.查看是否安装了rsyslog服务 [root@server- ~]# yum install -y rsyslog 已加载插件:fastestmirror Loading mirror speed ...
- psu补丁
1.查看命令 su - oracle opatch lspatches su - grid opatch lspatches
- Saving Tang Monk II HihoCoder - 1828 2018北京赛站网络赛A题
<Journey to the West>(also <Monkey>) is one of the Four Great Classical Novels of Chines ...
- linux tar压缩解压命令的详细解释
tar [-cxtzjvfpPN] 文件与目录 参数:-c :建立一个压缩文件的参数指令(create 的意思):-x :解开一个压缩文件的参数指令!-t :查看 tarfile 里面的文件!特别注意 ...
- 【学习】027 Dubbo
Dubbo概述 Dubbo的背景 随着互联网的发展,网站应用的规模不断扩大,常规的垂直应用架构已无法应对,分布式服务架构以及流动计算架构势在必行,亟需一个治理系统确保架构有条不紊的演进. 单一应用架构 ...
- 转:ThreadLocal剖析
转自http://www.cnblogs.com/dolphin0520/p/3920407.html 一.对ThreadLocal的理解 ThreadLocal,很多地方叫做线程本地变量,也有些地方 ...
- Python之网路编程利用threading模块开线程
一多线程的概念介绍 threading模块介绍 threading模块和multiprocessing模块在使用层面,有很大的相似性. 二.开启多线程的两种方式 1 1.创建线程的开销比创建进程的开销 ...