Learning Multilingual Meta-Embeddings for Code-Switching Named Entity Recognition

Published in Proceedings of the 4th Workshop on Representation Learning for NLP (RepL4NLP-2019) (Best Paper Award) in conjunction with ACL, 2019

Recommended citation: Winata, G. I., Lin, Z., & Fung, P. (2019). Learning Multilingual Meta-Embeddings for Code-Switching Named Entity Recognition. ACL 2019, 181. https://www.aclweb.org/anthology/W19-4320

In this paper, we propose Multilingual Meta-Embeddings (MME), an effective method to learn multilingual representations by leveraging monolingual pre-trained embeddings. MME learns to utilize information from these embeddings via a self-attention mechanism without explicit language identification. We evaluate the proposed embedding method on the code-switching English-Spanish Named Entity Recognition dataset in a multilingual and cross-lingual setting. The experimental results show that our proposed method achieves state-of-the-art performance on the multilingual setting, and it has the ability to generalize to an unseen language task.

Paper

Recommended citation: Winata, G. I., Lin, Z., & Fung, P. (2019). Learning Multilingual Meta-Embeddings for Code-Switching Named Entity Recognition. ACL 2019, 181.