A combined syntactic-semantic embedding model based on lexicalized tree-adjoining grammar
Authors: Le Hong Phuong, Dang Hoang Vu
Abstract: This paper presents a joint syntactic-semantic embedding model which not only uses syntactic information to enrich the word embeddings but also generates distributed representations for the syntactic structures themselves. The syntactic input to our model comes from a Lexicalized Tree-Adjoining Grammar parser. The word embeddings from our model outperform the Skip-gram embeddings in several word similarity and sentiment classification experiments. The syntactic structure embeddings help improve a transition-based dependency parser by a clear margin.
Published in: July 2021, Computer Speech & Language