Skip to content
image post
White papers

A combined syntactic-semantic embedding model based on lexicalized tree-adjoining grammar

April 16, 2024

Share with:

Authors: Le Hong Phuong, Dang Hoang Vu

Abstract: This paper presents a joint syntactic-semantic embedding model which not only uses syntactic information to enrich the word embeddings but also generates distributed representations for the syntactic structures themselves. The syntactic input to our model comes from a Lexicalized Tree-Adjoining Grammar parser. The word embeddings from our model outperform the Skip-gram embeddings in several word similarity and sentiment classification experiments. The syntactic structure embeddings help improve a transition-based dependency parser by a clear margin.

Published in: July 2021, Computer Speech & Language


Download now

Do you need a workthrough of our platform? Let us know

    Related Posts

    Get ahead with AI-powered technology updates!

    Subscribe now to our newsletter for exclusive insights, expert analysis, and cutting-edge developments delivered straight to your inbox!