![]() The resulting parser is simple and proved to effective on the semantic parsing task. The proposed model alleviates challenges and feature engineering in traditional transition-based and graph-based parsers. The encoder is a Bi-LSTM and the decoder uses recursive self-attention. ![]() ![]() We present an encoder-decoder model for semantic parsing with UCCA SemEval 2019 Task 1. Proceedings of the 13th International Workshop on Semantic EvaluationĪssociation for Computational Linguistics UC Davis at SemEval-2019 Task 1: DAG Semantic Parsing with Attention-based Decoder Cite (Informal): UC Davis at SemEval-2019 Task 1: DAG Semantic Parsing with Attention-based Decoder (Yu & Sagae, SemEval 2019) Copy Citation: BibTeX Markdown MODS XML Endnote More options… PDF: Association for Computational Linguistics. In Proceedings of the 13th International Workshop on Semantic Evaluation, pages 119–124, Minneapolis, Minnesota, USA. UC Davis at SemEval-2019 Task 1: DAG Semantic Parsing with Attention-based Decoder. Anthology ID: S19-2017 Volume: Proceedings of the 13th International Workshop on Semantic Evaluation Month: June Year: 2019 Address: Minneapolis, Minnesota, USA Venue: SemEval SIG: SIGLEX Publisher: Association for Computational Linguistics Note: Pages: 119–124 Language: URL: DOI: 10.18653/v1/S19-2017 Bibkey: yu-sagae-2019-uc Cite (ACL): Dian Yu and Kenji Sagae. ![]() UC Davis microbiome spinout rebrands infant supplement business with. The proposed model alleviates challenges and feature engineering in traditional transition-based and graph-based parsers. A dozen years in the making, a UC Berkeley spinout nabs funds to take on the eye. Abstract We present an encoder-decoder model for semantic parsing with UCCA SemEval 2019 Task 1. ![]()
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |