Parsing the Uzbek language universal dependency treebank using a bi-affine neural model
Keywords:
dependency parsing, Uzbek language, bi-affine attention, BiLSTM, neural networks, low-resource, NLP (Natural Language Processing), Universal DependenciesAbstract
This study implements the deep bi-affine neural dependency parsing for the Uzbek language using a available Universal Dependency treebank. Key aspects such as attention layers, network architecture, embedding dropout rates, and optimization parameters (e.g., Adam optimizer) are discussed comprehensively. Experimental results demonstrate the bi-affine model achieves a UAS of 79.5% and a LAS of 72.4%, significantly outperforming a transition-based baseline parser by 6.7% (UAS) and 7.4% (LAS).
References
Ryan McDonald, Fernando Pereira, Kiril Ribarov, and Jan Hajič. 2005. Non-Projective Dependency Parsing using Spanning Tree Algorithms. In Proceedings of Human Language Technology Conference and Conference on Empirical Methods in Natural Language Processing, pages 523–530, Vancouver, British Columbia, Canada. Association for Computational Linguistics.
Joakim Nivre. 2008. Algorithms for Deterministic Incremental Dependency Parsing. Computational Linguistics, 34(4):513–553.
Dozat, T., & Manning, C. D. (2016, November 6). Deep biaffine attention for neural dependency parsing. arXiv.org. https://arxiv.org/abs/1611.01734
Eliyahu Kiperwasser and Yoav Goldberg. 2016. Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations. Transactions of the Association for Computational Linguistics, 4:313–327.
Arofat Akhundjanova, & Luigi Talamo (2025). Universal Dependencies Treebank for Uzbek. In Proceedings of the Third Workshop on Resources and Representations for Under-Resourced Languages and Domains (RESOURCEFUL-2025) (pp. 1–6). Association for Computational Linguistics.
Bahdanau, D., Cho, K., & Bengio, Y. (2014, September 1). Neural machine translation by jointly learning to align and translate. arXiv.org. https://arxiv.org/abs/1409.0473
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., & Polosukhin, I. (2017, June 12). Attention is all you need. arXiv.org. https://arxiv.org/abs/1706.03762
Diederik Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. International Conference on Learning Representations.
Chris Dyer, Miguel Ballesteros, Wang Ling, Austin Matthews, and Noah A. Smith. 2015. Transition-Based Dependency Parsing with Stack Long Short-Term Memory. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 334–343, Beijing, China. Association for Computational Linguistics.
Additional Files
Published
How to Cite
License
Copyright (c) 2025 Sanatbek Matlatipov

This work is licensed under a Creative Commons Attribution 4.0 International License.