EFFECTIVENESS AND FUTURE OF DEEP LEARNING METHODS IN SYNTAX ANALYSIS APPLICATIONS IN NATURAL LANGUAGE PROCESSING

Authors

Keywords:

: computational linguistics, syntactic analysis, natural language processing, deep learning, neural networks.

Abstract

The article presents a comprehensive review of innovative technologies in the field of automated syntactic sentence analysis. The evolution of methods from traditional approaches to modern neural network architectures, including the application of deep learning (RNN, BiLSTM, Transformers) is considered. Special attention is given to attention mechanism based models (BERT, XLNet, RoBERTa) and their adaptation for syntactic analysis. The study also covers graph-based methods (GNN, graph transformers), multilingual and cross-linguistic approaches. Semantic information integration, active learning techniques and the application of pre-trained language models are discussed. New metrics for assessing the quality of parsing and the impact of the Universal Dependencies project on standardisation in this area are discussed. The study is based on a systematic literature review including chronological and comparative analysis of methods, their categorisation and critical evaluation. Information from various sources is synthesised and future trends are predicted. The paper is of interest to researchers and practitioners in the fields of NLP, linguistics and artificial intelligence, providing a comprehensive overview of the current state-of-the-art and future prospects of automated parsing

References

Ahmad, W. U., Zhang, Z., Ma, X., Hovy, E., Chang, K. W., & Peng, N. (2019). On difficulties of cross-lingual transfer with order differences: A case study on dependency parsing. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies.

Chen, D., & Manning, C. D. (2014). A fast and accurate dependency parser using neural networks. In Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP).

Conneau, A., Khandelwal, K., Goyal, N., Chaudhary, V., Wenzek, G., Guzmán, F., ... & Stoyanov, V. (2020). Unsupervised cross-lingual representation learning at scale. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics.

de Marneffe, M. C., Manning, C. D., Nivre, J., & Zeman, D. (2021). Universal dependencies. Computational Linguistics, 47(2), 255-308.

Hewitt, J., & Manning, C. D. (2019). A structural probe for finding syntax in word representations. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies.

Ji, T., Cao, Y., & Huang, L. (2019). Graph-based neural dependency parsing with latent structural predictions. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics.

Joshi, M., Chen, D., Liu, Y., Weld, D. S., Zettlemoyer, L., & Levy, O. (2020). SpanBERT: Improving pre-training by representing and predicting spans. Transactions of the Association for Computational Linguistics.

Kasai, J., Cahn, J., Tanaka, R., & Choi, Y. (2019). Syntax-aware neural semantic role labeling. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies.

Kiperwasser, E., & Goldberg, Y. (2016). Simple and accurate dependency parsing using bidirectional LSTM feature representations. Transactions of the Association for Computational Linguistics, 4, 313-327.

Kitaev, N., & Klein, D. (2018). Constituency parsing with a self-attentive encoder. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics.

Kondratyuk, D., & Straka, M. (2019). 75 languages, 1 model: Parsing universal dependencies universally. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing.

Li, Z., Li, J., Tang, D., Chen, B., & Li, J. (2020). Deep active learning for dependency parsing. In Proceedings of the 28th International Conference on Computational Linguistics.

Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen, D., ... & Stoyanov, V. (2019). RoBERTa: A robustly optimized BERT pretraining approach. arXiv preprint arXiv:1907.11692 .

Ma, X., Hu, Z., Liu, J., Peng, N., Neubig, G., & Hovy, E. (2018). Stack-pointer networks for dependency parsing. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics.

Mille, S., Belz, A., Bohnet, B., Graham, Y., Pitler, E., & Wanner, L. (2018). The first multilingual surface realisation shared task (SR'18): Overview and evaluation results. In Proceedings of the First Workshop on Multilingual Surface Realisation.

Slobodkin, A., Bhattacharyya, P., & Ungar, L. (2021). A unified feature attribution framework for interpreting dependency parsers. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing.

Swayamdipta, S., Thomson, S., Lee, K., Zettlemoyer, L., Eisenstein, J., & Smith, N. A. (2018). Syntactic scaffolds for semantic structures. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing.

Wu, Z., Pan, S., Chen, F., Long, G., Zhang, C., & Yu, P. S. (2020). A comprehensive survey on graph neural networks. IEEE Transactions on Neural Networks and Learning Systems.

Yang, Z., Dai, Z., Yang, Y., Carbonell, J., Salakhutdinov, R. R., & Le, Q. V. (2019). XLNet: Generalized autoregressive pretraining for language understanding. In Advances in Neural Information Processing Systems.

Zhou, J., Zhang, Z., Zhao, H., & Zhang, S. (2020). LIMIT-BERT: Linguistic informed multi-task BERT. In Findings of the Association for Computational Linguistics: EMNLP 2020.

Published

03/31/2025

How to Cite

Ryspakova, M., & Tursunova, A. (2025). EFFECTIVENESS AND FUTURE OF DEEP LEARNING METHODS IN SYNTAX ANALYSIS APPLICATIONS IN NATURAL LANGUAGE PROCESSING. Anatolian Social Sciences and Education Journal, 1(1). Retrieved from https://anatolianjournal.com/index.php/pub/article/view/37