Abstract: | Dependency parsing is a fundamental task in natural language processing that involves analyzing the grammatical structure of sentences. This research focuses on advancing dependency parsing techniques for the Amharic language, using Transformer model. Amharic language is rich in linguistic complexities. For the experiment, we utilized a treebank containing 1574 sentences. Out of these, 500 sentences were meticulously crafted by the researcher in collaboration with linguistic experts. The entirety of the sentences originated from works of fiction and various novel genres, chosen to ensure relative structural correctness. While the remaining 1074 Amharic sentence adopted from UD-Amharic Treebank. The research begins with careful data preprocessing to ensure the quality and consistency of the dataset. We perform morphological analysis, POS tag and syntactic relations on collected sentence. The Transformer model is well-known for its success in various natural language processing tasks. The model's ability to capture contextual information and long-range dependencies aligns with the linguistic complexities of Amharic. Comparative analyses are conducted to assess the effectiveness of the Transformer model against traditional parsing algorithms additionally, the Arc-Hybrid algorithm, known for its efficiency in parsing non-projective structures, is integrated to enhance parsing capabilities. The hybrid approach addresses Amharic's complex sentence structures and long-range dependencies. The utilization of Transformer models and the Arc-Hybrid algorithm showcases their potential in advancing the accuracy and robustness of dependency parsing for languages with complex linguistic structures. The proposed system is evaluated and achieves 94.58 % unlabeled attachment score and 84.2% and labeled attachment score. |