Fusion: Practice and Applications

Journal DOI

https://doi.org/10.54216/FPA

Submit Your Paper

2692-4048ISSN (Online) 2770-0070ISSN (Print)

Volume 17 , Issue 1 , PP: 01-14, 2025 | Cite this article as | XML | Html | PDF | Full Length Article

A Transformer-Enhanced System to Reverse Dictionary Technology

Ahmed Bahaaulddin A. Alwahhab 1 * , Vian Sabeeh 2 , Ali Sami Al-Itbi 3 , Ali Abdulmunim Ibrahim Al-kharaz 4

  • 1 Department of informatics, Technical College of Management, Middle Technical University, Baghdad, Iraq - (ahmedbahaaulddin@mtu.edu.iq)
  • 2 Department of informatics, Technical College of Management, Middle Technical University, Baghdad, Iraq - (viantalal@mtu.edu.iq)
  • 3 Department of informatics, Technical College of Management, Middle Technical University, Baghdad, Iraq - (ali.sami@mtu.edu.iq)
  • 4 Department of informatics, Technical College of Management, Middle Technical University, Baghdad, Iraq - (ali.al-kharaz@mtu.edu.iq)
  • Doi: https://doi.org/10.54216/FPA.170101

    Received: October 28, 2023 Revised: February 25, 2024 Accepted: June 25, 2024
    Abstract

    The ability to retrieve a word from the cusp of memory often encounters the well-documented Tip-of-the-Tongue (TOT) barrier. This cognitive phenomenon can impede communication and learning. Addressing this, our study introduces a novel reverse dictionary framework empowered by cutting-edge neural network architectures to facilitate the retrieval of words from definitions or descriptions. This research draws the path of the development and the efficiency of various natural language deep learning models formulated to grasp the semantics inside the text. This work started with gripping a new dataset with rich content from a linguistic perspective. An accurate pre-processing step, including text normalizations and contextual features extraction, was conducted to transform the unstructured text into structured features fitting the model training. Dense vectors representative of text have been extracted using the BERT embedding model. Three models (LSTM, FNN, and GRU) were tested and compared using scrapped and benchmarked data. The proposed model that was consisted from Bert embedding and LSTM learner was evaluated and showed notable performance under cosine similarity and mean square error metrics. The LSTM model proved useful in real-world applications by exhibiting excellent semantic coherence in its embedding and accuracy in its predictions. This research evolved a discussion about the efficient behavior of the pre-trained BERT model in enhancing vocabulary. In addition, this work sheds light on the crucial role of reverse dictionaries in many NLP applications in the future. Subsequent research endeavors will focus on augmenting the multilingual functionalities of our methodology and investigating its suitability for other cognitive linguistic phenomena.

    Keywords :

    Bidirectional Encoder Representations from Transformers , Long short-term memory networks , Natural language processing , Reverse Dictionary

    References

    [1]     Khishigsuren T, Bella G, Batsuren K, Freihat AA, Nair NC, Ganbold A, et al. Using linguistic typology to enrich multilingual lexicons: the case of lexical gaps in kinship. ArXiv Prepr ArXiv220405049 2022.

    [2]     Brown AS. A review of the tip-of-the-tongue experience. Psychol Bull 1991;109:204–23. https://doi.org/10.1037//0033-2909.109.2.204.

    [3]     Brown R, McNeill D. The “tip of the tongue” phenomenon. J Verbal Learning Verbal Behav 1966;5:325–37. https://doi.org/10.1016/s0022-5371(66)80040-3.

    [4]     Siddique B, Sufyan Beg MM. A Review of Reverse Dictionary: Finding Words from Concept Description. Commun Comput Inf Sci 2019:128–39. https://doi.org/10.1007/978-981-15-1718-1_11.

    [5]     Ive J. Natural Language Processing: A Machine Learning Perspective by Yue Zhang and Zhiyang Teng. Comput Linguist 2022;48:233–5. https://doi.org/10.1162/coli_r_00423.

    [6]     Jiang K, Lu X. Natural Language Processing and Its Applications in Machine Translation: A Diachronic Review. 2020 IEEE 3rd Int Conf Safe Prod Informatiz 2020. https://doi.org/10.1109/iicspi51290.2020.9332458.

    [7]     Morinaga Y, Yamaguchi K. Improvement of Neural Reverse Dictionary by Using Cascade Forward Neural Network. J Inf Process 2020;28:715–23. https://doi.org/10.2197/ipsjjip.28.715.

    [8]     Lindemann B, Müller T, Vietz H, Jazdi N, Weyrich M. A survey on long short-term memory networks for time series prediction. Procedia CIRP 2021;99:650–5. https://doi.org/10.1016/j.procir.2021.03.088.

    [9]     Sazlı MH. A brief review of feed-forward neural networks. Commun Fac Sci Univ Ankara Ser A2-A3 Phys Sci Eng 2006;50.

    [10]   Chung J, Gulcehre C, Cho K, Bengio Y. Empirical evaluation of gated recurrent neural networks on sequence modeling. ArXiv Prepr ArXiv14123555 2014.

    [11]   Qi F, Zhang L, Yang Y, Liu Z, Sun M. WantWords: An Open-source Online Reverse Dictionary System. Proc 2020 Conf Empir Methods Nat Lang Process Syst Demonstr 2020. https://doi.org/10.18653/v1/2020.emnlp-demos.23.

    [12]   Yan H, Li X, Qiu X, Deng B. BERT for Monolingual and Cross-Lingual Reverse Dictionary. Find Assoc Comput Linguist EMNLP 2020 2020. https://doi.org/10.18653/v1/2020.findings-emnlp.388.

    [13]   Chen P, Zhao Z. A unified model for reverse dictionary and definition modelling. ArXiv Prepr ArXiv220504602 2022.

    [14]   Mane SB, Patil HN, Madaswar KB, Sadavarte PN. WordAlchemy: A Transformer-based Reverse Dictionary. 2022 2nd Int Conf Intell Technol 2022. https://doi.org/10.1109/conit55038.2022.9848383.

    [15]   Tran THH, Martinc M, Purver M, Pollak S. JSI at SemEval-2022 Task 1: CODWOE - Reverse Dictionary: Monolingual and cross-lingual approaches. Proc 16th Int Work Semant Eval 2022. https://doi.org/10.18653/v1/2022.semeval-1.12.

    [16]   Malekzadeh A, Gheibi A, Mohades A. PREDICT: persian reverse dictionary. ArXiv Prepr ArXiv210500309 2021.

    [17]   Zhang L, Qi F, Liu Z, Wang Y, Liu Q, Sun M. Multi-Channel Reverse Dictionary Model. Proc AAAI Conf Artif Intell 2020;34:312–9. https://doi.org/10.1609/aaai.v34i01.5365.

    [18]   Al-Matham R, Alshammari W, AlOsaimy A, Alhumoud S, Wazrah A, Altamimi A, et al. KSAA-RD Shared Task: Arabic Reverse Dictionary. Proc Arab 2023 2023. https://doi.org/10.18653/v1/2023.arabicnlp-1.39.

    [19]   Pavao A, Guyon I, Letournel A-C, Tran D-T, Baro X, Escalante HJ, et al. Codalab competitions: An open source platform to organize scientific challenges. J Mach Learn Res 2023;24:1–6.

    [20]   Shanmugam R. Practical text analytics: maximizing the value of text data. J Stat Comput Simul 2019;90:1346. https://doi.org/10.1080/00949655.2019.1628899.

    [21] Montajab, S. (2023). The Applications of Digital Signal Processing Techniques for Enhancing the Performance of High Speed Optical Communication Systems. Galoitica: Journal of Mathematical Structures and Applications, 8( 2), 16-28.

     [22]  Millstein F. Natural language processing with python: natural language processing using NLTK. Frank Millstein; 2020.

    [23]   Sibaee S, Ahmad S, Khurfan I, Sabeeh V, Bahaaulddin A, Belhaj H, et al. Qamosy at Arabic Reverse Dictionary shared task: Semi Decoder Architecture for Reverse Dictionary with SBERT Encoder. Proc Arab 2023 2023. https://doi.org/10.18653/v1/2023.arabicnlp-1.41.

    [24] Moaz, K. (2024). On the Improving of the Efficiency of FL Controller Based on Genetic Algorithms to Control (DC-DC) Converters. Pure Mathematics for Theoretical Computer Science, 4( 1), 35-47.

     [25]  Bendahman N, Breton J, Nicolaieff L, Billami MB, Bortolaso C, Miloudi Y. BL.Research at SemEval-2022 Task 1: Deep networks for Reverse Dictionary using embeddings and LSTM autoencoders. Proc 16th Int Work Semant Eval 2022. https://doi.org/10.18653/v1/2022.semeval-1.11.

    [26]   Ardoiz A, Ortega-Martín M, García-Sierra Ó, Álvarez J, Arranz I, Alonso A. MMG at SemEval-2022 Task 1: A Reverse Dictionary approach based on a review of the dataset from a lexicographic perspective. Proc 16th Int Work Semant Eval 2022. https://doi.org/10.18653/v1/2022.semeval-1.7.

    Cite This Article As :
    Bahaaulddin, Ahmed. , Sabeeh, Vian. , Sami, Ali. , Abdulmunim, Ali. A Transformer-Enhanced System to Reverse Dictionary Technology. Fusion: Practice and Applications, vol. , no. , 2025, pp. 01-14. DOI: https://doi.org/10.54216/FPA.170101
    Bahaaulddin, A. Sabeeh, V. Sami, A. Abdulmunim, A. (2025). A Transformer-Enhanced System to Reverse Dictionary Technology. Fusion: Practice and Applications, (), 01-14. DOI: https://doi.org/10.54216/FPA.170101
    Bahaaulddin, Ahmed. Sabeeh, Vian. Sami, Ali. Abdulmunim, Ali. A Transformer-Enhanced System to Reverse Dictionary Technology. Fusion: Practice and Applications , no. (2025): 01-14. DOI: https://doi.org/10.54216/FPA.170101
    Bahaaulddin, A. , Sabeeh, V. , Sami, A. , Abdulmunim, A. (2025) . A Transformer-Enhanced System to Reverse Dictionary Technology. Fusion: Practice and Applications , () , 01-14 . DOI: https://doi.org/10.54216/FPA.170101
    Bahaaulddin A. , Sabeeh V. , Sami A. , Abdulmunim A. [2025]. A Transformer-Enhanced System to Reverse Dictionary Technology. Fusion: Practice and Applications. (): 01-14. DOI: https://doi.org/10.54216/FPA.170101
    Bahaaulddin, A. Sabeeh, V. Sami, A. Abdulmunim, A. "A Transformer-Enhanced System to Reverse Dictionary Technology," Fusion: Practice and Applications, vol. , no. , pp. 01-14, 2025. DOI: https://doi.org/10.54216/FPA.170101