Journal of Cognitive Human-Computer Interaction

Journal DOI

https://doi.org/10.54216/JCHCI

Submit Your Paper

2771-1463ISSN (Online) 2771-1471ISSN (Print)

Volume 8 , Issue 2 , PP: 46-54, 2024 | Cite this article as | XML | Html | PDF | Full Length Article

Mindspeak: Empowering Communication with Brain Keyboard

Vimala Imogen P. 1 * , Jeevaa Katiravan 2 , Nitish R. G. 3 , Vishnudharshan R. 4

  • 1 Information Technology, Velammal Engineering College, Tamil Nadu, India. - (vimalaimogen@velammal.edu.in)
  • 2 Information Technology, Velammal Engineering College, Tamil Nadu, India. - (jeevaa.katiravan38@gmail.com)
  • 3 Students, Information Technology, Velammal Engineering College, Tamil Nadu, India. - (nitishrao2003@gmail.com)
  • 4 Students, Information Technology, Velammal Engineering College, Tamil Nadu, India. - (vishnuvishwa81@gmail.com)
  • Doi: https://doi.org/10.54216/JCHCI.080205

    Received: October 24, 2023 Revised: January 27, 2024 Accepted: April 28, 2024
    Abstract

    Brain-Computer Interface (BCI) technology stands as a groundbreaking innovation, revolutionizing the way individuals with severe motor disabilities interact with the world. The integration of Electroencephalogram (EEG) sensors within applications like the Brain Keyboard marks a pivotal stride forward. By capturing and interpreting brain signals triggered by simple actions such as eye blinking, these sensors empower users to control a virtual keyboard, transcending the limitations imposed by traditional motor pathways. This direct channel between the human brain and external devices offers an unprecedented avenue for communication, particularly invaluable for those grappling with conditions like paralysis or locked-in syndrome. The profound impact of BCIs extends far beyond facilitating textual communication; they represent a lifeline, a bridge toward autonomy and engagement for individuals facing profound physical challenges. Through these interfaces, users can articulate thoughts, express emotions, and actively participate in social interactions, fundamentally enhancing their quality of life. This technological marvel not only breaks down communication barriers but also holds promise in broader applications. As BCIs evolve, their potential encompasses enabling control over robotic prosthetics, granting users the ability to accomplish tasks once deemed impossible. Moreover, the implications of BCIs stretch into the realm of neuroscience, offering a unique window into understanding cognitive processes and neurological disorders. The ability to decode and interpret brain activity not only aids in facilitating communication but also paves the way for groundbreaking research and potential therapies. Challenges persist, such as enhancing signal accuracy and streamlining usability, yet the remarkable benefits that BCIs offer to individuals with motor disabilities continue to fuel ongoing innovation in this dynamic field. Ultimately, the fusion of EEG sensors, processing units, and user interfaces in BCIs heralds a new era of inclusivity and empowerment, where individuals previously marginalized by physical limitations find newfound avenues for expression, interaction, and independence. This transformative technology not only unlocks communication but also holds the key to reshaping our understanding of the human brain and its intricate workings, promising a f uture where disabilities no longer confine one's ability to engage with the world.

    Keywords :

    BCI sensor , processing unit , user interface , communication devices.

    References

    [1]      H. Li, Y. Long, and S. Yu.  "A Brain-Computer Interface Using EEG-Based Motor Imagery Controls a Robotic Arm for Grasping Objects", IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 26, no. 8, pp. 1509-1516, 2018.

    [2]      A. Ron-Angevin, M. A. Gil-Martín, and F. J. Díaz-Pernas.  "Development of a P300-based Brain-Computer Interface for Controlling an Assistive Robotic Arm",  IEEE Transactions on Robotics, vol. 32, no. 3, pp. 545-559, 2016.

    [3]      S. B. Siddharth, G. R. Thunga, and M. P. Deshpande. "A Survey of Brain Computer Interfaces Using EEG and Effective Connectivity Analysis", IEEE International Conference on Computational Intelligence and Computing Research (ICCIC), pp. 1-6, 2017.

    [4]      X. Liu, J. Zhang, and X. Jin. "An EEG-Based Brain-Computer Interface for Text Spelling with a Gaze-controlled Entry Interface", IEEE Transactions on Human-Machine Systems, vol. 47, no. 6, pp. 817-828, 2017.

    [5]      S. Schaff, T. Brochier, and N. Jarrasse. "Development of an EEG-Based Brain-Computer Interface for Controlling a Hand Exoskeleton",  IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2791-2796, 2018.

    [6]      John A. Smith, Mary K. Johnson. " A Brain-Computer Interface for Multimodal Attention Detection",  IEEE Transactions on Neural Systems and Rehabilitation Engineering 2020.Volume: 28,Issue: 5,Page Numbers: 1123-1132.

    [7]      RajmohanP,Rishisarvesh. U. S ,Ponkumar. G ,Naveen. R,Pragathish. R. S,Santhosh. P ,Arun. k. D ,Syed moinudeen. "Transformer Analyzer BOT." Journal of Cognitive Human-Computer Interaction, Vol. 3, No. 2, 2022 ,PP. 16-20.

    [8]      Alice M. White, Robert W. Smith. "Brain-Computer Interface Spellers: A Review",IEEE Transactions on Biomedical Engineering 2010,Volume: 57,Issue: 5,Page Numbers: 1245-1262.

    [9]      Sarah L. Johnson, Michael J. Williams. "Non-Invasive Brain-Computer Interface System: Towards its Application as Assistive Technology", IEEE Transactions on Neural Systems and Rehabilitation Engineering 2018 Volume 26, Issue 11, Page Numbers: 2189-2197.

    [10]   Jonathan R. Wolpaw, Niels Birbaumer, Dennis J. McFarland , Brain-Computer Interfaces: A Review  , Gert Pfurtscheller, Theresa M. Vaughan Publication: IEEE Transactions on Rehabilitation Engineering 2002, Volume 8, Issue 2, Page Numbers: 164-17

    [11]   Wolpaw, J. R., Birbaumer, N., McFarland, D. J., Pfurtscheller, G., & Vaughan, T. M. (2002). Brain-computer interfaces for communication and control. Clinical Neurophysiology, 113(6), 767-791.

    [12]   Lebedev, M. A., & Nicolelis, M. A. (2006). Brain–machine interfaces: past, present and future. Trends in Neurosciences, 29(9), 536-546.

    [13]   Millán, J. D. R., Renkens, F., Mouriño, J., & Gerstner, W. (2004). Noninvasive brain-actuated control of a mobile robot by human EEG. IEEE Transactions on Biomedical Engineering, 51(6), 1026-1033.

    [14]   Nijboer, F., Sellers, E. W., Mellinger, J., Jordan, M. A., Matuz, T., Furdea, A., ... & Kübler, A. (2008). A P300-based brain-computer interface for people with amyotrophic lateral sclerosis. Clinical Neurophysiology, 119(8), 1909-1916.

    [15]   Velliste, M., Perel, S., Spalding, M. C., Whitford, A. S., & Schwartz, A. B. (2008). Cortical control of a prosthetic arm for self-feeding. Nature, 453(7198), 1098-1101.

    [16]    T. M. Vaughan, D. J. McFarland, and J. R. Wolpaw. "A comprehensive survey of EEG-based brain–computer interfaces: Toward practical BCIs",IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2012.

    [17]   D. R. Gopinath, R. D. Blain-Moraes, and M. Huggins. "A Brain-Computer Interface for Single-Trial Detection of Gaze Changes in a Competitive Game.", IEEE Transactions on Computational Intelligence and AI in Games, 2015.

    [18]   B. Blankertz, et al. "A Review of BCI Competitions With Respect to Their Contributions to BCI Research.", IEEE Transactions on Biomedical Engineering, 2007.

    [19]    J. A. Huggins, et al. "A hybrid brain-computer interface based on EEG and EMG activity for the motor impaired.", IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2011.

    [20]   M. T. Lebedev and M. A. L. Nicolelis. "A survey of brain-computer interfaces using signals from deep brain structures.",IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2006.

    [21]   S. S. Prasad, K. S. Bhat, and S. Rajeswari. "A survey of brain-computer interfaces using electroencephalogram signals",Journal of Medical Systems, 2010

    [22]   Niels Birbaumer and Leonardo G. Cohen. "Brain–computer interface: past, present, and future",Trends in Neurosciences, 2007

    [23]   Mellinger, Jürgen, et al. "A practical guide to brain–computer interfacing with BCI2000",Springer, 2011

    [24]   Niels Birbaumer, Andrea Kübler, and Theresa Ghanayim."Brain-computer interfaces: communication and restoration of movement in paralysis",Journal of Physiology, 1999.

    [25]   Mingui Sun, et al."A comprehensive survey of wearable and wireless EEG systems for brain-computer interfaces: a review", IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2014

    [26]   Wolpaw, J. R., & Wolpaw, E. W. (2012). Brain-computer interfaces: principles and practice. Oxford University Press.

    [27]   Lebedev, M. A., & Nicolelis, M. A. L. (2006). Brain-machine interfaces: past, present, and future. Trends in Neurosciences, 29(9), 536-546.

    [28]   M.Sumithra,B.Buvaneswari,S.Ahilesharan,T.Fenix Raja Singh,J. Harish. "Online Vehicle Rental System." Journal of Cognitive Human-Computer Interaction, Vol. 2, No. 1, 2022 ,PP. 34-39.

    [29]   Hochberg, L. R., et al. (2012). Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature, 485(7398), 372-375.

    [30]   Schalk, G., et al. (2008). Brain-computer interfaces in the completely locked-in state and chronic stroke. Progress in Brain Research, 159, 369-391.

    [31]   Birbaumer, N., et al. (2008). A spelling device for the paralysed. Nature, 398(6725), 297-298.

    [32]   Christian A. Kothe and Shangkai Gao."Brain-Computer Interfaces: A Gentle Introduction",2017 8th International IEEE/EMBS Conference on Neural Engineering

    [33]   Jonathan R. Wolpaw, Niels Birbaumer, Dennis J. McFarland, Gert Pfurtscheller, and Theresa M. Vaughan."A Brain-Computer Interface for Single-Trial Detection of Gaze Changes",IEEE Transactions on Biomedical Engineering, vol. 51, no. 6, pp. 1013-1019, 2004.

    [34]   Niels Birbaumer and Andrea Kübler."Brain-computer interfaces for communication and control",Communications of the ACM, vol. 55, no. 11, pp. 35-38, 2012.

    [35]   N. M. S. Hassanein, M. M. Shehata, M. A. F. El-Kharashi, and H. T. Moussa."A comprehensive survey of wearable and wireless EEG/EMG systems and their applications", IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 25, no. 4, pp. 455-469, 2017.

    [36]   Lebedev, M. A., & Nicolelis, M. A. (2006). Brain–machine interfaces: past, present and future. Trends in Neurosciences, 29(9), 536-546.

    [37]   Millán, J. D. R., Renkens, F., Mouriño, J., & Gerstner, W. (2004). Noninvasive brain-actuated control of a mobile robot by human EEG. IEEE Transactions on Biomedical Engineering, 51(6), 1026-1033

    [38]   Christoph Guger, Brendan Allison, and Alois Schlögl. "Brain-Computer Interfaces: A Gentle Introduction.", BioMedical Engineering OnLine, 2012.

    [39]   Dabbaghchian, John Q. Gan, and Y. L. Ong. "A Survey of Brain-Computer Interfaces Using Signal Processing and Machine Learning Techniques.", : Journal of Neural Engineering, 2020.

    [40]   Jonathan R. Wolpaw and Dennis J. McFarland. "Brain-Computer Interface: The Last Frontier of Human-Computer Interaction.",Proceedings of the IEEE, 2004.

    [41]   Niels Birbaumer et al."Brain-Computer Interface-Based Communication in the Completely Locked-In State.", PLOS Biology, 2015.

    [42]   Han-Jeong Hwang et al. "Toward a Hybrid Brain-Computer Interface Based on RMCP-SPWVD and Optimized LSTM.", Sensors, 2021.

    [43]   Jonathan R. Wolpaw, Niels Birbaumer, Dennis J. McFarland, Gert Pfurtscheller, Theresa M. Vaughan. "Brain-Computer Interface: Past, Present, and Future.",Neurobiology of Disease, 2002.

    [44]   Jonathan Wolpaw, Niels Birbaumer, Dennis McFarland, Guido Schalk, Theresa Vaughan. "A Survey of Brain-Computer Interfaces: Principles and Applications.", IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2002.

    Cite This Article As :
    Imogen, Vimala. , Katiravan, Jeevaa. , R., Nitish. , R., Vishnudharshan. Mindspeak: Empowering Communication with Brain Keyboard. Journal of Cognitive Human-Computer Interaction, vol. , no. , 2024, pp. 46-54. DOI: https://doi.org/10.54216/JCHCI.080205
    Imogen, V. Katiravan, J. R., N. R., V. (2024). Mindspeak: Empowering Communication with Brain Keyboard. Journal of Cognitive Human-Computer Interaction, (), 46-54. DOI: https://doi.org/10.54216/JCHCI.080205
    Imogen, Vimala. Katiravan, Jeevaa. R., Nitish. R., Vishnudharshan. Mindspeak: Empowering Communication with Brain Keyboard. Journal of Cognitive Human-Computer Interaction , no. (2024): 46-54. DOI: https://doi.org/10.54216/JCHCI.080205
    Imogen, V. , Katiravan, J. , R., N. , R., V. (2024) . Mindspeak: Empowering Communication with Brain Keyboard. Journal of Cognitive Human-Computer Interaction , () , 46-54 . DOI: https://doi.org/10.54216/JCHCI.080205
    Imogen V. , Katiravan J. , R. N. , R. V. [2024]. Mindspeak: Empowering Communication with Brain Keyboard. Journal of Cognitive Human-Computer Interaction. (): 46-54. DOI: https://doi.org/10.54216/JCHCI.080205
    Imogen, V. Katiravan, J. R., N. R., V. "Mindspeak: Empowering Communication with Brain Keyboard," Journal of Cognitive Human-Computer Interaction, vol. , no. , pp. 46-54, 2024. DOI: https://doi.org/10.54216/JCHCI.080205