Journal of Intelligent Systems and Internet of Things

Journal DOI

https://doi.org/10.54216/JISIoT

Submit Your Paper

2690-6791ISSN (Online) 2769-786XISSN (Print)

Blockchain-Enabled Multi-Head Attention Based Deep Learning Model for Intrusion Detection System in Smart Networks

Ehab Bahaudien Ashary

Intrusion Detection Systems (IDS) are increasingly being integrated into smart homes for effective pervasive sensing and resource management, thanks to advancements in sensor technologies and the development of Information and Communication Technology (ICT). Securing IDSs in smart homes is significant for safeguarding crucial data and ensure the integrity of related devices. Implementing strong cybersecurity, measures, including regular software updates, encrypted communication protocols, and secure authentication mechanisms, is critical to safeguard potential risks. As the smart home network constantly increasing, developers, users, and manufacturers must work together to maintain and prioritize stringent security standards, alleviating the risks closely related to connected devices and preserving the safety and privacy of the consumer. Blockchain (BC) technology can increase the security of IDS in smart homes by giving a tamper-resistant and decentralized framework to manage data transactions and device interactions. By leveraging blockchain, smart home networks can establish a more secure and resilient infrastructure, which provides consumers with high confidence in the security and privacy of the interconnected devices. This study introduces a Blockchain and Multi-Head Attention-Based Deep Learning for Intrusion Detection System in Smart Networks (BCMHDL-IDSSN) technique in Smart Home Networks. The BCMHDL-IDSSN method aims to enhance security in the smart home networks. In the BCMHDL-IDSSN technique, BC technology is used to achieve security. Besides, the BCMHDL-IDSSN technique involves the design of a multi-head attention bidirectional gated recurrent unit (MHA-BiGRU) method for the detection of malicious activities. Finally, an enhanced pigeon-inspired optimization (EPIO) model is applied for the optimal hyperactive parameter choice of the MHA-BiGRU model. A detailed investigation was applied to validate the performance of the BCMHDL-IDSSN method. The simulation values emphasized that the BCMHDL-IDSSN method gains high efficiency over other techniques.

Read More

Doi: https://doi.org/10.54216/JISIoT.150201

Vol. 15 Issue. 2 PP. 01-13, (2025)

Design and implementation of intelligent home data cloud storage system with large system and big data

Yangxia Shu , Hai Liu

The increasing maturity of 5G technology and Internet of Things technology makes people feel the convenience brought by high-tech in their daily lives, and smart homes gradually penetrate into people’s lives. Aiming at the disadvantages of traditional data storage such as low flexibility and slow speed, an effective cloud storage system for data storage and management is designed. Through the design of the data cloud storage system structure and database, and the hardware design of the smart home data cloud storage system, this paper provides users with various functions, verifies the practicability of the cloud storage system through system testing and analysis, and improves the functions of the smart home data cloud storage system.

Read More

Doi: https://doi.org/10.54216/JISIoT.150202

Vol. 15 Issue. 2 PP. 14-28, (2025)

Modelling Software Development Effort Using Data-Driven Models

Zainab Rustum Mohsin , Firoj Khan

Software effort estimation is highly significant for project management regarding the bidding process since underestimation leads to financial losses, while overestimation may bring the chance of losing a competitive bid. Whereas numerous models have been designed up until now, those developed upon machine learning, Adaptive Neuro-Fuzzy Inference Systems (ANFIS) and Artificial Neural Networks (ANN) have emerged as preeminent technologies. The proposed research will explore the effectiveness of using the ANN and ANFIS approaches in the estimation of effort for NASA datasets by 13 observations used for training and the rest for the test. To check the precision of models, several measures are used to evaluate the accuracy of the developed model, including the correlation coefficient, RMSE, and MMRE. The findings demonstrate that ANN and ANFIS exhibit superior performance, yielding much higher prediction accuracy compared to conventional Models including Walston-Felix, Doty, Bailey-Basili, and Halstead. It emphasizes ANN and ANFIS as reliable and straightforward software effort estimating methodologies, hence yielding significant enhancements in estimation precision and competitiveness. Their high performance underlines their usefulness to project managers who seek accurate predictions. This study strongly recommends the application of data-driven approaches like ANN and ANFIS to enhance the overall estimation accuracy in software project bidding.

Read More

Doi: https://doi.org/10.54216/JISIoT.150203

Vol. 15 Issue. 2 PP. 29-40, (2025)

Deep Secure: An Integrated Approach to Anomaly Detection and Cryptographic Protection in Industrial Cyber-Physical Systems

Sameer Nooh

Industrial Cyber-Physical System (CPS) signify a noteworthy development in industrial automation and control, combining physical and digital parts in order to improve the efficacy, trustworthiness, and functionality of numerous industrial procedures. Industrial CPS are helpful in a huge range of industries such as transportation, energy, manufacturing, and healthcare.  Intrusion detection systems (IDs) assist as vigilant protectors, constantly observing network and physical modules for any illegal access, variances, or doubtful actions. They deliver initial threat recognition and prevent safety breaks and operating troubles. In addition, cryptographic protection guarantees the privacy, honesty and genuineness of data that spread across Industrial CPS systems. By utilizing innovative encryption and authentication devices, cryptographic solutions defense complex data from capture or damage preserving consistency and confidentiality of dangerous industrial procedures. The combination of these safety actions creates a strong defence device, boosting the flexibility of Industrial CPS besides developing cyber threats and protecting the reliability of vital industrial processes. This article presents a Deep Secure: An Integrated Approach to Intrusion Detection and Cryptographic Protection in Industrial CPS environment. The proposed model aims to integrate intrusion detection and cryptographic-based secure communication protocol for industrial CPS environments. The Deep Secure model comprises two major phases: intrusion detection and secure communication. Primarily, the intrusion detection process comprises a self-attention-based bidirectional long short-term memory (SA-BiLSTM) technique. Besides, the deer hunting optimization algorithm (DHOA) achieve hyperparameter tuning of the SA-BiLSTM technique. Moreover, a secure communication protocol is designed by the use of the ElGamal cryptosystem. The experimental result of the Deep Secure method was tested in terms of dissimilar measures. A comprehensive result analysis highlighted the advanced performance of the Deep Secure method when associated to other current approaches.

Read More

Doi: https://doi.org/10.54216/JISIoT.150204

Vol. 15 Issue. 2 PP. 41-54, (2025)

Explainable AI-Driven Gait Analysis Using Wearable Internet of Things (Wiot) and Human Activity Recognition

Ponugoti Kalpana , Sarangam Kodati , L. Smitha , Dhasaratham , Nara Sreekanth , Aseel Smerat , Muhannad Akram Ahmad

Due to the rapid expansion of the Internet of Things (IoT), supportive systems for healthcare have made significant advancements in both diagnosis and treatment processes. To provide optimal support in clinical settings and daily activities, these systems must accurately detect human movements. Real-time gait analysis plays a crucial role in developing advanced supportive systems. While machine learning and deep learning algorithms have significantly improved gait detection accuracy, many existing models primarily focus on enhancing detection accuracy, often neglecting computational overhead, which can affect real-time applicability. This paper proposes a novel hybrid combination of Sparse Gate Recurrent Units (SGRUs) and Devil Feared Feed Forward Networks (DFFFN) to effectively recognize human activities based on gait data. These data are gathered through Wearable Internet of Things (WIoT) devices. The SGRU and DFFFN networks extract spatio-temporal features for classification, enabling accurate gait recognition. Moreover, Explainable Artificial Intelligence (EAI) assesses the interoperability, scalability, and reliability of the proposed hybrid deep learning framework. Extensive experiments were conducted on real-time datasets and benchmark datasets, including WHU-Gait and OU-ISIR, to validate the algorithm’s efficacy against existing hybrid methods. SHAP models were also employed to evaluate feature importance and predict the degree of interoperability and robustness. The experimental results show that the method, combining Sparse GRUs and Tasmanian Devil Optimization (TDO)-inspired classifiers, achieves superior accuracy and computational efficiency compared to existing models. Tested on real-time and benchmark datasets, the model demonstrates significant potential for real-time healthcare applications, with an AUC of 0.988 on real-time data. These findings suggest that the approach offers practical benefits for improving gait recognition in clinical settings.

Read More

Doi: https://doi.org/10.54216/JISIoT.150205

Vol. 15 Issue. 2 PP. 55-75, (2025)

Ensemble of Machine Learning Model with Tuna Swarm Optimization-Driven Feature Selection for Cybersecurity Threat Detection and Classification Approach

K. Anitha , K. Rajiv Gandhi

The initial identification of cybersecurity events like attacks is challenging provided the continuously growing threat environment. Despite state-of-the-art surveillance, advanced attackers can apply for more than 100 days in a system before being detected. Guaranteeing cyber security is a composite task that depends on area of interest and needs cognitive capabilities to control possible threats from larger quantities of network data. The most important task of a cyber-security analyst is to safeguard a network from damage. Numerous technological developments in network and information security have enabled progressive monitoring and threat detection for the predictors, but the responsibilities they carried out could not be automated completely. Hence, in recent times’ Artificial intelligence (AI), mainly deep learning (DL) and machine learning (ML) algorithms, has been utilized to expand a beneficial data-driven intrusion detection system (IDS). Many standard ML classification methods provide intelligent facilities in the area of cyber-security, mainly for intrusion detection. This study develops a Tuna Swarm Optimization-Driven Feature Selection with Ensemble of Machine Learning Models for Cybersecurity Threat Detection and Classification (TSOFSEML-CTDC) technique. The proposed TSOFSEML-CTDC model concentrates on detecting and classifying intrusions on the network. Initially, the TSOFSEML-CTDC algorithm performs data preprocessing using min-max normalization to convert an input data into a beneficial format. Then, the feature selection process has been carried out using tuna swarm optimization (TSO) algorithm. For the classification of intrusion detection, ensemble of ML techniques was employed such as support vector regression (SVR) approach, least-square support vector machines (LSSVM) method, and modified extreme learning machine (MELM) technique.  At last, the hyperactive parameter optimization process is executed by using the coati optimization algorithm (COA). The experimental evaluation of the TSOFSEML-CTDC model occurs using a benchmark dataset. The stimulated results emphasized the enhanced performance of the TSOFSEML-CTDC method compared to existing approaches.

Read More

Doi: https://doi.org/10.54216/JISIoT.150206

Vol. 15 Issue. 2 PP. 76-90, (2025)

Leveraging Artificial Intelligence for Assessing Metering Faults in Electric Power Systems

Huda W. Ahmed , Asma Khazaal Abdulsahib , Massila Kamalrudin , Mustafa Musa

Operations sampling inspections, manual verification, user applications, and other labor-intensive methods are commonly used for energy meter error analysis in the power industry. Accurate fault detection in electric energy meters is crucial for achieving reliable measurements. The fundamental issue with on-site testing of electrical energy metering equipment is the characteristics of electric power meters under dynamic settings. This research develops a deep learning-assisted prediction model (DLPM) to address the problem of inaccurate energy power meters. Electricity is measured precisely, and the meters can pinpoint the most consequential deviations between the predicted and actual trajectories. The results of this research point to the widespread adoption of a consistent and autonomous method for analyzing discrepancies in energy meters. Compared to the traditional way, this technology considerably improves the electric intelligent meter deviance computation, providing exact data assistance for analysis and diagnostics of the source of the electric smart meter abnormality. The simulation results show that the suggested DLPM model has better prediction accuracy (99.2%), performance (97.8%), efficiency (98.9%), average consumption (10.3%), and root mean square error (11.2%) than the state-of-the-art.

Read More

Doi: https://doi.org/10.54216/JISIoT.150207

Vol. 15 Issue. 2 PP. 91-103, (2025)

Enhancing E-commerce Security through Fake News Detection Using Natural Language Processing and Advanced Feature Engineering Technique

Lama Sameer Khoshaim

E-commerce has simplified customers' lives and offered a range of items, but it has also made them vulnerable to frauds. Fake news on e-commerce platforms threatens trust, brand image, and economic stability. Researchers have shown that contemporary Natural Language Processing (NLP) and machine learning can stop bogus news. However, e-commerce companies still struggle to distinguish phony news from real information. Fast knowledge diffusion can cause financial loss, reputation damage, and customer distrust. Thus, e-commerce false news identification requires robust and trustworthy methods. This investigation will successfully recognize and discriminate fake news. High Feature Extraction uses Word2vec and Term Frequency-Inverse Document Frequency (TF-IDF) to extract features. The optimum feature subset is determined via feature selection utilizing the least absolute shrinkage and selector operator (LASSO). The study involves four phases: Extraction, selection, classification, and data processing are the four steps. To remove raw data, data preparation utilizes stemming, lemmatization, and stop word removal. The suggested method averages model outputs to reduce overfitting and improve prediction stability. DIstilBERT with multi-stacked LSTM is tested on WELFake and ranked by F1 score, sensitivity, accuracy, and specificity. The multi-stacked LSTM distiller has 99.77% accuracy, far greater than the others do. We can use it to detect bogus news. It boosts customer confidence and Internet commerce legitimacy by improving accuracy and consistency.

Read More

Doi: https://doi.org/10.54216/JISIoT.150208

Vol. 15 Issue. 2 PP. 104-120, (2025)

Grasshopper-Inspired Deep Neural Network for Enhanced Breast Cancer Classification

Bhawna Utreja , Reecha Sharma , Amit Wason

Early-stage disease diagnosis is critical for effective treatment, and software-aided design can analyze disease architecture for timely detection. Many fail to identify disease severity before it becomes chronic, contributing to global mortality rates. Breast cancer, a prime reason of death among women, can be treated if detected early. Computer-aided diagnosis aids practitioners in accurately assessing disease criticality. This paper introduces an automated diagnosis system utilizing an enhanced Grasshopper Optimization technique and a Deep Neural Network (DNN) classifier. The Grasshopper Algorithm optimally selects features from segmented images, extracted through SIFT and BRISK hybrid techniques. The DNN classifies breast cancer using a partitioned dataset for training and testing. Performance metrics, including accuracy, precision, F-measure, and recall, demonstrate that the proposed system significantly outperforms existing methods, with an F-measure improvement of 5.1% and an accuracy increase of 11.19%.

Read More

Doi: https://doi.org/10.54216/JISIoT.150209

Vol. 15 Issue. 2 PP. 121-137, (2025)

Behavior of SPEA2 Algorithm to Resolve Scheduling Problem for IoT Cloud

Syed Mutiullah Hussaini , T. Abdul Razak , Muhammad Abid Jamil

The (SPEA2) Strength Pareto Evolutionary Algorithm 2 is a capable technique for managing multi-objective optimization problems. In IoT-cloud systems, this is particularly true with regard to task scheduling. Task scheduling and efficient resource allocation are necessary to improve performance and service quality as the Internet of Things (IoT) grows. SPEA2, which is especially helpful for cloud computing frameworks, is excellent at handling competing goals, such minimizing executing duration while increasing the usage of resources. The capacity of SPEA2 to keep a large collection of solutions allows for the exploration of various scheduling approaches in IoT-cloud scenarios, where tasks generated by several devices need to be handled effectively. In dynamic contexts where resource availability varies, this IoT-CS (IoT-Cloud_Scheduling) adaptability is essential. With SPEA2, researchers are able to create algorithms that enhance system responsiveness and dependability overall while also optimizing task scheduling. The management of resource distribution and task prioritizing difficulties is exemplified by the use of SPEA2 to scheduling problems in IoT-cloud infrastructures. Thus, by guaranteeing that computing resources are used efficiently while respecting performance limitations, SPEA2 makes a substantial contribution to the development of intelligent scheduling solutions that satisfy the changing requirements of IoT applications

Read More

Doi: https://doi.org/10.54216/JISIoT.150210

Vol. 15 Issue. 2 PP. 138-150, (2025)

Optimizing Heart Attack Predictions Models using Innovative Machine Learning Methods

Yerraginnela Shravani , Ashesh K.

Cardiopathy is a critical health issue worldwide, accounting for a significant number of fatalities each year. Early and precise prediction of heart-related conditions can substantially reduce mortality rates and improve healthcare outcomes. Although traditional machine learning models have been employed in this domain, their performance often falls short due to challenges like overfitting, limited scalability, and difficulty in capturing intricate, non-linear data patterns. This paper introduces an improved methodology for heart disease prediction by employing advanced machine learning techniques, including deep learning networks, ensemble methods such as CNN and VGG16. Key components of the proposed framework include advanced data pre-processing methods for addressing class imbalance, sophisticated feature engineering driven by domain-specific insights, and comprehensive hyperparameter tuning for enhanced model performance The results of this study reveal significant improvements in predictive accuracy and reliability compared to conventional methods, paving the way for better integration of predictive analytics in cardiovascular healthcare. Future research will focus on integrating dynamic patient data from wearable devices and broadening dataset diversity to enhance the generalizability and fairness of these predictive models.

Read More

Doi: https://doi.org/10.54216/JISIoT.150211

Vol. 15 Issue. 2 PP. 151-163, (2025)

Energy Efficiency and Practical Implications of IoT-Based Static vs. Single-Axis Solar Tracking Systems: A Comparative Analysis

Indra Kishor , Udit Mamodiya , Bright Keswani

The objective of this research is to offer a comparative evaluation of IoT based static and single-axis solar tracking systems with respect to energy efficiency, economic viability, and impediments in the implementation of both static and single-axis solar tracking systems. In order to fill in the gaps in the current literature on their performance comparison. In this research work, IoT technology has been used to monitor both systems in real time over a period of 30 days in comparable under the similar environmental conditions for data collection and analysis. The research also implements the Fuzzy Logic Controller-based algorithm, developed for the single-axis solar tracking system provides a dynamic and flexible mechanism to optimize solar energy capture. It intelligently adjusts the solar panel's angle based on real-time sensor data, ensuring that the panel is always positioned to maximize sunlight exposure. The data characteristics like solar radiation, temperature, voltage and these different effects were monitored to help in the determination of energy output and the overall efficiency of the system. The findings confirm that the IoT-based single-axis tracking system considerably improved the average system efficiency by 7% as compared to the static system. However, the high installation and maintenance costs of IoT-based single-axis systems increase complexity, posing challenges for mass adoption, particularly in small-scale applications. This paper demonstrates how IoT tracking systems offer improved efficiency of single axis trackers to achieve higher energy efficiency. This work will help in the decision making process for the future solar energy projects where there will be a need to consider the costs against the operational and performance advantages to balance performance benefits with cost and operational consideration. Studies have shown that IoT technology application enhances efficiency and energy operational parameters of solar photovoltaic (PV) systems.

Read More

Doi: https://doi.org/10.54216/JISIoT.150212

Vol. 15 Issue. 2 PP. 164-182, (2025)

Securing IoT through Intrusion Detection Systems: An Overview

Razan Abdulhammed , Shaima Miqdad Mohamed Najeeb , Rabei Raad Ali , Mohammed Ahmed Jubair

Internet of Things (IoT) has emerged as a new paradigm for integrating internet resources and physical objects. It provides a better standard of living in different domains, like industrial processes, home automation, and environmental monitoring. The growth of IoT depends on the need to connect more devices via the Internet. However, anywhere internet connectivity is involved, security poses as an enormous challenge. Intrusion Detection Systems (IDS) can protect IoTs by applying rules related to IoTs operation. This paper reviews some of the mechanisms of IoT-related IDS, which protect IoT devices against various attacks. The paper includes a summary of the recent developments of IDS against many security threats. A review is presented regarding various IDS designs developed in the last decade with different methods, ideas, and approaches toward a better understanding of suitable IDS platforms that provide security against the global growth of attacks and intruders. It also involves the examination of the IDS basics, types, and components of the previously proposed systems, as well as discussing the pros and disadvantages of each.  We organize the taxonomy of investigated IDS approaches using the detection approaches. This work aims to provide a thorough summary of the existing IDS designs and issues to empower research and development for IDS about IoTs.

Read More

Doi: https://doi.org/10.54216/JISIoT.150213

Vol. 15 Issue. 2 PP. 183-190, (2025)