Mobile health (mHealth) applications have revolutionized the healthcare sector by providing innovative solutions for patient monitoring, health tracking, and medical consultation. These applications leverage the widespread use of smartphones to deliver health services that are accessible, affordable, and efficient. Research indicates that mHealth technologies significantly improve healthcare service delivery processes, enhancing patient outcomes and healthcare management. Furthermore, the functionality of mobile apps in health interventions has been systematically reviewed, showing positive impacts on user engagement and behavior change. This study explores the development and implementation of a medical screening application for incoming university students using an Android platform. The application is designed to perform basic health check-ups, including monitoring and assessing general health status, and providing recommendations for further medical consultation if necessary. The application includes several modules: blood test analysis, vision test, hearing test, and speech test. By leveraging advancements in mobile health (mHealth) technologies and artificial intelligence, the application offers a cost-effective and scalable solution for university health services. This paper highlights the potential benefits, challenges, and future implications of deploying mobile health screening applications in educational institutions.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170201
Vol. 17 Issue. 2 PP. 01-14, (2025)
The current landscape of assistive robotics in digital healthcare faces significant challenges, particularly in ubiquitous environments. Existing systems need the necessary infrastructure to monitor and process data, hindering their effectiveness. Moreover, the arrangement and management of IoMT (Internet of Medical Things) data across various nodes present a new challenge, further complicating the deployment of assistive digital healthcare solutions. We propose a novel Assistive Robotics-Based Digital Healthcare System within a Ubiquitous IoMT Cloud network to address these challenges. This system supports various medical care applications, including digital wheelchair location tracking, artificial limbs, and remote surgical operations across different hospitals. Our contributions are as follows: We introduce the ARDTS (Assistive Robot Digital Healthcare Task Scheduling) algorithm to efficiently process data across multiple nodes; ensuring secure data handling based on the systems security protocols. We implement a convolutional neural network for data standardization, converting non-linear data into a linear form to predict relevant features accurately. We develop a socket-enabled cross-platform system to enhance interoperability for seamless data sharing and processing.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170202
Vol. 17 Issue. 2 PP. 15-22, (2025)
Establishing basic network connectivity by mobile devices depends on wireless communication during infrastructure downtime. Nodes within these networks use routing protocols to send data packets between one another until the packets reach their endpoint. The protocols have security weaknesses that permit harmful nodes to stage assaults on the network. Network disruption occurs through the Black Hole Attack, which blocks all data packets from getting to their destinations by intercepting them during their transmission. Security systems that detect intruders executing these attacks protect against the security challenge. A simulated wireless ad-hoc network scenario is the basis for assessing how well response systems fight against the Black Hole attack. In this paper, the Anti-Black Hole Ad hoc On-Demand Distance Vector (ABAODV) is the proposed solution to combat the Black Hole attack effects. During the experiments, ABAODV's modified AODV version and standard AODV protocol underwent performance measurements through throughput, Packet Delivery Fraction (PDF), Average End-to-End Delay (AED), and Normalized Routing Load (NRL) while operating in Black Hole attack environments and without such attacks. Through its NS-2 implementation, ABAODV achieved 99% effectiveness in combating the Black Hole attack. The entire simulation was conducted on a Linux platform, including mobility generation, analysis, results presentation, and NS-2 simulation.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170203
Vol. 17 Issue. 2 PP. 23-35, (2025)
This study investigates the application of AI-powered predictive analytics in chronic disease management, focusing on the most effective machine learning models for predicting patient risk and optimizing healthcare interventions, like Random Forest, Linear Regression, Support Vector Machines (SVM), K-Nearest Neighbors (KNN), and Gradient Boosting were evaluated using a dataset of 10,000 patient records. The models were assessed based on their accuracy, interpretability, and clinical relevance. Gradient Boosting attained the highest predictive accuracy, with an AUC of 0.89. Random Forest followed closely with an AUC of 0.85, offering a good balance of accuracy and interpretability. Linear Regression, with an AUC of 0.75, demonstrated the trade-offs between simplicity and model performance, while SVM and KNN performed with AUCs of 0.82 and 0.78, respectively, with SVM being robust but facing scalability challenges and KNN being less practical for large datasets. These AI models improve patient outcomes, decrease healthcare costs, and optimize healthcare delivery.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170204
Vol. 17 Issue. 2 PP. 36-49, (2025)
One of the major concerns when transitioning emails is the potential influx of unsolicited and unwanted spam emails. These unwanted emails can clog inboxes, causing recipients to overlook important messages and opportunities. To ensure security and avoid the destructive and dangerous effect of these spam emails, machine learning and deep learning methods have been conducted to design spam detection models. In this work, a combination of embedding models and multi-layer artificial neural networks as deep learning classification models is utilized in order to introduce an approach to spam detection. The proposed classifier leverages the Bidirectional Encoder Representations from Transformers (BERT) model for word embedding, applied to the Enron-Spam dataset, offering a noteworthy technique for considerable spam detection. Experimental results demonstrate that the proposed spam detection model achieved a 99% recall rate for detecting spam emails. Notably, this model is a step forward in generality and improving the efficiency of spam detection. It presents a good attempt at presenting a solution for detecting spam emails and fake text within communication environments.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170205
Vol. 17 Issue. 2 PP. 50-63, (2025)
This study presents a predictive modeling framework for forecasting the E-Government Development Index (EGDI) using two advanced time series approaches. Firstly, the Seasonal Auto Regressive Integrated Moving Average with Exogenous Variables (SARIMAX). Secondly, hybrid ARIMA-LSTM model. We focus on two case studies, Iraq and Tunisia, based on monthly EGDI data from the United Nations Survey Reports, spanning the years 2003 to 2024. Using several preprocessing steps such as handling missing data, testing for stationarity using the combined ADF and KPSS tests, and determining the optimal ARIMA parameters through ACF and PACF analysis and implementing autoarima. The model was built and trained using 80% of the data, while 20% was retained for testing. The independence of the residuals verified using the Ljung-Box test. Four types of visualization and error analysis were applied using ACF/PACF for residuals, error plots as prediction error plot, error distribution plot (histogram + KDE) and decomposition analysis to visually assess model fit. Evaluation was conducts using multiple error metrics, including RMSE, MAE, MAPE, MHE, AIC, BIC and MAPA. After building the four models, we ensured that the results and reconstructions were evaluated using the 12 tests we mentioned, and that they were based on the best results and were consensus acceptable. ARIMAX model demonstrated superior performance, achieving an average absolute percentage Accuracy (MAPA) of 98.35% for Iraq and 97.93% for Tunisia. In comparison, the hybrid ARIMA-LSTM model, which combines linear ARIMA outputs with nonlinear corrections from an LSTM neural network, demonstrated competitive predictive ability with a MAPA of 95.68% for Iraq and 96.14% for Tunisia. SARIMAX showed slightly outperformed the hybrid model in overall accuracy. On other hand, ARIMA-LSTM model demonstrated robustness in capturing complex nonlinear dynamics particularly in the more structurally diverse Tunisian dataset. These results confirm the potential of both models as effective tools for predicting EGDIs and support their application in digital governance planning and policymaking. We designed and we recommend adopting our "12 -Test Approach" for evaluation framework as a standard methodology in future studies addressing analysis and forecasting, and its suitability for different types of time series models. This approach provides comprehensiveness, accuracy, and flexibility in evaluation, regardless of model type or application area.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170206
Vol. 17 Issue. 2 PP. 64-87, (2025)
Recent advancements in biomedical data analysis have significantly transformed clinical decision-making. However, the inherent complexity and heterogeneity of healthcare data continue to present major challenges. Traditional deep learning models, while powerful, often lack transparency, limiting their adoption in clinical settings due to their "black-box" nature. To address this critical gap, this study introduces a novel Explainable Deep Learning (XDL) framework that integrates high predictive accuracy with interpretability, enabling clinicians to trust and validate AI-driven insights. The proposed framework leverages advanced interpretability techniques—such as Grad-CAM for visual attribution and SHAP for feature importance analysis—to analyze multimodal biomedical data, including clinical imaging, genomic sequencing, and electronic health records. Experimental evaluations across three benchmark datasets demonstrated the model’s strong performance, achieving an accuracy of 91%, sensitivity of 95.4%, specificity of 98.6%, and an AUC of 99%, while maintaining an interpretability score of 92% as rated by domain experts. Compared to non-explainable models, the proposed approach showed a 12.3% increase in interpretability and a 5.8% improvement in accuracy. Importantly, attention map analysis revealed alignment with clinically relevant biomarkers in 93% of cases and uncovered previously overlooked prognostic patterns in 18% of patient cohorts. These findings underscore the model’s potential to enhance diagnostic precision and support more informed clinical decisions. Moreover, the algorithm reduced diagnostic time by 23% due to its provision of actionable insights. The hybrid approach—combining built-in attention mechanisms with external interpretability tools—ensures seamless integration into clinical workflows while supporting compliance with regulatory standards for transparency.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170207
Vol. 17 Issue. 2 PP. 88-100, (2025)
The Internet of Things (IoT) advancement has created new security holes, which require intrusion detection systems to defend networks effectively. The complex structure of IoT networks causes traditional security methods to fail because they produce high amounts of incorrect detections and limited ability to accurately identify threats. The authors introduce ID-ELC: Ensemble Learning and Classification framework for Intrusion Detection, which aims to strengthen IoT environment security. A new ID-ELC model uses CS optimization with composite variance to choose network features that boost their detection capabilities. The cybersecurity evaluation of the system utilized Kyoto network records that included 91,000 intrusion-prone records and 59,000 benign logs from 150,000 total records. Experiments revealed ID-ELC surpasses Statistical Flow Features (SFF) and Two-layer Dimension Reduction and Two-tier Classification (TDRTC) through precision 0.98, accuracy 0.98, sensitivity 0.99 and specificity 0.97. Science-based evaluations confirm ID-ELC represents a flexible and resilient tool for IoT intrusion protection that shows practical value for citywide security systems and medicine networks and manufacturing operations. Future investigation will concentrate on enhancing the selection of features alongside classification methods to address rising cyber threats.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170208
Vol. 17 Issue. 2 PP. 101-118, (2025)
The motive of current investigations is to design a computational artificial neural network procedure for the numerical outputs of the fractional order (FO) lumpy skin disease model (LSDM), called as FO-LSDM. The stochastic performances using the optimization of scale conjugate gradient (SCGD) have been implemented to get the solutions of the FO-LSDM. The aim to implement the solutions of the FO is considered more reliable as compared to the integer order. The mathematical form of the LSDM is divided into two populations based on the cattle and vector using the population of susceptible and infected. A numerical Adam scheme is plagued to accomplish the dataset for reducing the mean square error by splitting the statics of endorsement, testing and training as 13%, 12% and 75%. The proposed stochastic neural network approach has a single layer, thirty numbers of neurons, sigmoid activation function, and optimization based SCGD procedure. The exactitude of the SCGD neural network is authenticated through the result comparisons and reducible absolute error around 10-06 to 10-08. Additionally, the correctness of the stochastic process based on the SCGD neural network is evaluated by applying the procedure of state transitions, correlation values, and best training.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170209
Vol. 17 Issue. 2 PP. 119-132, (2025)
Wireless sensor networks (WSNs) are made up of thousands of sensor nodes that are distributed in an area where their energy is limited. To overcome the issue of energy consumption. This paper study different deployment configuration as well as evaluating the two different clustering-based routing protocols. This work describes a hybrid distance, energy, and zonal SEP (HDEZ-SEP), which combines the strengths of the Distance and Energy-Aware Stable Election Routing Protocol (DE-SEP) and Zone-Based Stable Election Protocol (Z-SEP) to improve WSN energy efficiency and longevity. The suggested HDEZ-SEP was executed and compared to other protocols, including DE-SEP and Z-SEP. Using the MATLAB R2022b simulator; we assess the suggested protocol and contrast it with the others. According to the simulation results, the overall performance is improved. This study shows how hybrid techniques can effectively optimize data transmission and energy use in WSNs.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170210
Vol. 17 Issue. 2 PP. 133-151, (2025)
Chronic Kidney Disease (CKD) is a global health concern that necessitates accurate and timely detection to improve patient outcomes and reduce healthcare costs. This study focuses on enhancing CKD classification using machine learning techniques, leveraging 400 instances with 25 clinical features to predict binary outcomes of CKD or non-CKD. The main objective is to improve detection accuracy by applying feature selection and model optimization. Standard machine learning models, including Multilayer Perceptron (MLP), Random Forest (RF), Support Vector Classifier (SVC), and K-Nearest Neighbors (KNN), were employed, with optimization achieved through binary optimization algorithms such as Greylag Goose Optimization (GGO), Particle Swarm Optimization (PSO), Bat Algorithm (BA), and Whale Optimization Algorithm (WAO), along with hyperparameter tuning using genetic algorithms and other metaheuristics. Results indicate significant improvements in classification performance after feature selection and optimization, with the GGO-optimized MLP model achieving an accuracy of 97.06%. The contributions of this paper include (i) benchmarking baseline models for CKD detection, (ii) a comprehensive analysis of feature selection strategies, (iii) optimization of machine learning models for CKD classification, and (iv) visualization of model performance to aid future research in healthcare machine learning applications.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170211
Vol. 17 Issue. 2 PP. 152-190, (2025)
Cement production is a major contributor to global CO2 emissions, posing a challenge for climate mitigation efforts. Accurate forecasting of these emissions is vital for guiding policy and industrial decarbonization. This study addresses the need for improved predictive frameworks by developing an optimized ensemble-based machine learning model for CO2 emissions forecasting. The model is trained on a corrected global cement emissions dataset and enhanced through hyperparameter tuning using ten metaheuristic algorithms. Among them, the Improved Henry’s Optimization Algorithm (iHOW) achieved superior performance. The iHOW-optimized model attained an MSE of 1.21×10−6 and R2 of 0.9657, improving over the best baseline model (Gradient Boosting: MSE = 0.0164, R2 = 0.8621) by more than 99%. These results confirm the effectiveness of iHOW in producing accurate and reliable forecasts. The proposed framework offers strong potential for integration into carbon tracking systems and policy support tools.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170212
Vol. 17 Issue. 2 PP. 191-213, (2025)
The integration of Clinical Informatics (NI) and Artificial Intelligence (AI) promises to transform healthcare by improving clinical decisions, optimizing workflows, and personalizing patient care. However, most current systems fail to incorporate contextual reasoning, real-time adaptation, or ethical sensitivity, leading to fragmented support and increased cognitive burden on clinicians. To address these limitations, we propose NI-AIH—a hybrid clinical-AI framework built on a Context-Enriched Hierarchical Attention Network (CE-HAN). This deep architecture employs dual-attention mechanisms to interpret structured and unstructured clinical data—including EHR entries, nursing notes, and real-time IoT sensor feeds—capturing temporal patterns and contextual cues essential to patient status. The NI-AIH framework consists of four core components: a Clinical Context Engine (CCE) that uses CE-HAN for semantic modeling; a Predictive Care Optimizer (PCO) that applies risk-stratified deep ensembles; an Adaptive Interaction Layer (AIL) that enables seamless nurse–AI collaboration; and an Ethical Decision Integrator (EDI) that uses fuzzy logic to ensure real-time ethical alignment. In a trial deployment within a smart geriatric care unit, NI-AIH demonstrated a 23% improvement in early sepsis detection (p<0.01), a 31% reduction in clinician cognitive load (measured via NASA-TLX survey), and a 19% increase in workflow efficiency compared to conventional rule-based systems. By uniting clinical precision with ethical and context-aware intelligence, NI-AIH establishes a new paradigm for compassionate and effective AI-assisted healthcare.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170213
Vol. 17 Issue. 2 PP. 214-227, (2025)
Mobile Ad hoc Networks (MANETs) are emerging technologies used to transfer data across locations within both infrastructure-less and infrastructure-based network models. To ensure quality communication among mobile devices in various applications, an efficient routing model and an optimal data transfer path are essential, helping to reduce delay and power consumption during transmission. This article focuses on 'A hybrid routing and efficient mobility model with ant optimization' (HEMAOM). HEMAOM introduces a novel hybrid routing approach combined with an energy-efficient optimization model to lower power consumption and improve data transmission. Using an energy model, power usage during data transfer is minimized, boosting overall efficiency. Additionally, an optimization model is developed to identify the best path for data transfer between areas. These processes collectively decrease delay and power consumption, enhancing the communication performance of mobile devices. Compared to state-of-the-art methods like EOMFM, OLSRM, and MPOUA, HEMAOM shows superior performance in energy efficiency and data delivery. The model is implemented using NS3 software, considering parameters such as packet delivery ratio, network throughput, average delay, energy efficiency, and routing overhead.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170214
Vol. 17 Issue. 2 PP. 228-237, (2025)
An accurate diagnosis of Endometrial Cancer (EC) is crucial for gynecologists, as different types may require specific treatments. Radiomics, a quantitative method, can help analyze and quantify image heterogeneity, aiding in lesion diagnosis. Previous research introduced a Transformer-based Semantic-Aware U-Net with Deep Endometrial Cancer Prediction (TSA-UNet-DeepECP) to segment and classify EC stages in Magnetic Resonance Imaging MRI scans. However, the heterogeneous properties of input scans can affect the DeepECP model's performance. Hence, this study presents the TSA-UNet with an Improved DeepECP model (TSA-UNet-IDeepECP) for EC stage classification. This IDeepECP model incorporates a multi-view learning approach, combining local 2D MRI image information with global 3D MRI image information. First, the endometrium MRI scans are collected, augmented, and segmented using the TSA-UNet model. Various Deep Learning (DL) models, one for 2D and one for 3D, are fed the segmented images. In contrast to the 3D view model, which collects global information from 3D MRI images, the 2D view model primarily recovers local features from 2D MRI data. The multi-view DeepECP model is trained using these combined characteristics. A Fully Connected (FC) layer and the softmax classifier are used for classifying EC stages using the combined features. When compared to traditional models, a TSA-UNet-IDeepECP model achieves better performance in EC detection from MRI images.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170215
Vol. 17 Issue. 2 PP. 238-249, (2025)
The Accurate energy forecasting is vital for strategic planning, particularly in de-veloping economies with rapidly evolving demand patterns. This study pro-poses a hybrid Artificial Neural Network (ANN) model optimized using a modified JAYA algorithm to forecast energy consumption in Oman. The JAYA algorithm’s parameter-free, metaheuristic search improves ANN train-ing by enhancing convergence speed and reducing the risk of local minima. Historical data from 2017–2021—comprising GDP, population, and oil and gas production—were used as inputs. Model performance was benchmarked against an ANN trained with the Artificial Bee Colony (ABC) algorithm using mean square error (MSE), mean absolute error (MAE), relative error (RE), and root mean square error (RMSE) as evaluation metrics. Results show that ANN–JAYA consistently outperformed ANN–ABC, achieving lower error rates and greater robustness. The proposed approach offers a reliable deci-sion-support tool for policymakers and energy authorities, enabling more ef-fective resource allocation and long-term planning. Future research will ex-tend the framework to integrate renewable energy indicators and real-time data for adaptive, sustainable forecasting.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170216
Vol. 17 Issue. 2 PP. 250-259, (2025)
The generation of cryptographic keys from biometric traits offers a secure alternative to password-based authentication, but is hindered by challenges related to entropy, reproducibility, and adversarial resistance. This work presents a dual-path framework in which a Continuous Thinking Machine Model (CTMM) extracts multimodal embeddings from iris and fingerprint data. Feature vectors undergo projection through principal component analysis and graph-based distance encoding, followed by chaotic sequence modeling with Lorenz-like dynamics and an error-correcting routine to stabilize bitstreams. A secure mixing function consolidates the outputs, while SHA3-512 ensures deterministic expansion. Final passkeys are generated using the Kyber512 post-quantum key encapsulation mechanism (KEM), with neuro-symbolic reasoning applied as a validation layer to enforce entropy, avalanche properties, and inter-user separation. Evaluation confirmed compliance with NIST statistical tests, including monobit, runs, and longest-run assessments, while the system maintained a near-zero false acceptance rate. The originality of this work lies in combining CTMM-driven multimodal feature extraction with a quantum-safe cryptographic pipeline, augmented by neuro-symbolic validation, to establish a reproducible and secure method for biometric passkey generation in high-assurance authentication contexts.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170217
Vol. 17 Issue. 2 PP. 260-277, (2025)
Solar energy systems play a crucial role in fulfilling global energy needs sustainably; however, their performance is often affected by dynamic environmental factors. This study investigates the use of Artificial Intelligence (AI) for real-time optimization and adaptive control to improve the operational efficiency of solar energy systems. The research specifically addresses output variability arising from fluctuations in solar irradiance, temperature, and panel soiling, limitations that conventional control approaches fail to manage effectively. The primary goal is to develop intelligent AI-based models capable of predicting and automatically adjusting critical system parameters in real time, thereby reducing manual intervention and enhancing operational reliability. Data from a solar photovoltaic (PV) and thermal hybrid testbed in Jodhpur, India were collected over a six-month period. The Indian Meteorological Department provided more than 10000 hourly data samples that included weather and seasonal variations. An NI DAQ system with high-precision sensors was used to measure important parameters such as solar irradiance panel, and ambient temperatures wind speed inclination angle and energy output. For predictive control, the suggested methodology uses a hybrid ensemble framework that combines Extreme Gradient Boosting (XGBoost), Adaptive Neuro-Fuzzy Inference Systems (ANFIS), and Deep Neural Networks (DNN). In this framework, XGBoost carries out variable importance ranking to determine the dominant influencing factors ANFIS enables adaptive operational control and DNNs forecast energy output. In contrast to previous research that concentrated on distinct AI methods this work presents a cohesive hybrid approach that integrates feature significance analysis adaptive optimization and forecasting accuracy into a single system. The hybrid ensemble model outperforms individual approaches in achieving stable and effective energy generation according to evaluation using RMSE, R2, and MEF metrics. Furthermore, its compatibility with IoT-enabled edge devices underscores its potential for large-scale, real-time, and automated solar energy management within future smart grid infrastructures, advancing global efforts toward sustainable energy transitions.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170218
Vol. 17 Issue. 2 PP. 278-294, (2025)
Indoor transportation systems are a key area of development, where Automated Guided Vehicles (AGVs) help to increase efficiency and reduce labor costs. However, high-precision positioning technologies such as LiDAR and GNSS are expensive, making them unsuitable for widespread use. This research has developed a low-cost positioning system for indoor AGVs using multiple sensors, including CCTV, UWB, inertial measurement units (IMUs), and encoders. The experiment was carried out under both static and dynamic conditions. In static tests, Trilateration distance measurements show a lower positioning error than the triangular method, with a maximum error of 1.4464 m (x-axis) and 1.0464 m (y-axis) in dynamic tests. The integrated Encoder and IMU sensor data yielded the lowest error (RMSE = 0.0732 m at 0.4 m/s, 0.0678 at 0.27 m/s), Next is CCTV, while UWB has the highest error rate. The application of a Parallel Sensor Fusion architecture optimized using a Generalized Reduced Gradient (GRG) nonlinear algorithm, significantly reduced localization errors. The RMSE values decreased to 0.0623 m (0.4 m/s) and 0.0411 m (0.27 m/s). The results, in a controlled environment laboratory, indicate that combining multiple sensors will improve the positioning accuracy. Combining the encoder and IMU effectively reduces accumulated errors and increases system stability. While Adjust the weight of the sensor offline, this proposed system offers a cost-effective positioning solution for indoor AGVs, which contributes to the development of affordable and accurate AGV navigation systems.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170219
Vol. 17 Issue. 2 PP. 295-310, (2025)
The rapid advancement of telecommunication infrastructures and endpoint technologies has led to a significant incorporation of Internet of Things devices in modern lifestyles. IoT involves a wide range of applications, such as connected video surveillance systems for security, wearable body sensors for health monitoring, and temperature sensors for environmental control in agricultural fields. These devices are essential for gathering and transmitting data in real-time. However, data acquisition and transmission processes are often exposed to serious security threats, particularly concerning data integrity, user privacy, and communication reliability. Conventional security mechanisms are typically inappropriate to resource constrained IoT devices. Thus, to overcome these challenges, extensive research has been devoted to developing secure communication frameworks, with a particular focus on robust authentication and key agreement protocols. Authentication is essential to guarantee the legitimacy of the information source, and many proposed AKA schemes rely on asymmetric cryptographic techniques. In this paper, we introduce an Enhanced Lightweight Cryptography-based Authentication Protocol for IoT devices, conceived to meet the computational constraints of IoT devices by employing simple XOR and hashing operations. The protocol enables mutual authentication between IoT devices and routers without the need to share credentials directly. Prior to authentication, an offline registration phase is conducted through an Authentication Server (AS), which generates unique key parameters based on the identifiers of the devices and routers. These parameters are securely distributed to both parties. Authentication is then performed using these pre-shared parameters in a computationally efficient yet secure manner that safeguards against common security threats. Theoretical analysis demonstrates that the proposed protocol is resistant to several common attacks, including man-in-the-middle, impersonation, session key disclosure, replay, and eavesdropping attacks. Additionally, the protocol ensures device anonymity and data privacy while maintaining lightweight performance suitable for constrained IoT environments.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170220
Vol. 17 Issue. 2 PP. 311-324, (2025)
The research-automated segmentation of brain tumors occurs due to the need to enhance diagnosis and/or treatment planning. The existing techniques suffer the effects of scale variation, redundant features, and the high dimensionality that causes ambiguous findings. We suggest the model named MFWX, which unites Multi-Scale CNN, Multi-Frequency Channel Attention (MFCA), Weighted Particle Swarm Optimization (WPSO) to identify features and XGBoost methods to classify them. The Multi-Scale CNN will capture the structure of the tumor at multiple resolutions, MFCA adjusts the features by zeroing in on significant frequency zones and WPSO eliminates redundancy to heavy-hit the strong forecasts of XGBoost. However, MFWX attained 94.2 accuracy and 92.5 Dice on the BraTS-2020 dataset surpassing ResNet50, EfficientNet-B7, and U-Net. It achieved an accuracy of 96.7%, and Dice of 95.1% on BraTS-2018, and performed well on classes of tumors. Ablation experiments proved the necessity of every part. In general, MFWX presents an efficient, clinically meaningful, scalable solution that outsmarts the current segmentation techniques.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170221
Vol. 17 Issue. 2 PP. 325-339, (2025)
The Internet of Things real-time communications depend on a secure stream of data. For the secure communications, a stream cipher with the features of ease and speediness is appropriate. The development and testing of a novel cryptographic algorithm with the goal of enhancing encryption performance. This paper introduces novel A matrix-based nonlinear pseudorandom key stream generation method inspired by the principles of fundamental recursive relationship of Reinforcement Learning, aiming to enhance diffusion and randomness in stream ciphers. We also incorporate the encryption approach based on the Counter based transformation of keystream generation (CBTKSG) method to enhance the speed, which is particularly well-suited for efficiently handling large file sizes since it delivers fast throughput. The technique was thoroughly bench marked and compared to other well-known encryption schemes. Performance has significantly improved without sacrificing security, according to the data. The keystream output was placed through the NIST SP 800-22 statistical test suite to verify its cryptographic strength. It passed every test with high p-values, indicating high randomness quality. The cipher has a strong avalanche effect, meets standard security criteria like IND-CPA and IND-CCA, and resists common cryptanalysis methods including related-key, differential, and linear attacks.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170222
Vol. 17 Issue. 2 PP. 340-359, (2025)
When using mammography to diagnose breast cancer, segmenting medical scans is a crucial step. Accurate segmentation facilitates early diagnosis, which in turn makes it possible to administer individualized treatment plans, ultimately improving patient outcomes. However, for these Deep Learning (DL) models to be trained efficiently and perform optimally, they require access to large datasets. The lack of sufficient photographs in many publicly available datasets to adequately train deep learning models is a common flaw. Therefore, this work aims to examine the effects of various affine data augmentations on the Dice Score of a U-NET model utilizing a recently released public dataset of Contrast-Enhanced Spectral Mammography (CESM) images. The collection consists of 1003 CESM images and matching segmentation masks made by a certified radiologist. Modifying certain model parameters on the CESM dataset and investigating the impact of single and combination data augmentations on the model's overall performance are the objectives of the study. Images that were moved in the x-direction and sheared vertically were used to train the best-performing model. On the test set, the model's Dice Score was 56.6%, which was 9% better than the baseline result and showed how crucial data augmentation is when working with small datasets.
Read MoreDoi: https://doi.org/10.54216/JISIoT.170223
Vol. 17 Issue. 2 PP. 360-368, (2025)