Journal of Intelligent Systems and Internet of Things

Journal DOI

https://doi.org/10.54216/JISIoT

Submit Your Paper

2690-6791ISSN (Online) 2769-786XISSN (Print)

Improving Data Aggregation Performance in Wireless Sensor Networks using Software-Defined Networks

Marwa K. Hasan

In this paper, we present a novel methodology to improve the performance of collection operations in wireless sensor networks by the application of software-defined networking technology on (SD WISE) platform. The conditions for selecting the grouped nodes in the controller were determined by adjusting the weights of the (Dijekstra) algorithm. The grouped nodes that include the node were determined based on the paths chosen by the algorithm. The SDN-WISE platform supports reading the payload of the packet and not just the header, in addition to the possibility of dealing with a packet depending on another packet, and the flexibility to modify the routing tables to achieve the appropriate rules for the proposed aggregation algorithm. The results show a significant reduction in the energy consumed after applying the novel suggested algorithm.

Read More

Doi: https://doi.org/10.54216/JISIoT.120201

Vol. 12 Issue. 2 PP. 08-18, (2024)

Enhanced Heart Disease Prediction Using Machine Learning Techniques

Jata Shanker Mishra , N. K.Gupta , Aditi Sharma

This study leverages sophisticated machine learning methodologies, particularly XGBoost, to analyze cardiovascular diseases through cardiac datasets. The methodology encompasses meticulous data pre-processing, training of the XGBoost algorithm, and its performance evaluation using metrics such as accuracy, precision, and ROC curves. This technique represents a notable progression in the realm of medical research, potentially leading to enhanced diagnostic precision and a deeper comprehension of cardiovascular ailments, thereby improving patient care and treatment modalities in cardiology. Furthermore, the research delves into the utilization of deep learning methodologies for the automated delineation of cardiac structures in MRI and mammography images, aiming to boost diagnostic precision and patient management. [24][3][5][6] In assessing machine learning algorithms' efficacy in diagnosing cardiovascular diseases, this analysis underscores the pivotal role of such algorithms and their possible data inputs. Additionally, it investigates promising directions for future exploration, such as the application of reinforcement learning. A significant aspect of our investigation is the development and deployment of sophisticated deep learning models for segmenting right ventricular images from cardiac MRI scans, aiming at heightened accuracy and dependability in diagnostics. Through the utilization of advanced techniques like Fourier Convolutional Neural Network (FCNN) and improved versions of Vanilla Convolutional Neural Networks (Vanilla-CNN) and Residual Networks (ResNet), we achieved a substantial improvement in accuracy and reliability. This enhancement allows for more precise and quicker identification and diagnosis of cardiovascular diseases, which is of utmost importance in clinical practice. Evaluation of Machine Learning Algorithms: We conducted a comprehensive evaluation of machine learning algorithms in the context of cardiovascular disease diagnosis. This assessment emphasized the fundamental role of machine learning algorithms and their potential data sources. We also explored promising avenues, such as reinforcement learning, for future research. Factors Affecting Predictive Models: We highlighted the critical factors affecting the effectiveness of machine learning-based predictive models. These factors include data heterogeneity, depth, and breadth, as well as the nature of the modeling task, and the choice of algorithms and feature selection methods. Recognizing and addressing these factors are essential for building reliable models. 

Read More

Doi: https://doi.org/10.54216/JISIoT.120202

Vol. 12 Issue. 2 PP. 19-33, (2024)

Quantitative Approach for Anemia Detection Using Regression Analysis

Vinit P. Kharkar , Ajay P. Thakare

Anemia, generally termed as deficiency of hemoglobin or red blood cells in the blood is significant global health concern for the population in underdeveloped as well as in developing nations specially, for children and young women in rural areas. This paper proposes a quantitative approach for anemia detection by regression analysis technique which predicts hemoglobin level in the blood. To achieve this, the image dataset of microscopic blood sample is collected from 70 individuals. The data collection requires proper procedure as it plays vital part in system implementation. The statistical feature utilizing mean pixel intensity values from the red, green, and blue color planes of the images are given as input to the regression model. For the proposed system, we have employed multiple regression analysis model using machine learning approach with both three and four regression coefficients to establish relation between features obtained from blood samples and the hemoglobin level in the blood to achieve the specified task of anemia detection in an individual. Performance analysis show promising results for the proposed system with co-efficient of determination (R2) and root mean square error (RMSE) found out be 0.923 and 1.682 respectively. Overall, this paper presents valuable system for anemia detection based on hemoglobin estimation which can be implemented in areas with limited medical resources and gives another supportive technological solution for current healthcare problems.

Read More

Doi: https://doi.org/10.54216/JISIoT.120203

Vol. 12 Issue. 2 PP. 34-43, (2024)

Comparative Analysis of ML-Based Outlier Detection Techniques for IoT-Based Smart Energy Management Systems

Parh Yong Wong , Nayef A. M. Alduais , Nurul Aswa Omar , Salama A. Mostafa , Abdul-Malik H. Y. Saad , Antar Shaddad H. Abdul-Qawy , Abdullah B. Nasser , Waheed Ali H. M. Ghanem

With the development and advancement of ICST, data-driven technology such as the Internet of Things (IoT) and Smart Technology including Smart Energy Management Systems (SEMS) has become a trend in many regions and around the globe. There is no doubt that data quality and data quality problems are among the most vital topics to be addressed for a successful application of IoT-based SEMS. Poor data in such major yet delicate systems will affect the quality of life (QoL) of millions, and even cause destruction and disruption to a country. This paper aims to tackle this problem by searching for suitable outlier detection techniques from the many developed ML-based outlier detection methods. Three methods are chosen and analyzed for their performances, namely the K-Nearest Neighbour (KNN)+ Mahalanobis Distance (MD), Minimum Covariance Determinant (MCD), and Local Outlier Factor (LOF) models. Three sensor-collected datasets that are related to SEMS and with different data types are used in this research, they are pre-processed and split into training and testing datasets with manually injected outliers. The training datasets are then used for searching the patterns of the datasets through training of the models, and the trained models are then tested with the testing datasets, using the found patterns to identify and label the outliers in the datasets. All the models can accurately identify the outliers, with their average accuracies scoring over 95%. However, the average execution time used for each model varies, where the KNN+MD model has the longest average execution time at 12.99 seconds, MCD achieving 3.98 seconds for execution time, and the LOF model at 0.60 seconds, the shortest among the three.

Read More

Doi: https://doi.org/10.54216/JISIoT.120204

Vol. 12 Issue. 2 PP. 44-64, (2024)

Sustainable Waste Management through ML-based Real-Time Trash Bin Prediction

Sandeep Kumar , Vikrant Shokeen , Amit Sharma , Prabhat K. Srivastava , Upasana Dugal , Aditi Sharma

Waste management has been an issue due to low awareness among people of any country to lead major environmental contamination, tragic accidents, and unfavorable working conditions for landfill workers. The Lack of precise and efficient object detection could be a barrier in the growth of computer vision-based systems. As per the latest research articles, pre-trained models could be used for Trash Bin detection in real time and for recommending appropriate actions after detection. Using a unique validation dataset made up of predicted trash items, the two classes of acceptable object identification models, YOLO (You Only Look Once) and SSD (Single Shot Multibox Detector), are then contrasted. It is concluded that SSD performs noticeably better than YOLO in identifying trash objects based on several performance metrics computed utilizing multiple open-source research projects. The model is then built up to recognize several trash object types after being pre-trained using Microsoft's COCO (Common Objects in Context) dataset. Our initiative intends to enhance sustainable waste management, make trash sorting incredibly simple, and guard against serious illnesses and accidents at landfill and garbage disposal sites.

Read More

Doi: https://doi.org/10.54216/JISIoT.120205

Vol. 12 Issue. 2 PP. 65-74, (2024)

Securing Pervasive Computing Networks: Enhancing Network Security via Network Virtualization in Wireless Communications Infrastructure

Ali Kadhim Nsaif

The seamless integration of technology for computing into everyday items and environments is known as pervasive computing. To protect against cyber threats and vulnerabilities, robust security mechanisms are necessary. Conventional security measures, including gateways and the use of encryption, may not be sufficient to address the unique challenges encountered in ubiquitous computing systems. But these techniques are still vital. In addition to the variety of devices, resource limitations, mobility needs, and the possibility of large-scale distributed attacks, these obstacles also include the potential for attack. Network virtualization, that abstracts and separates network facilities and functions, is a promising way to increasing security in pervasive computing deployments: it abstracts and isolates network resources and processes. Wireless communication play a significant part in the development of a digital infrastructure that is both resilient and trustworthy. The processes of dynamic resource allocation, isolation, and management of network bandwidth are made possible through the utilization of virtualization, leads to the proposal of Secure Wireless Virtual Resource Allocation and Authentication Algorithm(SWVRA3) to make the abstraction of the network's physical resources into virtualized entities By using network virtualization, pervasive computing applications and services can be secured with logically segregated virtual networks. The cross-contamination and security breaches can be reduced by this separation. Furthermore, flexible configuration, dynamic allocation of resources, and centralized virtual control are allowed by network visualization that improves threat incidence response, enforcement of policies, and security surveillance.

Read More

Doi: https://doi.org/10.54216/JISIoT.120206

Vol. 12 Issue. 2 PP. 75-88, (2024)

An IoT Device-Level Vulnerability Control Model Through Federated Detection

Umar Audi Isma’ila , Kamaluddeen Usman Danyaro , Mohd Fadzil Hassan , Aminu Aminu Muazu , M. S. Liew

In the rapidly expanding Internet of Things (IoT) landscape, the security of IoT devices is a major concern. The challenge lies in the lack of intrusion detection systems (IDS) models for these devices. This is due to resource limitations, resulting in, single point of failure, delayed threat detection and privacy issues when centralizing IDS processing. To address this, a LiteDLVC model is proposed in this paper, employing a multi-layer perceptron (MLP) in a federated learning (FL) approach to minimize the vulnerabilities in IoT system. This model manages smaller datasets from individual devices, reducing processing time and optimizing computing resources. Importantly, in the event of an attack, the LiteDLVC model targets only the compromised device, protecting the FL aggregator and other IoT devices. The model's evaluation using the BoT-IoT dataset on TensorFlow Federated (TFF) demonstrates higher accuracy and better performance with full features subset of 99.99% accuracy on test set and achieved average of 1.11sec in detecting bot attacks through federated detection. While on 10-best subset achieved 99.99 on test with 1.14sec as average detection time. Notably, this highlights that LiteDLVC model can potential secure IoT device from device level very efficiently. To improve the global model convergence, we are currently exploring the use genetic algorithm.  This could lead to better performance on diverse IoT data distributions, and increased overall efficiency in FL scenes with non-IID data.

Read More

Doi: https://doi.org/10.54216/JISIoT.120207

Vol. 12 Issue. 2 PP. 89-98, (2024)

HDRA: A Haybird Data Reduction and Routing Algorithm

M. K. Hussein , Ion Marghescu , Nayef A. M. Alduais

Presently, wireless sensor networks (WSNs) are emerging as a vibrant field of research due to various challenging aspects such as energy consumption, routing strategies, effectiveness, among others. Despite unresolved issues within WSNs, a substantial array of applications has already been developed. For any application design, a primary objective is to optimize the WSN in terms of its lifecycle and functionality. Recent studies on data reduction methods have shown that sensor nodes often transmit data directly (single hop) to the base station (BS). However, a significant concern is that most existing multi-hop routing protocols do not address data reduction before forwarding data to the BS. Consequently, this study introduces a Hybrid Data Reduction and Routing Algorithm (HDRA). The principal aim of HDRA is to prolong the lifespan of cluster-based WSNs. It strives to decrease the packet transmission by sensor nodes, especially when there's minimal change in sensor readings. The findings indicate that HDRA outperforms the LEACH protocol in terms of energy efficiency in sensor networks, irrespective of network type (T, H, or TH) or deployment scenarios (200x200m or 400x400m). Overall, the proposed algorithm enhances network performance by conserving energy and extending network lifespan.

Read More

Doi: https://doi.org/10.54216/JISIoT.120208

Vol. 12 Issue. 2 PP. 99-121, (2024)

Multi-Objective Evolutionary Algorithm to Optimize IoT Based Scheduling Problem Using (NSGA-II Algorithm)

Syed Mutiullah Hussaini , T. Abdul Razak , Muhammad Abid Jamil

Due to the continual advancements in the Internet of Things (IoT), which generate enormous volumes of data, the cloud computing infrastructure recently has received the most significance. to meet the demands made by the network of IoT devices. It is anticipated that the planned Fog computing system would constitute the next development in cloud computing. The optimal distribution of computing capacity to reduce processing times and operating costs is one of the tasks that fog computing confronts. In the IoT, fog computing is a decentralized computing approach that moves data storage and processing closer to the network's edge. This research article discusses a unique technique for lowering operating expenses and improving work scheduling in a cloud-fog environment. Non-dominated sorting genetic algorithm II (NSGA-II) is a proposal that is presented in this paper. Its purpose is to allocate service requests with the multi-objective of minimising finishing time and running cost. Determining the Pareto front that is associated with a group of perfect solutions, which are sometimes referred to as non-dominated solutions or Pareto sets, is the fundamental objective of the Pareto NSGA-II. There is a contradiction between the environmental and economic performances, which is shown by the Pareto set of sub-optimal solutions that are the consequence of the bi-objective issue.

Read More

Doi: https://doi.org/10.54216/JISIoT.120209

Vol. 12 Issue. 2 PP. 122-137, (2024)

A Hybrid Logistic Scroll Chaotic Encryption Algorithm for Ensuring the Cloud Security to Counerfeiting the Attacks

Madireddy Swetha , Kalaivani Kathirvelu

Cloud computing is meant for storing the huge data using third party that ensures that confidential data cannot be accessed by the other users. But with rapid growth of technologies, data in the cloud normally increases which questions its security in storing in the cloud.  Hence the protecting the cloud data seeks the strong security levels to counterfeit the different cloud attack. In order to achieve the highest level of security for cloud data, this study suggests a powerful encryption technique that combines chaotic scrolls and logistic maps. The proposed model exhibits the following advantages over the other algorithms: 1) High dynamic key generation 2) ability to counterfeit the multiple attacks 3) High randomness encrypted data which can provide more confusion of hacking from the intruder’s insight. To prove the strength of the proposed model, NIST National Institute of Science and Technology (NIST) is used for significant experiments. in which different statistical tests were carried out to prove the strength of the proposed model. The level of security of the suggested model is also evaluated and investigated using formal analysis using Burrows-Abadi-Needham Logic (BAN). The given model is thoroughly verified using both the Profverif tool and AVISPA. In terms of communication costs and unpredictability, the suggested model's randomness has also been contrasted with that of another existing algorithm. Results demonstrates that the proposed model shows its ability to provide more potent protection to the cloud data than the other existing encryption algorithms.

Read More

Doi: https://doi.org/10.54216/JISIoT.120210

Vol. 12 Issue. 2 PP. 138-149, (2024)

Filtering Big Data with Optimized Hybrid Algorithm for IoT-Based Data Selection

Sarvesh Kumar , Satyajee Srivastava , Surendra Kumar , Arun Kumar Saini , Neeraj Verma , Dhiraj Kapila

Data management across servers has grown problematic because of technological advancements in data processing and storage capacities. Data that is neither organized nor labelled adds an additional layer of difficulty to the storing and retrieving processes. This data, which is not tagged, requires analytic techniques that are more powerful and time efficient. Clustering has long been regarded as one of the most effective methods for managing large amounts of data; nonetheless, larger volumes can lead to unexpectedly poor accuracy when using conventional clustering methodologies. In this study, we suggest the use of a novel framework for the clustering of large amounts of data. The preprocessing stage is one of the most important parts in the data cleansing process; hence, a global stop-word list is used to filter the contents of the files before sending them on to the cluster distribution stage. A meta-heuristic focused Genetic Algorithm (GA) is utilized to eradicate the redundant information present in the datasets. In addition to the generalized attributable fitness function, an attribute-based innovative fitness function (f) is being developed. To determine how well proposed method performs, it is compared to a variety of alternative clustering approaches. When comparing the distributions of clusters for the purpose of evaluation, the Standard Error (SE), root mean squared error (RMSE), and corrected R squared error are all computed.

Read More

Doi: https://doi.org/10.54216/JISIoT.120211

Vol. 12 Issue. 2 PP. 150-165, (2024)

Development of an Approach for Image Forgery Detection Using Machine Learning Algorithms

Ahmed K. Jawad Alataby

Digital picture fraud detection is an increasing societal necessity due to the importance of verified images. The detection of picture copying, splicing, retouching, and re-sampling forgeries is included. In the absence of digital signatures or watermarks, passive picture authentication may serve as an alternative to active authentication. Passive techniques, every so often recognized as blind techniques, could take place without preceding knowledge of the picture or its reference. Identifying counterfeiting picture or tampering was a research field for long a period of time, triggered via the Internet, online platforms, social messaging platforms, and extensive digital image usage. The rate of failure could be a key factor for examining the alteration of picture or forgery, among other existing methods. The research applies almost six common algorithms related to machine learning in order to extract features from Lightweight, Spatial Exploitation, and Residual deep learning models on benchmark datasets MICC-F220, Columbia, and CoMoFoD. The models of incorporated deep learning could consist of AlexNet, GoogleNet, VGG16, VGG19, SqueezeNet, MobileNetV2, ShuffleNet, ResNet-18, ResNet-50, and ResNet-101 for spatial exploitation. Fine-tuning is applied to the top three deep learning models, optimizing hyperparameters centered on indicators of performance for every single benchmark dataset. Tweaked SqueezeNet, MobileNetV2, and ShuffleNet deep learning models with SGDM Optimizer and SVM classifier yielded the best results for MICC-F220 dataset. Fine-tuned VGG19, MobileNetV2, and ResNet-50 deep learning models with SGDM Optimizer and SVM v classifier yielded the best results for Columbia dataset. In CoMoFoD dataset, fine-tuned AlexNet, MobileNetV2, and ShuffleNet deep learning models with SGDM Optimizer and SVM classifier yielded the best results. The proposed approach, utilizing machine learning algorithms and deep learning features, enhanced forgery detection and reduced false positives. Results were validated on benchmark image forgery datasets and compared to current methods.

Read More

Doi: https://doi.org/10.54216/JISIoT.120212

Vol. 12 Issue. 2 PP. 166-177, (2024)

Evaluating the Potential of Mesh Networks in Enhancing Rural Connectivity based on Internet of Thing

Neelima Gurrapu , Akhil Nair R. , C. Laxmikanth Reddy , V. V. J. Rama Krishnaiah , S. Shiek Aalam , Kancharla Suresh

Rural communities struggle to connect to the internet, a phenomenon known as the "digital divide." Mesh networks, with improved access in rural regions, might help to tackle this problem. From a social, economic, and scientific standpoint, this study investigated whether mesh networks may improve rural connectivity. This project developed and implemented methodologies to assess community participation, cost, and network coverage. Five well-known methods were pitted against these ones. Locals are working on a mesh node placement project in a rural location with diverse topography. In terms of network coverage, the Network Coverage Assessment revealed that the proposed approach frequently outperformed the most recent approaches. Finding the ideal locations for mesh nodes helped to tackle challenges in rural regions. After putting the strategy into effect, the Cost-Effectiveness Analysis revealed a positive ROI. Many alternative options seemed unprofitable. On the Community Engagement Index, the recommended method performed better than others. Participating in network activities with individuals from the local community helps to foster ownership and shared accountability.

Read More

Doi: https://doi.org/10.54216/JISIoT.120213

Vol. 12 Issue. 2 PP. 178-186, (2024)

Investigating the Impact of Compressed Sensing Techniques and IoT in Medical Imaging

Suresh Kumar Mandala1 , Shahnaz K. V. , Chopparapu Gowthami , S. Shiek Aalam , B. Laxmi Kantha , K. Chandran

 This research paper examines compressed sensing's impact on medical imaging. Math and signal processing inspired compressed sensing. Future picture-capturing will be radically different. The paper focuses on adaptive random sampling (ARS), iterative shrinkage-thresholding algorithms (ISTA), and temporal compressed sensing (TCS). These approaches were rigorously tested using MRIs, X-rays, and dynamic imaging patterns. Low scan times, picture quality, and dynamic imaging were the main test criteria. The technologies considerably reduced scan time, demonstrating their potential to speed up imaging procedures. The reconstructed photos had higher SNRs and SSIs than those obtained using normal techniques, indicating greater accuracy. The TCS algorithm's dynamic imaging skills, especially evident in heart and musculoskeletal imaging, eliminated motion defects while exhibiting real-time physiological changes. The study was expanded to incorporate customized treatment, and the recommended procedures have proven amazing adaptability to each patient's demands. This adaptability fits current medical treatments, making unique imaging technologies viable.

Read More

Doi: https://doi.org/10.54216/JISIoT.120214

Vol. 12 Issue. 2 PP. 187-197, (2024)

Strategizing IoT Network Layer Security Through Advanced Intrusion Detection Systems and AI-Driven Threat Analysis

Deepak Dasaratha Rao , Akhilesh A. Waoo , Murlidhar Prasad Singh , Piyush Kumar Pareek , Shoaib Kamal , Shraddha V. Pandit

This research introduces an algorithmic framework for enhancing the security of Internet of Things (IoT) networks. The Enhanced Anomaly Detection (EAD) algorithm initiates the process by detecting anomalies in real-time IoT data, serving as the foundational layer. The Behavior Analysis for Profiling (BAP) algorithm builds upon EAD, adding behavior analysis for profiling and adaptive identification of abnormal behavior. Signature-Based Detection (SBD) involves pre-identified attack signatures, which supports detection of known attacks and provides proactive defense measures against documented threats. The MLID, or the Machine Learning-Based Intrusion Detection, algorithm uses trained machine learning models in order to detect anomalies and the adaptability to changing security risks. The Real-Time Threat Intelligence Integration (RTI) algorithm integrates updated threat intelligence feeds, which improves the framework's responsiveness to emerging threats. The visual representations illustrate once again the idea of the new framework being very accurate at intergration, applicability, and overal security effectiveness. The research makes a standard solution which proves to be a smart and responsive way guarding the IoT networks reducing and even fighting known and potential threats in a real-time mode.

Read More

Doi: https://doi.org/10.54216/JISIoT.120215

Vol. 12 Issue. 2 PP. 195-207, (2024)