Journal of Intelligent Systems and Internet of Things

Journal DOI

https://doi.org/10.54216/JISIoT

Submit Your Paper

2690-6791ISSN (Online) 2769-786XISSN (Print)

Digital Forensic Based Object Recognition for Enhanced Crime Scene Interpretation

Vikash Kumar Singh , Durga Sivashankar , Siddharth Sriram , Manish Nagpal , Warish Patel , Shweta Loonkar

This research introduces a novel and comprehensive framework for digital forensics-based crime scene interpretation. The proposed framework comprises five algorithms, each serving a distinct purpose in enhancing image quality, extracting features, matching, and constructing a database, recognizing, and reconstructing objects in 3D, and conducting context-aware analysis. An ablation study validates the necessity of each algorithmic step. The framework consistently outperforms existing methods in terms of accuracy, precision, recall, and processing time. A detailed comparative analysis of parameters further highlights its cost-effectiveness, moderate complexity, superior data integration, and scalability. Visualizations underscore its dominance across multiple metrics and parameters, positioning it as an advanced solution for digital forensic-based object recognition in crime scene interpretation.

Read More

Doi: https://doi.org/10.54216/JISIoT.130201

Vol. 13 Issue. 2 PP. 08-24, (2024)

Comprehensive Analysis of Implementation and Evaluation IoT based Techniques in Networked Security Systems

Raenu Kolandaisamy , Suhas Gupta , Shashikant Patil , Jaymeel Shah , Abhinav Mishra , N. Gobi

This research introduces an advanced network security methodology based on IoT, combining five innovative algorithms: Dynamic Threat Detection (DTD), Adaptive Intrusion Prevention System (AIPS), Anomaly-Based Security Metrics (ABSM), Context-Aware Firewall (CAF), and Cognitive Security Assessment (CSA). Each algorithm contributes specific functionalities, ranging from real-time threat detection and adaptive policy adjustments to anomaly quantification, contextual rule modifications, and holistic security risk assessments. The ablation study conducted on each algorithm reveals critical components driving their performance, ensuring a deep understanding of their inner workings. The proposed method demonstrates superior performance in accuracy, scalability, usability, and adaptability compared to existing network security methods. Visual representations and a comprehensive evaluation further validate the proposed method's effectiveness, positioning it as an advanced and efficient solution for addressing evolving network security challenges.

Read More

Doi: https://doi.org/10.54216/JISIoT.130203

Vol. 13 Issue. 2 PP. 35-51, (2024)

Machine Learning and Internet of Things Driven Energy Optimization in Wireless Sensor Networks through Crossbreed Clustering

Ahmed Saeed Alabed , Rajesh Kumar Samala , Asha KS , Sorabh Sharma , Amit barve , Deepak Minhas

Key challenges in Wireless Sensor Networks (WSNs) include reduced dormancy, energy efficacy, reportage worries, and network lifetime. To solve the issues of energy efficiency and network longevity, more study of cluster-based WSNs is required. In order to address the challenges and constraints of WSNs, creative approaches are needed. WSNs use machine-learning techniques because of their unique characteristics. These characteristics include high communication costs, low energy reserves, high mobility, and frequent topological shifts.  The current method picks cluster heads at random at the beginning of each cycle, not considering the remaining energy of these nodes. It is possible that the newly chosen CH nodes will have the lowest energy level in the network and will die off fast as a result. Energy is wasted while communicating over long distances between cluster heads and the BS, which occurs frequently in a big network due to Internet of things. This would mean that WSNs have a finite lifespan. Therefore, to increase the network's longevity and efficiency, we propose a machine-learning-based strategy called energy proficient crossbreed clustering methodology (ECCM). The experimental results reveal that the ECCM is superior to the LEACH approach, increasing residual energy by 35%, extending network lifetime by 37%, and increasing throughput by 15%.  

Read More

Doi: https://doi.org/10.54216/JISIoT.130204

Vol. 13 Issue. 2 PP. 52-59, (2024)

Emerging Trends: Nano-Scale Wireless Sensor Networks and Applications

Julissa E. Reyna-Gonzalez DRA , N. K. Rayaguru , Gowrishankar J. , Bhargavi Gaurav Deshpande , Madhur Grover , Daxa Vekariya

New Adaptive Nano-Scale Sensor Network (ANSN) can quickly feel nanoscale surroundings. ANSN uses data in many scenarios to improve networks, consume less energy, and gain more accurate data. ANSI essentials are covered in detail here. This group has numerous parts. Making service better, collecting data with less energy, sending data with Q-learning, merging sensor data to increase accuracy, controlling power dynamically, and protecting data using AES are examples. Energy collection and sensor use are key to this effort. Academic research has proven that ANSN outperforms other techniques in several areas. Improvements include speed, security, latency, sensor accuracy, and network stability. With these changes, ANSN may be suitable for small wireless sensor networks.

Read More

Doi: https://doi.org/10.54216/JISIoT.130202

Vol. 13 Issue. 2 PP. 25-34, (2024)

Intelligent Integration of Wearable Sensors and Artificial Intelligence for Real-time Athletic Performance Enhancement

Prabhat Kr. Srivastava , Ram Kinkar Pandey , Gaurav Kumar Srivastava , Nishant Anand , Kunchanapalli Rama Krishna , Prateek Singhal , Aditi Sharma

The amalgamation of wearable sensor technologies and artificial intelligence (AI) presents a transformative paradigm for optimising athletic performance in real time. This paper explores the integration of cutting-edge sensors - including bioimpedance sensors, accelerometers, and gyroscopes - with advanced AI algorithms such as machine learning and decision support systems. By capturing diverse physiological, biomechanical, and environmental data, the proposed framework aims to offer personalized, actionable insights for athletes. This research synthesizes the current landscape of wearable sensor technology in sports and highlights the evolving role of AI in interpreting data for enhancing athletic performance. It delineates an innovative framework designed for real-time analysis, personalized feedback, and training optimization. The seamless interaction between sensors and AI models empowers athletes and coaches to make informed decisions, optimizing training regimens and minimizing injury risks. The paper discusses the practical implications, challenges, and ethical considerations associated with this integration, emphasizing its potential benefits in diverse sports disciplines. Results from real-world trials underscore the efficacy of the proposed framework in providing dynamic guidance to athletes, thereby augmenting their performance through tailored interventions.

Read More

Doi: https://doi.org/10.54216/JISIoT.130205

Vol. 13 Issue. 2 PP. 60-77, (2024)

Optimized LoRaWAN Architectures: Enhancing Energy Efficiency and Long-Range Connectivity in IoT Networks for Sustainable, Low-Power Solutions and Future Integrations with Edge Computing and 5G

Nishant Anand , Pritee Parwekar , Aditi Sharma

The Internet of Things (IoT) has expanded rapidly, allowing for a network of sensors and gadgets to collect and share information to make people's lives easier and more convenient. As the Internet of Things (IoT) grows, however, energy efficiency becomes a major issue, especially for portable and wireless gadgets. Low-power, long-range communication capabilities are needed, and Long-Range Wide Area Network (LoRaWAN) has emerged as a viable solution to meet this need. This study provides an in-depth analysis of the LoRaWAN-based, low-power Internet of Things. The suggested network architecture is optimized for low power consumption and high connectivity for numerous Internet of Things (IoT) use cases. This low-power Internet of Things network relies on LoRaWAN gateways, end devices, and a server to function. LoRaWAN is a technology that enables the low-power, long-range transmission of data packets. The results show that the optimized case and non-optimized case have a delivery ratio of 0.85 to 0.73 from node 100 to 500. LoRaWAN significantly reduces energy usage compared to conventional IoT connectivity alternatives, making it a fantastic option for battery-powered devices in far-flung or limited-resource locations. Finally, the adoption of LoRaWAN provides a viable solution to address the energy efficiency concerns in IoT networks, hence allowing for sustainable, long-lasting IoT installations and enabling a wide variety of new applications within the IoT ecosystem. Furthermore, addresses the potential applications of this technology in the future, including upgrades and integration with other technologies like edge computing and 5G networks.

Read More

Doi: https://doi.org/10.54216/JISIoT.130206

Vol. 13 Issue. 2 PP. 78-90, (2024)

An Operative IoT Grounded AEEBLR (Ant-Founded Efficient Energy and Balanced Load Routing) Method for Path Conjunction in Mobile Ad Hoc Networks Approach

Safaa H. OBAIDI al-Khafaji , Julissa E. Reyna-Gonzalez DRA , Sukhman Ghumman , Hannah Jessie Rani R. , Raj Kumar , Shikhar Gupta

An architecture for a wireless network that is constantly developing, decentralized, and multi-hop is called a mobile ad hoc network. MANETs are able to function in many different contexts where regular networks are unable. As can be seen from the advantages listed above, these networks are well-suited for a wide variety of applications, some of which include military and commercial use, as well as applications relating to disaster management, rescue operations, and defense. Energy conservation is a standard factor that indicates the overall network lifetime in mobile ad hoc networks that operate on rechargeable or replaceable battery. This is because usage, battery power consumption in relation to transmission range, type of application running on each device, location, and other influences all play a part in determining the overall network lifetime. An earlier study used a method called ant colony optimization, which is a form of swarm intelligence enthused by the activity of foraging ants in colonies. The best possible travel plan was found using this strategy. Current MANETS routing systems face difficulties in load balancing and energy efficiency that must be overcome if optimal path convergence is to be achieved. When deciding on the next hop node, the IoT based AEEBLR method is recommended. The latency, energy consumption, congestion, and connection quality are all taken into account before making a final decision. The likelihood of selecting the next-hop node as the neighbor node is determined using these metrics. It is the following hop's probability that determines which ant agent goes forward and which goes backward. This paves the door for the creation of many paths, from which the most effective might be chosen for transmission. The results of the implementation show that the suggested AEEBLR technique outperforms the existing AESR approach when the number of packets, the number of nodes, and the mobility of nodes are all varied.

Read More

Doi: https://doi.org/10.54216/JISIoT.130207

Vol. 13 Issue. 2 PP. 91-101, (2024)

Artificial Intelligence-Enabled Muscular Movement Analysis in Wireless Body Area Networks for IoT based Fitness Assessment

Jameela Ali Alkrimi , Sulabh Mahajan , A. Mohamed Jaffer , Sudhanshu Dev , Akshay Kumar V. , Jaymeel Shah

The game's physical and physiological stakes are equal for all players. The two dimensions that rely on the power of physical and physiological consequences are the pursuer and the defence. Whether a chaser or defender, male or female, the physiological actions that occur during the physical activity will have a good effect on the body and on the personality. A Wireless Body Area Network (WBAN) is a network that may transmit real-time traffic like data, speech, and video to monitor the state of essential organs capabilities while remaining external to the body. The present research provides a clear evaluation of how different bones and muscles function, metabolism, movement regulation, and energy generation in relation to varying environmental conditions. There are physiological differences between a chaser and a defender. The primary goal is to gain an in-depth IoT based understanding of how several physiological variables, such as resting heart rate, maximum heart rate, aerobic capacity, and the regulation and maintenance of red blood cells and haemoglobin, are affected by skeletal muscle contraction. It was discovered based on artificial intelligence that the defenders with high speed agility and flexibility performed better in the pre-test. Physiological variables have a considerable impact on speed, strength, agility, and flexibility tests.

Read More

Doi: https://doi.org/10.54216/JISIoT.130208

Vol. 13 Issue. 2 PP. 102-112, (2024)

Advances in Wearable Sensors for Real-Time Internet of things based Biomechanical Analysis in High-Performance Sports

Vilas Alagdeve , Ranjan K. Pradhan , R. Manikandan , P. Sivaraman , Sarihaddu Kavitha , Shaeen Kalathil

Interest in wearable technology and the need for eco-friendly solutions have spurred new methodologies. This research examines how sophisticated deep learning and biomimetic designs benefit each other. The results may change smart technology forever. The introduction highlights the global appeal of wearable technology and the importance of environmental considerations in design. Deep learning and biomimicry are a fresh and exciting combination that might increase smart device accuracy, energy efficiency, and biomimicry. This project seamlessly integrates biomimetic design elements with deep learning methods. Biomimicry affects wearable technology design and functioning. However, deep learning techniques based on artificial neural networks boost user flexibility and predictive analytics. The controlled experiment allows a thorough examination of a number of datasets designed to cover a wide range of biomimetic settings and user behaviours. The data prove that the proposed technique beats alternatives across several performance parameters. Integrating biomimetic principles with deep learning systems boosts accuracy. This proves the system's reliability. The biomimetic method is eco-friendly since energy efficiency grows dramatically. Biological mimicry indications show that the suggested strategy resembles natural systems. A new exploratory method enhances sustainable technologies. Integrating biomimicry and deep learning efficiently enhances gadget performance and meets environmental standards. This research emphasizes the transformational power of nature-friendly technology, changing our worldview. Our study helps ensure that upcoming wearable technologies are cutting-edge and ecologically beneficial. Deep learning and biomimetic designs are converging, marking a tipping point in sustainable technology. This helps move toward an eco-friendly future.

Read More

Doi: https://doi.org/10.54216/JISIoT.130209

Vol. 13 Issue. 2 PP. 113-128, (2024)

Energy saving of cluster computing by CPU frequency Tuning using genetic algorithm

Zainab A. Abdulazeez , Nihad Abduljalil , Ahmed B. M. Fanfakh , Ali Kadhum M. Al-Qurabat , Esraa H. Alwan

Dynamic voltage and frequency scaling (DVFS) is a tool used primarily to decrease computer processor energy consumption by lowering its operational frequency. Their only downside is that they distract from the efficiency of parallel applications while operating on parallel platforms. In a heterogeneous cluster architecture, however, a genetic algorithm is being implemented and applied to model the best trade-off between energy-saving and parallel application performance degradation. The proposed algorithm selects the best frequency vector in order to accomplish these objectives by providing the same compromise. So, the objective function of the genetic algorithm at the same time gives limited energy consumption and minimum decreases in performance. The SimGrid simulator will be used for all experiments. The suggested algorithm saves the average energy by (20 %) and the application performance degrades to the limit (0.15 %).

Read More

Doi: https://doi.org/10.54216/JISIoT.130210

Vol. 13 Issue. 2 PP. 129-140, (2024)

An effective Web System for Weather monitoring using Artificial Neural Network Based on Internet of Things and Cloud Computing

Ebtehal Akeel Hamed , Nahla Ibraheem Jabbar , Zaid Th Hassan , Refed Adnan Jaleel

This research presents an effective web system for weather monitoring based on the Internet of Things (IoT) and cloud computing. The three primary parts of the system are an online application, a cloud platform, and Internet of Things-based weather stations. Periodically, IoT-based weather stations gather meteorological data and send it to a cloud platform. The ANN model can access the meteorological data that is stored in a database by the cloud platform. ANN model utilizes the meteorological data to produce predictions for several weather factors. The web application gives users access to real-time weather data and forecasts.

Read More

Doi: https://doi.org/10.54216/JISIoT.130211

Vol. 13 Issue. 2 PP. 141-149, (2024)

Software Testing Using Cuckoo Search Algorithm with Machine Learning Techniques

Deepashree N , M. Sahina Parveen

Software testing are any errors, flaws, bugs, mistakes, failures in a piece of software that might cause the programme to produce incorrect or unexpected results. Testing in software almost always increase both the time and money needed to finish a project. And finding bugs and fixing them is a laborious and expensive software process in and of itself. While it's unrealistic to expect to completely eradicate all testing from a project, their severity may be mitigated. It is possible to predict where bugs may appear in software using a method known as software defect prediction (SDP). The goal of each software development project should be to provide a bug-free product. Predicting where bugs may appear in code, often known as software defect prediction (SDP), is an important part of fixing software. Software of a high calibre should have few bugs. A software metric is a quantitative or qualitative evaluation of some aspect of the programme or its requirements. One of the more recent population-based algorithms, Cuckoo Search (CS) was inspired by the flight patterns of some cuckoo species as well as the Lévy flying patterns of other birds and fruit flies. The needs for international convergence are met by CS. KNN is a significant non-parameter supervised learning technique. This paper presents an overview of Stochastic Diffusion Search (SDS) in the form of a social metaphor to illustrate the processes by which SDS allots resources. The best-fit pattern identification and matching difficulties were addressed by SDS using a novel probabilistic method. As a multiagent population-based global search and optimization method, SDS is a distributed model of computing that makes use of interaction amongst basic agents. The behaviour of SDS is described by studying its resource allocation, convergence to global optimum, resilience, minimum convergence criterion, and linear time complexity within a rigorous mathematical framework, setting it apart from many nature-inspired search algorithms. This paper proposes a hybrid optimization strategy based on CS-SDS techniques. By using the global search strategy solution of the SDS algorithm, this hybridization idea aims to enhance the cuckoo bird's search strategy for the optimum host nest. To that end, the SDS method would be used to place the cuckoo egg in the most advantageous location. When compared to other classifiers, PC2's improved performance may be attributed to its higher recall values. When compared to the Naive Bayes and Radial Bias Neural Network classifiers, the KNN performs 7.64% and 2.20% better, respectively.

Read More

Doi: https://doi.org/10.54216/JISIoT.130212

Vol. 13 Issue. 2 PP. 150-165, (2024)

Transfer Learning and Optimised Firefly Neural Network for Lung Cancer

A. Gopinath , P.Gowthaman

Today's clinical analysis and precise illness detection are mandated requirements for the development of intelligent expert systems. Since lung cancer affects both men and women equally and has a greater mortality rate than other illnesses, a more complete examination is needed to diagnose lung cancer. More helpful information regarding a lung cancer diagnosis may be provided by images from a computer tomography (CT) scan. Various machine learning and deep learning algorithms are created to enhance the medical treatment process using CT scan input pictures. But research still has a bad side when it comes to creating a precise and intelligent system. In order to improve the detection of lung tumors from the CT input images, this paper presented Firefly optimized pre trained transfer learning. The previously trained model VGG-16 is used in this paper to extract features more effectively, using the features chosen via the firefly optimization approach to increase classification accuracy while reducing complexity. The thorough testing done with the “LUNA-16 & LIDC Lung image” datasets is assessed & studied along with other performance measures like "accuracy, precision, recall, specificity, and F1-score". Investigation results show that the suggested design outperformed the “DenseNet, AlexNet, Resnets-50, Resnets-100, VGG-16 & Inception models” and reached the top results with "98.5% accuracy, 99.0% precision, 98.8% recall, with 99.1% F1-score.

Read More

Doi: https://doi.org/10.54216/JISIoT.130213

Vol. 13 Issue. 2 PP. 166-177, (2024)

LoRa Architecture-Enabled Intelligent for Agriculture with Deep Learning Architecture

K M Monica , Anitha D , S.Prabu , B.Girirajan , Arun M

The agricultural industry faces significant challenges in improving efficiency and productivity, particularly in monitoring crop health and environmental conditions. Traditional methods are often labor-intensive, time-consuming, and lack real-time data, leading to suboptimal decision-making. Recent advancements in Internet of Things (IoT) and Artificial Intelligence (AI) technologies offer promising solutions. Long Range (LoRa) communication, a type of low-power wide-area network (LPWAN), enables long-distance data transmission with minimal power consumption, making it ideal for rural and expansive agricultural areas. When combined with deep learning, which can analyze large volumes of data to generate predictive insights, these technologies have the potential to revolutionize agricultural practices by providing farmers with timely and accurate information to optimize crop management and resource utilization. This study introduces an intelligent mote for agricultural applications, leveraging Long Range (LoRa) communication and deep learning techniques to improve precision farming. Traditional agricultural monitoring methods are labor-intensive and lack real-time insights. To address this, the mote is equipped with sensors to monitor temperature, humidity, soil moisture, and light intensity, transmitting real-time data over long distances with minimal power consumption using LoRaWAN. The collected data is processed by deep learning models to predict crop yield and identify potential issues. Field tests demonstrated a 15% improvement in yield prediction accuracy and a 20% reduction in water usage compared to traditional methods. These results highlight the effectiveness of integrating LoRa and deep learning in enhancing agricultural resource management and productivity.

Read More

Doi: https://doi.org/10.54216/JISIoT.130214

Vol. 13 Issue. 2 PP. 178-191, (2024)

Deep Learning-Based model for Medical Image Compression

Saad H. Baiee , Tawfiq A. AL-Assadi

Efficient compression algorithms are required to handle the growing amount of medical picture data, ensuring that storage and transmission requirements are met without compromising diagnostic quality. This research presents a hybrid image compression framework that integrates deep learning alongside standard lossless compression techniques. A convolutional autoencoder (CAE) learns a compact representation of medical images, which are subsequently compressed using the Brotli algorithm. Our technique beats conventional approaches, like JPEG, JPEG2000, and wavelet-based ones, according to an analysis of a brain MRI dataset. It maintains competitive compression ratios while producing higher (PSNR) and (MSE), indicating higher picture integrity and low information loss. To strike a good balance between the critical need for accurate diagnosis and the economical use of resources, this study offers a possible method for compressing medical images.

Read More

Doi: https://doi.org/10.54216/JISIoT.130215

Vol. 13 Issue. 2 PP. 192-201, (2024)

Collaborative Intelligence for IoT: Decentralized Net security and confidentiality

Kiran Sree Pokkuluri , Ajay Kumar , Krishan Kant Singh Gautam , Pratibha Deshmukh , Pavithra G , Laith Abualigah

This research compares federated and centralized learning paradigms to discover the best machine learning privacy-model accuracy balance. Federated learning allows model training across devices or clients without data centralization. It's innovative distributed machine learning. Keeping data on individual devices reduces the hazards of centralized data storage, improving user privacy and security. However, centralized learning concentrates data on a server, which raises privacy and security problems. It evaluates two learning approaches using simulated data in a simple regression problem framework. Federated learning seems to be as accurate as centralized learning while protecting privacy. The paper also shows how federated learning works in popular machine learning frameworks like TensorFlow Federated. This research shows that federated learning protects privacy while producing accurate machine learning models. It challenges the idea that machine learning must constantly choose between privacy and accuracy. Empirical facts and theoretical ideas from this study advance machine learning methodology discussions. In the digital era, it promotes privacy-conscious, dispersed learning frameworks.

Read More

Doi: https://doi.org/10.54216/JISIoT.130216

Vol. 13 Issue. 2 PP. 202-211, (2024)

Bridging the Gap between Technology and Medicine through the Revolutionary Impact of the Healthcare Internet of Things on Remote Patient Monitoring

Kiran Sree Pokkuluri , Vibha Tiwari , Jyoti Uikey , Prerna Mehta , Chopparapu Srinivasa Rao , Annamaraju Thanuja

Healthcare Internet of Things (IoT) initiatives that aim to integrate technology and medicine are shaking the sector to its foundations. The revolutionary potential of the proposed strategy is shown here as we investigate the far-reaching consequences of the Healthcare IoT on remote patient monitoring. The beginning sets the stage by underlining the significance of bridging the gap between technology and medicine. Our multi-pronged approach comprises Internet of Things (IoT) remote monitoring, cloud-based analysis, artificial intelligence (AI) integrated diagnostics, real-time alerts, and predictive analytics. Our study's results demonstrate that the proposed approach is superior to the status quo. The area of remote patient monitoring has profited considerably from the employment of traditional approaches, such as the fusion of data from wearable sensors, analysis in the cloud, diagnostics that utilize artificial intelligence, real-time monitoring, predictive modeling, and smart alarm systems. The suggested strategy, however, performs very well across all of the most important measures of assessment. Comparatively, the accuracy rate of the conventional wearable sensor fusion approach was only 76%, whereas our suggested method reached 89%. Our strategy was also more accurate than the standard approach (88% vs. 73%). When compared to the recall rate of 68% produced by conventional methods, our suggested strategy significantly outperformed the competition. It's a great option for hospitals and clinics since it improves diagnostic precision and speed without breaking the bank.

Read More

Doi: https://doi.org/10.54216/JISIoT.130217

Vol. 13 Issue. 2 PP. 212-222, (2024)

Optimizing Sensor Localization and Cluster Performance in Wireless Sensor Networks through Internet of Thing (IoT) and Boosted Weight Centroid Algorithm

Krishna Kumar .N , Surya Kiran Chebrolu , R. Manikandan , Aby K Thomas , Peruri Venkata Anusha , Hari Prasad Bhupathi

Localization is an extremely important component of applications that make use of wireless sensor networks. It has a substantial impact on academics as well as real-time sensor deployment applications in the aim of lowering the amount of energy that is used while simultaneously locating unknown nodes. The process of obtaining the coordinates along an axis that represent the locations of the sensor nodes is referred to as localization. The accuracy of locating the positions of the nodes varies depending on the environmental conditions, the type of nodes, the type of application, and the type of localization methods used. A standard localization method known as distance vector hop (DV-hop) localization will be able to determine the positions of unknown nodes with typical accuracy with the assistance of beacon nodes based on Internet of things. The DV-hop and improved weighted centroid localization algorithms, in addition to the suggested boosted weight centroid-based localization approach, are both addressed in this article. The suggested boosted weight centroid localization technique is utilized to find nodes in the remote area of the WSN while conserving energy. This is accomplished with the assistance of measurements involving both the nodes and the centroid. The modified weight metric is utilized in the process of carrying out the task of localisation of an unknown node. The performance of BWCLA is evaluated based on a number of different metrics, including accuracy in localization, average localization error, total packets utilized, and energy usage.

Read More

Doi: https://doi.org/10.54216/JISIoT.130218

Vol. 13 Issue. 2 PP. 223-230, (2024)

Security Implications of IoT-Enabled Mobile Net Facial Recognition System

Sumit Thakur , Nikhat Raza khan

Face recognition technology is gaining popularity for security, access management, and user identification. A novel facial recognition method employing cutting-edge deep learning algorithms and attention processes reduces false positives in this study. This technique was designed to approach facial recognition differently. We demonstrate statistically substantial recognition gains over current approaches through extensive research and experimentation. The recommended solution uses an attention device and a complex feature extraction module. The pieces work together to highlight distinctive characteristics and facial identifiers. To optimize performance and generalization across datasets, data addition and hyper parameter adjustment are used to fine-tune the model. We do this for maximum benefit. Studies on the issue may help us understand the multiple reasons that make ablation so successful. We also discuss facial recognition technology's moral difficulties, including fairness and user privacy. We also emphasize cautious distribution. Our findings expand facial recognition technology knowledge and pave the way for future studies. This study demonstrates that better Mobile Net models and Internet of Things technologies increase the accuracy of mobile facial recognition. The project overcomes the challenge of providing powerful AI tools in resource-constrained situations by utilizing IoT infrastructures and effective, lightweight Mobile Net architectures. Extensive testing demonstrates that the technique increases identification rates and outperforms existing models, showing its suitability for real-time operations. The Internet of Things enables data mobility and cross-device model usage. This illustrates that the IoT ecosystem can enable effective and scalable security solutions.

Read More

Doi: https://doi.org/10.54216/JISIoT.130219

Vol. 13 Issue. 2 PP. 231-244, (2024)

The Healthcare IoTs as a Paradigm Shift in Healthcare Management, Patient Treatment, and Healthcare Data Processing

Amit Kumar Chandanan , Prabha Rani Sikdar , C. Raja , Saiyed Faiayaz Waris , Manoj Kumar .T , Kiran Bhopate

When it comes to hospital administration, patient care, and medical data analysis, the Healthcare Internet of Things (HIoT) is nothing short of a paradigm revolution. We dive into this new paradigm to examine its far-reaching effects and revolutionary possibilities in the healthcare system. The context is established by introducing HIoT as a game-changing development in healthcare. Using the IoT to network several devices, this model paves the way for real-time patient monitoring, streamlined inventory management, and integrated telemedicine. The healthcare industry as we know it will be transformed by HIoT as it strives to improve resource allocation, simplify operations, and provide proactive patient care. Our investigation includes a thorough appraisal of how HIoT will affect many facets of medical treatment. We use many research approaches and quality indicators for this evaluation. We may evaluate the viability and scalability of HIoT solutions by testing them in experimental settings that mimic real-world healthcare settings. To provide a precise depiction of the healthcare system, dataset environments use well maintained medical data sources. The performance and efficacy of HIoT technologies may be evaluated using measurable criteria such as sensitivity (0.94), specificity (0.89), F1-Score (0.91), ROC-AUC (0.95), and cost savings ($150,000). To determine the relative importance of each part of the HIoT ecosystem, researchers undertake "ablation studies. Our findings provide a clear picture of the disruptive potential of HIoT. Better patient outcomes may be ensured via early interventions thanks to the improved accuracy (0.92), efficiency (9.2), and satisfaction (9.2) provided by the suggested HIoT technique for patient monitoring. When healthcare and telemedicine are combined, the success rate of remote consultations increases to 95%, response times decrease to 15 minutes, and more people have access to medical treatment.

Read More

Doi: https://doi.org/10.54216/JISIoT.130220

Vol. 13 Issue. 2 PP. 245-255, (2024)

Innovative Approaches to Bank Security in India: Leveraging IoT, Blockchain, and Decentralized Systems against Loan Scams

Akhtar Hasan Jamal Khan , Syed Afzal Ahmad

This research paper explores the significant impacts of multiple loan fraud on Indian banks and financial institutions, emphasizing the resulting bad debts and financial losses. The issue is exacerbated in the real estate sector, where influential developers exploit system vulnerabilities to secure multiple loans using the same collateral. Consumers also face challenges in accessing credit due to these fraudulent practices. The study underscores the need for enhanced regulatory measures and internal controls within financial institutions. Additionally, it introduces IoTBlockFin, a decentralized system that integrates block chain and IoT technologies to securely assess customer reliability and mitigate fraud. IoTBlockFin's Advanced Proof of Work (APOW) mechanism, combined with IoT data for real-time monitoring, offers superior security, latency, and cost-effectiveness compared to centralized systems, as demonstrated by experimental results.

Read More

Doi: https://doi.org/10.54216/JISIoT.130221

Vol. 13 Issue. 2 PP. 256-271, (2024)

An Ensemble Boosting Algorithm based Intrusion Detection System for Smart Internet of Things Environment

Rami Baazeem

An influx of smart spaces that are now connected to the IoT network has increased new forms of cyber threats; thus, a need for more effective IDS to deal with these complex cyber threats. Traditional security measures cannot solve the modern problem of protecting IoT devices as they are a complex and homogeneously distributed network. Advancements and development of Artificial intelligent (AI) and machine learning technologies have provided new hope to make more reliable IDS. Our study presents Particle Swarm Optimization integrated Light-Weight Gradient Boosting Machine, abbreviated as LGBM-PSO in which, the PSO algorithm is applied for hyper parameters optimization in the model training. Based on the ensemble methodology, a new model for network intrusion detection is proposed in this study to improve the accuracy of the technique proposed. As for the current study project, the “DS2OS” dataset was employed to execute the suggested task. All of the data obtained from the traces of the smart devices placed in a smart home environment are incorporated in this dataset. The IDS model comprises several stages, one of which comprises data preprocessing that entails data cleaning, normalization, and encoding of network traffic data. Feature selection and dimensionality reduction are used which leads to the optimization of the dataset in this case. The core of the model comprises four classifiers: The compared models are Decision Tree (DT), LGBM-PSO, Light Gradient Boost Machine (LGBM), and Extreme Gradient Boost (XGB). Each of these classifiers can be combined with a majority voting ensemble method to increase the reliability of the predictions. The suggested model's accuracy that is LGBM-PSO is the highest with a value of 99.89%. The corresponding figures for the training data are 99.79%. Stand on the testing data proving the efficiency and stability of the algorithm. The use of the ensemble approach is superior especially when using a complex model like LGBM-PSO in the field of intrusion detection. As a result, high accuracy, optimized time, and effective threat identification ensure that it is a useful tool in strengthening security in the different applications.

Read More

Doi: https://doi.org/10.54216/JISIoT.130222

Vol. 13 Issue. 2 PP. 272-292, (2024)

A Predictive Analysis of IMDb Movie Reviews Using LSTM and ANN Models

Noor alhuda A. Salih , Osama A. Qasim , Mohammed S. Noori , Rabei Raad Ali , Khawla Ahmad Wali

The Machine Learning domain has made a major process with the progression of state-of-the-art technologies. Since current algorithms often don’t provide palatable learning performance, it is necessary to continually upgrade them. This paper has illustrated the comparison of the Long Short-Term Memory (LSTM) model and the Artificial Neural Networks (ANN) model in the prediction of the Internet Movie Database (IMDb) website. These evaluations were then related to sentiment assessment approaches to evaluate their predicted accuracy and performances. The results demonstrate that the ANN model outperforms the LSTM model with a high accuracy rate in terms of the prediction accuracy and loss indicators for the IMDb movie review’s sentiment analysis task in terms of the prediction accuracy and loss indicators for the IMDb movie review’s sentiment analysis task. The accuracy of prediction on the test dataset of the ANN model is 83.5 % and the LSTM model is 83.5%. Therefore, it can be concluded that the standard artificial neural network model that was utilized is an appropriate technique for sentiment assessment tasks in IMDb rating text data.

Read More

Doi: https://doi.org/10.54216/JISIoT.130223

Vol. 13 Issue. 2 PP. 293-302, (2024)

A Comprehensive Review of Real-Time Vehicle Tracking for Smart Navigation Systems

Veena R S , Seema Rani , Ch Madhava Rao , Piyush Kumar Pareek , Sandeep Dalal , Shweta Bansal

Vehicle tracking is one of computer vision's most important applications, with applications ranging from robotics and traffic monitoring to autonomous vehicle navigation and many more. Even with the significant advancements in recent research, issues like occlusion, fluctuating illumination, and fast motion still need to be addressed, calling for more investigation and creativity in this field. This study performs a thorough examination of various vehicle-tracking approaches and suggests a thorough classification scheme that divides them into four main categories: strategies that rely on features, segmentation, estimate, or learning. Two well-known methods are highlighted specifically in the estimation-based category: particle filters and Kalman filters.

Read More

Doi: https://doi.org/10.54216/JISIoT.130224

Vol. 13 Issue. 2 PP. 303-323, (2024)

Gazelle Optimized Visual Geometry Group Network with Resnet101 fostered Oral Squamous Cell Carcinoma Detection

Kumar R , S Pazhanirajan

Microscopic examination of tissues to detect oral cancer falls short as traditional microscopes struggle to easily differentiate between cancerous and non-cancerous cells. The identification of cancerous cells through microscopic biopsy images has the potential to alleviate concerns and improve outcomes if precise biological approaches are employed. However, relying solely on physical examinations and microscopic biopsy images for cancer identification increases the likelihood of human error and mistakes. Therefore, in order to obtain accurate results, a new research technique has been developed. In this manuscript, Gazelle Optimized Visual Geometry Group Network with Resnet101 fostered Oral Squamous Cell Carcinoma Detection (OCD-VGGNetCNN-GOA-Resnet101) is proposed. In this method initially, the images are taken from Kaggle repository benchmark dataset and preprocessed to improve image quality.  Then the result is given to the Visual Geometry group Network based CNN (VGGNetCNN) with Resnet101 for classification. Finally, the VGGNetCNN -ResNet 101 classifies image into normal and OSCC. Then the simulation performance of the proposed -VGGNetCNN-GOA-Resnet101 method attains 23.67%, 34.89%, 39.45% and 45.31% higher accuracy while compared with existing methods such as OCD-CNN-Alexnet, OCD-CNN-VGG19 and HI-OCD-CNN-INet respectively.

Read More

Doi: https://doi.org/10.54216/JISIoT.130225

Vol. 13 Issue. 2 PP. 324-333, (2024)

Brain Tumor Semantic Segmentation using U-Net and Moth Flame Optimization

B. Tapasvi , E. Gnanamanoharan , N. Udaya Kumar

Brain tumor is an abnormal development of brain cells that, if left untreated, can have severe consequences. Brain tumour semantic segmentation is the process of determining and distinguishing the impacted brain regions, which is essential for accurate diagnosis, treatment planning, as well as surveillance of the tumor's development over time. This paper presents a model for identifying and segmenting brain tumor using Unet architecture with the optimization of hyper parameters using the Moth Flame Optimization (MFO) algorithm. Due to its capacity to collect spatial information, the Unit architecture is a common choice for picture segmentation tasks. The MFO algorithm is an optimization technique that draws inspiration and replicates from the behavior of moths. Both techniques are developed to improve efficiency. The performance of the model has increased using the MFO method, which led to improved segmentation results. Based on comparative analysis report, the proposed model shows a percentage improvement of approximately 65.16% in MSE, 28.87% in PSNR, and 40.30% in Tversky compared to the Unet and Unet++ models. This method has demonstrated good results in identifying and segmenting brain tumors, which can be helpful in the early identification and treatment of brain tumor.

Read More

Doi: https://doi.org/10.54216/JISIoT.130226

Vol. 13 Issue. 2 PP. 334-346, (2024)

A Hybrid Intelligence-based Deep Learning Model with Reptile Search Algorithm for Effective Channel Estimation in massive MIMO Communication Systems

Nallamothu Suneetha , Penke Satyanarayana

Channel estimation poses critical challenges in millimeter-wave (mmWave) massive Multiple Input, Multiple Output (MIMO) communication models, particularly when dealing with a substantial number of antennas. Deep learning techniques have shown remarkable advancements in improving channel estimation accuracy and minimizing computational difficulty in 5G as well as the future generation of communications. The main intention of the suggested method is to use an optimal hybrid deep learning strategy to create a better channel estimation model. The proposed method, referred to as optimized D-LSTM, combines the power of a deep neural network (DNN) and long short-term memory (LSTM), and the optimization process involves the integration of the Reptile Search Algorithm (RSA) to enhance the performance of  deep learning model. The suggested hybrid deep learning method considers the correlation between the measurement matrix and the signal vectors that were received as input to predict the amplitude of the beam space channel. The newly proposed estimation model demonstrates remarkable superiority over traditional models in both Normalized Mean-Squared Error (NMSE) reduction and enhanced spectral efficiency. The spectral efficiency of the designed RSA-D-LSTM is 68.62%, 62.26%, 30.3%, and 19.77% higher than DOA, DHOA, HHO, and RSA. Therefore, the suggested system provides better channel estimation to improve its efficiency.

Read More

Doi: https://doi.org/10.54216/JISIoT.130227

Vol. 13 Issue. 2 PP. 347-360, (2024)