One example of a cutting-edge technology that is opening up new channels for human-machine connection is brain-computer interfaces, or BCIs. From keyboards and mice to touchscreens, voice commands, and gesture engagements, communication interfaces have evolved over time. New methods of controlling computer systems and engaging with virtual worlds have gained appeal as computers become more and more ingrained in daily life. These innovative applications range from gaming to teaching. It's important to handle ethical, privacy, and security issues related to developing and applying Brain-Computer Interface (BCI) technology from a balanced standpoint. Susceptible brain signals must be gathered and interpreted for BCI devices. Unauthorized access to this material carries the risk of compromising privacy by disclosing private thoughts, feelings, or other sensitive information. The initial areas of brain-computer interface (BCI) applications were based on EEG and created for medical use, hoping to help patients get back to their regular lives. Beyond the original purpose, EEG-based BCI applications have become more and more important in the non-medical field, helping healthy individuals live better lives by becoming more productive, collaborative, and self-developing, for example.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140103
Vol. 14 Issue. 1 PP. 31-44, (2025)
Research on wireless body area networks (WBAN), also known as wireless body sensor networks (WBSN), has been increasingly important in medical applications recently and is now crucial for patient monitoring. To create a dependable body area network (BAN) system, several factors need to be considered at both the software and hardware levels. One such factor is the designing and implementation of routing protocols in the network layers. Protocols for routing can detect and manage the routing paths in a network to facilitate efficient data transmission between nodes. Therefore, the routing protocol is crucial in wireless sensor networks (WSN) to provide dependable communication among the sensor nodes. Different clustering methods can be used in WBAN systems. However, these techniques often produce many cluster heads (CHs), which leads to higher energy consumption. Increased consumption of energy reduces the lifespan of WBANs and raises costs of monitoring. This research proposes a recent metaheuristic algorithm to select the optimal clusters to provide an energy-effective protocol for healthcare monitoring. This research aims to minimize the energy utilization of WBANs by choosing the most suitable CHs based on the BWO. The proposed BWO-based routing protocol demonstrates superior performance in WBANs based on energy consumption, packet loss, packet delivery ratio, network lifetime, end-to-end delay, and throughput. It optimizes energy consumption by effectively selecting CHs and routing paths, leading to balanced energy usage and prolonged network operation. The BWO model significantly reduces end-to-end delay by ensuring data packets follow the shortest and least congested routes, which is significant for real-time health monitoring. It achieves a high packet delivery ratio, typically between 95% and 98%, indicating reliable data transmission, while maintaining a low packet loss rate, generally between 1% and 5%. Additionally, the BWO-based routing protocol extends network lifetime by preventing early node depletion and enhances network throughput by reducing congestion and packet collisions, thereby supporting continuous and robust health data monitoring.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140104
Vol. 14 Issue. 1 PP. 45-58, (2025)
In the evolving landscape of the Internet of Things (IoT), effective intrusion detection is paramount for maintaining security and data integrity. This study introduces a hybrid heuristic technique utilizing artificial intelligence for enhancing intrusion detection systems (IDS) in IoT environments. By integrating various machine learning models, the research focuses on training, tuning, and validating a sequential neural network to predict intrusion occurrences based on extensive data analysis. The methodology involves modelling, which starts with training machine learning algorithms to predict labels from features, tuning the models to meet organizational requirements, and validating them using holdout data. Key machine learning techniques explored include logistic regression, k-nearest neighbors (KNN), naive Bayes, support vector machines (SVM), decision trees, random forests, and neural networks. Each technique's applicability to classification tasks, particularly binary and multivariate scenarios, is discussed in the context of enhancing IDS capabilities. A sequential neural network model, comprising multiple dense and dropout layers, was developed and trained with 148,033 parameters to achieve high accuracy and robustness. The architecture's effectiveness in learning intricate patterns associated with malicious activities while avoiding overfitting is emphasized. The study demonstrates the model's proficiency in binary classification tasks, which is critical for distinguishing between normal and anomalous behaviors in IoT systems. The results indicate that the neural network, optimized using the hybrid heuristic approach, shows a significant reduction in validation loss and a steady improvement in accuracy over multiple epochs. Despite initial overfitting signs, the model maintains high performance on unseen data, underscoring the importance of ongoing model assessment and tuning.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140101
Vol. 14 Issue. 1 PP. 01-15, (2025)
Clothing design plays an important role in personal image expression and social and cultural transmission. The traditional fashion design method has many problems, such as low efficiency and large design error, and it is difficult to bring users better wearing experience. In order to meet different users’ Design needs, reduce design errors, and improve users’ satisfaction with design results, this paper combined with intelligent sensing technology, conducted in-depth research on digital automation analysis of clothing design CAD (Computer Aided Design). Aiming at the clothing design process, this paper first constructed a brand-new clothing design CAD system, using the depth transducer to solve the 3D information of the relevant feature points, and realized the accurate acquisition of the human body feature size information. Through the registration of adjacent frame point data, the 3D human body modeling was carried out. Then, according to the user’s physical characteristics and related information collected by the sensor, the paper compared the user’s characteristic information to filter out the user’s preferences, and used the recommendation algorithm to calculate the corresponding parameters to realize the intelligent choice of clothing styles. Finally, through the measurement of each index by the sensor, the size adjustment of the garment and the specific design of the garment were realized. In order to verify the effect of clothing design CAD system based on intelligent sensing technology, this paper conducted system tests. The results showed that in terms of clothing comfort, clothing quality and clothing functionality, the number of users satisfied and very satisfied reached 50.4%, 47.9% and 51.3%, respectively. From the overall survey results, the system has a high degree of user satisfaction. The research conclusion of this paper shows that the digital automatic analysis of clothing design CAD based on intelligent sensing technology can effectively meet the needs of users, improve their wearing experience, and promote the intelligent development of clothing design.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140102
Vol. 14 Issue. 1 PP. 16-30, (2025)
The recent progress in the Internet of Things (IoT), Artificial Intelligence (AI), and cloud computing has revolutionized the traditional healthcare system, upgrading it into a smart healthcare system. Medical services can be enhanced by integrating essential technology such as IoT and AI. The integration of IoT and AI presents several prospects within the healthcare industry. In this research, a novel hybrid Deep Learning (DL) model called Binary Butterfly Optimization Algorithm with Stacked Non-symmetric Deep Auto-Encoder (BBOA-SNDAE) for HD (HD) prediction based on the Medical IoT technology. The key aim of the work is to categorize and predict HD utilizing clinical data with the BBOA-SDNAE model. Initially, the model is trained using the Cleveland and Statlog datasets. The input data is preprocessed and standardized utilizing the Min-Max normalization. After preprocessing, the selection of features is performed utilizing the BBOA to choose the best optimal features for improved classification. Based on the selected features, the classification is performed using the SNDAE technique. The research model was assessed based on accuracy, sensitivity, precision, specificity, NPV, and F-measure. The model attained 99.62% accuracy, 99.45% precision, 99.32% NPV, 99.56% sensitivity, 99.45% specificity, and 99.38% f-measure using the HD dataset, and the model attained 98.84% accuracy, 98.73% precision, 98.34% NPV, 98.62% sensitivity, 98.21% specificity, and 98.27% f-measure using the sensor data. The results of the research model were compared with the current model for validation, where the research model outperformed all the compared models.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140105
Vol. 14 Issue. 1 PP. 59-76, (2025)
Multi Organ and tumor segmentation is the challenging task in medical imaging and surgical planning scenarios due to its diverse applications includes lesions and organs measurements and disease diagnosis respectively. Although collecting and examining labels for all classes pose severe challenges. Furthermore, Graphical Processing Unit (GPU) optimization emerge as another critical factor for multi organ and tumor segmentation. To address the mentioned conventional challenge, we designed a deep learning-based model named “Intelligent Segmentor” which performs automated segmentation in end-to-end fashion with novel semi supervised training approach. Initially, the obtained multi organ CT images is then subjected to pre-processing in terms of geometric standardization, noise removal, and intensity normalization respectively. The pre-processed image is then further provided to dual view training for effective Pseudolabel generation. The labelled data along with generated pseudolabels are provided to train the model for amplifying the model performance. After that, there are two inputs are provided to the designed segmentation model which includes dual encoders such as GoogleNet and VGG-16 for contextual and spatial information extraction in five stages, Tweaked Feature Pyramidal Network (TFPN) for dimensionality reduction and side features extraction, and Gated Fusion Module (GFM) for fusing the side features to form unified feature map. Finally, the unified feature map is the examined through convolution layers for multi organ and tumor output. We adopted FLARE 2023 dataset for validating the proposed work with existing works on 13 various organs and tumor segmentation tasks. From the results, the proposed research achieves better Dice Similarity Coefficient (DSC) and Normalized Surface Dice (NSD) through online validation and final testing than the existing works.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140106
Vol. 14 Issue. 1 PP. 77-89, (2025)
Forecasting energy demand is essential for efficient grid management as it promotes steady operations, efficient markets, and sustainable energy practices. In this study, previously observed, evenly spaced energy consumption data are analysed using recurrent neural networks based on Gated Recurrent Unit (GRU) and Long Short-Term Memory (LSTM) architectures to extract important insights, features, and remarkable patterns. First, the study examines the influence of meteorological features on energy consumption. The most significant meteorological features are determined by computing the MIC and Pearson's correlation coefficient. The selected features are then combined with historical energy consumption data to feed the neural network. Second, to improve and optimise the performance of the proposed models, two technical indicators - the daily energy usage average and the simple moving average - are considered. The following are some instances of comparisons in terms of prediction accuracy: (1) The MAPE of the proposed model is 2.47, whereas that of the current model is 4.03. (2) The MAPE of the existing model is 25.83, whereas the proposed solution is 18.68. (3) The MAPE of the suggested model is 24.8, while the MAPE of the current model is 26.6. (4) The MAPE of the present model is 4.77, whereas the suggested approach's is 4.42.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140107
Vol. 14 Issue. 1 PP. 90-101, (2025)
Internet of Things (IoT) devices are more attractive towards various vulnerable activities and nodes are easily compromised by attackers. The complexity of insecure IoT node installation relies on device heterogeneity and resource constraints because of the network ends and conventional endpoints. This work concentrates on modeling an efficient IoT-based preservation model () which is a lightweight approach used for detecting anomaly and performance various analyses at the endpoints. This work integrates linear Support Vector Machine for pattern analysis and adaptive fuzzy rule model for data pattern rule generation to examine malicious network functionality and network traffic. While adopting the rules, the compromised node needs to fulfill the generated rules; when it fails then it is considered as malicious activity. Then, the models impose network access restrictions on the compromised and terminate the further process. Thus, the nodes are prevented from further network attacks. The evaluation model is done with the use of an online available network dataset and the dataset samples are evaluated in the complex network scenario. The simulation is done in MATLAB 2020a simulation environment and the accuracy attained with this model is higher compared to other approaches. Similarly, other metrics like False Alarm Rate (FAR) are evaluated for predicting malicious network functionality. The significance of the model is evaluated based on the prediction and mitigation of various network attacks. The anticipated model shows a prediction rate of 90.21% for DoS attacks, 89.13% for R2L, 91.65% for probe, and 93.56% for U2R attacks.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140108
Vol. 14 Issue. 1 PP. 102-113, (2025)
Task scheduling (TS) in fog computing (FC) involves efficiently allocating computing tasks to fog nodes, considering factors such as minimizing execution time, energy consumption, and latency to meet the quality-of-service (QoS) requirements of the Internet of Things (IoT) and edge devices. Efficient TS in FC is crucial for optimizing resource usage, minimizing latency, and ensuring that IoT and edge devices receive timely and high-quality services. The growing complexity of FC environments, along with the dynamic nature of IoT applications, necessitates innovative TS models using metaheuristic algorithms to allocate tasks and meet diverse quality-of-service requirements efficiently. This research introduces the GTO-SSSA (Gorilla Troops Optimization with Skip Salp Swarm Algorithm), a novel model for intelligent TS in FC environments. This model capitalizes on the collaborative nature of the GTO algorithm while incorporating enhanced exploration and exploitation capabilities via the SSSA algorithm's skipping mechanism. The primary objective of GTO-SSSA is to tackle the intricate challenges of TS in FC effectively. This includes the efficient allocation of tasks to fog nodes, considering multiple objectives such as minimizing makespan, execution time, and throughput. The GTO-SSSA model in FC demonstrates improved efficiency, consistently surpassing compared models across various task quantities with significantly reduced makespan values. Performance improvement rates for GTO-SSSA over other models show substantial gains in TS efficiency, ranging from 0.87% to 17.83%. The model exhibits scalability as it maintains its efficiency even with an increased number of tasks, aligning with the dynamic nature of IoT applications.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140109
Vol. 14 Issue. 1 PP. 114-128, (2025)
Rapid urbanization needs major cities that change into smart cities to increase our lifestyle with respect to transportation, people, government, environmental sustainability, and more. In recent times, Internet of Things (IoT) and healthcare wearables have played a vital play in the progress of smart cities by providing enhanced healthcare services and an entire standard of living. Wearables offer real-time health records to individuals and healthcare providers, permitting for proactive management of chronic conditions and early recognition of health problems. While sleep is of major importance for a healthy life, it can be required to forecast sleep quality. Insufficient sleep affects mental health, physical, and emotional, and is a solution to many illnesses like heart disease, insulin resistance, stress, heart disease, and so on. Recently, deep learning (DL) techniques can be deployed to forecast the quality of sleep dependent upon the wearables data in the awake duration. Therefore, this paper presents an automated sleep quality recognition using hunger games search optimization with deep learning (ASQR-HGSODL) technique in the IoT-assisted smart healthcare system. The ASQR-HGSODL technique allows the IoT devices to perform a data collection process, which collects the data related to sleep activity. For the feature selection process, the ASQR-HGSODL technique applies an arithmetic optimization algorithm (AOA). For the prediction of sleep quality, the ASQR-HGSODL technique implements a convolutional long short-term memory (ConvLSTM) approach. Lastly, the HGSO technique has been applied for the optimum hyper parameter selection of the ConvLSTM approach. To exhibit the effectual prediction results of the ASQR-HGSODL approach, a range of simulation can be carried out. The investigational outputs highlight the improved outcome of the ASQR-HGSODL. technique with other DL methodologies.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140110
Vol. 14 Issue. 1 PP. 129-140, (2025)
Facial emotion recognition (FER) technology in autonomous vehicle drivers can considerably strengthen the efficiency and safety of the driving experience. The system can analyze facial expressions in real-time by employing advanced computer vision (CV) techniques, which identify emotions such as stress, fatigue, or distraction. This enables the vehicle to adapt its behavior, triggering interventions or alerts where applicable to alleviate possible threats. Ensuring the emotional well-being of the driver promotes a safer road environment, improving overall road safety and diminishing the possibility of accidents in the era of autonomous vehicles. FER using (Deep Learning) DL is an advanced technique that leverages deep neural network (DNN) to automatically interpret and identify emotions from facial expressions. DL algorithms, especially Recurrent Neural Network (RNN) and Convolutional Neural Network (CNN) have attained outstanding results in this field since they allow us to learn temporal dependencies hierarchy and features within the data. This research develops a novel Computer Vision with Optimal DL-based Emotion Recognition (CVODL-ER) model for Autonomous Vehicle Drivers. The CVODL-ER method concentrates on the automated classification of various sorts of emotions of autonomous vehicle drives. To accomplish this, the CVODL-ER technique makes use of the SE-ResNet model for learning intrinsic patterns from the driver's facial images. Besides, the hyper parameter tuning of the SE-ResNet model takes place via a quasi-oppositional Jaya (QO-Jaya) algorithm. For the recognition of driver emotions, the CVODL-ER system executes the deep belief network (DBN) algorithm. The performance analysis of the CVODL-ER technique takes place using a benchmark facial image database. The obtained results underline the improved efficiency of the CVODL-ER technique over other models.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140111
Vol. 14 Issue. 1 PP. 141-153, (2025)
Skin cancer is most top three critical kinds of cancer due to damaged DNA, which is cause death. This damaged DNA begins cells for growing uncontrollably and currently it can be obtaining improved quickly. It is several researches on the computerized examination of malignancy from the skin cancer image. But, study of these images are very difficult taking several troublesome issues such as light reflections on the skin surface, differences from the color illumination, sizes of lesions, and distinct shapes. Thus, the outcome, evidential automatic detection of skin cancer are appreciated for developing the accuracy and efficiency of pathologists at the beginning phases. This manuscript develops a Stacked Ensemble Machine Learning based Skin Cancer Detection and Classification (SEML-SKCDC) approach. The presented SEML-SKCDC technique majorly aims to offer ensemble of three ML models for skin cancer classification. In the presented SEML-SKCDC technique, median filtering and contrast enhancement is performed at the pre-processing stage. To generate feature vectors, the honey badger algorithm (HBA) with EfficientNet method has been exploited in this work. At last, an ensemble of k-nearest neighbor (KNN), random forest (RF), and feed forward neural network (FFNN) approaches are applied for skin cancer classification. The simulation evaluation of the SEML-SKCDC system on skin cancer database depicts the developments of the SEML-SKCDC algorithm with recent methods.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140112
Vol. 14 Issue. 1 PP. 154-167, (2025)
The attainment of smart sustainable production of energy is the goal, which is being pursued globally. In the field of agricultural system, several challenges are present and as well, it is combined with the climatic crises. In general, the renewable energy resources is the origin of energy production and consumption so that using this energy source it is possible to improve the ecologically and social agriculture. Due to the expansion of renewable energy, the concept of Agrivoltaic System is created which convert the food production to energy generation process. Currently many of the research are developed to increase the crop yield and energy production. In this article, we concentrate on intelligent farming in agrivoltaic system with the help of Internet of Agricultural Things (IoAT). It focuses on newer preliminary methods like fluid dynamic system, improved photovoltaic (PV) module, land equivalent ratio analysis and shading ratio calculation. In IoAT based system, crop field analysis, energy production model, sensor localization process, cost optimization and fault diagnosis processes are concentrated. So that the effective outcomes are attained in the cultivation of crops like melon, bean, millet, and cucumber. The parameters, which are calculated in the results analysis, are shading ratio and temperature, crops-based analysis, and energy-based analysis. With the help of IoAT system both, the crop yield and electricity production is increased.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140113
Vol. 14 Issue. 1 PP. 168-184, (2025)
The current work focuses on the establishment of an enhanced Internet of Things (IoT) model in expectation to improve the sunflower seeds output in Uzbekistan. The presented framework involves examination of air quality, soil moisture, temperature, humidity, light intensity, GPS and weather station which is anticipated in giving a complete control and monitoring of the environmental probes at real time. The main goal is to establish an argument that such architecture will increase the yields in agriculture. It is done by simulating a model based on correlation and regression on secondary data which shows that the model will provide solutions to the problems associated with conventional farming which include conventional approaches towards provision of water and failure to internalize the conditions within which farming activities occur. The connection of the proposed sensors with the platform based on Arduino allows to gather and analyze the data that is essential for making appropriate decisions by the farmers. As the results the use of the developed framework in selected fields of sunflower will enhance yield with a potential of up to 25% in yield increase. Thus, the results shows that the implementation of such an innovative IoT architecture can greatly help farmers to increase efficiency, make proper use of resources, and minimize the negative effects on the environment while contributing to the development of sustainable agriculture. At the end the study recommends that further studies shall include more variables in the framework and test it for other crops and in other regions.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140114
Vol. 14 Issue. 1 PP. 185-195, (2025)
In the recent era of communication technology, flying ad hoc networks are gaining popularity because of their flexibility and broad area of application to gather data from environmental sources with limited infrastructure. FANET nodes, or unmanned aerial vehicles (UAVs), are heterogeneous devices, and coordination between the UAVs is an important part of communication with limited battery power sources. In ad hoc networks, devices have limited battery power, so proper battery utilization is critical to maintaining network connectivity. In order to establish a network without congestion, it is vital to have inter-UAV and IoT wireless communication for cooperation and collaboration among many UAVs. UAV connections may experience frequent disconnections. Another obstacle is the limited distance allowed between the stations. The routing algorithm selects only the nodes that are specifically requested by the source node based on its requirements and maintains the source node no longer needs the route until it. IoT devices have limited processing capability and memory. A single mobile device controls the IoT devices, or users can use the concept of automation to control the functioning of smart IoT devices. This research proposes a fuzzy-based congestion control scheme (MCPFB) to control the congestion between UAVs and IoT devices. UAVs are faster, and IoT devices can collect information from UAVs and forward it to other devices. The UAV’s can store limited and sufficient types of information, but during routing, only a single path is available, which causes congestion in the FANET-IoT network. The fuzzy based load prediction and balancing routing is able to handle the problem of congestion in FANET-IoT. In order to overcome the problem of congestion with improper energy utilisation, this paper presents fuzzy rule-based congestion control techniques for a flying ad hoc network. We focus on the efforts to reduce congestion in the FANET-IoT network. Routing is a critical issue in FANET-IoT and hence the focus of this research is on the performance improvement of routing in FANET-IoT. Packets dropping on the nodes show congestion occurrence in the network, and the possibility of lost connectivity with other nodes is high. Unlike the aforementioned works, the proposed MCPFB routing shows better performance compared to the conventional BARS scheme in FANET-IoT.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140115
Vol. 14 Issue. 1 PP. 196-208, (2025)
Loan frauds in India have gotten more difficult by exploiting financial system vulnerabilities. Online purchasing has exacerbated these frauds. Identity fraud, phoney paperwork, and unclear loan conditions are common. This article looks at how blockchain and IoT could make loans safer, more open, and more efficient, reducing loan fraud. On an independent blockchain network, the proposed IoTBlockFin system records all loan events. This opens up the system and prevents dishonest alterations. IoT devices verify borrower identities and property, reducing false claims. An online loan application and smartphone app allow remote loan status checks. This speeds up and simplifies client service. Blockchain's digital safety measures protect sensitive user and transaction data from unauthorised parties. This prevents data breaches and illegal access. This comprehensive approach reduces loan frauds and improves financial transactions. IoTBlockFin seeks to solve today's lending process, which will transform India's banking business.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140116
Vol. 14 Issue. 1 PP. 209-220, (2025)
To extend the lifespan of Wireless Sensor Networks (WSNs), effective routing protocols are required to provide communication channels between the sources and sink. While nodes are arbitrarily distributed in a substantially unsafe situation, these steering protocols are susceptible to an extensive range of assaults. For WSNs, trust-based routing protocols are created, which employ a trusted route rather than the quickest path, to prevent these attacks. The artificial bee colony-based clustering technique is utilized because the conventional clustering algorithm reduces the energy usage of nodes. This allows it to increase the lifespan of the sensor network by evenly dividing energy use among all nodes. The artificial bee colony (ABC)-based grouping method was developed because the typical grouping technique minimizes the energy usage of nodes. By integrating diverse sensors and devices, Internet of Things (IoT) enhances the performance of WSN, by enabling efficient data collection, analysis, and communication. The creation of such traditional protocols does not guarantee the best global optimization for the lengthening of WSN life. Through simulation analysis, the suggested Artificial Bee Algorithm (ABC)-based Traffic-Aware Energy Efficient Routing (TEER) protocol's performance was evaluated and contrasted with the TEER protocols. The ABC-based TEER protocol's lifetime analysis, active node analysis is achieved and contrasted with those of other protocols. In terms of the number of rounds, the network performance for the ABC-based TEER scheme performs better than the TEER schemes. The Analysis of throughput of the ABC-TEER method, which reveals a 9.5% increase in performance in comparison to the TEER protocol.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140117
Vol. 14 Issue. 1 PP. 221-229, (2025)
The article "The Future of Personalized Medicine and How the Healthcare Internet of Things is Reshaping Treatment Plans and Patient Experiences" offers a comprehensive exploration of the transformative landscape of healthcare. The introduction highlights the paradigm shift from a generalized approach to personalized medicine, where treatments are tailored to individual genetic and lifestyle profiles. Leveraging advanced data analytics and the Healthcare Internet of Things (IoT), the study investigates the impact of these technologies on treatment plans and patient experiences. Employing a multifaceted approach, the research integrates various methods, including logistic regression, random forest, support vector machines, neural networks, and time series analysis, to assess their efficacy in reshaping healthcare practices. Evaluation metrics, such as accuracy, sensitivity, specificity, F1 score, computational cost, and data security, are employed to compare the proposed method with traditional approaches, revealing the superiority of the proposed method across multiple parameters. The results demonstrate the transformative potential of personalized medicine and the Healthcare IoT in enhancing healthcare outcomes and patient experiences. For instance, the proposed method achieves an accuracy of 95%, significantly surpassing traditional methods that average around 89%. Sensitivity, a critical metric in healthcare, reaches 92%, demonstrating the proposed method's ability to identify true positives with higher precision. Additionally, the computational cost of the proposed method, at 0.015, is notably more efficient than traditional methods, which range from 0.020 to 0.022. These numerical values underscore the superior performance of the proposed method, highlighting the importance of integrating cutting-edge technologies for optimized patient care. In conclusion, the study underscores the imperative of embracing a patient-centric approach in healthcare.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140118
Vol. 14 Issue. 1 PP. 230-240, (2025)
The Healthcare Internet of Things (HIoT) is driving a paradigm shift in the healthcare business by providing safe, fast, and networked healthcare solutions. We examined the advantages, disadvantages, and potential future of the Internet of Things (IoT) in the medical industry. Scalability, accuracy, real-time monitoring, data security, and interoperability were among the top priorities. The study employed strict assessment criteria to compare the proposed HIoT technology to existing approaches. This article begins with an overview of the IoT in healthcare. This study compares and contrasts the proposed HIoT strategy with more conventional approaches. We applied both methodologies in this study, each with its own benefits and drawbacks. We evaluated the responses using the F1-score, recall, accuracy, and precision. The inquiry uncovered an interesting story. The proposed HIoT method outperformed traditional techniques in all assessment parameters. In terms of accuracy, the recommended solution outperformed "Block chain Encryption" (8.4) and "Data Validation" (7.9). Additionally, it received an 8.9 for real-time monitoring and an 8.8 for interoperability. Another benefit of the strategy was a reduction in medical errors. The high data accuracy score of 9.1 demonstrates this. The findings illustrate the potential transformation of healthcare delivery through the Internet of Things. According to the study, the proposed strategy might increase healthcare's efficacy, efficiency, and patient-centeredness. The Internet of Things has opened up exciting new opportunities in healthcare. These options may transform medical care and patient outcomes.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140119
Vol. 14 Issue. 1 PP. 241-251, (2025)
The heading "Wireless Networks in Hospitals: Ensuring Seamless Communication in Critical Situations" examines hospital wireless network enhancement. When patient well-being is at stake, this strategy encourages honest conversation. Service quality, resource efficiency, and network security are crucial. These mathematical models increase hospital wireless network stability based on Internet of Thing (IoT). Service management effectiveness influences who gets vital medical information quickly. Information and crucial messages are delivered faster. A mathematical technique considers the relevance and transmission time of each data payload to estimate its priority factor (P(i)). Network performance determines QoS settings. Priority data is transmitted first to ensure quick delivery to the intended recipients. This technology is essential for updating hospital WiFi networks, especially in critical situations where it can transmit accurate and timely information and save lives. WiFi reliability is essential for building operations. Compare failure frequency and MTBF to assess each network point's reliability. An exponential reliability function determines network dependability. The mean time between failures is used. This method maintains network functionality despite its complexity. Determine which pieces are crucial and how they influences network health. This simplifies network backups and maintenance. Load balancing distributes network tasks among entry points. This strategy helps the network function smoothly and minimize congestion during peak demand. The weighted round-robin timing algorithm determines how busy each access point is to send fresh network traffic to the proper areas. By equally distributing load and prioritizing underutilized access points, this method maintains network stability and keeps critical lines available. These three approaches form a full healthcare WiFi network strengthening plan. Mission-critical data is prioritized, the network is more robust, and resources may be allocated quickly. Our solution often outperforms the existing standard in network stability, communication, and cost.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140120
Vol. 14 Issue. 1 PP. 252-264, (2025)
Urinary Tract Infections (UTIs) are a prevalent medical condition affecting male felines that can lead to severe discomfort, behavioural changes, and even fatality if not promptly diagnosed and treated. This paper aims to address the limitations of traditional diagnostic methods by integrating Internet of Things (IoT) sensors with machine learning algorithms to predict UTIs in male felines. The study utilizes a multi-modal sensor array to continuously monitor various physiological and behavioural parameters, such as acidity of urine pH levels, heart rate, territorial marking, and eating habits. Observations were categorized into several states, ranging from normal conditions to severe abnormalities, including death. A machine learning model was trained on the collected data to identify early signs of UTIs. The model demonstrated a high predictive accuracy in identifying urinary tract infections before the manifestation of severe symptoms, thus providing a promising avenue for early intervention. The integrated system offers a non-invasive, real-time monitoring solution that could significantly improve the management and the treatment outcomes of UTIs in male feline.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140121
Vol. 14 Issue. 1 PP. 265-277, (2025)
The rapidly evolving landscape of cyber threats demands robust and adaptive Intrusion Detection Systems (IDS) capable of real-time operation. This paper presents a novel approach to augmenting Field-Programmable Gate Arrays (FPGA) for the development of a high-performance IDS designed to enhance communication security by rapidly and accurately identifying threats. The proposed system integrates advanced techniques, including Meta Ensemble Learning (MEL), Extreme Gradient Boosting (XGBoost), and a Hybrid Deep Learning (HDL) model that combines Long Short-Term Memory (LSTM) networks for temporal analysis and Convolutional Neural Networks (CNN) for feature extraction. This synergistic approach significantly reduces detection latency and improves the accuracy of threat identification. The effectiveness of the FPGA-based IDS is evaluated using four widely recognized datasets—NSL-KDD, IoTID20, CICIDS2017, and UNSW NB15—all of which focus on communication attacks, making them ideal for testing IDS performance in diverse IoT environments. The results demonstrate that the proposed IDS not only achieves a high detection rate with a low false positive rate but also operates efficiently in real-time settings, underscoring its viability as a critical security solution in data communication networks. Moreover, the system's exceptional performance in securing IoT devices, which are frequently targeted due to their ubiquity and vulnerabilities, highlights its potential as a reliable and scalable security measure. The FPGA-based IDS offers a significant contribution to the field by providing a rapid, accurate, and real-time security solution that addresses the pressing need for effective threat detection and prevention in modern communication networks.
Read MoreDoi: https://doi.org/10.54216/JISIoT.140122
Vol. 14 Issue. 1 PP. 278-292, (2025)
Landslides establish a main geologic threat of strong concern in many parts of the world. The vigor of soil, rocks, or other rubbish moving down a slope can destroy whatever in its track. Landslides happen in an extensive variety of geological and structural settings, geomechanical contexts, and as a response to numerous triggering and loading procedures. They are frequently related to other main natural disasters like floods, earthquakes, and volcanic waves. Landslides occasionally attack without noticeable warning. While only some cases have been examined the earlier, modern monitoring models are certain to deliver a wealth of novel quantitative observations based on SAR (synthetic aperture radar) and GPS technology for mapping the surface velocity area. This study emphasizes the latent of incorporating advanced machine learning (ML) models with geophysical data to improve prediction of landslides and risk management strategies. This study develops a Predicting Landslides with frictional-based Deep Learning using Spider Wasp Optimizer (PLFFDL-SWO) Method. The major intention of the PLFFDL-SWO technique lies in the robust frictional force based on predicting landslides. In the presented PLFFDL-SWO model, Z-score normalization is performed to transform the raw data into compatible format. Then, the long short-term memory (LSTM) model is utilized for the prediction of landslides. LSTM is a recurrent neural network (RNN) type, for predicting landslides based on frictional force data. Traditional landslide prediction methods often struggle with temporal dynamics and nonlinear relationships inherent in geophysical data. Finally, the spider wasp optimizer (SWO) algorithm is exploited for the optimal hyper parameter adjustment of the LSTM model to improve prediction accuracy. The experimentation result investigation of the PLFFDL-SWO technique can be examined by employing a benchmark dataset. The simulation outcomes reported the supremacy of the PLFFDL-SWO technique under different measures
Read MoreDoi: https://doi.org/10.54216/JISIoT.140123
Vol. 14 Issue. 1 PP. 293-304, (2024)