Wireless Body Sensor Network (BSN) comprises wearables with different sensing, processing, storing, and broadcast abilities. Once several devices acquire the data, multi-sensor fusion was needed for transforming erroneous sensor information into maximum quality fused data. Deep learning (DL) approaches are utilized in different application domains comprising e-health for applications like activity detection, and disease forecast. In recent times, it can be demonstrated that the accuracy of classification techniques is enhanced by the combination of feature selection (FS) approaches. This article develops a Multi-sensor Data Fusion based Medical Data Classification Model using Gorilla Troops Optimization with Deep Learning (MDFMDC-GTODL) algorithm. The proposed MDFMDC-GTODL method enables collection of various daily activity data using different sensors, which are then fused to produce high-quality activity data. In addition, the MDFMDC-GTODL technique applies optimal attention based bidirectional long short term memory (ABLSTM) for heart disease prediction. In this study, Gorilla Troops Optimization Algorithm based FS (GTOA-FS) technique is involved to improve the classification performance. The simulation outcome of the MDFMDC-GTODL technique are validated and the results are investigated in different prospects. A wide-ranging simulation analysis stated the better performance of the MDFMDC-GTODL method over other compared approaches.
Read MoreDoi: https://doi.org/10.54216/FPA.150101
Vol. 15 Issue. 1 PP. 08-18, (2024)
Electrical loading prediction is a key aspect of the power system governing, operating, and scheduling. Energy suppliers can control the running system cost by using a lot of information it provides thereby optimizing the power system operation performance. The demand for the electricity well forcasted means more than half of their energy efficiency. Implementation of this work traces out an in-depth detail of integrated quality time series forecasting models on the prediction of electrical consumption. The primary goal of the study is to assess the performance of two state-of-the-art forecasting models: Deep LSTM version and long short-term memory (LSTM) neural networks, Seasonal autoregressive integrated ma. The main task is to evaluate the models’ precision in predicting daily energy consumption based on the historical demand data, holiday data and other time-related lines of evidence. The performance of the models is assessed based on the Mean Absolute Percentage Error (MAPE). The method covers feature engineering, the data preparation, model selection, and assessment. The generated MAPE values illuminated the performance of the models— SARIMA performed relatively inaccurately, and LSTM and deep LSTM significantly improved, obtaining a very good MAPEs of 7.5% and 7.45%, respectively. Notably, the deep LSTM version shows a superiority in prediction compared to other models, with particular emphasis on capturing the temporal relationships. This study makes a great contribution to the field of energy forecasting as it shows applicability of LSTM- and SARIMA- based models for the very good forecast of the consumption power. It captures the attention on how the LSTM networks at 20% of depth; may help in improving prediction accuracy when there are complex patterns and long-distance dependence is a concern. To utility companies, the grid operators and lawmakers who are out to harness every energy resource, to cut the costs, and ensure a continuous flow of electricity; such results are so very helpful.
Read MoreDoi: https://doi.org/10.54216/FPA.150102
Vol. 15 Issue. 1 PP. 19-31, (2024)
Cryptography is a well-known technology for providing confidential data transfer via asymmetric or symmetric algorithms with public or private keys. Secure data transmission over networks using unreliable, untrusted channels is made achievable by cryptography. As a result of the quick digital transition, network traffic is rapidly rising, and consumers remain constantly connected and accessible online. Extortions, including transforming, spoofing, and tracking data through unauthorised access, are quite widespread over the internet. Many more cryptographic algorithms already exist, but they need to be consistently improved and optimized for better performance within the constraints imposed by new technology and a wide variety of application domains. To overcome these limitations, we suggest a novel FishyCurve Cipher technique by combining an elliptic curve-based algorithm (ECA) with a Threefish cipher algorithm (TCA) to improve cipher security and performance, the data will be encrypted using TFCA, and the key will be secured by the EC technique. To verify data integrity, a digital signature algorithm (DSA) is employed. To evaluate the effectiveness of the proposed FishyCurve Cipher technique, comprehensive experimental tests have been conducted. The results clearly demonstrate its superiority in terms of cipher security when compared to traditional encryption algorithms. Its outstanding resilience against a wide range of attacks makes it a strong method of securing resilience infrastructure from malicious actors who seek to compromise data confidentiality and integrity.
Read MoreDoi: https://doi.org/10.54216/FPA.150103
Vol. 15 Issue. 1 PP. 32-44, (2024)
Nowadays, intelligent information technology can implement high-level information processing and decision-making activities that can support risk assessment of autonomous. Risk assessment is a critical process for deploying autonomous ships, ensuring these innovative vessels' safe and efficient operation. There is a need to identify, analyze, and mitigate potential risks associated with system reliability, collision avoidance, cybersecurity, environmental conditions, human interaction, regulatory compliance, sensor performance, data integrity, emergency response, and testing and validation. This work provides an overview of the essential considerations and objectives of risk assessment in autonomous boats. We used the multi-criteria decision-making model to deal with various criteria. The Ranking of Alternatives through Functional Mapping Criterion Sub-Intervals into Single Interval (RAFSI) method is applied to rank the alternatives. We used the ten criteria and twenty options in this study. The results show that the proposed framework can provide a comprehensive risk assessment framework that can enable stakeholders to gain insights into potential hazards and vulnerabilities unique to autonomous ships.
Read MoreDoi: https://doi.org/10.54216/FPA.150104
Vol. 15 Issue. 1 PP. 45-58, (2024)
The advent of the gig economy has triggered an unprecedented transformation in labor markets worldwide. Leveraging an intricate network analysis, this paper aims to delve into the multi-layered complexities of labor market metamorphosis within the context of a digital gig economy. We construct a bipartite labor-market network model that allows us to explore the nexus between gig workers and employment platforms using a robust set of parameters – connectivity, centrality, and clustering coefficient. Consequently, our empirical investigation elucidates how traditional labor market paradigms are being disrupted, engendering the emergence of new socio-economic stratifications. The results unveil a counterintuitive network structure where high centrality does not necessarily correlate with enhanced economic benefits for gig workers. Moreover, the findings underscore the potential pitfalls of a skewed clustering coefficient, manifesting as increased vulnerability to systemic shocks. The ubiquity of digital technology has engendered a seismic shift in economic frameworks, predominantly by initiating the concept of the gig economy. Although a plethora of research has been conducted on the gig economy from various disciplinary vantage points, limited endeavors have been undertaken to explore the intricacies of labor market changes via a network analysis paradigm. As a result, this study provides vital insights for policymakers, platform operators, and labor market participants, promoting a nuanced understanding of the gig economy’s implications for labor market architecture.
Read MoreDoi: https://doi.org/10.54216/FPA.150105
Vol. 15 Issue. 1 PP. 59-65, (2024)
This paper analyzes the development of the activities of small business entities through the fusion of digital technologies in ensuring the social and economic development of Uzbekistan, its significant aspects in the development of the country’s economy. In Uzbekistan the economic, social and legal levels of small business entities in organizing their activities through digital technologies were determined. 5 directions of its economic and social support were analyzed based on today's policy, and the advantage of using the digital economy in the activities of small business entities compared to large enterprises was determined. The research employs a confluence of descriptive statistics, panel data regression models, and time-series analysis to unravel the intricate correlation matrix that binds various dimensions of investment outcomes within the country's distinct economic climate. A conclusion was made based on the results of the study of the main economic development indicators of the development of small business entities through digital technologies. In assessing the effectiveness of the development of the activities of small business entities through digital technologies, the effectiveness of digitalization on the activities of small business entities was determined using the Cobb-Douglas production function. Proposals and recommendations were developed according to the forecasting results.
Read MoreDoi: https://doi.org/10.54216/FPA.150106
Vol. 15 Issue. 1 PP. 66-77, (2024)
Protecting Software-Defined Networking (SDN) environments from intrusions and unauthorized access requires a high level of security. Security issues have arisen because of the widespread use of Software-Defined Networking (SDN), especially regarding intrusions that may cause disruptions to network operations by gaining unauthorized access. Intrusion is a danger to an SDN architecture's security, efficacy, and dependability because it involves manipulation or disruption. To improve SDN security through Intrusion Detection Systems (IDS), this study suggests a novel approach that makes use of Graph Convolutional Networks (GCN) and Deep Reinforcement Learning (DRL). The approach, which makes use of the NSL-KDD dataset, shows enhanced performance measures for intrusion detection, such as accuracy (93.8%), recall (93%), F1-score (92%), and precision (94.2%). This work establishes the groundwork for resilient infrastructure against threats and advances the security posture of SDN environments.
Read MoreDoi: https://doi.org/10.54216/FPA.150107
Vol. 15 Issue. 1 PP. 78-87, (2024)
Accurate classification of malignant and benign skin lesions is crucial in dermatology. In this novel research, we propose robust image analysis methodology for skin lesion classification that integrates color-based segmentation with luminosity analysis. Our approach is evaluated on a dataset of 400 skin images, with equal representation of malignant and benign samples. By computing mean color values for the Red Channel Color (RCC), Green Channel Color (GCC), and Blue Channel Color (BCC) in groups of 10 samples, we establish a classification range for precise diagnosis, this research introduces a novel dimension by harnessing the potential of the CIE Lab Color characteristics for skin lesion detection as the most reliable scale for distinguishing between benign and malignant samples. The smaller and more thought variety ranges saw in the glow examination improve difference and perceivability, consequently working with prevalent sore separation. By featuring the meaning of mean histograms for each variety channel, this complete exploration adds to propelling the area of dermatology and presents an imaginative methodology that holds guarantee for PC helped conclusion frameworks in skin malignant growth discovery.
Read MoreDoi: https://doi.org/10.54216/FPA.150108
Vol. 15 Issue. 1 PP. 88-97, (2024)
The increasing adoption of cloud computing in healthcare presents immense opportunities for disease prediction, while raising critical privacy concerns. This study proposes a novel privacy-preserving scheme that leverages advanced cryptographic techniques, blockchain technology and deep learning approach within a cloud platform, to ensure secure data handling and accurate disease prediction. The proposed methodology encompasses authentication, encryption, blockchain-based transmission, and a deep learning-based heart disease prediction system (HDPS). Through rigorous authentication protocols and two-level security mechanisms, patient data is securely encrypted using RSA and Blowfish encryption before storage in the cloud. Blockchain technology facilitates secure data transmission, ensuring integrity and traceability. At the receiver end, data decryption precedes input into the HDPS, comprising artificial neural networks (ANN), convolutional neural networks (CNN), and recurrent neural networks (RNN). The HDPS incorporates data preprocessing, feature extraction, feature selection, and a deep learning-based prediction model, achieving remarkable accuracy (0.9941) in heart disease prediction. Implemented in MATLAB, this approach offers a robust framework for privacy-preserving heart disease prediction in cloud-based healthcare systems.
Read MoreDoi: https://doi.org/10.54216/FPA.150109
Vol. 15 Issue. 1 PP. 98-119, (2024)
Autism spectrum disorder (ASD) is a neurological and developmental condition impacting individuals' interactions with others, communication, learning, and behavior. While autism can be identified at any point in life, it is characterized as a "developmental disorder" due to the typical onset of symptoms within the initial two years of life. As individuals with ASD transition from childhood to adolescence and young adulthood, they might face challenges in establishing and having friendships, communicating with both peers and adults, and understanding the expected behaviors in education or work. The current study introduces a novel approach for suggesting the right behavioral strategy to assist Autistic Spectrum Disorder with the help of supervised BERT (Bidirectional Encoder Representations from Transformers). Our model achieved an accuracy of 88% with the help of BERT to predict the right behavioral trait. This research demonstrates cost-effectiveness and efficiency in offering recommendations for ASD, making it suitable for applications requiring near real-time outcomes.
Read MoreDoi: https://doi.org/10.54216/FPA.150110
Vol. 15 Issue. 1 PP. 120-127, (2024)
The analysis of sentiment in product reviews across diverse platforms such as e-commerce website and social media presents a challenging task due to the inherent differences in user behaviour and review formats. This research introduces an innovative methodology for detecting positive and negative deviations in cross-domain product reviews using Adaptive Stochastic Deep Networks (ASDN) tailored for multi-platform sentiment analysis. ASDNs possess mechanisms that enable dynamic adaptation to changes in data distributions, domain shifts, or varying complexities within the input data. The proposed framework aims to capture refined variations in sentiment expression across disparate platforms by incorporating adaptive stochasticity within deep neural networks. By adapting dynamically to changes in review styles, language use, and sentiment patterns unique to each platform, the ASDN architecture facilitates the identification of nuanced sentiment shifts. Through extensive experimentation on comprehensive datasets spanning Amazon, Facebook, and Instagram, the efficacy of the ASDN model in detecting positive and negative sentiment deviations across diverse platforms is demonstrated. This research contributes to advancing the understanding of sentiment dynamics across distinct social platforms and e-commerce sites, paving the way for more robust and adaptable models in cross-domain sentiment analysis.
Read MoreDoi: https://doi.org/10.54216/FPA.150111
Vol. 15 Issue. 1 PP. 128-143, (2024)
The emergence of low-cost red, green, and blue (RGB) cameras has significantly impacted various computer vision tasks. However, these cameras often produce depth maps with limited object details, noise, and missing information. These limitations can adversely affect the quality of 3D reconstruction and the accuracy of camera trajectory estimation. Additionally, existing depth refinement methods struggle to distinguish shape from complex albedo, leading to visible artifacts in the refined depth maps. In this paper, we address these challenges by proposing two novel methods based on the theory of photometric stereo. The first method, the RGB ratio model, tackles the nonlinearity problem present in previous approaches and provides a closed-form solution. The second method, the robust multi-light model, overcomes the limitations of existing depth refinement methods by accurately estimating shape from imperfect depth data without relying on regularization. Furthermore, we demonstrate the effectiveness of combining these methods with image super-resolution to obtain high-quality, high-resolution depth maps. Through quantitative and qualitative experiments, we validate the robustness and effectiveness of our techniques in improving shape transformations for RGB cameras.
Read MoreDoi: https://doi.org/10.54216/FPA.150112
Vol. 15 Issue. 1 PP. 144-156, (2024)
Optimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiments reveal unique patterns in algorithmic behaviors by workload. In the 15-task and 5-node scenario, the GA and PSO algorithms outclass all others, completing 100 percent of tasks before deadlines, Task 5 was a bane to the ACO algorithm. The study proposes a more extensive system that promotes an adaptive algorithmic approach based on workload characteristics. Numerically, the GA and PSO algorithms triumphed completing 100 percent of tasks before their deadlines in the face of 10 tasks and 5 nodes, while the ACO algorithm stumbled on certain tasks. As it is stated in the study, The above-mentioned system offers an integrated approach to ill-structured problem of task scheduling and resource allocation. It offers an intelligent and aggressive scheduling scheme that runs asynchronously when a higher number of tasks is submitted for the completion in addition to those dynamically aborts whenever system load and utilization cascade excessively. The proposed design seems like full-fledged solution over project scheduling or resource allocation issues. It highlights a detailed method of the choice of algorithms based on semantic features, aiming at flexibility. Effects of producing quantifiable statistical results from the experiments on performance empirically demonstrate each algorithm performed under various settings.
Read MoreDoi: https://doi.org/10.54216/FPA.150113
Vol. 15 Issue. 1 PP. 157-179, (2024)
The adversarial machine learning approaches are modelled to provide a defence mechanism during the prediction of cloning and jamming attacks launched over the wireless communication process. The transmitter is supplied with a pre-trained classifier to analyze the status of the channel based on the sensing nature and determine the other transmission process. The learning method gathers all acknowledgements and fusion made between nodes and the channel's current state to build a learning model that can accurately identify the succeeding transmission constraint caused by network jamming. In this instance, compared to random jamming procedures, an inventive anti-clone detection strategy aims to minimize the number of clones and jamming found throughout the network model. The transmitter analyzes the power restrictions over the sensor networks using the learning-based fisher score (FS). Here, an adversarial network model (ANM-FS) is fused to diminish the computational time to collect the training dataset by examining the incoming samples. With this defence mechanism, the transmitter intends to predict the false prediction rate (FPR) and design a better model for providing a reliable classifier. Systematically, the transmitter identifies the floating of attacks over the network model and adopts the defending mechanism to mislead the injected clone, enhancing the throughput and reducing the prediction error.
Read MoreDoi: https://doi.org/10.54216/FPA.150114
Vol. 15 Issue. 1 PP. 180-195, (2024)
Depression, or Major Depressive Disorder, is a serious and common medical condition that affects people worldwide. It negatively affects the person's feelings, thoughts, and actions. Depression causes a loss of interest in activities he enjoyed in the past. It can lead to physical and emotional problems that hamper the daily activities at work and home. In recent years, much research has been done to identify Depression through various modalities of image, speech, and text through artificial intelligence. Social media is an important medium where many discussions and mentions happen about Depression. The current study proposes a novel approach to understand how the depressed and non-depressed communicate differently with the help of Topic Modeling with latent-Dirichlet allocation (LDA) and also detect depression with the help of Robustly Optimized BERT Pretraining Approach (RoBERTa). The current study achieved an accuracy of 66.4% for the depression detection model, which outperformed the previous approaches with similar methodology. The current study is helpful for self-diagnosis of signs of Depression at very early stages.
Read MoreDoi: https://doi.org/10.54216/FPA.150115
Vol. 15 Issue. 1 PP. 196-204, (2024)
The integration of Information and Communication Technologies (ICT) into notarial activities has revolutionized the way procedures are processed by significantly enhancing speed and legal security, which are key aspects for user satisfaction. This shift responds to the growing demand for fast and secure notarial services, where efficiency and legal protection are priorities. Through the analysis conducted with the neutrosophic RAFSI method, risks derived from digitalization have been identified and classified, proposing effective solutions for their mitigation. Among these, the need to update regulations to adapt them to the digital context and the importance of training notaries in digital competencies and cyber security stand out. These measures are focused not only on streamlining notarial procedures but also on reinforcing trust in notarial services, marking a significant advance toward the modernization of notarial practice in the digital era. In conclusion, the fusion of ICT with notarial activities, supported by risk control and supervision, has effectively balanced service speed with legal security, meeting the current expectations of users.
Read MoreDoi: https://doi.org/10.54216/FPA.150116
Vol. 15 Issue. 1 PP. 205-213, (2024)
This work investigates the use of ensemble machine-learning algorithms to optimize loop-tiling in computing systems, with the goal of improving performance by predicting optimal tile sizes. It compares two approaches: independent training and averaging (soft voting) and an ensemble technique (hard voting) that employs models such as linear regression, ridge regression, and random forests. Experiments on an Intel Core i7-8565U CPU with several benchmark programs revealed that the hard voting Ensemble Approach beat the soft voting technique, providing more dependable and accurate predictions across a range of computing environments. The hard voting technique reduced execution time by around 87.5% for dynamic features and 89.89% for static features, whereas the soft voting approach showed an average drop of 75.45% for dynamic features and 78.13% for static characteristics. This work demonstrates the effectiveness of hard voting ensemble machine learning approaches in improving cache efficiency and total execution time, opening the way for future advances in high-performance computing settings.
Read MoreDoi: https://doi.org/10.54216/FPA.150117
Vol. 15 Issue. 1 PP. 214-226, (2024)
The surge of Venezuelan migration has left indelible marks on various regions, notably within Babahoyo Canton, presenting both challenges and opportunities for local communities. This study delves into the socio-economic impacts of Venezuelan migration on Babahoyo throughout 2023, employing a sophisticated blend of compensatory fuzzy logic and information fusion techniques. These methodologies offer a nuanced exploration of the migration's effects, capturing the complex interplay between local perceptions, labor market fluctuations, and broader economic dynamics. The findings underscore the critical need for comprehensive integration strategies that not only facilitate the socio-cultural adaptation of migrants but also leverage local public policies to mitigate adverse impacts while maximizing potential benefits. Ultimately, this research aims to illuminate pathways for informed decision-making and policy development, ensuring that responses to Venezuelan migration in Babahoyo are both effective and empathetic, thus fostering a more integrated and resilient community.
Read MoreDoi: https://doi.org/10.54216/FPA.150118
Vol. 15 Issue. 1 PP. 227-237, (2024)
Data management is developing rapidly, and we need solutions that can handle massive volumes of diverse data. Especially for cloud-based data fusion and global network designs. Our research offers a fresh solution. Each difficult formula in this manner improves the system. Standardizing, matching, translating, and merging data from several sources is the fundamental strategy for data integration and management. We found that this alternative is superior to standard data management systems for growing, working fast, consistently, securely, and accurately integrating data, as well as cost-effectiveness. Data's visual presentation enhances the method's advantages and shows its potential. This research proves the technique works and illustrates how it may be utilized to advance the field. Supporting today's sophisticated data systems is a major advance. It's a solid, scalable data management solution that can evolve.
Read MoreDoi: https://doi.org/10.54216/FPA.150119
Vol. 15 Issue. 1 PP. 238-249, (2024)
In this work, we rethink the phonetic growing experience in scene message recognition and abandon the broadly acknowledged complex language model. We present a Visual Language Displaying Organization (Vision LAN), which considers the visual and etymological data as an association by straightforwardly enriching the vision model with language capacities, rather than prior strategies that look at the visual and semantic data in two free designs. Specifically, we present person shrewd impeded highlight map message recognition in the preparation stage. At the point when visual prompts (like impediment, commotion, and so on) are perplexed, this activity guides the vision model to utilize both the visual surface of the characters and the phonetic data in the visual setting for recognition. To improve the performance of visual language models devoted to item identification and recognition in irregular scene images, the abstract investigates the critical function that context plays. Distinguished by intricate and ever-changing visual components, irregular sceneries pose distinct difficulties for conventional computer vision systems.
Read MoreDoi: https://doi.org/10.54216/FPA.150120
Vol. 15 Issue. 1 PP. 250-261, (2024)