Accurate forecasting of future electricity consumption is necessary to create a satisfactory design for an electricity distribution system. To enhance forecasting accuracy, autoregressive integrated moving average (ARIMA) was compared with hybrid of ensemble empirical mode decomposition (EEMD) plus autoregressive integrated moving average (ARIMA) denoted by (EEMD+ARIMA), to know which model is better performing a historical US monthly electricity consumption from DEC-2000 to SEP-2022 were used. The data were divided into training set (90%) and testing set (10%) to insure the model accuracy. The mean absolute square error, root mean square error, mean absolute error and mean absolute percentage error measurements were used to test the ARIMA and hybrid EEMD+ARIMA performance, the results show that the hybrid EEMD+ARIMA outperforms ARIMA model with the lowest RMSE, MAE, MPE, MAPE, MASE. For the best model, Akaike Information Criterion and Bayesian Information Criterion were applied to choose the best. The results show that the AIC and BIC of the EEMD+ARIMA were lower than the ARIMA model, which indicates that the EEMD+ARIMA is better than the single ARIMA in forecasting of electricity consumption. The conclusion reveals that the hybrid EEMD+ARIMA provides more accurate forecasting and performs significantly better than the ARIMA in forecasting of electricity.
Read MoreDoi: https://doi.org/10.54216/FPA.140101
Vol. 14 Issue. 1 PP. 08-18, (2024)
Due to the very high direct or indirect costs of fraud, banks and financial institutions seek to accelerate the recognition of the activities of fraudsters. The reason for this is its direct effect on serving the customers of these institutions, reducing operating costs and remaining as a reliable and valid financial service provider. On the other hand, in recent years, with the development of information and communication technology, electronic banking has become very popular. In the meantime, it is inevitable to use fraud detection techniques to prevent fraudulent actions in banking systems, especially electronic banking systems. In this paper, a method has been developed that leads to the improvement of fraud detection in information security and cyber defense systems. The main purpose of fraud detection systems is to predict and detect false financial transactions and improve the intrusion detection system using information classification. In this regard, the genetic algorithm, which is known as one of the stochastic optimization methods, is used. At the end, the results of the genetic algorithm have been compared with the results of the decision tree classification and the regression tree. The simulation results show the effectiveness and superiority of the proposed method.
Read MoreDoi: https://doi.org/10.54216/FPA.140102
Vol. 14 Issue. 1 PP. 19-27, (2024)
Due to advancement in technology, various fields have boosted the development of systems that improve people’s life quality, contributing to the welfare of the community by providing relevant and pertinent information for decision-making. On the Internet of Things (IoT), the systems demand measuring and monitoring several environmental variables. The heterogeneity of the captured data and the measuring instruments used to hinder the interoperability among the different components of the IoT. The problems are raised an interest in the development of methods and tools that support the heterogeneity of the data from the sensors, the measurements, and the measuring devices. Some existing tools have resolved some of these interoperability problems. However, it forces to IoT developers to use sensors from specific brands, limiting their generalized use in the community. Furthermore, it is required to solve the challenge of integrating different protocols in a same IoT project. Besides, by generating alerts, it may help making decisions daily, considering the data provided by the sensors. it is required to solve the challenge of integrating different protocols in a same IoT project. To overcome the limitations of the existing glitches, there is need to develop a framework based on network of sensors via software that allows communication-using protocols in a specific environment to monitor the quality of air and to alarm users about this. In this paper, a prototype of proposal is mentioned about the architecture, list of hardware, software and different APIs are utilized to gather data in a systematic way so as users can visualize data in a semantic view. The visualization is shown later by using Matplotlib, Seaborn tools of Machine Learning (ML) and Deep Learning (DL) to plot the temperature along with humidity in a historical span. The result shows that accuracy obtained via Machine Learning Classifier is 87% in the context of Weather Prediction.
Read MoreDoi: https://doi.org/10.54216/FPA.140103
Vol. 14 Issue. 1 PP. 28-39, (2024)
Internet-of-Things (IoT)-based heart disease prediction is a complex task and processing the real collected data directly for remote patient monitoring suffers from the limitations due to the irrelevant data features, affecting the prediction accuracy and raising the security concerns. Hence, the efficient Adaptive ensembled deep Convolution neural network –Bidirectional Long Short Term Memory (Adaptive ensembled deep CNN-BiLSTM ) classifier model is proposed via the fusion of interactive hunt-based CNN and Whale on Marine optimization (WoM)-based deep BiLSTM. The Adaptive optimization developed from the standard hybrid characteristics such as random searching, seeking, attack prohibition, following, and waiting characteristics optimized the fusion parameters of the developed classifier for attaining high detection accuracy. Additionally, the modified Elliptic Curve Cryptography (ECC) based Diffi-Huffman encryption algorithm provides the authentication and security of sensitive patient data in heart disease prediction. The developed model is evaluated with other competent methods in terms of accuracy, sensitivity, specificity as well as F-measure, which are reported as 97.573%, 98.012%, 97.592%, and 97.705% respectively.
Read MoreDoi: https://doi.org/10.54216/FPA.140104
Vol. 14 Issue. 1 PP. 40-55, (2024)
The fusion of computer technologies has had a remarkable impact on contemporary culture, as computers have a substantial impact on practically all aspects of learning; nonetheless, some students have claimed that they still feel uncomfortable when using computers. Test anxiety related to computer–assisted assessment (CAA) is a main factor that is expected to influence students’ academic achievement. Learning math in the digital environment could be a challenging process for students which could increase anxiety levels among them. The current quantitative research study pursues to measure students’ levels of anxiety that result from learning and assessment with computers and discover whether anxiety level is associated with students' academic achievement in tertiary institutions. Descriptive analysis and Correlation Coefficient are the employed statistical techniques to achieve the study objectives. Findings demonstrated that more than 90% of the sample identified with low anxiety levels and there is a noteworthy negative correlation between anxiety levels and students’ academic achievement in math. The findings have implications for practice in the higher education sector in instructional design and university counselling services.
Read MoreDoi: https://doi.org/10.54216/FPA.140105
Vol. 14 Issue. 1 PP. 56-65, (2024)
In video games, artificial intelligence is the effort of going beyond scripted interactions, however complex into the arena of truly interactive systems. To make a game world appear more real, these video games must be responsive, adaptive, and intelligent. For example, in real time strategy games, if there is an enemy seeking/hunting the player, it will be moving in paths, turning around and even maybe jumping in order to find the player. In this case, if the enemy acts/moves more real like human, it will be a benefit for making the game more attractive and exciting. This paper aims to develop a fast, intelligent, and realistic pathfinding approach that makes a user feel that he/she is playing with a human being instead of a machine. To achieve this, this paper presents a Heap Heuristic A* Algorithm as an enhancement of A* algorithm, in which the Chebyshev distance is used to control the smoothness of the resulted path and heapsort algorithm to sort the nodes easily without a lot of memory consumption. Compared to the pervious improved A* algorithms, the proposed algorithm produces a smoother path while consuming less memory to get a final result of human like movement. The experiments results showed that the proposed algorithm reduced the computing time by 66.6% using a grid size of 200*200 compared with A*MOD algorithm. Also, they showed that the proposed work takes almost 91ms to find the path compared to 363 ms and 116 ms when Native A* and A*MOD algorithms are used, respectively, Furthermore, the proposed algorithm performance remains stable in the case of increasing the number of visited nodes, despite the changing order of obstacles.
Read MoreDoi: https://doi.org/10.54216/FPA.140106
Vol. 14 Issue. 1 PP. 66-80, (2024)
In a cloud context, merging complimentary numerous virtual machines (VMs) on an existing physical machine (PM) is the primary method for optimizing physical resources. One well-known area of research concentrates on making better use of VM migration resources when taking into account the dynamically changing resource demands of VMs. Finding the ideal balance between the complexity and performance of the VM migration optimization is the problem here. On the one hand, effective resource reuse is achieved through VM migration planning, and on the other, VM migration frequency is decreased to improve migration efficiency. On the other hand, a cloud data centre’s enormous PM and VM population typically makes migration planning more challenging, which impedes the VM migration decision-making process. By reducing the number of VM migration options to make VM migration planning easier and address these issues, this study recommend a hybrid Ant Colony and Genetic Algorithm (AGO) resource pool architecture. Then, establishing this model as a base, we develop the hybrid resource-reuse optimization method, which maximizes resource utilization with a minimal number of VM migrations. Finally, we evaluate hybrid AGO using simulation testing and real-world trials on a working cloud platform. Compared to similar methods, the findings show that hybrid AGO increases average resource utilization by 15%, reduces the use of PMs by 15%, and decreases the average number of migrations by 30%.
Read MoreDoi: https://doi.org/10.54216/FPA.140107
Vol. 14 Issue. 1 PP. 81-92, (2024)
Glaucoma is a condition where the eyes of human beings are infected due to retinal damage which could result in loss of vision. It generally occurs due to prolonged pressure on the eye and affects the optic nerve if not treated at the earliest stage. However, it is hard for even experts to detect it at the earlier stage. Hence numerous image processing techniques were applied to identify Glaucoma in retinal eyes. The profound purpose of the work is to propose a pre-processing console to remove outliers in the Glaucoma retinal Fundus images using Denoising techniques of pre-processing to enhance the prediction using image pre-processing and computer vision techniques. The model was created with three stages including applying the denoising model using the Median Filtering for Edge Preservation, Contrast Limited Adaptive Histogram Equalization (CLAHE) and optimizing by eliminating irrelevant features using the Black Widow Optimization model and finally evaluating the performance of denoising techniques using accuracy-based predictions. The results showed that after performing a combination of denoising and optimizing techniques, the image quality was enhanced with 97% outperforming the existing models.
Read MoreDoi: https://doi.org/10.54216/FPA.140108
Vol. 14 Issue. 1 PP. 93-104, (2024)
Suicide is a significant issue for public health worldwide since suicide is not something that happens randomly but is influenced by social and environmental variables as well. At the same time, effective early diagnosis and treatment may lead to several positive health and behavioural results. Suicide persists undiagnosed and untreated for many reasons, including denial of sickness and cultural and social disgrace. Through the ubiquity of social media, by expressing opinions, thoughts and everyday struggles with mental health on social media, millions of people are sharing their online identity. As opposed to typical retrospective research that uses self-reported surveys and questionnaires, this study assesses the validity of identifying suicidal symptoms using Twitter tweets that were gathered over more than a year, using a variety of online web-blogging sites as points of reference. For recognizing tweets expressing suicidal thoughts, three sets of characteristics are employed for training the dataset employing base and ensemble classifiers. The Rotation Forest (RF) approach is the preferred baseline, and the Maximum Probability Voting Decision approach is used in seven different labelled classes relating to suicide communication and class demonstrating suicidal thoughts. With the suicidal ideation class scoring 0.76 and the suicidal contents for all seven classes scoring 0.82, this revised model was able to attain an F-measure. To increase awareness of the vocabulary made use of on Twitter to express suicidal thoughts, the findings are summarized by highlighting the predictive principal component of suicide communication in classrooms.
Read MoreDoi: https://doi.org/10.54216/FPA.140109
Vol. 14 Issue. 1 PP. 105-119, (2024)
Even though the transmission and processing speeds of electronic documents have been vastly enhanced, electronic document information may be revealed, counterfeited, tampered with, or otherwise compromised. To maintain corporate success in the marketplace, network security should be essential to the protection of electronic documents. As a result, there is a rising demand for authentication and verification procedures for a variety of important documents, including those used in banking, government, and other transactions as well as certificates and other academic credentials. In recent years, there has been a fast growth of digital watermarking technology, which involves embedding invisible or hidden digital signatures into data without compromising the data's authenticity. Hence, in this paper, we utilize the watermarking technology in the encrypted data using dynamic wavelet transform algorithm to make a document more protected. Now the protected data is sent to cloud database for storage. Integrated digital signature algorithm (SHA-256 + DSA) is proposed in this research to generate digital signature for each document. When recipients download the data, the data is verified for its integrity after extracting the digital signature and encrypted data. This strategy improves record security. We also compare the suggested technique to standard practices and assess its performance based on a variety of indicators to demonstrate its effectiveness.
Read MoreDoi: https://doi.org/10.54216/FPA.140111
Vol. 14 Issue. 1 PP. 120-128, (2024)
Groundwater recharge is essential in establishing reliable groundwater supplies in a region. Groundwater is a vital natural water resource, but its quantity and quality may vary significantly from one area to another. Growing urbanization and population increase have put a significant demand on groundwater supplies. Using Multi-Criteria Decision-Making (MCDM), several studies have identified good areas for recharging groundwater supplies. To help choose between several types of artificial recharge (AR) structures, we have developed an MCDM approach for this research. We used an MCDM fusion methodology to combine various AR criteria with the alternatives. This study collected eight criteria and eight alternatives. We used the average method to compute the weights of the criteria. Then, we used the COCOSO method as an MCDM fusion method to rank the alternatives. The results show that hydrological conditions are the best criteria, and stakeholder engagement is the lowest weight. The sensitivity analysis is performed to show the stability of the results in this study.
Read MoreDoi: https://doi.org/10.54216/FPA.140110
Vol. 14 Issue. 1 PP. 129-137, (2024)
Energy policy implementation relies heavily on assessing savings from retrofitting for energy efficiency. Because of their unique purpose, hospitals need energy-efficient renovations to improve indoor air quality and create a pleasant space for staff and visitors. Because of this crucial distinction, investors' preferences must be considered when deciding on refurbishment plans. Considering elements including energy savings, financial viability, and thermal comfort, this research provides a multi-criteria decision-making (MCDM) approach to guide investors in choosing the most effective remodeling plan for hospital wards. We used the MABAC method as an MCDM fusion method to combine the various criteria and alternatives to select the best one. We used ten criteria and ten alternatives in this study. We compute the weights of criteria to rank the criteria. Then, we used the MABAC fusion to rank the alternatives. The results show the financial viability has the least weight and the building envelope has the highest. We conducted a sensitivity analysis to show the stability of the results in this study.
Read MoreDoi: https://doi.org/10.54216/FPA.140112
Vol. 14 Issue. 1 PP. 138-148, (2024)
Effective procurement of clinical devices in healthcare demands a sophisticated decision-making approach integrating diverse data sources from multiple devices, brands, and suppliers, particularly within the context of information fusion. This study addresses this challenge by proposing an improved best-worst method harmonized with information fusion techniques and multi-criteria decision-making methodologies. The background emphasizes the dynamic nature of healthcare procurement, necessitating systematic strategies for navigating the complexities of device selection and integration. Recognizing the intricacies inherent in this challenge, the problem statement revolves around enhancing the best-worst method to amalgamate data from clinical devices while concurrently evaluating brands and suppliers. This aims to optimize performance and minimize costs within the information fusion paradigm. Our proposed methodology introduces an augmented best-worst approach, encompassing weighted criteria assessment for clinical devices, brands, and suppliers, providing a more adaptable and nuanced decision-making framework tailored to the information fusion landscape. The results showcase a structured evaluation matrix derived from refined weighted criteria, elucidating the relative performance and strengths across various entities within the healthcare procurement ecosystem. Emphasizing reliability, compatibility, innovation, and quality assurance, this process highlights pivotal factors influencing procurement decisions within the realm of information fusion.
Read MoreDoi: https://doi.org/10.54216/FPA.140113
Vol. 14 Issue. 1 PP. 149-157, (2024)
This study addresses the burgeoning challenges in autonomous Maritime navigation by employing information fusion methodologies to assess and manage multifaceted risks. The proliferation of autonomous maritime systems has led to a complex interplay among maritime-related, shore-based remote control, environmental, and emergency management factors, necessitating a comprehensive risk evaluation framework. Leveraging a multi-criteria decision-making approach and employing the Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS), this research presents a methodical analysis of the coupling coordination degree among these risk variables. Through a meticulous examination of historical accident data and information fusion techniques, our study reveals dynamic trends in the comprehensive risk evaluation index, showcasing the evolving nature of risks inherent in autonomous Maritime navigation. The predictive insights gleaned from these analyses forecast an initial increase followed by a peak in accidents, underscoring the urgency for proactive risk mitigation strategies. This study's conclusions emphasize the pivotal role of information fusion methodologies in comprehensively assessing, understanding, and managing risks within autonomous Maritime navigation.
Read MoreDoi: https://doi.org/10.54216/FPA.140114
Vol. 14 Issue. 1 PP. 158-168, (2024)
Constraints perceived in different socioeconomic situations reinforce land use patterns and land cover (LULC) at different levels. However, the statistical information regarding the LULC variations encounters enormous significance for the execution and modelling of appropriate environmental variations and resource management with the available remote sensed data from diverse satellite images and advanced computing technologies; information is generally retrieved from the image classification approaches. However, a broader quantitative analysis of various classification approaches is crucial to choosing an effectual classifier model to acquire appropriate land use regions. We concentrate on the Karavetti region and its related fields in this study. We use a Non-Linear Recurrent Convolutional Neural Network (NLR-CNN) to analyze the data statistically. Well-known techniques such as Support Vector Machine (SVM), Random Forest (RF), and Decision Tree (DT), among others are used to evaluate the model performance. High-resolution images and the data points supplied are also used to assess the accuracy of the categorization and prediction. A confusion matrix is generated where the land cover regions show superior classification accuracy with the fusion model. Also, the NDVI facts and additional metrics like loss, error rate and kappa coefficients are analyzed. Therefore, the outcomes show that the anticipated is considered more robust with better performance to enhance the classification accuracy with the specific land cover regions.
Read MoreDoi: https://doi.org/10.54216/FPA.140115
Vol. 14 Issue. 1 PP. 178-189, (2024)
The COVID-19 pandemic necessitated a swift shift to online learning, affecting students differently. We investigated the experiences of 62 students with disabilities in this new educational landscape. Online learning tools raise concerns about privacy and security, making it crucial to explore students' perceptions in these areas. Our findings reveal that while students with learning disabilities appreciate online learning's flexibility, they need more guidance and support. Neurodiverse students with learning disabilities are particularly aware of the need for a secure online learning environment. These insights underscore the unique educational needs of students with disabilities in online education. In Personal Records, authenticating individuals, especially those with visual impairments, is critical. Our research combines education with cutting-edge technologies, like blockchain and machine learning, to enhance biometric authentication for visually impaired individuals. Proposed work focuses the Highly Secure Blockchain-Based Compressive Sensing (HSBCS) system, which uses blockchain for data integrity and machine learning for secure biometric authentication. Our research focuses on education and includes comprehensive testing and performance assessments. Results highlight the educational value of the HSBCS system for Students, as it significantly improves Personal Records data security and accessibility. In conclusion, our research offers an innovative, secure solution for biometric authentication in Personal Records, with a strong emphasis on education. It empowers Students to access their student information securely and independently, while enhancing education on data security and integrity. This study underscores the importance of integrating emerging technologies into Personal Records to provide better experiences for Students and address their unique educational needs.
Read MoreDoi: https://doi.org/10.54216/FPA.140116
Vol. 14 Issue. 1 PP. 190-220, (2024)
The Internet of Medical Things (IoMT) revolutionizes healthcare, enhances patient care, and optimizes workflows. However, the integration of IoMT introduces concerns related to privacy and security. In addressing these issues and aiming to bolster privacy and data security, this study presents a novel cybersecurity framework based on blockchain (BC) technology. The primary goal is to ensure secure communication among IoMT devices, preventing unauthorized access and tampering with sensitive data. The proposed framework is implemented in a model designed for classifying electrocardiogram (ECG) signals, utilizing two datasets: a Medical Technology Database (MTDB) with a limited sample size and the Massachusetts Institute of Technology–Beth Israel Hospital (MITBIH) dataset with a more extensive sample size. The datasets are subsequently partitioned into training and testing data. Feature extraction and selection are performed using the Pan-Tomkins and genetic algorithms. To enhance security, BC technology is employed to encrypt the test data. Finally, signal classification is performed using the support vector machine (SVM) classifier. Thus, the model trained on the MITBIH dataset outperforms its small data counterpart, achieving an impressive accuracy rate of 99.9%. Additionally, the model exhibits a true positive rate (TPR) and true negative rate (TNR) of 100%, an F-score of 100%, and a positive predictive value (PPV) of 100%.
Read MoreDoi: https://doi.org/10.54216/FPA.140117
Vol. 14 Issue. 1 PP. 221-251, (2024)
This research focuses on the identification of passengers, in dimensions using information fusion as a tool. We recognize the challenges involved in identifying individuals who have been transferred to alternate dimensions and in this study we make use of CatBoost, an open source machine learning algorithm to address this problem. Our approach includes a preprocessing strategy that involves filling in missing values using techniques like priori distribution terms, which helps ensure the reliability of our dataset. By leveraging CatBoosts ability to handle variables and prevent overfitting we achieve results in accurately predicting passenger movement across dimensions. Our analysis highlights CatBoosts effectiveness in identifying patterns within data leading to more precise predictions for interdimensional passenger transportation. Additionally we incorporate techniques, like Greedy TS augmentation to enhance the adaptability of the algorithm and improve precision while reducing bias in modeling. Proof-of-concept experiments demonstrate that the proposed fusion system not only advances predictive modeling in niche domains but also paves the way for broader applications of machine learning in deciphering complex phenomena beyond traditional realms, marking a significant stride in understanding and addressing unconventional challenges.
Read MoreDoi: https://doi.org/10.54216/FPA.140118
Vol. 14 Issue. 1 PP. 252-262, (2024)
Survival analysis remains an important area in predictive modeling, especially in cases where event timing information is critical. This work presents a research effort to investigate the application of LightGBM, a high-performance high-throughput model, to conduct an improved fusion of decisions from multiple trees to reach survival analysis. Our objective is to address the challenge of developing correct predictive models while advancing computational effectiveness. Based on a case study of live disaster scenarios, the proposed approach applies and compares LightGBM with traditional prediction methods, which involve careful design engineering, and model training with LightGBM tree structure refinement. The results obtained from fair experimentation and comprehensive predictive performance evaluation demonstrate the robustness of LightGBM in increasing the accuracy of relevant classification tasks toward survival analysis. Furthermore, the findings highlighted that the combination of excellent tree depth for cutting and multi-thread optimization promotes efficient computational complexity and prediction accuracy.
Read MoreDoi: https://doi.org/10.54216/FPA.140119
Vol. 14 Issue. 1 PP. 263-272, (2024)
Diagnosing Parkinson's Disease (PD) can be quite challenging as it presents with symptoms and lacks biomarkers. Nevertheless, the use of data fusion, which combines types of data using machine learning techniques holds promise, for the timely detection of the disease. In this study, we explore the application of data fusion by employing Principal Component Analysis (PCA) as a step to reduce complexity. We then utilize the K Nearest Neighbors (KNN) classification to improve the accuracy of PD diagnosis. By analyzing nonlinear features associated with PD from a dataset PCA helps us extract attributes while maintaining important variations in the data. Subsequently, KNN is employed to identify patterns in this reduced feature space and effectively distinguish between individuals with PD and those who are healthy. Our results show improvements when using the KNN classifier compared to state-of-the-art approaches. This demonstrates its effectiveness in detecting PD leading to promising outcomes and providing a framework for precise PD diagnosis.
Read MoreDoi: https://doi.org/10.54216/FPA.140120
Vol. 14 Issue. 1 PP. 273-282, (2024)
The aviation industry is constantly changing and to keep up with the trends of air passengers we need predictive models. In this paper, we explore the use of Information Fusion methodologies and classical time series techniques to forecast how many passengers will be traveling by air. Predicting passenger demands is a task, due to various factors that influence travel patterns. The existing models often struggle to capture the dynamics in this field so it's crucial to develop accurate forecasting methods. By leveraging information fusion techniques like smoothing and Autoregressive Integrated Moving Average (ARIMA) our research creates models based on historical data of air passenger volumes. These techniques combine machine learning algorithms and time series analysis to identify dependencies and patterns in the dataset. Through evaluations and comparative analyses, our proposed models demonstrate promising capabilities in forecasting future air passenger volumes. Proof-of-concept experiments based on 5-fold cross-validation demonstrate the efficacy of the proposed approach in capturing underlying trends and seasonality within the dataset.
Read MoreDoi: https://doi.org/10.54216/FPA.140121
Vol. 14 Issue. 1 PP. 283-292, (2024)
There has yet to be a comprehensive investigation on enhancing the diagnostic accuracy of oral disease using handheld smartphone photographic photos. To overcome the difficulties associated with the automatic detection of oral illnesses, we describe an approach based on smartphone image diagnosis powered by a deep learning algorithm. The centered rule method of image capture was offered as a quick and easy way to get high-quality pictures of the mouth. A resampling method was proposed to mitigate the influence of image variability from handheld smartphone cameras, and a medium-sized oral dataset with five types of disorders was developed based on this approach. Finally, we introduce a recently developed deep-learning network to assess oral cancer diagnosis. On 455 test images, the proposed technique showed an impressive 83.0% sensitivity, 96.6% specificity, 84.3% accuracy, and 83.6% F1. The proposed "center positioning" method was about 8% higher than a simulated "random positioning" method; the resampling process had an additional 6% performance improvement. The performance of a deep learning algorithm for detecting oral cancer can be enhanced by capturing oral photos centered on the lesion. Primary oral cancer diagnosis using smartphone-based images with deep learning offers promising potential.
Read MoreDoi: https://doi.org/10.54216/FPA.140122
Vol. 14 Issue. 1 PP. 293-308, (2024)
The world has become more like a small community thanks to the internet, which connects millions of people, businesses, and pieces of technology for a variety of uses. Because of the significant influence these networks have on our lives, maintaining their efficiency is important, which necessitates addressing issues like congestion. In this study, PI-controller gains are adjusted using a variety of optimization strategies to regulate the nonlinear TCP/AQM model. This controller commits controlled pressured signaling characteristics and modifies computer network congestion. First manual tune PI-Controller are used; then several optimization techniques were used to tune PI-controller gains (Particle Swarm Optimization (PSO), Ant-Colony Optimization (ACO) and Simulated Annealing algorithm (SA)) and then Linear Quadratic Regulator theory are used. To test the reliability and effectiveness of each of the suggested controllers, several tests utilizing varied network parameter values, different queue sizes, and extra disturbances were conducted. MATLAB was used for all experiments., the results show the superiority of the LQR controller over PI controller with both manual and optimal tuning techniques.
Read MoreDoi: https://doi.org/10.54216/FPA.140123
Vol. 14 Issue. 1 PP. 309-319, (2024)