Volume 11 , Issue 1 , PP: 01–09, 2026 | Cite this article as | XML | Html | PDF | Full Length Article
Kharchenko Raisa 1 * , Rahul Chauhan 2 , Andino Maseleno 3
Doi: https://doi.org/10.54216/JCHCI.110101
The interaction logs which researchers collected during their information-architecture evaluation process contain detailed proof which shows how users select between successful and unsuccessful navigation routes. The predictive signal displays its initial appearance during task execution yet users exhibit different navigation patterns depending on their current task and interface they are using. The researchers of this study developed an early navigation failure prediction system which uses public interaction data to create task-specific prefix classification models. The study analyzes data from an open dataset which includes 180 participants completing 1800 tasks across six testing conditions that evaluate tree testing and highfidelity prototype navigation. A prefix-structural encoder works together with a regularized task-conditioned logistic model which predicts success based on the first k navigation actions. The researchers assessed model performance through participant-specific validation using three different machine learning techniques which included random forest, extra trees, and gradient boosting. The optimal configuration achieved 0.7833 accuracy, 0.7513 balanced accuracy, 0.8350 F1-score, and 0.7949 ROC–AUC performance at k = 3. The horizon analysis demonstration shows that predictive signals become accessible after users complete their first three actions. The ablation study proves that task conditioning functions as an essential component. The study results demonstrate that early trace analytics enable quick identification of navigation failures in information-architecture research while providing a useful method for customized assessment during usability testing.
Information-architecture evaluation , Navigation patterns , Task-specific prefix classification models
Abb, L., & Rehse, J.-R. (2024). Process-related user interaction logs: State of the art, reference model, and object-centric implementation. Information Systems, 124, 102386. doi: 10.1016/j.is.2024.102386
Atoum, I. (2023). Measurement of key performance indicators of user experience based on software requirements. Science of Computer Programming, 226, 102929. doi: 10.1016/j.scico.2023.102929
Callejo, A., & Macias, J. A. (2025). Enhancing tree testing analysis to improve the usability evaluation of websites. Behaviour & Information Technology, 45(6), 1–19. doi: 10.1080/0144929X.2025.2546971
Esposito, A., Desolda, G., & Lanzilotti, R. (2025). A dataset of interactions and emotions for website user experience evaluation. Scientific Data, 12, 1794. doi: 10.1038/s41597-025-06079-1
Hamdani, R., & Chihi, I. (2025). Adaptive human-computer interaction for industry 5.0: A novel concept, with comprehensive review and empirical validation. Computers in Industry, 168, 104268. doi: 10.1016/ j.compind.2025.104268
Iannuzzi, N., Manca, M., Paterno, F., & Santoro, C. (2025). Combined accessibility validation and monitoring of web sites and PDF documents. Universal Access in the Information Society, 24(3), 2315–2334. doi: 10.1007/s10209-025-01194-7
Jeong, D. H., Jeong, B. K., & Ji, S. Y. (2024). Leveraging machine learning to analyze semantic user interactions in visual analytics. Information, 15(6), 351. doi: 10.3390/info15060351
Kuric, E., Demcak, P., & Krajcovic, M. (2025). Validation of information architecture: Cross-methodological comparison of tree testing variants and prototype user testing. Information and Software Technology, 183, 107740. doi: 10.1016/j.infsof.2025.107740
Macias, J. A., & Borges, C. R. (2024). Monitoring and forecasting usability indicators: A business intelligence approach for leveraging user-centered evaluation data. Science of Computer Programming, 234, 103077. doi: 10.1016/j.scico.2023.103077
Martinez-Rojas, A., Jimenez-Ramirez, A., Enriquez, J. G., & Reijers, H. A. (2024). A screenshot-based task mining framework for disclosing the drivers behind variable human actions. Information Systems, 121, 102340. doi: 10.1016/j.is.2023.102340
Rebmann, A., & van der Aa, H. (2024). Recognizing task-level events from user interaction data. Information Systems, 124, 102404. doi: 10.1016/j.is.2024.102404