Journal of Intelligent Systems and Internet of Things JISIoT 2690-6791 2769-786X 10.54216/JISIoT https://www.americaspg.com/journals/show/3942 2019 2019 Trustworthy and Interpretable AI in IoT-Based Medical Systems: A Review and Framework for CoT-XAI Integration Professional Engineer Program Department, Faculty of Engineering, Bina Nusantara University, Jakarta 11480, Indonesia; Faculty of Economics and Business, Muhammadiyah University Berau, 77311, Indonesia Faisal Faisal Professional Engineer Program Department, Faculty of Engineering, Bina Nusantara University, Jakarta 11480, Indonesia; Research Interest Group in Education Technology, Bina Nusantara University, Jakarta 11480, Indonesia Sasmoko Sasmoko The use of Artificial Intelligence (AI) in medical diagnosis has rapidly evolved with the adoption of large language models and explainability techniques. This study investigates the intersection of Chain-of-Thought (CoT) reasoning and Explainable AI (XAI) in the development of trustworthy diagnostic systems, particularly within Internet of Things (IoT)-enabled healthcare environments. A systematic review of 106 Scopus-indexed publications (2016–2025) was conducted, supported by topic modeling (LDA) and keyword co-occurrence network analysis to identify dominant research themes and gaps. Findings reveal that while CoT and XAI are actively studied, their integration within real-time, distributed, and resource-constrained medical systems remains limited. Most research emphasizes either performance or interpretability in isolation, with minimal efforts to embed step-wise reasoning into deployable clinical AI pipelines. Moreover, few studies address how CoT can function effectively in edge computing or federated learning scenarios common to IoT infrastructures. To address this gap, we propose a multi-layered conceptual framework that integrates CoT reasoning, machine learning predictors, XAI methods, and IoT deployment models. This framework reflects the shift toward user-centric, transparent, and adaptive AI solutions in smart healthcare. It provides a structured path from multimodal data ingestion to clinically interpretable and real-time decision support. This study contributes a novel perspective on reasoning-driven explainability and offers design guidance for future development of interpretable, scalable, and deployable AI systems in medical applications. 2026 2026 169 184 10.54216/JISIoT.180112 https://www.americaspg.com/articleinfo/18/show/3942