Smartphone-Based Machine Learning for Automated Diagnosis Using Eye, Skin, and Voice Signals | IJCSE Volume 10 â Issue 1 | IJCSE-V10I1P10
Table of Contents
ToggleInternational Journal of Computer Science Engineering Techniques
ISSN: 2455-135X
Volume 10, Issue 1
|
Published:
Author
Shaikh Abdul Hannan
Abstract
Applying machine learning techniques to data produced by commonplace gadgets like smartphones offers a chance to improve the standard of medical treatment and diagnosis. Smartphones are perfect for collecting data, giving prompt feedback on diagnosis, and suggesting measures to enhance health. Cloud-based Internet of Things (IoT) apps provide smart cities with cutting-edge ways to reduce traffic accidents brought on by fatigued driving. In this work, we examined cutting-edge methods for identifying risky driving behaviors utilizing three popular IoT-based systems. Effective treatment requires an early and precise diagnosis. Artificial Intelligence (AI) has recently demonstrated encouraging possibilities in helping physicians diagnose pterygium. An overview of AI-assisted pterygium diagnosis is given in this work, along with information on the AI methods that are employed, including computer vision, deep learning, and machine learning. Dehydration is brought on by the body losing fluids, which interferes with normal bodily processes and leads to health issues. The current methods for detecting dehydration in clinical and laboratory settings are costly, time-consuming, and need visiting medical facilities, which are frequently absent in impoverished areas. We create a deep learning model based on Siamese networks to identify changes in facial features caused by dehydration that are invisible to the average person. Our model is light enough to operate on a smartphone processor and has an overall accuracy of 76.1%. By integrating it into the backdrop, we create a smartphone app called “Dehydration Scan” that merely takes pictures of people’s faces and determines how hydrated they are. People can use oral rehydration solutions and prevent severe dehydration if they are aware of dehydration early on.
Keywords
Rehydration Solutions, Human Body, Smartphones Presents, Laboratory-Based, Facial Landmarks, Hydration Status, Deep Learning Model.Conclusion
Using multi-sensor, mobile, and cloud-based computing architectures, a number of inexpensive computerized fatigue detection systems (DFDs) have been created to assist drivers. In this work, we examined cutting-edge methods for identifying risky driving behaviors utilizing three popular IoT-based systems. Additionally, using both conventional and the most recent deep learning-based methods, we conducted comparisons with previous research in various parameter settings. To the greatest extent of our knowledge, no research has been done on this subject. This article’s originality is demonstrating the key distinctions between cloud-based, smartphone, and multi-sensor systems in multimodal feature processing. We spoke about every issue that machine learning approaches have encountered recently, particularly deep neural networks to forecast driver hypervigilance states (2-class and 4-class), particularly with regard to these three designs.Â
Although the present AI-based pterygium automatic diagnosis system is still in its early phases of development, we think this technology will eventually become a crucial tool for pterygium diagnosis and treatment.
Our created siamese network-based dehydration detecting model performs much better than the baseline models, achieving at least 10% higher accuracy and 20% higher specificity. The most impacted facial landmark from dehydration, according to experimental data, is the eyes. Our smartphone-based dehydration diagnostic tool might not be as accurate as a clinical diagnosis made by qualified medical professionals or technology.
References
[1]Tu, W.; Wei, L.; Hu, W.; Sheng, Z.; Nicanfar, H.; Hu, X.; Ngai, E.C.H.; Leung, V.C. A survey on mobile sensing-based mood-fatigue detection for drivers. In Smart City 360°; Springer: Cham, Switzerland, 2016; pp. 3â15.
[2]Pratt, S.G.; Bell, J.L. Analytical observational study of nonfatal motor vehicle collisions and incidents in a light-vehicle sales and service fleet. Accid. Anal. Prev. 2019, 129, 126â135.
[3]Koesdwiady, A.; Soua, R.; Karray, F.; Kamel, M.S. Recent trends in driver safety monitoring systems: State of the art and challenges. IEEE Trans. Veh. Technol. 2017, 66, 4550â4563.
[4]Chhabra, R.; Verma, S.; Krishna, C.R. A survey on driver behavior detection techniques for intelligent transportation systems. In Proceedings of the 2017 7th International Conference on Cloud Computing, Data Science & Engineering-Confluence, Noida, India, 12â13 January 2017; IEEE: Noida, India, 2017; pp. 36â41.
[5]AbrĂ moff MD, Lavin PT, Birch M, Shah N, Folk JC. Pivotal trial of an autonomous AI-based diagnostic system for detection of diabetic retinopathy in primary care offices. NPJ Digit Med 2018;1:39.
[6]Yang WH, Shao Y, Xu YW, et al. Guidelines on Clinical Research Evaluation of Artificial Intelligence in Ophthalmology (2023).
[7]Guoji Yanke Zazhi (Int Eye Sci) 2023;23(7):1064-1071.
[8]Boudry C, Al Hajj H, Arnould L, Mouriaux F. Analysis of international publication trends in artificial intelligence in ophthalmology. Graefes Arch Clin Exp Ophthalmol 2022;260(5):1779-1788.
[9]Grzybowski A, Brona P, Lim G, Ruamviboonsuk P, Tan GSW, Abramoff M, Ting DSW. Artificial intelligence for diabetic retinopathy screening: a review. Eye (Lond) 2020;34(3):451-460.
[10]Dong L, Yang Q, Zhang RH, Wei WB. Artificial intelligence for the detection of age-related macular degeneration in color fundus photographs: a systematic review and meta-analysis. EClinicalMedicine 2021;35:100875.
[11]Chen, Y. C., Chu, Y. C., Huang, C. Y., Lee, Y. T., Lee, W. Y., Hsu, C. Y., … & Cheng, Y. F. (2022). Smartphone-based artificial intelligence using a transfer learning algorithm for the detection and diagnosis of middle ear diseases: A retrospective deep learning study. EClinicalMedicine, 51.
[12]Majumder, S., & Deen, M. J. (2019). Smartphone sensors for health monitoring and diagnosis. Sensors, 19(9), 2164.
[13]Banik, S., Melanthota, S. K., Arbaaz, Vaz, J. M., Kadambalithaya, V. M., Hussain, I., … & Mazumder, N. (2021). Recent trends in smartphone-based detection for biomedical applications: a review. Analytical and Bioanalytical Chemistry, 413(9), 2389-2406.
[14]Bui, T. H., Thangavel, B., Sharipov, M., Chen, K., & Shin, J. H. (2023). Smartphone-based portable bio-chemical sensors: exploring recent advancements. Chemosensors, 11(9), 468.
[15]Lamonaca, F., Polimeni, G., BarbĂŠ, K., & Grimaldi, D. (2015). Health parameters monitoring by smartphone for quality-of-life improvement. Measurement, 73, 82-94.
[16]Shahira, K. C., Sruthi, C. J., & Lijiya, A. (2022). Assistive technologies for visual, hearing, and speech impairments: Machine learning and deep learning solutions. Fundamentals and Methods of Machine and Deep Learning: Algorithms, Tools and Applications, 397-423.
[17]Lee, B. G., & Chung, W. Y. (2012). A smartphone-based driver safety monitoring system using data fusion. Sensors, 12(12), 17536-17552.
[18]Hasan, M. K., Aziz, M. H., Zarif, M. I. I., Hasan, M., Hashem, M. M. A., Guha, S., … & Ahamed, S. (2019). HeLP ME: Recommendations for non-invasive hemoglobin level prediction in mobile-phone environment. JMIR Mhealth Uhealth.
[19]Kargarandehkordi, A., & Washington, P. (2023). Computer vision estimation of stress and anxiety using a gamified mobile-based ecological momentary assessment and deep learning: Research protocol. medRxiv, 2023-04.
[20]Arumugam, S., Colburn, D. A., & Sia, S. K. (2020). Biosensors for personal mobile health: a system architecture perspective. Advanced materials technologies, 5(3), 1900720.
[21]Sidrah Liaqat, Kia Dashtipour, Kamran Arshad, and Naeem Ramzan. 2020. Non-invasive skin hydration level detection using machine learning. en. Electronics (Basel), 9, 7, (July 2020), 1086.
[22]Chenbin Liu, Francis Tsow, Dangdang Shao, Yuting Yang, Rafael Iriya, and Nongjian Tao. 2016. Skin mechanical properties and hydration measured with mobile phone camera. IEEE Sens. J., 16, 4, (Feb. 2016), 924â930.
[23]Gengchen Liu, Kyle Smith, and Tolga Kaya. 2014. Implementation of a microfluidic conductivity sensor â a potential sweat electrolyte sensing system for dehydration detection. In 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE, Chicago, IL, USA, 1678â1681.
[24]Joshi, V.P.; Jain, A.; Thyagrajan, R.; Vaddavalli, P.K. Anterior segment imaging using a simple universal smartphone attachment for patients. In Seminars in Ophthalmology; Taylor & Francis: Abingdon, UK, 2022; pp. 232â240.
[25]Adlung, L.; Cohen, Y.; Mor, U.; Elinav, E. Machine learning in clinical decision making. Med 2021, 2, 642â665.
[26]Armstrong, G.W.; Kalra, G.; Arrigunaga, S.D.; Friedman, D.S.; Lorch, A.C. Anterior Segment Imaging Devices in Ophthalmic Telemedicine. Semin. Ophthalmol. 2021, 36, 149â156.
[27]Dutt, S.; Vadivel, S.S.; Nagarajan, S.; Galagali, A.; Christy, J.S.; Sivaraman, A.; Rao, D.P. A novel approach to anterior segment imaging with smartphones in the COVID-19 era. Indian J. Ophthalmol. 2021, 69, 1257â1262.
[28]Huang H, Zhang B, Zhong J, Han G, Zhang J, Zhou H, Mao T, Liu Y. 2023. The behavior between fluid and structure from coupling system of bile, bile duct, and polydioxanone biliary stent: a numerical method. Medical Engineering & Physics 113:103966.
[29]Lee, J.; Kim, J.-W.; Lee, J. Mobile Personal Multi-Access Edge Computing Architecture Composed of Individual User Devices. Appl. Sci. 2020, 10, 4643.
Zebin, T.; Scully, P.J.; Peek, N.; Casson, A.J.; Ozanyan, K.B. Design and Implementation of a Convolutional Neural Network on an Edge Computing Smartphone for Human Activity Recognition. IEEE Access 2019, 7, 133509â133520.
Smartphone-Based Machine Learning for Automated Diagnosis Using Eye, Skin, and Voice SignalsDownload





