EmotiBot – Intelligent Robotic Receptionist with Health Monitoring | IJCSE Volume 10 – Issue 2 | IJCSE-V10I2P3
Table of Contents
ToggleInternational Journal of Computer Science Engineering Techniques
ISSN: 2455-135X
Volume 10, Issue 2
|
Published:
Author
Jinta Johnson, Syam Mohan
Abstract
The increasing demand for intelligent service systems in public environments has led to the development of interactive robotic assistants capable of providing automated information and user support. This paper presents the design and implementation of EmotiBot, an intelligent robotic receptionist that integrates emotion-aware interaction, voice communication, and basic health monitoring functionalities. The system is implemented using a Raspberry Pi 5 as the central processing unit and incorporates multiple sensors and modules to enable natural human–robot interaction. A camera-based computer vision module detects the presence of visitors and initiates communication, while an offline speech recognition system using Vosk enables voice-based interaction without requiring internet connectivity. The robot provides responses through a text-to-speech engine and answers common user queries using a chatbot-based knowledge system. In addition, a pulse sensor integrated with the ADS1115 allows the robot to measure the user’s heart rate, demonstrating basic health monitoring capabilities. An ultrasonic sensor ensures safe interaction distance, and a servo-controlled robotic arm guides users during pulse measurement. Experimental evaluation demonstrates that the proposed system successfully performs visitor detection, voice interaction, and pulse monitoring, providing an efficient and cost-effective solution for intelligent reception systems in institutional environments.
Keywords
Service robots, human–robot interaction, emotion detection, speech recognition, health monitoring, embedded systems.Conclusion
This paper presented the design and implementation of EmotiBot, an intelligent robotic receptionist capable of interacting with visitors through voice communication, facial detection, and basic health monitoring. The system integrates computer vision, speech recognition, and embedded sensor technologies to create an interactive robotic assistant suitable for institutional environments.
The proposed system uses the Raspberry Pi 5 as the central processing unit to coordinate various hardware and software modules. The robot detects visitors using a camera-based face detection system and initiates interaction through a greeting sequence. Voice communication is enabled using an offline speech recognition system implemented with Vosk, allowing the robot to operate without internet connectivity.
In addition to answering user queries through a chatbot-based information system, the robot provides a basic health monitoring feature by measuring the user’s pulse rate using a pulse sensor integrated with the ADS1115.
Experimental evaluation demonstrates that the system successfully performs visitor detection, voice interaction, and pulse monitoring with satisfactory accuracy. The proposed solution provides a cost-effective and practical approach for implementing intelligent reception systems in educational institutions and other service environments.
References
[1] P. Duraisamy, G. C, J. A., and S. R. S., “Implementation of AI-Powered Robotic Chatbot for Intelligent Customer Interaction,” 2025 International Conference on Intelligent Control, Computing and Communications (IC3), Mathura, India, 2025, pp. 360–364, doi: 10.1109/IC363308.2025.10956525.
[2] S. Pophale, H. Gandhi, and A. Gupta, “Emotion Recognition Using Chatbot System,” in Advanced Informatics for Computing Research, Springer, 2021, pp. 547–556, doi: 10.1007/978-981-15-7234-0_54.
[3] F. Fu, L. Zhang, Q. Wang, and Z. Mao, “E-CORE: Emotion Correlation Enhanced Empathetic Dialogue Generation,” in Proc. EMNLP, Singapore, 2023, pp. 10784–10797.
[4] Y. Wang, H. Jin, and Y. Wu, “CTSM: Combining Trait and State Emotions for Empathetic Response Model,” arXiv preprint, arXiv:2403.15516, 2024.
[5] R. Zandie and M. H. Mahoor, “EmpTransfo: A Multi-head Transformer Architecture for Empathetic Dialog Systems,” arXiv preprint, arXiv:2003.02958, 2020.
[6] H. Admoni and B. Scassellati, “Social Eye Gaze in Human–Robot Interaction: A Review,” Journal of Human-Robot Interaction, vol. 6, no. 1, pp. 25–63, 2017.
[7] S. K. Ghosh, R. Ghosh, and S. Bhowmick, “A Rule-Based Chatbot for FAQs in Educational Institutions,” Int. J. Adv. Comput. Sci. Appl. (IJACSA), vol. 12, no. 4, pp. 560–567, 2021.
[8] A. R. Chowdhury, S. Almalki, and M. Abou-Abbas, “Design of an Intelligent Receptionist Robot Using Voice-Based Interaction,” 2022 Int. Conf. Robotics and Automation Engineering (ICRAE), pp. 155–160, 2022.
[9] J. Li, M. Galley, C. Brockett, J. Gao, and B. Dolan, “A Diversity-Promoting Objective Function for Neural Conversation Models,” in Proc. NAACL-HLT, 2016, pp. 110–119.
[10] A. Majumder, S. Ghosh, and P. Banerjee, “A Deep Learning-Based Facial Emotion Recognition System for Human–Robot Interaction,” IEEE Access, vol. 9, pp. 10334–10345, 2021.
[11] M. T. Islam and M. R. Islam, “Pulse Rate Monitoring Using Photoplethysmography Sensor and Arduino,” Journal of Biomedical Engineering and Technology, vol. 9, no. 2, pp. 12–19, 2022.
[12] M. McTear, Z. Callejas, and D. Griol, The Conversational Interface: Talking to Smart Devices, Springer, 2016.
[13] P. Lison and C. Kennington, “OpenDial: A Toolkit for Developing Spoken Dialogue Systems with Probabilistic Rules,” in ACL System Demonstrations, 2016, pp. 67–72.
[14] S. K. Dwivedi and V. Singh, “Research and Reviews on Chatbot Design and Development,” Int. J. Comput. Appl., vol. 181, no. 48, pp. 7–11, 2019.
[15] X. Zhu, T. Li, and K. Sycara, “A Survey on Human–Robot Interaction: Methods, Systems, and Tools,” IEEE Trans. Human-Machine Systems, vol. 50, no. 5, pp. 411–425, 2020.

