Published August 3, 2023 | Version v1
Journal article Open

Home Security System Using IoT and Facial Emotion Detection & Alleviation Effects

  • 1. 2nd year Student, Department of ECE, SECAB.I.E.T,Vijayapura,Karnataka, India
  • 2. Assistant Professor, Department of ECE, SECAB.I.E.T,Vijayapura,Karnataka, India

Description

Everyone has good mood/bad, if her/she  hear encouragement statements, may feel better. Facial recognition system refers to a computer application or technology that can identify or verify individuals based on their facial features from digital images or video frames. The system achieves this by comparing specific facial characteristics with those stored in a face database. Hence, a system is developed which captures the footages of person and depending on expressions, takes necessary actions. The proposed system uses Viola-Jones algorithm. This algorithm has four stages: haar feature selection, creating an integral image, adaboost training and cascading classifiers. Integral images are a technique used in computer vision and object detection to efficiently calculate the sum of pixel intensities within rectangular regions, which helps in identifying specific patterns or objects in an image. These integral image-based features are commonly used in various object detection algorithms like Haar cascades, which are used in face detection, among other applications. After creating integral image by Haar feature and with the help of training an efficient algorithm is applied to train and detect the region of interest of our facial images and provide response to the emotion of the viewer and alleviate the viewer’s emotion by musical therapy and Updating the identified person information to authorized user via Internet of things(IoT) Module.

Files

Home Security System Using IoT and Facial Emotion Detection & Alleviation Effects.pdf

Additional details

References

  • 1. Yu, Y. C., You, S. D., & Tsai, D. R. (2012). Magic mirror table for social-emotion alleviation in the smart home. IEEE Transactions on Consumer Electronics, 58(1), 126-131.
  • 2. Pantic, M., & Patras, I. (2006). Dynamics of facial expression: recognition of facial actions and their temporal segments from face profile image sequences. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 36(2), 433-449.
  • 3. Cohn, J. F. (2006, November). Foundations of human computing: facial expression and emotion. In Proceedings of the 8th international conference on Multimodal interfaces (pp. 233-238).
  • 4. Pantic, M., & Rothkrantz, L. J. (2000). Expert system for automatic analysis of facial expressions. Image and Vision Computing, 18(11), 881-905.
  • 5. Pantic, M., & Rothkrantz, L. J. M. (2000). Automatic analysis of facial expressions: The state of the art. IEEE Transactions on pattern analysis and machine intelligence, 22(12), 1424-1445.