Please use this identifier to cite or link to this item: https://repository.iimb.ac.in/handle/2074/19556
DC FieldValueLanguage
dc.contributor.advisorDe, Rahul
dc.contributor.authorPujari, Maneesha
dc.contributor.authorBodanki, Neel Kumar
dc.date.accessioned2021-06-11T14:44:37Z-
dc.date.available2021-06-11T14:44:37Z-
dc.date.issued2020
dc.identifier.urihttps://repository.iimb.ac.in/handle/2074/19556-
dc.description.abstractEmotional AI or Affective Computing helps detect human emotions by using facial analysis, voice patterns, and deep learning to process them further and generate various degrees of the emotion present in the given output. According to Gartner, by 2022, personal devices will know more about a person’s emotional state than their own friends or family. The Emotional AI market is estimated to grow to $41billion by 2022, with existing players like Affectiva, Beyond Verbal, audEERING, and the big giants like Amazon, Google entering this space to understand their users’ emotions. As Emotional A is evolving over the years, it is shifting from deep EQ-guided experiences to detecting complex cognitive states like distraction and drowsiness. Affectiva is working in this space called the Human Perception AI, using real-world data, speech science, and computer vision, which helps autonomous vehicles. iii Lyrebird, a Canadian AI company, has created a system to mimic a human voice by analysing speech recognition and its corresponding AI transcripts
dc.publisherIndian Institute of Management Bangalore
dc.relation.ispartofseriesPGP_CCS_P20_118
dc.subjectArtificial intelligence
dc.subjectEmotions
dc.subjectEmotional AI
dc.subjectAffective computing
dc.titleArtificial intelligence in detecting emiotions
dc.typeCCS Project Report-PGP
dc.pages11p.
Appears in Collections:2020
Files in This Item:
File SizeFormat 
PGP_CCS_P20_118.pdf414.19 kBAdobe PDFView/Open    Request a copy
Show simple item record

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.