The Official Trend Watching Site
powered by Daniel Levine and The Avant-Guide Institute
ShareSave to Favorites
Added 2 December, 2018

iPhone sensors will soon track users’ emotions

A company called Buglife has found a way to use the infrared depth sensors found on some iPhones to analyze facial expressions. The company has designed a software development kit that works in any iOS app. The product can help companies and marketers capture user reactions to a product or a piece of content. It offers two modes of operation: streaming emotional analysis data in real time, or taking emotion snapshots based on specific in-app events. It uses the cameras’ infrared depth sensors to map the users face in high levels of detail in almost any type of lighting. The company has also created deep learning algorithms to translate the facial data into emotions in real time.

WindowSeat
First spotted by WindowSeat
25 trends spotted
149 trends loved