Our Technology
Since 2015, our team has developed a number of smart-sensing devices (head-mounted and eyewear) as well as designed ground-breaking sensors and algorithms that shed light on how emotional responses are elicited. A wide range of high-resolution techniques with frequencies up to 6000 times per second are employed. These include optical and electrical facial muscle activity, heart rate features, and head movement. The data streams are collectively analysed to identify emotional and behavioural responses to stimuli and environmental context. Our AI engine translates that information and provides insights into emotional expressivity, behavioural chances & stress responses on demand.
OCOsense
Available for researchers
Download our Brochure for more information
DownloadOCO™ and OCOsense™ are trademarks of Emteq Limited
This technology is protected by multiple granted patents, including but not limited to:
GB2561537, GB2604076, US11,003,899, US11,538,279
This technology is protected by multiple granted patents, including but not limited to:
GB2561537, GB2604076, US11,003,899, US11,538,279
Our approach
Perceptions of positivity and negativity associated with a stimulus are fundamental mechanisms that underpin much of our emotional experience. Depending on context, positivity will be indexed via relative activation of the zygomaticus major muscle (associated with smiling) whereas negativity is represented by activation of the corrugator supercilii (associated with frowning). The relative activation of each muscle group is measured to understand the unconscious generation of emotional experience as an adaptive, nuanced and flexible system.
Virtual Reality presents a unique controlled way of studying and measure human reactions by monitoring behaviours and responses to known stimuli. As such, it offers a more cost-effective way to generate ecologically valid environments for research.
However the real-world is far more complex. Naturally the next step was applying those techniques in every-day life, where endless possibilities arise in studying and monitoring human affect and behaviour to different settings, conditions and uncontrolled parameters. The application of positivity and negativity measures was heavily limited in the real-world until now. Head-mounted or eye-wearables offer the solution in tracking facial responses continuously in-the-wild, whilst maintaining unobtrusiveness, reliability and validity in different conditions, e.g. lighting. The ability to monitor changes in negativity and positivity(valence) in real-life sets a fundamental framework needed for researching on mental and wellbeing applications for state monitoring, phenotype composition, performance assessment, and designing user-centered interventions.
Perceptions of positivity and negativity
Our approach to quantifying emotional responses is to use continuous interpretive systems such as the Dimensional Model, which considers both valence (positivity and negativity) as well as arousal (activation of the sympathetic nervous system).
Emteq lab’s activity tagging system captures context combined with individualised response measures, allowing application of both Evaluative Space and Constructed Emotion models.
Emteq lab’s activity tagging system captures context combined with individualised response measures, allowing application of both Evaluative Space and Constructed Emotion models.
The Science behind our technology
The scientific study of emotions was pioneered by Charles Darwin in the 19th century. Later, psychologists such as Paul Ekman focussed research into the role of facial expressions in displaying emotions. Their work principally used images of faces. This work led to the categorisation of emotional facial expressions into discrete archetypes (happy, sad, fear, anger surprise and disgust).
It is important to note that facial expressions are generated by the contractions of muscles, which in turn may (or may not) deform the overlying skin. Therefore, changes in the position of facial features occur after muscle activation. Researchers used a technique called electromyography [EMG] to measure the electrical activation of muscles beneath the skin, to determine whether early computer vision systems could detect subtle expressions (Cohn & Schmidt, 2004).
Electromyography involves using electrodes like little microphones which “listen” for muscle activation 2000 times per second (unlike cameras which sample at 30-60 times per second). EMG is highly sensitive and can even pick up micro-expressions which are not observable. Unlike cameras, which rely on the indirect measurement of skin overlying the muscle, EMG can also detect changes in baseline muscle tone and directly record electrical activity.
Measuring facial expressions and emotional responses using EMG is a fundamental research method that, until recently, was confined to the laboratory. With a combination of multi-sensor arrays, active noise cancellation and advanced algorithms, emteq labs is liberating this powerful tool and making it available to researchers, content creators and developers. By combining this with non-invasive heart rate and heart rate variability sensing, the emteqPRO offers a “lab-in-a-box” solution for conducting remote studies. This offers significant potential for researchers in media, marketing, gaming and psychology.
Virtual reality provides a powerful paradigm for measuring behaviour and simulating controlled realistic environments. However, the most salient facial information is covered by the headset, hence a different approach is needed.
Research using facial EMG ranges from neuroscience, psychology, human computer interaction, gaming, content analytics and training. You can read a selection of research articles using facial EMG here.
Researchers at Emteq Labs and our collaborators have validated the use of our technology to assess facial EMG in a range of contexts. To read more, you can find articles here.
Whilst EMG has its benefits, it is limited to use in virtual environments. It also requires skin contact and does not allow the technology to be used in daily life. For this reason, Emteq Labs developed and patented a novel method of ombining the best features of EMG (high sensitivity, rapid sample rate), and camerabased facial tracking (non-contact, low noise).
Optomyography samples the tiny changes in muscle activation bysampling at up to 6000 times a second. By applying the same principles as facial EMG. The incorporation of the echnology into glasses avoids the disadvantages of camera-based solutions (privacy concerns, continuous data capture regardless of lighting and head orientation).
Optomyography samples the tiny changes in muscle activation bysampling at up to 6000 times a second. By applying the same principles as facial EMG. The incorporation of the echnology into glasses avoids the disadvantages of camera-based solutions (privacy concerns, continuous data capture regardless of lighting and head orientation).
Our patents
The company has a growing patent portfolio covering core requirements for behavioural measurement and analysis for XR which includes the following:
Patent No: GB 130027 (1 & 2),130027EP(1 & 2), US130027
Optical Muscle sensor & optical expression detection. This invention relates to a system comprising wearable apparatus for detecting facial muscle activity, facial skin movement and facial expressions using optical sensors.
Patent No: GB 2518113, US10398373
Wearable apparatus for providing muscular biofeedback. The apparatus comprises biosensors that are able to detect activity of a set of facial muscles and a pattern in the sensor data that is characteristic of a facial muscle imbalance can be detected. When such a pattern is detected, feedback may be provided to a wearer. This informs a user that their facial expression is imbalanced, allowing them to attempt to correct for the imbalance
Patent No: GB 2552124, US10398373
Wearable apparatus for providing muscular biofeedback. The apparatus comprises sensors arranged to detect activity of the posterior auricular muscles (behind the ear), and by identifying patterns in this activity, the apparatus can infer a zygomaticus muscle (a cheek muscle used when smiling) is also active. Thus, the system provides an indirect and unobtrusive means by which the activity of the zygomaticus can be detected.
Patent No: GB1703133
A wearable system for detecting facial muscle activity. The system comprises several optical flow sensors arranged to image an area of the skin. Each imaged area of skin is associated with one or more facial muscles. Thus, a processor can be configured to determine the activity of facial muscle by analysing how the images vary over time.
The company has also filed several other applications for protection of its IP, especially with regard to the proprietary sensors used by OCOSense and is in the process of preparing additional applications
The company has also filed several other applications for protection of its IP, especially with regard to the proprietary sensors used by OCOSense and is in the process of preparing additional applications
Groundbreaking eyewear that revolutionizes the way we understand human behaviour. By seamlessly integrating wireless sensors and advanced machine learning, OCOsense empowers you to uncover profound insights in real-world or augmented reality environments.
Watch our webinar to learn more
Watch video on YoutubeUnparalleled
Behavioural Analytics
Gain superior data quality and meaningful insights by mapping behavioural responses to activities and contexts.
Wearable Expression and Behaviour Sensing
Gain superior data quality and meaningful insights by mapping behavioural responses to activities and contexts.
OCOsense
Get yours now!
Download our Brochure for more information
DownloadOCO™ and OCOsense™ are trademarks of Emteq Limited
This technology is protected by multiple granted patents, including but not limited to:
GB2561537, GB2604076, US11,003,899, US11,538,279
This technology is protected by multiple granted patents, including but not limited to:
GB2561537, GB2604076, US11,003,899, US11,538,279