We’re initially focused on the Apple Vision Pro, which restricts access to facial expressions and eye gaze data. Applications like ours could benefit from that, delivering real value to the user. In the future, Apple could easily enable apps to access such data by letting users allow apps access via an app permission prompt, similar to how Apple allows apps to access user surroundings or the microphone, for example.
In the past, a research subject would visit an affective computing laboratory. With XR (VR/AR/MR) headsets, a substantial sensor bed is built-in in order to improve the user experience by making it intuitive and minimizing traditional user interfaces to facilitate direct, unimpeded interaction between the user and their environment. Those same sensors can be used to enable our AI to understand users' emotional states, forming the basis for meaningful and empathetic digital interactions.
The first facet of our Affective AI strategy involves using affective computing methodologies, employing sensors and machine learning algorithms to analyze user interactions through tone of voice, speech content, spatial interactions, head movements, and environmental context.
Interpreting Human Emotions