Diagnostics and calibration
Learn how Affective Computing (powered by Virtue) establishes users' psychological profiles and calibrates solutions based on both motivation and end-users' psychological states.
EDAA™, the underlying technology that powers Affective Computing, embodies the principles of the Theory of Mind (ToM). Therefore, it is capable of identifying and analyzing its users' psychological profiles, detecting variations in their emotional states, and calibrating the solution to provide personalization with a high degree of human-centricity.
EDAA™ understands its users' psychological profiles through the internal processes of diagnostics and calibration;
Diagnostics is the process of analyzing and establishing a preliminary (baseline) psychological profile for each user.
Calibration is the process that helps EDAA™ understand both the solution’s context and the current psychological state (mood) of its users. Calibration happens at 2 levels; product and user.
Advantages
Diagnostics and calibration enable you to build solutions that provide automated real-time personalization that is accurately tailored to users' psychological profiles and current emotional states.
Through these processes, Affective Computing (powered by Virtue) overcomes the common limitations of typical machine learning tools; EDAA™ doesn't require large volumes of training data, long periods of training, or manual resources for training and model management. Moreover, as EDAA™ analyzes motivations and emotional states of users in real-time rather than training data, it also avoids incurring biases inherent in these data sets.
As a result, your solutions can provide accurate personalization and more humanized interactions, leading to improving user engagement and increasing its effectiveness. Your solutions can provide relevant and adaptive experiences to users.
Use cases
Validation: You can build solutions to virtually validate a product, service, or experience before actually moving forward with production. The validation would take place in a simulated environment and use Affective Computing-powered Virtual Humans (emotionally-driven non-playable characters) based on cloned or augmented emotional data of real human users. Affective Computing's diagnostics and calibration capabilities can accurately detect variations in the VHs' emotional states and the help align the solution with product and users' motivations.
Personalized user experiences: Affective Computing-powered VHs can customize their interactions with real human users based on the users' current moods.
Human safety: You can leverage Affective Computing (powered by Virtue)'s capabilities to analyze users' psychological profiles and moods to predict and detect anomalistic behavior that would endanger humans. You can build solutions that could deliver preventative actions, such as mood enhancement.
How it works
Diagnostics
Diagnostics is the process of analyzing and establishing a preliminary (baseline) psychological profile for each user. This process takes place when a new user interacts with an Affective Computing-powered solution for the first time.
In order to provide accurate personalization, before interacting with the solution, users must be "diagnosed" by EDAA™, the underlying technology that powers Affective Computing. Diagnostics is the first step towards the diagnosis, as it enables EDAA™ to determine the users' preliminary psychological profile.
During diagnostics, EDAA™ provides each new user with a concrete scenario (called the diagnostic interaction) in which it poses a set of questions (typically 5-7) to users.
The questions aim at triggering unconscious emotional responses from users and activating their inefficient behaviors, which in turn give EDAA™ enough information to determine and generate their initial psychological profile.
When users participate in the diagnostics process, EDAA™ analyzes the cognitive processes revealed by their physiological and behavioral responses.
In projects, diagnostics is set up through logics (For detailed information, see Understanding project components). When the diagnostic logic is invoked, a relevant question is delivered to the user. (You can group related questions under the same attribute when parameterizing projects so that any of them can be delivered when the respective logic is triggered.)
You can use separate attributes to group actions relevant to different personas so that you can set up logics to take personas into consideration during action delivery.
Product calibration
Product calibration is the operation mode that enables EDAA™ to establish the solution’s reality (context). In this mode, Affective Computing (powered by Virtue) validates and aligns the solution’s end goal with the ideal outcome of the experience.
Affective Computing (powered by Virtue) projects runs in this mode during initial solution deployment. You can also manually activate or deactivate this mode for your projects at any time.
User calibration
User calibration is the process of validating and adjusting the previously-established psychological profile of existing users based on their current psychological states.
This process occurs at the beginning of each interaction session between the solution and existing users, and ensures that Affective Computing (powered by Virtue) considers each user’s latest emotional state when delivering personalization. This process typically takes up to 3 minutes.
However, based on the solution's requirements, you can customize its duration and even automate it. For example, you can set it up to require a minimum number of interactions.
During user calibration, EDAA™ operates in passive mode and collects inputs from the user to determine if the psychological profile it determined previously needs to be updated.
EDAA™ does this by analyzing physiological inputs, posing questions and analyzing responses, or observing user behavior. EDAA™ then generates motivational checkpoints for users that you can verify manually or through an API endpoint.
This process ensures that Affective Computing (powered by Virtue) considers each user’s latest emotional state when delivering personalization.
Last updated
Was this helpful?