Logics
Understand what logics are and the role they play in solution design.
Logics customize and guide the platform's interactions with users. They are the key to personalizing the experience provided by your solution. They work hand in hand with other core components of projects (entities, objects, actions, attributes, and interaction channels) to help you define when, for whom, and what actions are delivered over the course of running your solution.
For example, you can configure a logic that plays a personalized greeting (audio message) whenever a user enters a room in the experience environment.
Logic components
When creating logics for Affective Computing (powered by Virtue) projects, you must configure the following components:
Activator
The subject of the logic; the recipient for whom the logic is activated.
You can configure any of the following activators:
Profile
Activates the logic for all users who have a specific psychological profile
Entity
Activates the logic for a specific VH (by entity ID).
Condition
Conditions or events that can trigger the logic.
You can configure any of the following conditions:
Environmental
Triggers the logic when the environmental condition is met.
Location
Speed
Time
External signal
Camera ID
Crowd counting
User motion
Triggers the logic when user motion (or presence) is detected.
User detected
User geotargeting
Triggers the logic based on specific conditions related to pinpointing the user's location.
Location
QR
Camera
Beacon
Action
Action(s) delivered when the logic is triggered. You can configure delivering one specific action or any action grouped within one or multiple attributes.
Operators
Logical operators (AND and OR) that define the flow of the logic and enable you to introduce multiple conditions and rules of interdependence between conditions and actions.
To set up the logic described in the previous example (that plays a personalized greeting when a user enters a room), you can configure the components as follows:
Activator: Profile > All
Condition: User motion > User detected
Action: Depending on how you have parameterized your solution, you can configure the specific action, or interaction or triggered action attribute that contains the action of playing the personalized audio greeting message.
Setting up logic blueprints
Setting up a logic blueprint simply means configuring the different logic components to define the flow of the logic. You can set up a logic blueprint as follows:
Activators
Profile
Self-centered
Average
Role model
Reserved
All
No input required; select a profile type from the drop-down menu.
The logic is activated only for users having the selected type of psychological profile.
Entity
Entity ID of an VH
The logic is activated only for the specific Virtual Human (VH).
Safety
Heart rate Note: You must include this type in "condition" to set the values using operators.
A number between 50 and 200
The logic is activated based on the heart rates of end-users. Note: You can use this activator for solutions in the safety domain.
Conditions
Environmental
Location
No input required; select the coordinates of the required location (which you previously declared as an interaction input) from the drop-down menu.
The logic is triggered when the defined location condition is met. For example, you can set up the logic to be triggered when a user arrives at a location.
Speed
A number in km/h
The logic is triggered when the defined speed condition is met. For example, you can set up the logic to be triggered when a user is travelling faster than the defined speed.
Time
Numbers in HHMMSS format
The logic is triggered when the defined time condition is met. For example, you can set up the logic to be triggered at a specific time.
External signal
No input required; select the name of the required object (which you previously declared as an interaction input) from the drop-down menu.
The logic is triggered when the defined external signal condition is met. For example, you can set up the logic to be triggered when when a traffic light turns red (the external signal).
Camera ID
No input required; select one of the following:
The name of the required camera (which you previously declared as an interaction input) from the drop-down menu
The (current) camera in the user's device
The logic is triggered when the defined camera ID condition is met. For example, you can set up the logic to be triggered if a user is detected using facial recognition by a specific camera.
Crowd counting
A number
The logic is triggered when the defined crowd counting condition is met. For example, you can set up the logic to be triggered if more than a specific number of people have gathered at a location.
Object
No input required; select the name of the required object (which you previously declared as an interaction input) from the drop-down menu. - validate
The logic is triggered when the defined object condition is met. For example, you can set up the logic to be triggered if a traffic light (object) turns red. - validate
User motion
User is detected
A binary value (True/False)
The logic is triggered when the user is detected. For example, you can set up the logic to be triggered if facial recognition detects a user.
User geotargeting
Location
test and validate
The logic is triggered when a user's location is identified based on a location interaction input. For example, you can set up the logic to be triggered if the user is present at a location.
QR
test and validate
The logic is triggered when a user's location is identified based on a QR code scan (interaction input). For example, you can set up the logic to be triggered if the user scans a specific QR code (which is at a specific location).
Camera
test and validate
The logic is triggered when a user's location is identified through a camera (interaction input). For example, you can set up the logic to be triggered if the user is detected by a specific camera (which is at a specific location) through facial recognition.
Beacon
test and validate
The logic is triggered when a user's location is identified by a beacon (interaction input). For example, you can set up the logic to be triggered if the user's positioning beacon indicates that they are at a specific location.
Profile
Anomaly
A number in percentage
The logic is triggered if the user deviates from the goal of the task by more than the defined threshold. Note: You can configure this condition to set it up as a baseline for EDAA™, the underlying technology that powers Affective Computing (powered by Virtue).
Actions
Specific action
test and validate
The logic delivers a specific (previously-created) action.
Content attribute
test and validate
The logic delivers a content action from the pool of (content) actions grouped under a specific content attribute.
Interaction attribute
test and validate
The logic delivers an interaction action from the pool of (interaction) actions grouped under a specific interaction attribute.
Triggered attribute
test and validate
The logic delivers a triggered action from the pool of (triggered) actions grouped under a specfic content attribute.
Logic types
In Affective Computing (powered by Virtue), you can work with logics of the following types:
User calibration
Logics triggered only when the project is running in user calibration mode.
Action
Logics that result in action delivery
Logic anchoring
EDAA™, the underlying technology that powers Affective Computing (powered by Virtue) has the ability to generatively evolve and fine-tune your logics to generate personalized actions for end-users and deliver them in the most effective way. This, in turn, enables you to incorporate a high degree of personalization in your solutions.
However, you can also prevent generative evolution of your logics by anchoring them.
The following table describes how your solution behaves when you anchor actions and logics in the following ways:
Anchored
Anchored
The solution will always use the logic exactly as configured and deliver an action from the same pool. You can configure logic anchoring in this manner to have absolute control over the solution's behavior and performance. Note: This configuration is a good choice when initially designing your solution. Later, when you gain more expertise in using Affective Computing (powered by Virtue), you can remove the anchors as required.
Anchored
Non-anchored
The solution uses the logic exactly as configured but uses variations of the actions that are generated according to their attribution. This enables better efficiency of the logic with respect to the actions' purpose. For example, if you enable generative evolution of the action of delivering a greeting message ("Hi!"), other similar statements ("Good morning", "Hello there!") relevant for the same conditions might be generated and delivered. Note: Generated actions are eligible for inclusion in the action pool only after they are validated.
Non-anchored
Anchored
Affective Computing (powered by Virtue) generatively changes the logic but doesn't create variations of anchored actions. The solution might deliver the same actions under different conditions.
For example, if an anchored action that delivers a greeting message is configured for a non-anchored logic that is triggered upon user detection, the same actions might be used in a similar logic designed to greet users when they arrive at a specified location (such as a room in the experience environment).
Non-anchored
Non-anchored
The solution utilizes Affective Computing (powered by Virtue)'s generative capabilities entirely; it generates better iterations of the logic that deliver appropriate actions that yield better engagement with higher efficiency.
Anchoring reduces personalization, therefore we recommend anchoring logics only if your solution really requires it.
Logic hierarchy in projects
Logics have the following hierarchy, which determines how their parameterization is inherited over different levels:
Parent > Domain > Tenant > Project
Creating and managing logics in projects
You can create and manage logics using the Portal, our no-code SaaS GUI platform, or through a low-code integration with our APIs.
Working with logics using the Portal
Creating logic blueprintsWorking with logics using APIs
Related information
Logic blueprintingLast updated
Was this helpful?