Objects
Understand what objects are and how to work with them.
Objects are the "things" in the 3D environment that Virtual Humans (VHs) can interact with in simulations.
For example, in a solution that aims to study the purchasing trends of different types of psychological profiles by simulating a shopping experience in a mall, you can define the items that the VHs (powered by the different types of psychological profiles) can buy as objects in the project.
During the simulation, if the state or properties of an object need to change as part of a solution experience, then you can map it to a triggered action. For example, as a result of some activity (such as triggering motion sensors by walking or manually flipping a switch) during the simulation, lights are turned on in a room.
You can map each object to only one triggered action. Therefore, when declaring (creating and configuring) objects for the solution, we recommend a high degree of specificity. For example, consider a laptop that is declared as an object. The simulation solution depends on the laptop in the following ways:
Inputs must be sent to the solution through the laptop; the VHs would interact with the UI of a pre-loaded app that is displayed on the screen.
Outputs are received through the laptop; music is played through the laptop's speaker as part of the experience.
The event of the laptop shutting down when its battery dies is part of the experience flow.
For this solution, rather than declaring only the laptop itself as the object, you can also declare the interactive app and the laptop's speaker as objects. Then, you can map the respective triggered actions to them.
Creating and managing objects in projects
You can create and manage objects through a low-code integration with our Orchestra APIs:
Creating and managing objectsRelated information
SimulationLast updated
Was this helpful?