Fologram streams user’s devices, gestures and scans to the rhino viewport in real time.
You can display users in the Rhino viewport to easily see where they are and what parts of the Rhino model they are looking at in mixed reality. You can also stream the spatial mesh from HoloLens and supported iOS devices to understand how geometry in Rhino is positioned relative to a users physical environment. Hands, controllers, cursors and gaze can also be streamed to the viewport to communicate how a user is interacting with objects.
Toggling user visualization
You can toggle what data is visible in the Rhino viewport from the Visualization tab. User visualization displays a preview of all connected mobile devices and headsets, a text tag with the device name and any currently tracked hands, controllers, cursors and gaze.
Mesh visualization will display a preview of the spatial mesh for all users. Note that if connected users have models positioned in positions, the spatial mesh for each user will be in a different coordinate space even if these users are in the same physical location.
Trackable visualization displays any currently detected QR codes as rectangles in the Rhino viewport.
Performance and Battery Life
Fologram will redraw your Rhino viewport when there is new tracking data available from a connected user. This is typically received at around 10fps in order to display up to date user information. This can introduce a heavy CPU and GPU load on your device if your viewport would normally be slow to render (for instance when in raytraced mode with large amounts of geometry), so disable user preview if trying to conserve battery life or free up resources for other applications.