Getting Started

Using the cognitiveVR SDK for Unity

Welcome! This SDK allows you to integrate your Unity projects with cognitiveVR, which provides analytics and insights about your user's experiences. In addition, cognitiveVR empowers you with the ability to take actions that will improve users' engagement with your experience.

Last Updated: Feb 24, 2017

Getting Started Video Guide


Prerequisites

Setup

Import the .unitypackage file ( Assets > Import Package > Custom Package... )

This window will pop up if everything was imported correctly.

Settings Window

Log in using your cognitiveVR account. You can create an account online here

Settings Window

Select your Organization and Product from the dropdown menus. If you only have one organization and one product, these will be automatically selected when you log in.

Select Test or Prod. This allows you to separate your analytic data for your internal testing and your production build of your product. Select Test for now.

Settings Window

Select SDK

Select SDK

If you are using a VR SDK such as SteamVR, Oculus Utilities or the Fove Plugin, you can select this to enable additional functionality. If you are not using one of these SDKs or you are using one we do not currently support, select Unity Default VR

Track Player Actions

Component Window

You will be prompted to add the CognitiveVR_Manager. This prefab will persist between different scenes.

  • Interval for Player Snapshot - This is the delay between taking snapshots of the player. A snapshot is data about the player's position, rotation and gaze point.

Batching Data

  • Evaluate Gaze in Real Time - This incurs a slight performance hit to calculate the gaze point during gameplay. If this is disabled, gaze points are calculated OnLevelLoaded, OnQuit or OnHMDRemove. This will often cause a notable framerate hit, so should be done during non-critical gameplay sections.
  • Gaze Snapshot Threshold - Once this many gaze points are calculated, the positions will be sent to SceneExplorer. If Evaluate Gaze in Real Time is disabled, the gaze points will be calculated and sent once this threshold is reached.
  • Transactions Threshold - This many Transactions will be batched together before being sent to the server and SceneExplorer
  • Dynamic Object Shapshot Threshold - This many snapshots of Dynamic Objects will be batched together before being sent to SceneExplorer
  • Sensor Data Threshold - This many snapshots of Sensors will be batched together before being sent to SceneExplorer

Sending Data

  • Send Data on Level Load - Any gaze, transactions, sensors or dynamic object data will be sent to Scene Explorer. It may be important to note that this happens after a scene is loaded, it will send data to the previous scene.
  • Send Data On Quit - Send any outstanding data using Unity's OnApplicationQuit. This may not work as expected on Mobile.
  • Send Data On HMD Remove - Send any outstanding data if the HMD is removed from the user. This currently requires SteamVR or Oculus Utilities.

The Tracker Options allows you to quickly add code snippets to your project. These range from additional information (Room Size Tracker) to usability events (Occlusion Event). See the Sample Scripts page for a short description of each components.

Send data

Navigate to our Dashboard, then to the ​Test​ branch of your product.

Integrations Tab

Then navigate to the Debugger page.

Integrations Tab

Now enable the debugger.

Now switch back to your Unity project and run the scene in the Editor. If you switch back to our dashboard once again, you will see your data start streaming in to our cloud!

Integrations Data

You are now tracking your user's basic data including GPU,CPU,OS and RAM.

Next