For the game “Channeler”, our main focus is on innovative gameplay mechanics through the use of eye tracking technology. This unique peripheral provides a challenge for both designers and programmers. The former deals with its own unique challenges worthy of their own post; my main focus therefore will be with the programming aspect as a result.

To introduce readers to the direction of this post, let me present a scenario. In “Focal Distraction”, the codename for one of the earlier prototypes, the designers attempt to distract the player from spotting and staying focused on the unique entity in a crowd of similar-looking figures. Simple mechanics in theory, but to get it right, the execution must be done in a way where the designer has to anticipate the player’s actions so it works against him or her. It ends up becoming somewhat of a psychological game between the designer and the player.  When designing a level like this, it becomes imperative to gather information about where the player might be looking at during a specific scripted event. Say, for example, we have an explosion of colors on the left side of the screen that we wish the players to be distracted by. As a designer, I would like to ask questions like “Does the average player get distracted by this event?” and “How long does the average player get distracted by said event?”. This mechanic, and other gameplay features, poses questions that cannot be answered by simple observation and so a tool is needed to assist the designers in answering these difficult questions. This is the reasoning that resulted in the development of our 3D Heat Map analytic tool.

What is a 3D Heat Map?

Before I talk about what a 3D Heat Map is, let’s discuss what a heat map is first for those not familiar. A heat map is a method to represent a collection of data on a coordinate plane through differences in color. Most people are familiar with geographic heat maps, where red usually indicates that there are more concentrations of heat, for example, on a certain location on the map. This style of representing data becomes very effective in the case of eye-tracking, as the screen becomes our coordinate plane and the location of the user’s gaze will act as the collection of data that we’re interested in analyzing. This is one version of the heat map analytic tool that is in development, however it will not be the focus of this article per se.

Example of an eye-tracking Heat Map analytic tool used for website development, pulled from a Google Image search.

Heat maps are more commonly known to be associated with two-dimensional data, however the data we’re interested in cannot be gathered from a 2-D coordinate. Let’s return to the scenario discussed earlier: let’s say there is an explosion that appears for a short period of time intended to distract the player. How do we know that the user is staring at the explosion and not another object close by? What if the player is allowed to move? Information of where the user is staring at on screen will not relate well with objects the users are staring at when the screen continues to change perspective. Answering questions using a simple heat map approach will not work. The data that needs gathering will require knowledge of the 3D space in-game, with regards to the location of the player and/or the specific object the player is staring at. This is where we have to extend upon how one represents data like this, and so the 3D Heat Map was conceived.

What makes a heat map three-dimensional can differ on implementation. In our case, the idea of representing a collection of data through colors still remains the same, but since the collection of data only concerns objects that are in focus, the color marks the object in-game rather than the pixels on screen. This effectively communicates to the designer which object is getting the most data points, where the data points in this case is the amount of frames that the object is being looked at.


As stated earlier, the 3D Heat Map is intended to record the state of what objects the players are staring at during gameplay or throughout a gameplay session. The way this is achieved is by exposing a blueprint function that stores information in the analytic tool about what object, and how long an object is being stared at.


This function is not a function that gets called every frame, this function is meant to be placed in the section of code that provides the data it needs every frame, data such as the object in focus (Actor Hit) and the duration it’s been in focus (Delta Time).

Although the Tobii Eye X provides functionality to detect if objects are being focused on, I decided to implement my own eye gaze focus detection to allow for additional features. Where Tobii provided the feature on the actor side (actors responding to events of focus gained / lost), I needed the feature to come from the user side (access any actor the user is currently staring at). This method was chosen so that any objects the user focuses on can be treated in the same manner, for example applying a depth of field focus effect on the focused object, or in this case storing the object’s focus duration and other gaze information into the 3D Heat Map analytics.

In the background (or on the code implementation), the object is both stored in a data structure to record any data related to the object (mainly just an accumulated time the object is in focus) and stores a snapshot of the object on that frame through the Visual Logging feature offered by Unreal. When a designer wants to look through an average playing session, he/she can look at the recording through an Excel spreadsheet or a 3D Heat Map visualization (this feature is in development). At a specific playing session. Designers can look into specific playing sessions, and scrub through frame-by-frame using Unreal’s Visual Logger Window.

To take further advantage of the Visual Logger feature, an extra blueprint was added to further assist developers. A Blueprint function added allows an object to be recorded on the Visual Logger at every frame or during a single frame. This allows level designers to control what information they want to see during a playthrough session.



Below is a live demonstration of the Heat Map analytic tool in use. Notice the white orb that trails around the game screen. This orb is the player’s focus on-screen. The heat map collects data on the object the player is staring at in the game and highlights it green.

Left: Gameplay session. Right: Unreal Editor’s 3D Heat Map visualization of that session.

Note that the Visual Logger only offers a video scrubber, so demonstrating this tool side-by-side with the gameplay will not be accurate.



This is a quick demonstration of the 3D Heat Map analytic tool developed in Unreal for the game “Channeler”. Continuing post on this subject will be a more technical dive into the development and issues that came with creating this in-editor 3D heat map tool.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s