Tobii

Platforms:

  • Windows 7 / 10

  • Linux

  • macOS

Required Python Version:

  • Python 3.6 +

Supported Models:

Tobii Pro eye tracker models that can use the tobii_research Python package. For a complete list please visit Tobii support.

Additional Software Requirements

To use the ioHub interface for Tobii, the Tobi Pro SDK must be installed in your Python environment. If a recent standalone installation of PsychoPy®, this package should already be included.

To install tobii-research type:

pip install tobii-research

EyeTracker Class

class psychopy.iohub.devices.eyetracker.hw.tobii.EyeTracker[source]

To start iohub with a Tobii eye tracker device, add the Tobii device to the dictionary passed to launchHubServer or the experiment’s iohub_config.yaml:

eyetracker.hw.tobii.EyeTracker

Examples

  1. Start ioHub with a Tobii device and run tracker calibration:

    from psychopy.iohub import launchHubServer
    from psychopy.core import getTime, wait
    
    iohub_config = {'eyetracker.hw.tobii.EyeTracker':
        {'name': 'tracker', 'runtime_settings': {'sampling_rate': 120}}}
    
    io = launchHubServer(**iohub_config)
    
    # Get the eye tracker device.
    tracker = io.devices.tracker
    
    # run eyetracker calibration
    r = tracker.runSetupProcedure()
    
  2. Print all eye tracker events received for 2 seconds:

    # Check for and print any eye tracker events received...
    tracker.setRecordingState(True)
    
    stime = getTime()
    while getTime()-stime < 2.0:
        for e in tracker.getEvents():
            print(e)
    
  3. Print current eye position for 5 seconds:

    # Check for and print current eye position every 100 msec.
    stime = getTime()
    while getTime()-stime < 5.0:
        print(tracker.getPosition())
        wait(0.1)
    
    tracker.setRecordingState(False)
    
    # Stop the ioHub Server
    io.quit()
    
clearEvents(event_type=None, filter_id=None, call_proc_events=True)

Clears any DeviceEvents that have occurred since the last call to the device’s getEvents(), or clearEvents() methods.

Note that calling clearEvents() at the device level only clears the given device’s event buffer. The ioHub Process’s Global Event Buffer is unchanged.

Parameters:

None

Returns:

None

enableEventReporting(enabled=True)[source]

enableEventReporting is functionally identical to the eye tracker device specific enableEventReporting method.

getConfiguration()

Retrieve the configuration settings information used to create the device instance. This will the default settings for the device, found in iohub.devices.<device_name>.default_<device_name>.yaml, updated with any device settings provided via launchHubServer(…).

Changing any values in the returned dictionary has no effect on the device state.

Parameters:

None

Returns:

The dictionary of the device configuration settings used to create the device.

Return type:

(dict)

getEvents(*args, **kwargs)

Retrieve any DeviceEvents that have occurred since the last call to the device’s getEvents() or clearEvents() methods.

Note that calling getEvents() at a device level does not change the Global Event Buffer’s contents.

Parameters:
  • event_type_id (int) – If specified, provides the ioHub DeviceEvent ID for which events should be returned for. Events that have occurred but do not match the event ID specified are ignored. Event type ID’s can be accessed via the EventConstants class; all available event types are class attributes of EventConstants.

  • clearEvents (int) – Can be used to indicate if the events being returned should also be removed from the device event buffer. True (the default) indicates to remove events being returned. False results in events being left in the device event buffer.

  • asType (str) – Optional kwarg giving the object type to return events as. Valid values are ‘namedtuple’ (the default), ‘dict’, ‘list’, or ‘object’.

Returns:

New events that the ioHub has received since the last getEvents() or clearEvents() call to the device. Events are ordered by the ioHub time of each event, older event at index 0. The event object type is determined by the asType parameter passed to the method. By default a namedtuple object is returned for each event.

Return type:

(list)

getLastGazePosition()[source]

Returns the latest 2D eye gaze position retrieved from the Tobii device. This represents where the eye tracker is reporting each eye gaze vector is intersecting the calibrated surface.

In general, the y or vertical component of each eyes gaze position should be the same value, since in typical user populations the two eyes are yoked vertically when they move. Therefore any difference between the two eyes in the y dimension is likely due to eye tracker error.

Differences between the x, or horizontal component of the gaze position, indicate that the participant is being reported as looking behind or in front of the calibrated plane. When a user is looking at the calibration surface , the x component of the two eyes gaze position should be the same. Differences between the x value for each eye either indicates that the user is not focussing at the calibrated depth, or that there is error in the eye data.

The above remarks are true for any eye tracker in general.

The getLastGazePosition method returns the most recent eye gaze position retrieved from the eye tracker device. This is the position on the calibrated 2D surface that the eye tracker is reporting as the current eye position. The units are in the units in use by the Display device.

If binocular recording is being performed, the average position of both eyes is returned.

If no samples have been received from the eye tracker, or the eye tracker is not currently recording data, None is returned.

Parameters:

None

Returns:

If the eye tracker is not currently recording data or no eye samples have been received.

tuple: Latest (gaze_x,gaze_y) position of the eye(s)

Return type:

None

getLastSample()[source]

Returns the latest sample retrieved from the Tobii device. The Tobii system always using the BinocularSample Event type.

Parameters:

None

Returns:

If the eye tracker is not currently recording data.

EyeSample: If the eye tracker is recording in a monocular tracking mode, the latest sample event of this event type is returned.

BinocularEyeSample: If the eye tracker is recording in a binocular tracking mode, the latest sample event of this event type is returned.

Return type:

None

getPosition()

See getLastGazePosition().

isRecordingEnabled()[source]

isRecordingEnabled returns the recording state from the eye tracking device.

Parameters:

None

Returns:

True == the device is recording data; False == Recording is not occurring

Return type:

bool

runSetupProcedure(calibration_args={})[source]

runSetupProcedure performs a calibration routine for the Tobii eye tracking system.

setRecordingState(recording)[source]

setRecordingState is used to start or stop the recording of data from the eye tracking device.

Parameters:

recording (bool) – if True, the eye tracker will start recordng available eye data and sending it to the experiment program if data streaming was enabled for the device. If recording == False, then the eye tracker stops recording eye data and streaming it to the experiment.

If the eye tracker is already recording, and setRecordingState(True) is called, the eye tracker will simple continue recording and the method call is a no-op. Likewise if the system has already stopped recording and setRecordingState(False) is called again.

Parameters:

recording (bool) – if True, the eye tracker will start recordng data.; false = stop recording data.

Returns:

the current recording state of the eye tracking device

Return type:

bool

Supported Event Types

tobii_research provides real-time access to binocular sample data.

The following fields of the ioHub BinocularEyeSample event are supported:

class psychopy.iohub.devices.eyetracker.BinocularEyeSampleEvent(*args, **kwargs)[source]

The BinocularEyeSampleEvent event represents the eye position and eye attribute data collected from one frame or reading of an eye tracker device that is recording both eyes of a participant.

Event Type ID: EventConstants.BINOCULAR_EYE_SAMPLE

Event Type String: ‘BINOCULAR_EYE_SAMPLE’

time

time of event, in sec.msec format, using psychopy timebase.

left_gaze_x

The horizontal position of the left eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data. Uses tobii_research gaze data ‘left_gaze_point_on_display_area’[0] field.

left_gaze_y

The vertical position of the left eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data. Uses tobii_research gaze data ‘left_gaze_point_on_display_area’[1] field.

left_eye_cam_x

The left x eye position in the eye trackers 3D coordinate space. Uses tobii_research gaze data ‘left_gaze_origin_in_trackbox_coordinate_system’[0] field.

left_eye_cam_y

The left y eye position in the eye trackers 3D coordinate space. Uses tobii_research gaze data ‘left_gaze_origin_in_trackbox_coordinate_system’[1] field.

left_eye_cam_z

The left z eye position in the eye trackers 3D coordinate space. Uses tobii_research gaze data ‘left_gaze_origin_in_trackbox_coordinate_system’[2] field.

left_pupil_measure_1

Left eye pupil diameter in mm. Uses tobii_research gaze data ‘left_pupil_diameter’ field.

right_gaze_x

The horizontal position of the right eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data. Uses tobii_research gaze data ‘right_gaze_point_on_display_area’[0] field.

right_gaze_y

The vertical position of the right eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data. Uses tobii_research gaze data ‘right_gaze_point_on_display_area’[1] field.

right_eye_cam_x

The right x eye position in the eye trackers 3D coordinate space. Uses tobii_research gaze data ‘right_gaze_origin_in_trackbox_coordinate_system’[0] field.

right_eye_cam_y

The right y eye position in the eye trackers 3D coordinate space. Uses tobii_research gaze data ‘right_gaze_origin_in_trackbox_coordinate_system’[1] field.

right_eye_cam_z

The right z eye position in the eye trackers 3D coordinate space. Uses tobii_research gaze data ‘right_gaze_origin_in_trackbox_coordinate_system’[2] field.

right_pupil_measure_1

Right eye pupil diameter in mm. Uses tobii_research gaze data ‘right_pupil_diameter’ field.

status

Indicates if eye sample contains ‘valid’ data for left and right eyes. 0 = Eye sample is OK. 2 = Right eye data is likely invalid. 20 = Left eye data is likely invalid. 22 = Eye sample is likely invalid.

Default Device Settings

eyetracker.hw.tobii.EyeTracker:
    # Indicates if the device should actually be loaded at experiment runtime.
    enable: True

    # The variable name of the device that will be used to access the ioHub Device class
    # during experiment run-time, via the devices.[name] attribute of the ioHub
    # connection or experiment runtime class.
    name: tracker

    # Should eye tracker events be saved to the ioHub DataStore file when the device
    # is recording data ?
    save_events: True

    # Should eye tracker events be sent to the Experiment process when the device
    # is recording data ?
    stream_events: True

    # How many eye events (including samples) should be saved in the ioHub event buffer before
    # old eye events start being replaced by new events. When the event buffer reaches
    # the maximum event length of the buffer defined here, older events will start to be dropped.
    event_buffer_length: 1024

    # The Tobii implementation of the common eye tracker interface supports the
    # BinocularEyeSampleEvent event type.
    monitor_event_types: [ BinocularEyeSampleEvent,]

    # The model name of the Tobii device that you wish to connect to can be specified here,
    # and only Tobii systems matching that model name will be considered as possible candidates for connection.
    # If you only have one Tobii system connected to the computer, this field can just be left empty.
    model_name:

    # The serial number of the Tobii device that you wish to connect to can be specified here,
    # and only the Tobii system matching that serial number will be connected to, if found.
    # If you only have one Tobii system connected to the computer, this field can just be left empty,
    # in which case the first Tobii device found will be connected to.
    serial_number:

    calibration:
        # The Tobii ioHub Common Eye Tracker Interface currently support
        # a 3, 5 and 9 point calibration mode.
        # THREE_POINTS,FIVE_POINTS,NINE_POINTS
        #
        type: NINE_POINTS

        # Should the target positions be randomized?
        #
        randomize: True

        # auto_pace can be True or False. If True, the eye tracker will 
        # automatically progress from one calibration point to the next.
        # If False, a manual key or button press is needed to progress to
        # the next point.
        #
        auto_pace: True
        
        # pacing_speed is the number of sec.msec that a calibration point should
        # be displayed before moving onto the next point when auto_pace is set to true.
        # If auto_pace is False, pacing_speed is ignored.
        #
        pacing_speed: 1.5
        
        # screen_background_color specifies the r,g,b background color to 
        # set the calibration, validation, etc, screens to. Each element of the color
        # should be a value between 0 and 255. 0 == black, 255 == white.
        #
        screen_background_color: [128,128,128]
        
        # Target type defines what form of calibration graphic should be used
        # during calibration, validation, etc. modes.
        # Currently the Tobii implementation supports the following
        # target type: CIRCLE_TARGET. 
        # To do: Add support for other types, etc.
        #
        target_type: CIRCLE_TARGET
        
        # The associated target attribute properties can be supplied
        # for the given target_type. 
        target_attributes:
             # CIRCLE_TARGET is drawn using two PsychoPy
             # Circle Stim. The _outer_ circle is drawn first, and should be
             # be larger than the _inner_ circle, which is drawn on top of the
             # outer circle. The target_attributes starting with 'outer_' define
             # how the outer circle of the calibration targets should be drawn.
             # The target_attributes starting with 'inner_' define
             # how the inner circle of the calibration targets should be drawn. 
             #
             # outer_diameter: The size of the outer circle of the calibration target
             #
             outer_diameter: 35
             # outer_stroke_width: The thickness of the outer circle edge. 
             #
             outer_stroke_width: 2
             # outer_fill_color: RGB255 color to use to fill the outer circle. 
             #
             outer_fill_color: [128,128,128]
             # outer_line_color: RGB255 color to used for the outer circle edge. 
             #
             outer_line_color: [255,255,255]
             # inner_diameter: The size of the inner circle calibration target
             #
             inner_diameter: 7
             # inner_stroke_width: The thickness of the inner circle edge. 
             #
             inner_stroke_width: 1
             # inner_fill_color: RGB255 color to use to fill the inner circle. 
             #
             inner_fill_color: [0,0,0]
             # inner_line_color: RGB255 color to used for the inner circle edge. 
             #
             inner_line_color: [0,0,0]
             # The Tobii Calibration routine supports using moving target graphics.
             # The following parameters control target movement (if any).
             #
             animate:
                 # enable: True if the calibration target should be animated.
                 # False specifies that the calibration targets could just jump
                 # from one calibration position to another.
                 #
                 enable: True
                 # movement_velocity: The velocity that a calibration target
                 # graphic should use when gliding from one calibration
                 # point to another. Always in pixels / second.
                 #
                 movement_velocity: 600.0 
                 # expansion_ratio: The outer circle of the calibration target
                 # can expand (and contract) when displayed at each position.
                 # expansion_ratio gives the largest size of the outer circle 
                 # as a ratio of the outer_diameter length. For example,
                 # if outer_diameter = 30, and expansion_ratio = 2.0, then
                 # the outer circle of each calibration point will expand out 
                 # to 60 pixels. Set expansion_ratio to 1.0 for no expansion.
                 # 
                 expansion_ratio: 3.0
                 # expansion_speed: The rate at which the outer circle
                 # graphic should expand. Always in pixels / second. 
                 # 
                 expansion_speed: 30.0
                 # contract_only: If the calibration target should expand from
                 # the outer circle initial diameter to the larger diameter
                 # and then contract back to the original diameter, set 
                 # contract_only to False. To only have the outer circle target
                 # go from an expanded state to the smaller size, set this to True.
                 #
                 contract_only: True
    
    runtime_settings:
        # The supported sampling rates for Tobii are model dependent. 
        # Using a default of 60 Hz.
        sampling_rate: 60

        # Tobii implementation supports BINOCULAR tracking mode only.
        track_eyes: BINOCULAR
            
    # manufacturer_name is used to store the name of the maker of the eye tracking
    # device. This is for informational purposes only.
    manufacturer_name: Tobii Technology

Last Updated: January, 2021


Back to top