SR Research

Platforms:

  • Windows 7 / 10

  • Linux

  • macOS

Required Python Version:

  • Python 3.6 +

Supported Models:

  • EyeLink 1000

  • EyeLink 1000 Plus

Additional Software Requirements

The SR Research EyeLink implementation of the ioHub common eye tracker interface uses the pylink package written by SR Research. If using a |PsychoPy|3 standalone installation, this package should already be included.

If you are manually installing PsychPy3, please install the appropriate version of pylink. Downloads are available to SR Research customers from their support website.

On macOS and Linux, the EyeLink Developers Kit must also be installed for pylink to work. Please visit SR Research support site for information about how to install the EyeLink developers kit on macOS or Linux.

EyeTracker Class

Supported Event Types

The EyeLink implementation of the ioHub eye tracker interface supports monoculor or binocular eye samples as well as fixation, saccade, and blink events.

Eye Samples

class psychopy.iohub.devices.eyetracker.MonocularEyeSampleEvent(*args, **kwargs)[source]

A MonocularEyeSampleEvent represents the eye position and eye attribute data collected from one frame or reading of an eye tracker device that is recoding from only one eye, or is recording from both eyes and averaging the binocular data.

Event Type ID: EventConstants.MONOCULAR_EYE_SAMPLE

Event Type String: ‘MONOCULAR_EYE_SAMPLE’

time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the sample. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

gaze_x

The horizontal position of the eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

gaze_y

The vertical position of the eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

angle_x

Horizontal eye angle.

angle_y

Vertical eye angle.

raw_x

The uncalibrated x position of the eye in a device specific coordinate space.

raw_y

The uncalibrated y position of the eye in a device specific coordinate space.

pupil_measure_1

Pupil size. Use pupil_measure1_type to determine what type of pupil size data was being saved by the tracker.

pupil_measure1_type

Coordinate space type being used for left_pupil_measure_1.

ppd_x

Horizontal pixels per visual degree for this eye position as reported by the eye tracker.

ppd_y

Vertical pixels per visual degree for this eye position as reported by the eye tracker.

velocity_x

Horizontal velocity of the eye at the time of the sample; as reported by the eye tracker.

velocity_y

Vertical velocity of the eye at the time of the sample; as reported by the eye tracker.

velocity_xy

2D Velocity of the eye at the time of the sample; as reported by the eye tracker.

status

Indicates if eye sample contains ‘valid’ data. 0 = Eye sample is OK. 2 = Eye sample is invalid.

class psychopy.iohub.devices.eyetracker.BinocularEyeSampleEvent(*args, **kwargs)[source]

The BinocularEyeSampleEvent event represents the eye position and eye attribute data collected from one frame or reading of an eye tracker device that is recording both eyes of a participant.

Event Type ID: EventConstants.BINOCULAR_EYE_SAMPLE

Event Type String: ‘BINOCULAR_EYE_SAMPLE’

time

time of event, in sec.msec format, using psychopy timebase.

left_gaze_x

The horizontal position of the left eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

left_gaze_y

The vertical position of the left eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

left_angle_x

The horizontal angle of left eye the relative to the head.

left_angle_y

The vertical angle of left eye the relative to the head.

left_raw_x

The uncalibrated x position of the left eye in a device specific coordinate space.

left_raw_y

The uncalibrated y position of the left eye in a device specific coordinate space.

left_pupil_measure_1

Left eye pupil diameter.

left_pupil_measure1_type

Coordinate space type being used for left_pupil_measure_1.

left_ppd_x

Pixels per degree for left eye horizontal position as reported by the eye tracker. Display distance must be correctly set for this to be accurate at all.

left_ppd_y

Pixels per degree for left eye vertical position as reported by the eye tracker. Display distance must be correctly set for this to be accurate at all.

left_velocity_x

Horizontal velocity of the left eye at the time of the sample; as reported by the eye tracker.

left_velocity_y

Vertical velocity of the left eye at the time of the sample; as reported by the eye tracker.

left_velocity_xy

2D Velocity of the left eye at the time of the sample; as reported by the eye tracker.

right_gaze_x

The horizontal position of the right eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

right_gaze_y

The vertical position of the right eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

right_angle_x

The horizontal angle of right eye the relative to the head.

right_angle_y

The vertical angle of right eye the relative to the head.

right_raw_x

The uncalibrated x position of the right eye in a device specific coordinate space.

right_raw_y

The uncalibrated y position of the right eye in a device specific coordinate space.

right_pupil_measure_1

Right eye pupil diameter.

right_pupil_measure1_type

Coordinate space type being used for right_pupil_measure1_type.

right_ppd_x

Pixels per degree for right eye horizontal position as reported by the eye tracker. Display distance must be correctly set for this to be accurate at all.

right_ppd_y

Pixels per degree for right eye vertical position as reported by the eye tracker. Display distance must be correctly set for this to be accurate at all.

right_velocity_x

Horizontal velocity of the right eye at the time of the sample; as reported by the eye tracker.

right_velocity_y

Vertical velocity of the right eye at the time of the sample; as reported by the eye tracker.

right_velocity_xy

2D Velocity of the right eye at the time of the sample; as reported by the eye tracker.

status

Indicates if eye sample contains ‘valid’ data for left and right eyes. 0 = Eye sample is OK. 2 = Right eye data is likely invalid. 20 = Left eye data is likely invalid. 22 = Eye sample is likely invalid.

Fixation Events

Successful eye tracker calibration must be performed prior to reading (meaningful) fixation event data.

class psychopy.iohub.devices.eyetracker.FixationStartEvent(*args, **kwargs)[source]

A FixationStartEvent is generated when the beginning of an eye fixation ( in very general terms, a period of relatively stable eye position ) is detected by the eye trackers sample parsing algorithms.

Event Type ID: EventConstants.FIXATION_START

Event Type String: ‘FIXATION_START’

time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the event. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

gaze_x

Horizontal gaze position at the start of the event, in Display Coordinate Type Units.

gaze_y

Vertical gaze position at the start of the event, in Display Coordinate Type Units.

angle_x

Horizontal eye angle at the start of the event.

angle_y

Vertical eye angle at the start of the event.

pupil_measure_1

Pupil size. Use pupil_measure1_type to determine what type of pupil size data was being saved by the tracker.

pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

ppd_x

Horizontal pixels per degree at start of event.

ppd_y

Vertical pixels per degree at start of event.

velocity_xy

2D eye velocity at the start of the event.

status

Event status as reported by the eye tracker.

class psychopy.iohub.devices.eyetracker.FixationEndEvent(*args, **kwargs)[source]

A FixationEndEvent is generated when the end of an eye fixation ( in very general terms, a period of relatively stable eye position ) is detected by the eye trackers sample parsing algorithms.

Event Type ID: EventConstants.FIXATION_END

Event Type String: ‘FIXATION_END’

time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the event. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

duration

Duration of the event in sec.msec format.

start_gaze_x

Horizontal gaze position at the start of the event, in Display Coordinate Type Units.

start_gaze_y

Vertical gaze position at the start of the event, in Display Coordinate Type Units.

start_angle_x

Horizontal eye angle at the start of the event.

start_angle_y

Vertical eye angle at the start of the event.

start_pupil_measure_1

Pupil size at the start of the event.

start_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

start_ppd_x

Horizontal pixels per degree at start of event.

start_ppd_y

Vertical pixels per degree at start of event.

start_velocity_xy

2D eye velocity at the start of the event.

end_gaze_x

Horizontal gaze position at the end of the event, in Display Coordinate Type Units.

end_gaze_y

Vertical gaze position at the end of the event, in Display Coordinate Type Units.

end_angle_x

Horizontal eye angle at the end of the event.

end_angle_y

Vertical eye angle at the end of the event.

end_pupil_measure_1

Pupil size at the end of the event.

end_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

end_ppd_x

Horizontal pixels per degree at end of event.

end_ppd_y

Vertical pixels per degree at end of event.

end_velocity_xy

2D eye velocity at the end of the event.

average_gaze_x

Average horizontal gaze position during the event, in Display Coordinate Type Units.

average_gaze_y

Average vertical gaze position during the event, in Display Coordinate Type Units.

average_angle_x

Average horizontal eye angle during the event,

average_angle_y

Average vertical eye angle during the event,

average_pupil_measure_1

Average pupil size during the event.

average_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

average_velocity_xy

Average 2D velocity of the eye during the event.

peak_velocity_xy

Peak 2D velocity of the eye during the event.

status

Event status as reported by the eye tracker.

Saccade Events

Successful eye tracker calibration must be performed prior to reading (meaningful) saccade event data.

class psychopy.iohub.devices.eyetracker.SaccadeStartEvent(*args, **kwargs)[source]
time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the event. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

gaze_x

Horizontal gaze position at the start of the event, in Display Coordinate Type Units.

gaze_y

Vertical gaze position at the start of the event, in Display Coordinate Type Units.

angle_x

Horizontal eye angle at the start of the event.

angle_y

Vertical eye angle at the start of the event.

pupil_measure_1

Pupil size. Use pupil_measure1_type to determine what type of pupil size data was being saved by the tracker.

pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

ppd_x

Horizontal pixels per degree at start of event.

ppd_y

Vertical pixels per degree at start of event.

velocity_xy

2D eye velocity at the start of the event.

status

Event status as reported by the eye tracker.

class psychopy.iohub.devices.eyetracker.SaccadeEndEvent(*args, **kwargs)[source]
time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the event. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

duration

Duration of the event in sec.msec format.

start_gaze_x

Horizontal gaze position at the start of the event, in Display Coordinate Type Units.

start_gaze_y

Vertical gaze position at the start of the event, in Display Coordinate Type Units.

start_angle_x

Horizontal eye angle at the start of the event.

start_angle_y

Vertical eye angle at the start of the event.

start_pupil_measure_1

Pupil size at the start of the event.

start_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

start_ppd_x

Horizontal pixels per degree at start of event.

start_ppd_y

Vertical pixels per degree at start of event.

start_velocity_xy

2D eye velocity at the start of the event.

end_gaze_x

Horizontal gaze position at the end of the event, in Display Coordinate Type Units.

end_gaze_y

Vertical gaze position at the end of the event, in Display Coordinate Type Units.

end_angle_x

Horizontal eye angle at the end of the event.

end_angle_y

Vertical eye angle at the end of the event.

end_pupil_measure_1

Pupil size at the end of the event.

end_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

end_ppd_x

Horizontal pixels per degree at end of event.

end_ppd_y

Vertical pixels per degree at end of event.

end_velocity_xy

2D eye velocity at the end of the event.

average_gaze_x

Average horizontal gaze position during the event, in Display Coordinate Type Units.

average_gaze_y

Average vertical gaze position during the event, in Display Coordinate Type Units.

average_angle_x

Average horizontal eye angle during the event,

average_angle_y

Average vertical eye angle during the event,

average_pupil_measure_1

Average pupil size during the event.

average_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

average_velocity_xy

Average 2D velocity of the eye during the event.

peak_velocity_xy

Peak 2D velocity of the eye during the event.

status

Event status as reported by the eye tracker.

Default Device Settings

# This section includes all valid sr_research.eyelink.EyeTracker Device
# settings that can be specified in an iohub_config.yaml
# or in a Python dictionary form and passed to the launchHubServer
# method. Any device parameters not specified when the device class is
# created by the ioHub Process will be assigned the default value
# indicated here.
#
eyetracker.hw.sr_research.eyelink.EyeTracker:
    # name: The unique name to assign to the device instance created.
    #   The device is accessed from within the PsychoPy script 
    #   using the name's value; therefore it must be a valid Python
    #   variable name as well.
    #
    name: tracker

    # enable: Specifies if the device should be enabled by ioHub and monitored
    #   for events.
    #   True = Enable the device on the ioHub Server Process
    #   False = Disable the device on the ioHub Server Process. No events for
    #   this device will be reported by the ioHub Server.
    #    
    enable: True

    # saveEvents: *If* the ioHubDataStore is enabled for the experiment, then
    #   indicate if events for this device should be saved to the
    #   data_collection/keyboard event group in the hdf5 event file.
    #   True = Save events for this device to the ioDataStore.
    #   False = Do not save events for this device in the ioDataStore.
    #    
    saveEvents: True

    # streamEvents: Indicate if events from this device should be made available
    #   during experiment runtime to the PsychoPy Process.
    #   True = Send events for this device to  the PsychoPy Process in real-time.
    #   False = Do *not* send events for this device to the PsychoPy Process in real-time.
    #    
    streamEvents: True

    # auto_report_events: Indicate if events from this device should start being
    #   processed by the ioHub as soon as the device is loaded at the start of an experiment,
    #   or if events should only start to be monitored on the device when a call to the
    #   device's enableEventReporting method is made with a parameter value of True.
    #   True = Automatically start reporting events for this device when the experiment starts.
    #   False = Do not start reporting events for this device until enableEventReporting(True)
    #   is set for the device during experiment runtime.
    #
    auto_report_events: False

    # event_buffer_length: Specify the maximum number of events (for each
    #   event type the device produces) that can be stored by the ioHub Server
    #   before each new event results in the oldest event of the same type being
    #   discarded from the ioHub device event buffer.
    #
    event_buffer_length: 1024

    # device_timer: The EyeLink EyeTracker class uses the polling method to
    #   check for new events received from the EyeTracker device. 
    #   device_timer.interval specifies the sec.msec time between device polls.
    #   0.001 = 1 msec, so the device will be polled at a rate of 1000 Hz.   
    device_timer:
        interval: 0.001

    # monitor_event_types: The eyelink implementation of the common eye tracker 
    #   interface supports the following event types. If you would like to 
    #   exclude certain events from being saved or streamed during runtime, 
    #   remove them from the list below.
    #    
    monitor_event_types: [ MonocularEyeSampleEvent, BinocularEyeSampleEvent, FixationStartEvent, FixationEndEvent, SaccadeStartEvent, SaccadeEndEvent, BlinkStartEvent, BlinkEndEvent]
    
    calibration:
        # IMPORTANT: Note that while the gaze position data provided by ioHub
        # will be in the Display's coordinate system, the EyeLink internally
        # always uses a 0,0 pixel_width, pixel_height coordinate system
        # since internally calibration point positions are given as integers,
        # so if the actual display coordinate system was passed to EyeLink,
        # coordinate types like deg and norm would become very coarse in
        # possible target locations during calibration.
        
        # type: sr_research.eyelink.EyeTracker supports the following
        #   calibration types:
        #   THREE_POINTS, FIVE_POINTS, NINE_POINTS, THIRTEEN_POINTS
        type: NINE_POINTS

        # auto_pace: If True, the eye tracker will automatically progress from
        # one calibration point to the next. If False, a manual key or button press
        # is needed to progress to the next point.
        # 
        auto_pace: True

        # pacing_speed: The number of sec.msec that a calibration point should
        # be displayed before moving onto the next point when auto_pace is set to true.
        # If auto_pace is False, pacing_speed is ignored.
        #
        pacing_speed: 1.5
        
        # screen_background_color: Specifies the r,g,b,a background color to 
        #   set the calibration, validation, etc, screens to. Each element of the color
        #   should be a value between 0 and 255. 0 == black, 255 == white. In general
        #   the last value of the color list (alpha) can be left at 255, indicating
        #   the color not mixed with the background color at all.
        screen_background_color: [128,128,128,255]
        
        # target_type: Defines what form of calibration graphic should be used
        #   during calibration, validation, etc. modes. sr_research.eyelink.EyeTracker
        #   supports the CIRCLE_TARGET type.
        #   
        target_type: CIRCLE_TARGET

        # target_attributes: The associated target attributes must be supplied
        #   for the given target_type. If target type attribute sections are provided
        #   for target types other than the entry associated with the specified
        #   target_type value they will simple be ignored.
        #
        target_attributes:
            # outer_diameter and inner_diameter are specified in pixels
            outer_diameter: 33
            inner_diameter: 6
            outer_color: [255,255,255,255]
            inner_color: [0,0,0,255]

    # network_settings: Specify the Host computer IP address. Normally
    #   leaving it set to the default value is fine.
    #
    network_settings: 100.1.1.1

    # default_native_data_file_name: The sr_research.eyelink.EyeTracker supports
    #   saving a native eye tracker edf data file, the
    #   default_native_data_file_name value is used to set the default name for
    #   the file that will be saved, not including the .edf file type extension.
    #
    default_native_data_file_name: et_data

    # simulation_mode: Indicate if the eye tracker should provide mouse simulated 
    #   eye data instead of sending eye data based on a participants actual 
    #   eye movements. 
    #
    simulation_mode: False
    
    # enable_interface_without_connection: Specifying if the ioHub Device
    #   should be enabled without truly connecting to the underlying eye tracking
    #   hardware. If True, ioHub EyeTracker methods can be called but will
    #   provide no-op results and no eye data will be received by the ioHub Server.
    #   This mode can be useful for working on aspects of an eye tracking experiment when the
    #   actual eye tracking device is not available, for example stimulus presentation
    #   or other non eye tracker dependent experiment functionality.
    #    
    enable_interface_without_connection: False

    runtime_settings:
        # sampling_rate: Specify the desired sampling rate to use. Actual
        #   sample rates depend on the model being used. 
        #   Overall, possible rates are 250, 500, 1000, and 2000 Hz.
        #
        sampling_rate: 250

        # track_eyes: Which eye(s) should be tracked? 
        #   Supported Values:  LEFT_EYE, RIGHT_EYE, BINOCULAR
        #        
        track_eyes: RIGHT_EYE

        # sample_filtering: Defines the native eye tracker filtering level to be 
        #   applied to the sample event data before it is sent to the specified data stream.
        #   The sample filter section can contain multiple key : value entries if 
        #   the tracker implementation supports it, where each key is a sample stream type,
        #   and each value is the associated filter level for that sample data stream.
        #   sr_research.eyelink.EyeTracker supported stream types are: 
        #       FILTER_ALL, FILTER_FILE, FILTER_ONLINE 
        #   Supported sr_research.eyelink.EyeTracker filter levels are:
        #       FILTER_LEVEL_OFF, FILTER_LEVEL_1, FILTER_LEVEL_2
        #   Note that if FILTER_ALL is specified, then other sample data stream values are
        #   ignored. If FILTER_ALL is not provided, ensure to specify the setting
        #   for both FILTER_FILE and FILTER_ONLINE as in this case if  either is not provided then
        #   the missing filter type will have filter level set to FILTER_OFF.
        #        
        sample_filtering:
            FILTER_ALL: FILTER_LEVEL_OFF
        
        vog_settings:
            # pupil_measure_types: sr_research.eyelink.EyeTracker supports one
            #   pupil_measure_type parameter that is used for all eyes being tracked. 
            #   Valid options are:
            #       PUPIL_AREA, PUPIL_DIAMETER,
            #            
            pupil_measure_types: PUPIL_AREA

            # tracking_mode: Define whether the eye tracker should run in a pupil only
            #   mode or run in a pupil-cr mode. Valid options are: 
            #       PUPIL_CR_TRACKING, PUPIL_ONLY_TRACKING
            #   Depending on other settings on the eyelink Host and the model and mode of
            #   eye tracker being used, this parameter may not be able to set the
            #   specified tracking mode. CHeck the mode listed on the camera setup
            #   screen of the Host PC after the experiment has started to confirm if
            #   the requested tracking mode was enabled. IMPORTANT: only use
            #   PUPIL_ONLY_TRACKING mode if using an EyeLink II system, or using
            #   the EyeLink 1000 is a head **fixed** setup. Any head movement
            #   when using PUPIL_ONLY_TRACKING will result in eye position signal drift.
            #            
            tracking_mode: PUPIL_CR_TRACKING

            # pupil_center_algorithm: The pupil_center_algorithm defines what 
            #   type of image processing approach should
            #   be used to determine the pupil center during image processing. 
            #   Valid possible values are for eyetracker.hw.sr_research.eyelink.EyeTracker are:
            #   ELLIPSE_FIT, or CENTROID_FIT
            #            
            pupil_center_algorithm: ELLIPSE_FIT

    # model_name: The model_name setting allows the definition of the eye tracker model being used.
    #   For the eyelink implementation, valid values are:
    #       'EYELINK 1000 DESKTOP', 'EYELINK 1000 TOWER', 'EYELINK 1000 REMOTE', 
    #       'EYELINK 1000 LONG RANGE', 'EYELINK 2'
    model_name: EYELINK 1000 DESKTOP

    # manufacturer_name:    manufacturer_name is used to store the name of the
    #   maker of the eye tracking device. This is for informational purposes only.
    #
    manufacturer_name: SR Research Ltd.

    # model_name: The below parameters are not used by the EyeGaze eye tracker
    #   implementation, so they can be left as is, or filled out for FYI only.
    #
    model_name: N/A

    # serial_number: The serial number for the specific isnstance of device used
    #   can be specified here. It is not used by the ioHub, so is FYI only.
    #
    serial_number: N/A

    # manufacture_date: The date of manufactiurer of the device 
    # can be specified here. It is not used by the ioHub,
    # so is FYI only.
    #   
    manufacture_date: DD-MM-YYYY

    # hardware_version: The device's hardware version can be specified here.
    #   It is not used by the ioHub, so is FYI only.
    #
    hardware_version: N/A
    
    # firmware_version: If the device has firmware, its revision number
    #   can be indicated here. It is not used by the ioHub, so is FYI only.
    #
    firmware_version: N/A

    # model_number: The device model number can be specified here.
    #   It is not used by the ioHub, so is FYI only.
    #
    model_number: N/A
    
    # software_version: The device driver and / or SDK software version number.
    #   This field is not used by ioHub, so is FYI only. 
    software_version: N/A

    # device_number: The device number to assign to the Analog Input device. 
    #   device_number is not used by this device type.
    #
    device_number: 0

Last Updated: January, 2021


Back to top