>While it may not immediately replace mice in the near future, it's certainly an interesting device
Eye-tracking won’t have the precision of a mouse for some time.
However, you can combine eye-tracking with other inputs.
Eye-tracking companies have eye tracker features that are for using mouse control and eye control in conjunction.
E.g. eye-tracking is used to initially teleport your cursor near your target, and then you can use the mouse to precisely place the cursor.
*Mouse-cursor-teleport user setting: time that mouse controlled cursor must be in rest before eye control is involved again (mouse precision still in use)*
Tobii has a time setting that determines how quickly a teleport-to-point-of-gaze-upon-movement-of-mouse will occur.
You can set the time that a mouse-controlled cursor has to be still before moving the mouse will cause a teleport.
You can decide the amount of time that the mouse has to sit still before eye control is involved again (return of eye control could mean that either gaze controls the cursor again, or the next movement of the mouse will warp/teleport the cursor to the point-of-gaze).
It’s for, “wait, I’m still using the mouse for stability and precision.
The mouse-controlled cursor is still working in this area”.
*Mouse-cursor-teleport user setting: point-of-gaze must be a certain distance from the mouse controlled cursor before eye control is involved again (eye-tracking is activated for larger cursor jumps)*
Another setting involves deciding the distance from the mouse-controlled cursor that the point-of-gaze has to be before gaze-teleporting is involved.
It’s for, “some of the targets are close enough, so I can just use the mouse.
I’ll save eye teleporting for when the distance is large”.).
Eye-tracking +keyboard: Eye-Tracking doesn’t have the precision of a mouse, but if an interface element and hit state is large enough, a “click-where-I’m-looking at” keyboard button will work.
Eye tracking + keyboard two-step process: there could be some eye tracking features that allow an eye controlled cursor to snap, zoom, etc. to a smaller target element, or make smaller elements project into large elements.
Sometimes it’s a two-step process, so even if you have the ability to instantly teleport the cursor, “both-hands-on-keyboard + eye-tracking two-step process” may not be suitable in certain situations.
Eye tracking teleport + mouse and keyboard: However, whenever you need the mouse, eye-tracking will still be there to provide an initial cursor teleport.
Without eye-tracking: If you have both hands on the keyboard, you lose time switching one hand to the mouse, and bringing the hand back to the keyboard.
You’re usually choosing between both hands on the keyboard, or one hand on the mouse.
With eye-tracking: With eye tracking, it can be used either with both hands on the keyboard (click-what-I’m-looking-at keyboard button), or one on the mouse (initial cursor teleport, then use the mouse).
You never have to forgo something to use eye-tracking; it’s always ready to make normal computer interaction faster.
*Eye-tracking can make on-screen buttons, and thus macros more prevalent*
Eye-tracking can make macros more popular because eye-tracking allows for easier activation, and thus more use of custom widgets and on-screen buttons.
A collection of custom on-screen macro buttons with recognizable, self-documenting text labels is easier to maintain than a collection of Control + Alt + Shift + <whatever> keyboard shortcuts for activating macros.
i.e. Tasker macros on mobile have a better chance for adoption than AutoHotkey or AutoIt shortcuts on the desktop.