My guess is eye controlled wouldn't be running the whole time, just when you press something equivalent to the AF on button or the like, so you could still scan your whole scene without confusing it. But to me it's just yet another thing to go wrong. Though I must admit, the iPhone's system of just touching the screen where you want it to take all it's exposure measurements etc is pretty fantastic for quickly and easily telling the camera what you want. If you could do something similar by LOOKING at the points you want for exposure and for focus with your DSLR, then it's certainly another step in the direction of the incredible ease of use that made iPhones so popular in the early days.