Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Input Devices Science

Let Your Pupils Do the Typing 49

New submitter s.mathot writes: Researchers from France and the Netherlands have developed a way to—literally—write text by thinking of letters. (Academic paper [open access], non-technical blog, YouTube video.) This technique relies on small changes in pupil size that occur when you covertly (from the corner of your eye; without moving your eyes or body) attend to bright or dark objects. By presenting a virtual keyboard on which the 'keys' alternate in brightness, and simultaneously measuring the size of the eye's pupil, the technique automatically determines which letter you want to write; as a result, you can write letters by merely attending to them, without moving any part of your body, including the eyes.
This discussion has been archived. No new comments can be posted.

Let Your Pupils Do the Typing

Comments Filter:
  • Is this really new? (Score:1, Interesting)

    by Z00L00K ( 682162 )

    I think that eye movement tracking have been studied for a long time as an input method, mostly for handicapped people that lacks movement in a major part of their body.

    What about the system Stephen Hawking uses?

    • Re: (Score:3, Interesting)

      This one is probably slightly different: rather than trying to track where you are looking, you examine the pupil change to try and find out what letter they are looking at. Sounds terribly unreliable and expensive.
      • Well it's actually very reliable and cheap: The eye tracker used for the video is €100, and selection accuracy (even for naive participants) is over 90%. It's slow though.
      • It would seem that the main problem is that it probably works great in a darkroom on a monitor specially designed for it. But not in the real word, not on a monitor with a giant glowing power button, and not if you want to show anything else on the screen at the same time.

      • > when you covertly (from the corner of your eye; without moving your eyes or body) attend to bright or dark objects.

        If this is accurate, you don't have to look at the letter. It sounds like, as the summary says, you're literally just thinking about a letter in your peripheral vision without looking directly at it and your pupil responds to its brightness.

        Which, even if unreliable, is an interesting discovery.
    • by Martin Blank ( 154261 ) on Saturday February 06, 2016 @11:42AM (#51452747) Homepage Journal

      Hawking still uses a system activated by a muscle in his cheek, one of the few over which he still has some level of control, which is then detected by an IR sensor in his glasses. Earlier versions used a small joystick while he still had some control over a few fingers (or maybe it was just one), but the system has been adapted as he's lost more and more control.

      This system might allow him to continue working even if he loses the last vestiges of control over his facial muscles.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      I understand the confusion, but this technique does not rely on eye tracking in the sense of measuring eye position. It relies on measuring the size of the pupil, while the eyes don't move. So our technique is more comparable to so-called brain-computer interfaces than to conventional eye tracking. (Of course, eye tracking and similar devices (such as the cheek system used by Stephen Hawking) have been around for years, and if (eye) movement is possible, these are more efficient.)

      • Nice hype, but you are measuring the pupil size, there is no brain-computer interface, and you aren't "thinking" of letters. The system would work without letters. Ridiculous.
        • The system may not need to show letters to get the letters right but somewhere in the brain there needs to be letters and somewhere in the machine there also needs to be letters.

          As an aside, is anyone else having to relog in a lot on the mobile side of Slashdot?
        • It's not a direct read, but the idea is the same. You're thinking of a letter, and your attention goes to the letter on the screen even if your eyes don't move at all (they mention this for use in locked-in syndrome, where there's no voluntary movement at all). The iris responds to a lesser degree than it would if it were to center on the letter, but it still responds to the brightness, an involuntary movement based on a thought.

          It's not a direct brain interface, but it makes for an indirect one through.

      • Call it as I see it.
    • by mcgrew ( 92797 ) *

      Indeed, this would only be helpful to someone who could neither type nor speak. It seems that writing this way would be very time consuming.

      • Indeed, this would only be helpful to someone who could neither type nor speak.

        Nor move any part of their body including their eyes.

        It seems that writing this way would be very time consuming.

        If the alternative is "not communicating at all" I don't think it matters.

    • by sconeu ( 64226 )

      My late wife had an eyegaze computer made by Tobii [tobiiati.com].

      My first thought is, "How does this differ from a traditional eyegaze computer?"

      • by KGIII ( 973947 )

        You made me curious so I went looking. The prices aren't that bad though I don't see any options for open source. I wonder (I looked and didn't find any) if there are any projects doing this with open source? It seems like you could patch something like that together for less money and the software would then be the only real hard part. (Unless I'm missing something.) An above poster, seemingly in a university and working with this sort of thing, indicates that you can do that with a standard web cam.

        I've n

    • by mikael ( 484 )

      Some systems work on detecting the direction the eyes are looking by measuring the shape of the pupil. Advertising and human-computer interaction people use eye-gaze detection systems that are used to measure how long and where a person is looking

  • This is perfect. We now don't even need to move our arms to watch netflix. The peak of civilization has now been achieved.

    Oh, by the way, this type of system has already been deployed for paraplegics for many years. And no, you aren't "literally thinking" about letters, you are "looking" at letters. Learn what "literally" means.
    • Actually the new and improved part of this invention is exactly that people do NOT look at the letters.

      • Not really. You are "looking", but you don't have to move the eye to look. In fact there doesn't need to be "letters" at all. You are just "looking at" light/dark patterns and measuring the pupil response. No thinking required, which makes it perfect for people like me who don't like to think.
    • by Anonymous Coward

      Except that the point of this tech is that you aren't in fact looking at the letters, you're mentally focusing on them outside of the focus of your vision. This is what they mean by "attending" to them, which frankly is a poor wording that doesn't really convey what they're trying to convey.

    • by Anonymous Coward

      Not quite so! Eye tracking has been used for many years (which is looking at letters); but measuring pupil size when participants are looking at a single location. They attend to the letter they want to type rather than actually looking at it. The pupil responds to the bright/dark background, and this is what's picked up by the algorithm. So the study is rather novel, and is a promising system for those who cannot move their eyes.

      You could've know this, if you actually read the announcement and/or the paper

    • by OzPeter ( 195038 )

      This is perfect. We now don't even need to move our arms to watch netflix. The peak of civilization has now been achieved.

      Netflix. Yeah, right.

  • Did anyone else think this was going to be advice to teachers to let their charges do the typing instead of the teachers ?

  • by Applehu Akbar ( 2968043 ) on Saturday February 06, 2016 @11:54AM (#51452797)

    So this isn't about exploitation of students, then?

  • by PopeRatzo ( 965947 ) on Saturday February 06, 2016 @12:42PM (#51452949) Journal

    Right after they clean the blackboard and erasers.

  • by 140Mandak262Jamuna ( 970587 ) on Saturday February 06, 2016 @01:18PM (#51453177) Journal
    Till about the 1990s, all over Indian even in very small towns there were "Typewriting and Shorthand Institutes" [thehindu.com]. In those institutes pupils have been typing since 1900s. .

    They started morphing into programmer mills churning out dBase III, COBOL, coders and now they teach everything from Java to Ansys Fluid Mechanics R17.1 (Register for two courses and AutoCAD is free!)

  • Richard Stallman Does This.

    He has his graduate students do his typing for him; too many EMACS related Vulcan Nerve Pinches have left him with severe carpal-tunnel syndrome.

    Oh. Not *those* kind of "pupils". Never mind.

    • by hawk ( 1151 )

      EMACS is indeed the only software that ever caused me physical injury.

      After a multi-day editing binge on a CKIE keyboard I went to the campus medical center. Muscle strain on my left pinky from rotate/stretch/curl of my large hands to hit control ...

      Now, I would never tamper with University property but a couple of days later days later, there was a little piece of plastic next to my keyboard , and the shift-lock no longer toggled allowing me to remap the control key to where God meant it tone. . .

      hawk

      • by djinn6 ( 1868030 )
        I've been using two $10 USB foot pedals for control and meta. It's completely gotten rid of the pinky and thumb strains for me, at least while I'm coding.
  • It sounds to me like they are implying that when the eye focuses on the brighter lights its pupil contracts, and in such a way the device can know which letter you want. Is that not what is happening? Because that definitely requires eye movement; Are they implying that you will just somehow know the differing brightness of the keys without even looking at them?

  • Do you mean "look at"?

  • They have developed a way to—literally—write text by looking at letters. Great advance!!

  • by marciot ( 598356 ) on Saturday February 06, 2016 @07:58PM (#51454855)

    It appears as if the letters are separated into two groups and that within that group the alternation of brightness is perfectly synchronized to the other letters in the group and opposite the color to the letters in the opposing group. Smaller sets of letters are presented over time until one letter is chosen. So this appears to be a binary search that reads one bit at a time, based strictly on the phase of the brightness signal.

    What makes me wonder is why this is so constrained. Could the brightness of each letter be controlled independently to encode the letter directly? Perhaps the user could be presented with a full keyboard, with each "key's" brightness modulated to a different binary code. Presumably then the code of the character that the user was fixating on could be read from the pupil diameter variation directly?

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...