Skip to main content

I apologize if this has been shared here before, but there is a relevant kickstarter campaign that is rather close...

I apologize if this has been shared here before, but there is a relevant kickstarter campaign that is rather close to what I am working on, including the branding (EyeSpeak vs my EyeSpeech). A Portuguese company is trying to use the Epson Moverio and an eye-tracking sensor to enable ALS patients to look at a keyboard and author messages by eye gaze. Sadly I believe this approach will fail because saccades and other factors make the area between many adjacent latters projected to be a small size in close proximity too small to accommodate eye-gaze technology. That is why I invented my EyeSpeech interface, along with the fact that most people using an eyewearable devices will not want an entire ASCII keyboard floating in front of them. It's fine on a desktop or even a handheld, but the interface must be re-imagined for smart glasses. Still I hope the kickstarter campaign is a success!
https://www.kickstarter.com/projects/886924859/eyespeak-beyond-communication

Comments

  1. It's limited in a lot of ways.. Firstly it only runs Android and lots of Windows-based and Linux setups exist which are far more capable. It's a brilliant idea but I don't think it's quite ready for mass consumption..

    ReplyDelete
  2. Most decent software for ALS, etc requires a decent processor. Look at Steven Hawking? He's got his setup with two Lenovo tablets running Windows 7 that control his typing, etc.

    Eye movement is certainly the future but it requires some horsepower and usability I'm not sure Android can do..

    ReplyDelete
  3. The biggest market is in cheap, reliable eye tracking systems for ALS and sufferers of relatable conditions which makes a complex OS like Windows simple to use.

    I can see it happening but I'm doubtful many people are willing to invest in it.. I mean Glass has an eye tracking camera but because Glass lacks the processing power and general usability as setup as Steven Hawking it's pointless to contemplate using it.

    ReplyDelete
  4. The system can get simpler, requiring less processing, and the targets larger and easier to calibrate for eye-gaze commands...using my patents, three of which have been granted, so no need to spend a dime, Kris. https://www.google.com/patents/US20120019645

    ReplyDelete
  5. All we need is a device with a fast processor.. ie. i5-i7 and 6-8GB of memory and we should be set.. From there you build software hooks into Windows which turn the cursor into a semi large blob and tell Windows it's a touchscreen, then make the eye tracker your mouse.

    Done. Windows 8 is ready for this technology RIGHT NOW but unfortunately having Windows 8 in your eye like Glass (Yes I have and can do this right now if I like) doesn't work and hence setup's where a screen is attached to a pole on a wheelchair is the ABSOLUTE best option.

    ReplyDelete
  6. Wouldn't work, imagine having flashing letters pop up randomly and you have to RELY on a substandard sensor to detect you've blinked.

    No matter what UI concept you might dream of to type, write, etc with Glass it's going to end in headaches and worst case.. worse vision.

    ReplyDelete
  7. Maybe with Glass XE2, but this platform is hardly static and many other eyewearables are in development. Look at what Brandyn White has already done by fixing a cam onto Glass, calibrating it to track pupil position and playing a Mario game with eye gaze. http://blog.brandynwhite.com/new-glass-input-methods_eye-tracking_touch-sensitive-clothing Soon this capability will be on every smart glass product.

    ReplyDelete
  8. I see it happening but then we have the concern around eye strain.. it needs to be directly in-front of the eye and not to the side for it to function accurately.

    I could talk for hours about the feasibility of certain platforms for specific needs and I usually do.. Unfortunately I haven't got access to the people I would like to work with to actually develop and get this stuff moving forward.

    ReplyDelete
  9. Samantha the EyeSpeech interface is based on the concept that only the target directly in front of the eye is eligible for selection. The eye is either seeking or selecting. When the eye looks to the targets to the right of center, the virtual flywheel spins left, so that the targets right of center are moved toward the center. And visa versa, when the eye looks to targets left of center, spinning the virtual flywheel toward the right.

    ReplyDelete
  10. Anyone with Glass wishing to alpha test this concept is welcome to sign up at www.eyespeech.com for the touch version. The eye gaze version will take some time...hopefully by the end of the summer. And anyone interested in helping develop this interface, please let me know!

    ReplyDelete
  11. When I get my Glass back I'm going to give this a shot.. Definitely interested in testing it heavily.

    ReplyDelete
  12. Thanks...it needs a lot more work, but the basic concept is developed.

    ReplyDelete

  13. Samantha Myers Eye movement is certainly the future but it requires some horsepower and usability I'm not sure Android can do

    *Eye-scrolling on Android, instead of Amazon’s tilt-to-scroll*
     
    Here’s a video demonstration of eye-scrolling on Android: http://youtu.be/PL9cCi5zTzE?t=1m42s.
     
    You wouldn’t have to keep tilting your device in order to scroll.
     
    *Cheap eye tracking in devices with front facing cameras first*
     
    Eye Tribe, the eye-tracking company with the prototype, says that it will only cost $5, and slight modifications to integrate an eye tracker into smartphones, tablets, notebooks, and laptops because they all already have front facing cameras (The developer eye tracker that I have from them is still only $99).
     
    I think Google has the most to gain with eye-tracking (Google, Apple, and Nokia all have eye tracking patents, but Google has a valuable pay-per-gaze patent).
    I wish they would just partner with Eye Tribe, or any other eye tracking company, and get an eye tracker into a Chromebook.
    It would instantly remove the Gorilla Arm, horizontal-arm ergonomic problem with vertical touchscreens, and start allowing people to do heavy work with touch UIs (easy-to-reach ““touch-what-am-looking-at” keyboard button).

    Once eye-tracking hits the mass-market, the costs of eye tracking should come down, and it should be easier to get eye-tracking into headmounted devices.
    With the limited real estate, eye-tracking would be the most useful on wearables.

    *Eye-tracking in Glass*
     
    However, some things need to happen first, like a better battery.
    Someone posted this on Reddit a while back:
     
    “Secondly, the issue with Image processing what you don't read about when seeing eye tracking glasses is the backpack with a mac mini, wires and batterypack that is needed in order to do all the image processing.
    Do you think the PrimeSense 3d sensor has a embeded image processing chip? No.
    To make the long story short, whats really keeping the world you described above is the following:
     
    Image processing chips (hardware especially designed for image processing) the technology is close with a few companies doing amazing work in this field.
     
    Power (actual battery power in small physical space) we just don't have the required power to do this kind of image processing without carrying a full onboard macmini like pc.
     
    Eye tracking method's (the Pupil center corneal reflection) just plain suck when you factor in ambient light.
    Someones either needs to do two things, firstly figure out how to do eye tracking with another method (some research is being done in this) secondly start making cameras particularly for eye tracking (right all eye trackers use off the self cameras).
    The issue with using off the self (or premade cameras) is that they are moving towards more visible light and away from picking up IR light.
    Eye trackers need to be able to see IR light to do image processing”.
     
    Yeah.. one step at a time.
    Five dollar eye trackers on smart phones, laptops, etc. first ;).

    ReplyDelete
  14. Just to clarify, Steven Hawking uses a scanning system with his tablets, which is controlled with a facial recognition system that detects cheek, mouth, and eyebrow movement. It's actually a three input system, which drives a word scanning program. The new system (which Intel constantly is tweaking) replaces his old one input (cheek movement) system which was at times painfully slow.

    He has tried eye tracking, but is not fond of it. From all accounts, he's a pretty particular guy. He recently upgraded his equipment due to more loss of function in his body.

    Switch scanning is very simple, which involves a grid of letters organized in different columns and rows.  When he gets to a row that contains the letter he wants, he activates the switch. The scanner will automatically advance to each letter on the row. When he gets to the letter, he activates the switch again. It is a slow process, but with word prediction it can increase words per minute.

    ALS is a horrible disease. I've worked with maybe about 50-60 patients who've had ALS. Near the end stages, it's sometimes even difficult to use state of the art eye tracking systems (Tobii). I'm seeing two clients now who have almost no voluntary movement, except for a little jaw movement. Even eye movement can be affected, as well as the ability to close your eyelids.

    The future is definitely BCI (brain computer interface). My hope is that within a few years more research will determine ways to treat ALS and other debilitating diseases. 

    Working with ALS patients is often the most difficult part of my job. I've become friends with many of them... It's tough to see the progression, and then finally receive news that they have passed away.

    ReplyDelete
  15. Kris Kitchen Thanks for posting this. As with a virtual QWERTY keyboard, the targets (characters) here will be too close for eye tracking to reliably reconcile, IMO. Due to saccades and the user gazing at real objects in their immediate (not virtual) environment, I would caution against displaying many small/adjacent targets. That approach puts enormous processing and calibration requirements on the system to reconcile eye gaze/movement intended to operate the UI from eye movement that is involuntary or not related to operating the system.

    I would also caution against thinking of the eye as a mouse or fingertip. The eye is not a way to swipe, for example. Google has a patent for unlocking screens/systems with an eye swipe. While it's fine on an iPhone to swipe with your fingertip to unlock the screen, the finger is not visual input device like the eye is. So why use vestigial elements of handhelds to operate eyewearables? We have the opportunity to totally re-think the UX. Here, the simpler the better and we can develop a new way to communicate by taking the simplest approach and building on it.

    To make a long story short, avoid the temptation to use targets like a keyboard or the eye like a mouse/fingertip. Simplify. Again, as stated in my top post, I think displaying all characters simultaneously is a nonstarter. Lastly, you suggest the method could be used with blinks, winks, facial movement, etc. There is no way to outline a workable method without defining exactly how it will be used. If you like facial movements, think through a serious method for that. If you like eye-gaze commands as I do, think through a serious method for that. A one-size fits all approach is not realistic IMO.

    ReplyDelete
  16. Kris Kitchen Yes, it's possible but not very intuitive for user and different colors seem to further complicate the UI. Asking people to memorize eye movements as a way to control  infinite possible letter/word combinations is not realistic IMO.

    ReplyDelete
  17. How will assigning different colors to alphanumeric characters make the method more intuitive?

    ReplyDelete

Post a Comment

Popular posts from this blog

Folks, we have a run on tickets and they are fast going - if your in Toronto Canada on the 30th June and have any...

Folks, we have a run on tickets and they are fast going - if your in Toronto Canada on the 30th June and have any interest in 'skynet'  (ps. or coming to ISTAS13 to meet Matthew Schroyer  then perhaps it's time you grabbed a FREE ticket and joined the crush at Ryerson - more info here -  http://uberveillance.com/uav-pros-cons