Tag Images With Your Mind 64
blee37 writes "Researchers at Microsoft have invented a system for tagging images by reading brain scans from an electroencephalograph (EEG). Tagging images is an important task because many images on the web are unlabeled and have no semantic information. This new method allows an appropriate tag to be generated by an AI algorithm interpreting the EEG scan of a person's brain while they view an image. The person need only view the image for as little as 500 ms. Other current methods for generating tags include flat out paying people to do it manually, putting the task on Amazon Mechanical Turk, or using Google Image Labeler."
Looks Good on Paper... (Score:5, Funny)
Honestly this is nice, but seriously if my mind was tagging my images there would be something like the following list of tags
idhitit, awesome,ohgod, thehelliswrongwithme, whydidisavethisagain, ohthatswhy, shit, wallpaper, and photoshop
and that's just keeping it within PG-13.
Re: (Score:3, Funny)
porn
Re: (Score:2, Funny)
So you frequent 4chans /b/, eh?
MI|CROSOFT BRAIN HELMET (Score:2)
It itches when you scroll.
It refuses to tag images of money, "evil".
I hear Apple has a multitouch brain helmet in the works! They say you'll never take it off!
Re: (Score:2)
I wonder what Steve Ballmer would think seeing a picture of him throwing a chair.
Re: (Score:3, Funny)
My Steve Ballmer pictures keep coming up as "best used as cat food"
Re: (Score:2)
I don't know about that, I'm sure it'd be better product testing than them automated machines you see in IKEA.
Re: (Score:1, Funny)
More importantly, what are they stating at goats?
What can go wrong? (Score:1, Funny)
No really what can go wrong with using your inconscient animal nature to tag every photo with a (decent) girl in bikini as "To Do"
Re: (Score:2)
Using an EEG scan of a person's brain while they view an image could yield very different results for an image of a naked woman depending on the viewer's sex or sexual persuasion. Also, for images of objects and images of people in general - each viewer would have a different set of associations for a given image. For example, imagine the EEG of a person with arachnophobia when presented with a picture of a spider, etc.
Interesting, but needs a lot of work (Score:2)
This new method allows an appropriate tag to be generated by an AI algorithm interpreting the EEG scan of a person's brain while they view an image.
That's true, as long as "appropriate" means it was either X or Y, as the system really only works on discriminating between things like "a face" and "not a face". It's an interesting piece of research, sure, but it sure as hell won't replace good old fashioned tagging using a keyboard.
Re:X or Y (Score:2)
You're not fast enough.
Get it to play 20 questions for you. ... Place/Thing ... StaplesButton/DolphinPanicButton"
"Person/NotPerson
I don't see it as innovative (Score:2)
But in the future more advanced versions might.
Then you could use "thought macros" to control wearable computers.
The measurements of thought patterns are likely to be specific to each person. So devices that use thought input would have to be trained.
But after that, you could be thinking of stuff like "purple green striped elephant" as the escape sequence to tell the computer to start listening in and doing stuff based on the thought patterns it recognizes (whi
Re: (Score:2)
Thinking of something could cause your wearable computer to perform an action. Which could include visiting a "favorite" url (which could result in turning the lights on or off, or setting the temperature of the airconditioning etc). Or running a script/program.
FWIW, I'm not sure how many would bother to take the trouble to train their wearable (and their minds[1]) to "virtually type" stuff quickl
Re: (Score:2)
Then you could use "thought macros" to control wearable computers.
What, like this [ocztechnology.com] product?
Re: (Score:2)
But in the near future I think the legal crap will be the thing that is slowing progress.
Have to be picky about your subjects (Score:1)
Wouldn't work with teenage males for example....
Bwahahahahaa (Score:1)
MSFT: "Is his Windows and Office license legit? Let's read his mind and find out."
"Does he also run Linux?"
"Yep. Crank up up the juice and reprogram him."
The other thing I'd like mention, being only on my second cup this morning, those guys in the graphic when looked at quickly looked like they were wearing thigh high stockings.
Re: (Score:1)
The other thing I'd like mention, being only on my second cup this morning, those guys in the graphic when looked at quickly looked like they were wearing thigh high stockings.
I thought the same and yet the image isn't tagged as such. Clearly vaporware then.
Re: (Score:2)
They're not gettin' their mind probes through my freakin tin foil barrier
Cool! (Score:3, Funny)
Just don't let Rorschach tag any images of ink blot tests.
Boobs boobs boobs (Score:3, Funny)
Microsoft, be warned. Some people have a limited scope in terms of what they are thinking about at any given time.
Re: (Score:2)
I think they know that already. "Developers developers developers!"
Connecting an EEG reader to the Internet... (Score:2)
...what could possibly go wrong? ^^
Re: (Score:2)
We could make registering minimal brain activity a requirement to access the internet, that should cut net population by about 90%.
Oh Microsoft... (Score:1)
It doesn't work very well, and it very probably never will. The variance in electrical activity in the brain between two people receiving the same sensory input is, in an average way, too great
Re: (Score:3, Interesting)
Actually, the variance isn't the problem. That comes out statistically in the wash - you can see that with a large enough N, patterns emerge across the different stimuli types, which allows them to do the tagging. The real problem is interpreting the complex interactions between the different regions of the brain. However, that doesn't really matter for an experiment like this, as the patterns don't actually need to be interpreted, just recognized by the algorithm. It's a similar concept to the way the MMP
Re: (Score:2)
You shouldn't directly use the "thought pattern" of a person to tag the data.
As you said, the thought patterns are likely to be different from person to person.
What you do though is, for each tagging participant, you get the person's thought patterns for a whole bunch of tags.
Then they can tag stuff really quickly just by looking at them. The advanced people
I was tagging that image, honest! (Score:2, Interesting)
Fun and Easy to Use (Score:4, Interesting)
This is typical of MS -thinking that something like this would be easy for the average user. FTA: "However, the mind reading approach has the advantage that it does not require any work at all from the user."
So, in order to use this sytem, we should all strap on EEG caps while we're surfing the web. Sounds real practical to me - I used to work in an EEG lab, and I can tell you that those caps are pretty uncomfortable to wear. After they put them on, you stick these little needles into the leads and squirt conductive goop on your scalp. It takes a few cycles to rinse that stuff out too.
Way to go MS for making productivity so much easier.
Re: (Score:2)
Re: (Score:2)
ORLY?? So tell me what technology they're going to use out of all EEG reading options out there?
I'm certainly not discounting the research. I think the real contribution is that they found an interesting real world application for these patterns. But really, it's not as groundbreaking as the article lets on. I was studying the patterns of evoked potentials way back in '89, my friend.
Re: (Score:2)
You used to work in a lab, so you ought to be familiar with how research works, and how often it produces actual products.
Forget the practicalities of people doing this in their homes; in principle, I think it's pretty damn cool.
Re: (Score:2)
And as we all know, no technology that was slightly inconvenient in a lab has ever had any value or practical use.
Re:Fun and Easy to Use (Score:4, Informative)
I used to work in an EEG lab, and I can tell you that those caps are pretty uncomfortable to wear. After they put them on, you stick these little needles into the leads and squirt conductive goop on your scalp. It takes a few cycles to rinse that stuff out too.
Smitty, we've come a long way from those caps. There are now "caps" that are essentially nets of elastic cord with plastic cups containing pieces of sponge in them, the electrodes embedded in the sponge. Dip it in mild salt water for conduction, shake it out so there's no drips running together bridging the electrode sites, and pull it on. I could get good signal on 128 channels in less than 10 minutes from the time they walked in to data collection start.
There is also a European company selling a similar get up, but the preamps are built into the cups on the net, making impedance matching irrelevant and signal balancing automatic on the fly. These are so stable that they can be used ambulatory.
And nobody ever has to get goop or glue stuck on/into them any more.
Re: (Score:2)
Hell, they've got EEG game controllers [technologyreview.com] now.
Oh that what the internet needs more of! (Score:3, Insightful)
no semantic information? Ahem *cough* (Score:3, Insightful)
unlabeled and have no computer-readable semantic information.
There, fixed that for you.
Seriously, the old saying "an image is worth 1,000 words" implies that images frequently have semantic information, at least in the sense that anything on paper can have semantic information. It's just that computers can't parse it, catalog it, search on it, etc. Not well, anyways, not yet. They are getting there though.
Re: (Score:2)
Terminology Police (Score:2)
Just one minor nitnoid: the title of this article should be "Tagging Images With Your Brain", not Mind. Electrical impulses are used - using the word mind implies that some conscious effort is involved. This is strictly identifying patterns using machine algorithms independent of the user's thought process.
3-class (Score:3, Interesting)
Focus (Score:2, Interesting)
What happens when I'm tagging a photo but listening to music at the same time?
Or I run the photo tagging software in a small window and watch a movie (or some porn) instead?
So they can create tags from brain waves, but there's no way to tell what a user is actually focussing on.
Re: (Score:2)
What happens when I'm tagging a photo but listening to music at the same time?
Or I run the photo tagging software in a small window and watch a movie (or some porn) instead?
So they can create tags from brain waves, but there's no way to tell what a user is actually focussing on.
If you were in a sensory deprivation tank with the only perceptual cue the target image, you'd still have hundreds to thousands of competing cognitions boiling away, fighting for processing space on their way to awareness, few of them making it but all taking up some resources and generating some signal. But these, as well as any ongoing stimulus like music, are not locked in time with the stimulus presentation and so are random as compared to the stimuli. When processed the signal is cut at the same point
Deep brain? (Score:2)
Cost Effective? (Score:2)
Are brain scans really so cheap that it's cheaper to set up an EEG than to pay someone in a third-world country to do it?
Am I EEG Or Not (Score:2)
So we'll now have automatic 'Like', 'Dislike' and 'Eeeeeeagh my visual cortex where is the brain soap' responses?
Soon we can implement a lar! (Score:2)
I for one welcome our mask overlords [cheeseburgerbrown.com]
Integrate it with an intelligent vocabulary (Score:1)
So what? (Score:2)
And yet somehow we've managed to survive. I've never really seen the point behind "tagging" much of anything. In every implementation, it just amounts to a mostly random bunch of words that a mostly random person or group thought vaguely described the item at that time. It's never been useful for finding more of the same because tags are so absurdly broad, and it's never been useful for n
recaptcha-like (Score:1)