Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Biotech IT

Will Neural Sensors Lead to Workplace Brain Scanning? (ieee.org) 68

"Get ready: Neurotechnology is coming to the workplace," claims IEEE Spectrum: Neural sensors are now reliable and affordable enough to support commercial pilot projects that extract productivity-enhancing data from workers' brains.

These projects aren't confined to specialized workplaces; they're also happening in offices, factories, farms, and airports. The companies and people behind these neurotech devices are certain that they will improve our lives. But there are serious questions about whether work should be organized around certain functions of the brain, rather than the person as a whole.

To be clear, the kind of neurotech that's currently available is nowhere close to reading minds. Sensors detect electrical activity across different areas of the brain, and the patterns in that activity can be broadly correlated with different feelings or physiological responses, such as stress, focus, or a reaction to external stimuli. These data can be exploited to make workers more efficient — and, proponents of the technology say, to make them happier. Two of the most interesting innovators in this field are the Israel-based startup InnerEye, which aims to give workers superhuman abilities, and Emotiv, a Silicon Valley neurotech company that's bringing a brain-tracking wearable to office workers, including those working remotely....

EEG has recently broken out of clinics and labs and has entered the consumer marketplace. This move has been driven by a new class of "dry" electrodes that can operate without conductive gel, a substantial reduction in the number of electrodes necessary to collect useful data, and advances in artificial intelligence that make it far easier to interpret the data. Some EEG headsets are even available directly to consumers for a few hundred dollars.

This discussion has been archived. No new comments can be posted.

Will Neural Sensors Lead to Workplace Brain Scanning?

Comments Filter:
  • by Mr. Dollar Ton ( 5495648 ) on Sunday November 20, 2022 @03:53AM (#63065409)

    Why should I pay my minions if their brains aren't constantly working on the jorb they are hired for?

    • Re:They will (Score:4, Interesting)

      by dvice ( 6309704 ) on Sunday November 20, 2022 @09:04AM (#63065799)

      This is quite interesting actually. I have seen situations where a person is unable to do some task (in programming) within 2 weeks, some even have claimed it to be impossible, and then someone else does it within hours. So even if the doers would be brain dead for 90% of their work time, they would still outperform others. How would these people measure in the scan.

      Another interesting take are autistic people who have been documented to have different kind of brain activity than neurotypical. I can easily imagine that they would show "coffee break" waves when working and "work waves" while of coffee break", as the latter is usually much for work for them because of the social aspect.

      • I'm a dev and I work in this way. 90% of the time I just wander around and think about a problem. Then, when I have the entire plan ironed out, I code a solution within a few hours. I also very commonly have epiphanies and a-ha! moments when performing other tasks outside of work i.e. while running or driving, when my brain is on "idle". My best work ever never gets done at actual work. It gets done when I'm alone, locked in a small room, usually after everyone goes to sleep.

      • I've heard these stories. As a particularly rich magnate, I don't believe them. It is the hare and the turtle all over again. You have to slave all day to earn your keep, just as I slave all day to provide your salary. Go back to work, don't waste time here. You're only productive for a few years anyway.

    • I hope you forgot the /s, but to answer that question for the sarcasm challenged....

      Because human beings are not machines, and the need to constantly monitor their thoughts for "productivity" shows you want work product that can only be produced by machines. Hence you should just buy the machines and fire the humans as it's far better return on your investment. Hell, even putting in the research and development to produce said machines, if they don't exist, would yield a better ROI than constantly expecti
  • by Anonymous Coward on Sunday November 20, 2022 @04:12AM (#63065419)

    Implanted electrodes have too much noise from adjacent, physically growing individual neurons. Skin electrodes sample over much, much too large an are. It's like trying to read Braille by smushing a poodle against it and feeling up the poodle's ears. It's useful for telling if the poodle is standng next to a wall, but that's about the resolution we get. And yes, I used to design neural sensors and stimulators.

    • by larwe ( 858929 )

      So it doesn't work well (or at all?). Will that even matter? Companies buy tracking tech on hype, not results - look at metrics-tracking "wellness programs" for instance, which are greatly hyped, and certainly do suck up a lot of juicy actuarial data into health insurance costing - but don't actually seem to have any great effect on the overall wellness of the employee population.

      Like a lot of other opaque AI-driven algorithms, this stuff will be tuned specifically to locate "unproductive" workers, with a b

  • by metrix007 ( 200091 ) on Sunday November 20, 2022 @04:17AM (#63065423)

    This is the kind of stuff that can slide into a dystopia, and really should just be legislated outright.

    AI ending humanity isn't the risk, this type of shit is.

    • by lsllll ( 830002 )
      This is just another clickbait article. No court is going to allow corporations to routinely have wires running to their employees skins, or worse yet, electrodes in their head. Not in the U.S., at least not until technology exists to pick up brain signals over the air. Even then, I doubt the courts will allow that. Being an employee doesn't equate to 100% of your thoughts revolving around the employer and the project. Take personal phone calls [upcounsel.com], for instance. There are limits (thankfully) to what a co
      • by test321 ( 8891681 ) on Sunday November 20, 2022 @07:45AM (#63065635)

        No court is going to allow corporations to routinely have wires running to their employees skins

        The risk is that this technology will enter the market for good purposes. An EEG could be used as a thought controller to command actions in VR world, which could be a legitimate use in the future that a court will allow.

        We have seen this with MS Teams, which is presented as apparently a software to help you do some work, but then is a keylogger and enables all sorts of analysis of what you have done along every second of using it.

      • But of course not. I will not require you to have electrodes implanted, that would technically imply that I'd have to pay for it.

        No, but if you don't have those electrodes and that other applicant has... hey, it's a free market, you could have had that job, ya know...

    • This is the kind of stuff that can slide further into a dystopia, and really should just be legislated outright.

      AI ending humanity isn't the risk, this type of shit is.

      There, I fixed that for you. It's just another way for employers to coerce & gaslight employees. Add it to the pile that they already use.

    • Interesting application to EEG devices is in porn..

      You can show porn to people their brains are most likely to like so they become more easily addicted..

      So kinda scary hacking of people's brains and their pleasure center's in brain.
  • by NotEmmanuelGoldstein ( 6423622 ) on Sunday November 20, 2022 @04:25AM (#63065435)

    ... correlated with different feelings or physiological responses ...

    "The Computer is your friend. The Computer wants you to be happy. Happiness is mandatory. Failure to be happy is treason."

  • by cstacy ( 534252 )

    What the fuck is this nightmare shit? Seriously?

    Yeah I don't think so.

  • Workers are not machines, most people like to do what they are good at but also challenge themselves to do new things.

    To be fulfilled in work, there has to be the right balance between simply doing the things you can already do and achieving something new.

    • I think a large part of this equation is choice. You start 'scanning peoples brains' to ensure they're 'focused on the job' and you're infringing on their personal agency. "Oh, but you can always quit!" employers would say, with shit-eating grins on their faces, I'm sure.
      If there is one place, real or virtual, that should be totally and completely inviolate, it should be peoples' thoughts/feelings. Other humans demanding what amounts to JTAG access to your brain is just plain evil and shouldn't be allowed.
  • Will Neural Sensors Lead to Workplace Brain Scanning?

    Definitely, especially if you work for Elon Musk. There will be an AI that assigns you a 'hard core score' and fires you by SMS if you don't score high enough.

    • Will Neural Sensors Lead to Workplace Brain Scanning?

      Definitely, especially if you work for Elon Musk. There will be an AI that assigns you a 'hard core score' and fires you by SMS if you don't score high enough.

      Or who corrects him [arstechnica.com] when he spews bullshit [imgur.com].

  • by JaredOfEuropa ( 526365 ) on Sunday November 20, 2022 @06:04AM (#63065529) Journal
    I support this IF the first persons to start working while wearing these devices are our elected (or appointed) representatives, to monitor whether they are being fortright or duplicitous. If it's ok by law for employers to monitor employees, then it should certainly be ok for us to monitor the persons whom we elected on the strength of their promises and personalities.
  • If I can get those brain probes to drench my cranium with endophins at will...drill baby drill!
    • Many years ago when it was first discovered you could stimulate the pleasure center with electrodes there was a scifi story of a man who had electrodes implanted. He died, of thirst probably, because he couldn't unplug from the pleasure.

  • Formatting margins is how I taste God [smbc-comics.com].

    As they say, from a revenue standpoint, it's not unethical.

  • by Misagon ( 1135 ) on Sunday November 20, 2022 @07:17AM (#63065593)

    This reminds me of the debate around high-performing workers doing "Mindfulness" exercises such as meditation to be able to better cope with stress and be more focused on their job. The issue being that workers are doing this instead of getting the working conditions that caused stress and fatigue in the first place to get changed.
    One point of criticism is that Mindfulness has taken the methods from Buddhism but without the ethical content.
    Read more: The Mindfulness Conspiracy [theguardian.com]

    And ... it is very easy with EEG to measure if someone is doing their daily work-mandated meditation exercise properly.
    It is by practicing meditation that I have been able to win every time at "Brain Ball" and similar games that use EEG sensors as input.

    • Good point. If you have to change yourself (either by this 'mindfulness' you speak of, or by taking drugs, including drinking alcohol) just to cope with a job, then it's the job that needs to change. I suppose there are some necessary exceptions (emergency worker/first responder, for instance) but in most cases the job is what needs to change, not pounding the square pegs that are humans into artificially round holes of shitty jobs.
    • Kinda like how yoga and eastern martial arts have been modified and monetized in the West.

      I practice techniques from eastern traditions, breathing & what would be recognizable as mindfulness, *not* including a full package of original philosophy, modified by myself to fit my needs, preferences, and objectives. It's good for stress reduction, focus, health.

      I have to disagree with the guardian article however. Sure, some of the "spiritual" aspects are lost in the western translation. You literally cannot
  • sure will [github.com]

  • If this gear can tell whether you're alert and focused, I could see it coming to heavy equipment & crane operators in cities, watch standers in nuclear plants, etc.
    • Next step: Congress mandates all automobiles must employ this tech to ensure drivers are alert and focused. No more road head!
  • ST:TNG crew had such a neural sensor in the form of a Betazoid on the bridge. The show quickly turned into a soap opera in space.
    • A very buxom Betazoid
    • Deanna wasn't used by the Captain and First Officer as a weapon against the rest of the crew to control them, she was there to help them if they needed help -- and occasionally to give some valuable insights into interactions with other species. Or are you trying to be funny?
      • Help whom? Imagine this scenario: Riker reprimands a crewmember, who is quite upset but wants to be manly and hide it. Deanna senses that upset and tells Riker. I would think it happened time and again. Control is not the only issue, and there may be many good (for everyone involved) reasons to not disclose - or have disclosed - all of your feelings. I was trying to point out how focusing on feelings may not be conducive to a solid work environmenment... In TOS, they did not have counselors (they did have
  • It's called "introspection".

    With this technology you don't even need any other device but the one you already have within: your brain.

    Of course, it takes time to master but once you do it will improve not only your work but your whole life too, beyond the workplace.
  • And welcome to everyday real-time ones.

    I know, I know. They says is just for giving us superhuman abilities or to check how are we doing, the fact is, you're being monitored in real-time.

    And trust them when they say that only aggregate data is handed over, because they are very ethical and responsible people that don't do it for the money but for the love of the human race.
    • Know what this reminds me of? How truckers now have cameras and microphones all over inside the cab and are literally being recorded and monitored every second of every workday, in addition to GPS tracking -- and how they can be fired for just a slightly negative facial expression or mumbling something (oh, well, you're clearly road-raging, against company policy, so we have to terminate you); something that's supposedly to enhance driver safety, but of course being used as a weapon against drivers to contr
  • I'm making a gun but I'm concerned it can be used to kill someone.
  • In the context of cognitive neuroscience EEG is a great tool, particularly because of the people using it: neuroscientists.

    It's neuroscientists that conduct the work and give meaning to the data extracted from EEG. Which in turn led to more insights about human behavior.

    In the context described by Emotiv is plain wrong, particularly because of the people using it: managers.
  • If brain scanners show everyone at a TPS metrics status meeting is totally zoned out, maybe the robotic overlords can cancel it and fire the organizer?

  • Stay out of my head, what goes on inside my head is not your gods-be-damned business, never has been, never will be. Shit like this is the ultimate invasion of privacy and must not be allowed.
  • It is a fundamental philosophical view that I hold that every being possesses an inalienable right to self-agency: the right to think for themselves without the intrusion of others. I am not opposed to the core idea that technology might someday be used to breach this frontier, but I do maintain that under no circumstances should such technology ever be used without the express unduressed consent of the individual. There is no way that use of this kind of technology in a workplace environment would someh
  • Need to unionize to stop this!

  • They can try to make this mandatory in the workplace (under threat of termination) and some will go along with it. Others will find ways to game the system and/or screw with the results. The idea will be to make the data so unreliable that it's a bigger pain to work with than not.

  • ... without human rights, sure.

Make sure your code does nothing gracefully.

Working...