Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Science

Why Do We Live at 10bits/s? (betanews.com) 97

BrianFagioli shares a report from BetaNews: It might sound unbelievable, but the human brain processes information at just 10 bits per second! Yes, folks, that's slower than the internet speeds many of us endured during the early days of dial-up. While our senses take in billions of bits of data every second, our brain intelligently sifts through the chaos, letting through only what's important.

This is no accident. Researchers Jieyu Zheng and Markus Meister explain in their study, The Unbearable Slowness of Being, that the brain is built this way for survival. Instead of getting overwhelmed by a flood of details, the brain has a system to focus on what matters most. It ensures we act quickly and effectively without being bogged down by unnecessary information. [...] The slow pace of the human brain might seem like a drawback in today's fast-paced world, but it has been sufficient for survival throughout human history. Evolution prioritized efficiency over speed, enabling the brain to focus on critical tasks without wasting energy. While machines continue to outpace us in raw processing power, the human brain remains unmatched in its ability to prioritize and adapt.
The study raises an important question: Why does a brain capable of such complexity operate at such a slow rate?

Why Do We Live at 10bits/s?

Comments Filter:
  • 10bit that is comcasts new speed when you hit your cap

    • Re: (Score:1, Insightful)

      by Anonymous Coward

      It's also the processing speed of your average Democrat voter.

      • Re: (Score:1, Insightful)

        by Anonymous Coward

        It's also the processing speed of your average Democrat voter.

        Go fuck yourself.....

  • But frankly as soon as I saw the stupid (and extremely forced) title they gave their "study", I pretty much decided I should dismiss it out of hand.

    • It's incredibly stupid because there's a lot more to our brains than the parts we use for conscious thought. That may not be as fast as other parts of the brain that evolved millions of years ago as opposed to much more recently, but it's taken immense processing power in modern computer systems to do what our (or other animal) brains do efficiently. We haven't been able to build a machine capable of consciousness yet, so there's no way to know if it's just incredibly difficult compared to other things we h
      • by Roger W Moore ( 538166 ) on Saturday December 28, 2024 @01:43AM (#65044723) Journal

        It's incredibly stupid because there's a lot more to our brains than the parts we use for conscious thought.

        Even when we consider this though we clearly process data faster than 10 bits/second. Consider that the average person reads at 238 words/minute which is ~4 per second. The average human vocabulary is around 30,000 words for English speakers. Now if we map the very non-binary processing of our brains into the binary world we would need 15 bits to encode a word and we read 4/second so even with this highly inappropriate mapping (our brains do not work this way) the average brain processes 60 bits/second while reading - and that's a conscious thought process.

        • by MindPrison ( 864299 ) on Saturday December 28, 2024 @03:06AM (#65044757) Journal

          Not to forget the image processing we do.

          We literally recognize the characters (letters) in various shapes and sizes, in real-time, while putting together words, fill in missing gaps, visualize from words and from images to words.

          Some of us are even capable of looking at something, and in a split second imagine an entire film scene, fully rendered with special effects worthy of a hollywood production (visual thinkers), some of us are capable of creating music with complete synthesized sounds developed in-house (in-brain) and capable of playing 1000 variants of those.

          No, the brain is insanely fast, it's capable of seing the bigger picture of an massive amount of historical and present data.
          Is it as accurate as a computers memory? No! Is it faster than anything we know, absolutely! We're barely capable of emulating an insect as it is.

          • Not to forget the image processing we do.

            We literally recognize the characters (letters) in various shapes and sizes, in real-time, while putting together words, fill in missing gaps, visualize from words and from images to words.

            Some of us are even capable of looking at something, and in a split second imagine an entire film scene, fully rendered with special effects worthy of a hollywood production (visual thinkers), some of us are capable of creating music with complete synthesized sounds developed in-house (in-brain) and capable of playing 1000 variants of those.

            No, the brain is insanely fast, it's capable of seing the bigger picture of an massive amount of historical and present data. Is it as accurate as a computers memory? No! Is it faster than anything we know, absolutely! We're barely capable of emulating an insect as it is.

            A personal example. I talk relatively slowly. But behind that, I am mentally parsing every word multiple times. That lends itself to greater verbal accuracy. On the flip side, until people are used to me, they tend to interrupt. However, I'm doing a whole lot faster than 10 bits per second of processing.

            And yes, Trying to assign some bits/sec to a machine like the human brain is going to fail. Our noggins can switch between linear and non-linear processing and how to assign a meaningful number to that is

        • "the average human vocabulary is around 30,000 words for English speakers. Now if we map the very non-binary processing of our brains into the binary world we would need 15 bits to encode a word"

          You would need 15 bits to encode each word *if they were chosen at random*. But they're not, they're part of a rational statement in a natural language and notable limitations of spicy autocomplete that given a few words of a sentence there are few choices for what the next would could be without being nonsense.

    • by Okian Warrior ( 537106 ) on Friday December 27, 2024 @10:05PM (#65044501) Homepage Journal

      But frankly as soon as I saw the stupid (and extremely forced) title they gave their "study", I pretty much decided I should dismiss it out of hand.

      To achieve understanding, one only needs to read and understand 3 papers.

      Firstly, there's A Mathematical Theory of Communication [harvard.edu], by Claude Shannon. This paper is an easy read, you only need to read the first third (-ish?), and you can skip over the mathematics and get the gist. Just assume the mathematics are correct and believe his conclusions and you're good. The first section of that paper defines information as a measure of data.

      Shannon's Prediction and Entropy of Printed English [upenn.edu] is the next paper to read. In it, he uses the definitions and methods from the 1st paper to derive the information we get from printed English. The "channel capacity" of written words is about 1 bit per character of text, the paper goes through how he arrives at this figure, and it's an interesting read.

      The third paper to read is The Magical Number Seven, Plus or Minus Two [utexas.edu], where George Miller describes the channel capacity of human sensory inputs, from experiments. The experiments are described, and listed in the reference section if you want more info.

      So for human hearing, you can play notes on a piano and ask the human to choose among 3 pitch possibilities (low, medium, high pitch) and the human will always get the right answer. Similarly with four possibilities, five possibilities, all the way up to about seven possibilities. Once you hit seven possibilities the human begins to make errors in which pitch is presented.

      The interesting bit about hearing is that it doesn't matter how you format the choices: you can have 4 low pitches, which the human will recognize perfectly, 4 high pitches, which the human will also recognize perfectly, but put them together (8 choices total) and the human begins to make mistakes.

      The channel capacity of human hearing is about 3 bits of information, it doesn't matter if you segment the choices, you can't exceed that amount of information.

      The 3rd paper (Magical number 7) goes through several human perceptual modes, such as identifying segment of X axis where an "X" occurs - more than 7 options and the human begins to make mistakes.

      The paper mentioned in the OP is simply restating the various background and conclusions from the 3 papers cited.

      Apropos of nothing, anyone doing AI research - not engineering, such as learning the API and hooking up an AI to perform various tasks, but actual research on making AIs that work better or more like the human brain - can use the information-theoretic description of our brain mechanisms to guide their thinking and direction of research. Knowing about information theory can eliminate a lot of proposed models and mechanisms you want to try.

      I like to think of it as an analogue to modern physics using abstract algebra to inform their theories on quantum gravity. You know that any correct solution needs to transform in a certain way (Lorentz invariance, for example), and this eliminates all solutions that do not have a specific form.

      I think AI will eventually be that way: correct solutions will learn quickly from small amounts of data (10 bits per second), or have a realistic explanation for why they deviate from this rule.

      But until then, expect to spend $150 million to train ever larger models on Sagans and Sagans of data.

      (Oh, and if you're interested, a fourth paper "Most Efficient Chunk Sizes" by David Dirlam can be quite an eye opener. The implications of this for AI are...)

      • by postbigbang ( 761081 ) on Friday December 27, 2024 @10:27PM (#65044523)

        I believe the basic premise is incorrect.

        I can read at about 400wpm, and can memorize at about half that for different bursts; many can do the same-- I'm not unique.

        I can do complex math in my head, sadly from habit. This is also not unique.

        I'm not special. Processing at 10bit/sec is plainly bullshit. The denominator of a bit is even worse. The mind processes many things concurrently. Reaction time (motor skill) is only one measure of many.

        Consider the second violinist in a typical orchestra, who's listening, playing, reading a chart, wary of the maestro, and has an itch. The parallelisms processed don't measure in any way to a descriptor of "10 bits/sec". It's plainly inane and facile to consider the brain in this way.

        Mapping this to AI training, the brain does calculus, the direct path to an answer in a way similar to AI, but on a highly parallel scale. The brain doesn't gain inference in the same way, rather, core adaptation to environment (which AI is not blind to, yet references inputs from that same environment). AI, like corporations, has no soul, whatever soul is. We measure soul by understanding compassion, shame, even guilt. There are those that walk among us with no compassion, shame, guilt, or sense of love; we call them sociopaths, narcissists, and psychopaths. Yet they function among us (until they don't), just like AI.

        • Are you an AI? (Score:3, Insightful)

          I believe the basic premise is incorrect. I can read at about 400wpm, and can memorize at about half that for different bursts; many can do the same-- I'm not unique.

          Average word length is 5 letters, one bit per leete, 5*400 = 2000, then divide by 60 seconds in a minute => 33 bits/sec, and you mention that you need to go half that rate to get a memorization level, which is about 16 bits/sec.

          And you think that's sufficiently distant from 10 bits/sec to invalidate the premise?

          Also, you're not memorizing. Unless you can recite back what you read word-for-word, which I guarantee would be almost unique among humans.

          I can do complex math in my head, sadly from habit. This is also not unique.

          So to draw an analogy, you're saying that because the com

          • 1 bit per letter? I know there is some redundancy but that seems a little optimistic.

            • you havenâ(TM)t defined what a bit is. On a computer itâ(TM)s a one or zero because of technological limitations. What is a bit in the human mind? can that simply map to an instinct, where one bit equals a slithering snake out of the corner of oneâ(TM)s eye? Then we jump back out of the way for instance.

              also, doing calculus in oneâ(TM)s head, for instance, simply comes from having largely memorized many things. It appears that while the human mind only processes at a fairly "low bit rate

              • by Junta ( 36770 )

                if one "bit" is redefined to mean more than 2 states, then it is ridiculous to use the word bit.

                The word was chosen to try to establish a metric of comparison with more obviously quantized digital computing.

                10 bit/s implies only being able to listen at a pace of one word per second if you had a vocabulary of about a thousand words, ignoring on the non word stimulus you are constantly also processing. However that would be a terribly slow pace and even a poor vocabulary is 20 times larger. Also, people can r

                • by dfghjk ( 711126 )

                  The term isn't being "redefined", it simply has two different meanings in different contexts. Once you understand that, you realize the entire purpose of the post is a troll, it is trying to suggest that the brain cannot process information except at an absurdly low rate. It accomplishes that troll by relying on a misunderstanding of the measure of information being used, then it is defended here by Okian Warrior using a bunch of bad faith tactics that ignore the real problem and offer more confusion for

          • by Bongo ( 13261 )

            This might be a Zeno's Arrow situation where the logic and data are absolutely correct yet the answer is nonsense anyway.

          • by q_e_t ( 5104099 )
            The idea of considering words to be about 1b per character misses out so much on the subtlety of human language it's not even funny.
          • by dfghjk ( 711126 )

            "And you think that's sufficiently distant from 10 bits/sec to invalidate the premise?"
            Yes, once you remove all your myriad errors in calculation. See above.

            "You should write this up as a series of papers. Disproving the studies mentioned in the "Magical Number 7" paper cited above would be highly valuable to science."
            So should you. I would bet your paper gets more laughs than his.

            "Are you an AI? Who's talking about reaction time?"
            At least there's a concept of time there, unlike your claim of "channel capa

          • Um... what's the denominator of a bit again?

            I chuckled.

            Saying "the brain does calculus? Citation needed.

            Oh, I gotta agree with the "I can do complex math in my head" humblebraggart's assertion that brains do calculus. Catching a ball, judging when another vehicle in traffic is likely to hit you or miss you on its current trajectory (and current acceleration/deceleration trend, of course), etc, etc -- there are countless examples I can think of that demonstrate the human brain intuitively doing at least second-order derivatives of motion/translation. I mean, our brains are clearly not implementing n

        • by Junta ( 36770 )

          I broadly agree with your sentiment that comparison is flawed, but I will point out that your reading level is highly dependent on the context.

          conscious lesson dominant recording oppose allocation highway grandfather ministry remunerate drama normal interrupt as earthwax censorship position article decade dead shareholder workshop camera rule first

          The first sentence was something you could probably read at 400 wpm easily, at least I could read a sentence like that. The sequence of the same number, but rand

          • Extrapolate what you wrote randomly to parallelism. The words you wrote connect by their contrast and can be rapidly related to each other by contrast. So is that a bit?

            Shortcuts are like stored procedures; they're "CPU strokes" often running in parallel. The relationships among random inputs are data points themselves. Chaos constitutes aggregated sets of data points. Making sense (whatever that is) of chaos is normal, as almost each moment is essentially smoothed chaos. 10"bits"/sec?

            The descriptor and den

      • To achieve understanding, one only needs to read and understand 3 papers.

        The paper mentioned in the OP is simply restating the various background and conclusions from the 3 papers cited.

        Only having access to an abstract that says very little it seems estimates of information content of inputs via "outer" brain is separate from the 10bit/s "inner" brain.

        The channel capacity of human hearing is about 3 bits of information, it doesn't matter if you segment the choices, you can't exceed that amount of information.

        All of the codecs for audio and video processing worth using have baked in perceptual models that take advantage of illusionary nature of human perception to reduce digital bandwidth required to reproduce the original analog signal without people noticing the difference. If what you say is true where is my transparent 3-bit audio codec?

        I think AI will eventually be that way: correct solutions will learn quickly from small amounts of data (10 bits per second), or have a realistic explanation for why they deviate from this rule.

        Is t

        • All of the codecs for audio and video processing worth using have baked in perceptual models that take advantage of illusionary nature of human perception to reduce digital bandwidth required to reproduce the original analog signal without people noticing the difference. If what you say is true where is my transparent 3-bit audio codec?

          I do think Samuel Johnson said the same sort of thing, more eloquently: "I refute it thus" *punt* Heheh.

      • You do have to be a bit (no pun intended) careful about associating 10 bits per second with the human sensory input capacity ideas. A bit as a unit of information doesn't make sense on its own, it has to be relative to some prior agreed model. Technically, you always measure entropy and Kullback Leibler divergence against a reference distribution, so when you say 10 bits you are really talking about classifying the outcomes of the reference distribution 2^10 ways.

        As a concrete examples, it's difficult to

      • by q_e_t ( 5104099 )
        Clearly I can save space in my mp3 collection by reripping at 3b/s
      • by Junta ( 36770 )

        The issue being that the attempt to correlate to computation standards doesn't quite map well to the works.

        A mapping for bits to printed character is a bit weird, since we'd be thinking about words rather than letters in terms of the "information" conveyed in natural use. However a lot of the principles extend to the written word, that our ability to process a stream of words is contingent on the previous words effectively defining the likely scope of the next word. If words break those promises, then we

      • I am pretty sure that humans can distinguish more than seven colors. And even if we can only distinguish about seven pitches, we can do that faster than one note per second.
      • by dfghjk ( 711126 ) on Saturday December 28, 2024 @10:12AM (#65045199)

        "Firstly, there's A Mathematical Theory of Communication [harvard.edu], by Claude Shannon. ... Shannon's Prediction and Entropy of Printed English [upenn.edu] ... uses the definitions and methods from the 1st paper to derive the information we get from printed English. The "channel capacity" of written words is about 1 bit per character of text, the paper goes through how he arrives at this figure, and it's an interesting read."

        The first paper defines a "bit" exactly as we expect, one boolean of information. It also defines "channel capacity as "units of information per unit of time", and a channel as something that transmits information.

        The second paper discusses information conveyed in written language. There are a bunch of issues with your comments though:
        1) 1 bit per character of text is not a "channel capacity", there is no time in that metric.
        2) The information you are claiming as defining channel capacity is the result after processing, the information received is far greater and that would also be processed by the brain.
        3) The experiments involved only 26/27 symbols, written language uses multiples of that. The data is preprocessed to reduce the information further.
        4) The conclusion was indeed 1 bit/letter with "redundancy of roughly 75%".

        Now if we take your example provided below of 5 letters per word and 400 wpm, in fact the bit rate would be 6*400 because you ignored the separators, so the reading rate would be 40 bits/sec. But that's post-processed information, the input rate would be 160 bits/sec according to the article you cite and, in fact, far greater if a simplified character set was not used. Also, the "memorizing" points you make are merely moving the goalposts. Finally, what you are reading matters immensely, text processing rate is largely irrelevant. Reading a pre-K picture book is gonna be faster than Nicomachean Ethics.

        So by your own citations, 10 bits/sec is off by an order of magnitude (perhaps even two orders in some written languages), and that ignores the OCR processing that's done before any of this is even measured. OCR would dwarf all the other processing BTW. The articles you cited are relevant, but they don't paint the picture you claim they do.

        Didn't bother looking at the third citation, there were already so many errors starting with the first one that this post is long enough. I expect it to be interesting, but I expect you don't really understand that one either. Appeals to authority you don't understand is not a good look. Thanks for the citations, but always make sure you understand them yourself.

    • Seems like they're using a very narrow definition of "processing", since we already have a pretty good idea how much data is required to reproduce convincing visual and auditory inputs. After been doing quite a bit of tweaking my own video encoding preferences for my Kodi movie library, I'd say anything less than about 1.5 gigabytes per hour of 1080p video starts to look noticeably bad. That's about 3.33 Mbps with data compression, so even after doing all the various tricks to eliminate redundancies, it's

      • hour of 1080p video starts to look noticeably bad.

        Your fovea only picks up a tiny portion of that at a time. If you could selectively downscale everything outside of that region, you probably wouldn't notice. Even within that region, your visual cortex basically tunes out details in objects that you aren't consciously paying attention to. Classic example:

        https://www.youtube.com/watch?... [youtube.com]

        Unless you're actively seeing through that test, you're not even conscious of the crap production values of the gorilla suit, let alone the picture quality of it. Chances a

        • hour of 1080p video starts to look noticeably bad.

          Your fovea only picks up a tiny portion of that at a time. If you could selectively downscale everything outside of that region, you probably wouldn't notice. Even within that region, your visual cortex basically tunes out details in objects that you aren't consciously paying attention to. Classic example:

          https://www.youtube.com/watch?... [youtube.com]

          Unless you're actively seeing through that test, you're not even conscious of the crap production values of the gorilla suit, let alone the picture quality of it. Chances are you can't even consciously make yourself see the veins in your retina without shining a bright light nearby, even though they're always "seen" by your visual cortex, regardless of lighting.

          The human eye has a single element lens, and it is a pretty crappy lens as lenses go. Only a very small area of "sharp" vision

          Our brain does an incredible job of stitching the info it gets. What seems to be largely focused is the result of moving our eyes, and letting our brains process and commit to short term memory what we are looking at. If I don't move my eye, it focuses only on one word

  • The study raises an important question: Why does a brain capable of such complexity operate at such a slow rate?

    Because otherwise we'd be overwhelmed and either go insane or be paralyzed. If our brain let us see, hear, smell, taste, and feel everything at the same time it would be sensory overload. Which happens to be a real condition [medicalnewstoday.com]. It has to limit the data stream to keep the body functioning. It's a self-preservation mechanism. And by "It has to", I don't mean it's thinking about this process, only th

    • by dfghjk ( 711126 )

      Doesn't the brain know that CISC was destroyed by RISC?

    • by piojo ( 995934 )

      No. Aren't you begging the question? Consciousness (the type of consciousness that humans have) can be aware of a certain amount of information. More is overwhelming, so either the species as a whole or the individual growing brain developed so as to not present more than that amount of information for long periods of time. But why does this consciousness hold that particular amount of information?

  • Fast enough to destroy a planet. It just takes a while.
  • From the original article at https://www.sciencedirect.com/... [sciencedirect.com]:

    "The game “Twenty Questions” has been popular for centuries1 as a thinking challenge. If the questions are properly designed, each will reveal 1 bit of information about the mystery thing. If the guesser wins routinely, this suggests that the thinker can access about million possible items in the few seconds allotted. Therefore, the speed of thinking—with no constraints imposed—corresponds to 20 bits of information over a

    • I saw the headline and read part of TFS/TFA and then stopped. It took just a few (say 10) seconds and yet I "processed" much more than 100 bits of information in that time. Enough information to know that I didn't want to bother reading further.

      It's a slow holiday on slashdot.

    • by gweihir ( 88907 )

      "Bit" has a clear definition: https://en.wikipedia.org/wiki/... [wikipedia.org]
      What they seem to be talking about are "chunks", which is something very different: https://en.wikipedia.org/wiki/... [wikipedia.org]

      When the authors do not even get _basic_ definitions right, the paper is probably not worth reading.

      • by dfghjk ( 711126 )

        What they are talking about are "output bits", not "input bits".

        Let's say you posed a question that requires a yes/no answer. And let's say it takes a system one second to produce that answer. The processing rate of that system is NOT 1 bit/sec, it could be billions of bits/sec. It is unknown how complex the processing would need to be. That's the lie here.

      • by piojo ( 995934 )

        Where do you get that idea? I clicked through to the article and the (mostly paywalled) paper, and the summary says:

        This article is about the neural conundrum behind the slowness of human behavior. The information throughput of a human being is about 10 bits/s. In comparison, our sensory systems gather data at ~10^9 bits/s.

        The fact that they are analyzing it in the same space as "10^9 units/s" implies they are not talking about chunks.

  • What does the following mean from the abstract? It sounds like a wacky non-sequitur.

    "If the guesser wins routinely, this suggests that the thinker can access about million possible items in the few seconds allotted. Therefore, the speed of thinking - with no constraints imposed - corresponds to 20 bits of information over a few seconds: a rate of 10 bits/s or less."

    By 2^20 I assume they are referring to a search space of a million possible items if each item were addressable via a 20 bit unique identifier.

    • I'll be honest, I'm not sure I can personally reference a million unique items. Maybe.
    • ... a search space of a million possible items ...

      It seems to conflate processing with quantity of information. Yet, the terminology suggests, the brain can identify the 'address' of the required record. This in fact, goes to the heart of modern knowledge about the brain. We know it does very little thinking, there seem to be millions of heuristics for the brain to find an answer: It doesn't have to look at a million records until it finds a match. That might be why stupidity is so common. A stupid person either doesn't have the brain 'power' to comp

  • We had to process and filter out all the nonsense that we intake, in real time and thoroughly. Life would be a nightmare.
  • The "article" is total BS. This didn't deserve a post.

  • by Midnight_Falcon ( 2432802 ) on Friday December 27, 2024 @09:48PM (#65044477)
    Interestingly this theory is similar to that of Aldous Huxley's in the classic "The Doors of Perception" (if you've ever listened to The Doors, you know the famous American Band, that's their namesake). Essentially, he posits that the brain's function is largely reductive and it ignores or reduces most of its sensory input to essentially filter out elements unnecessary for survival. Huxley believed psychedelics disrupt the efficiency of this filter, leading to states analogous with religious rapture, artistic inspiration, or on the darker side schizophrenia.

    The results of this particular study seem to agree with much of Huxley's old argument.

  • Why do humans run so slow? Why are we so weak? Why do we have no fur, no claws, no fangs?

    Same reason we supposedly "only process 10 bits/s". Because we don't need to do more.

  • A bit per second, 86400 seconds per day, 12.5 cents per bit. Works out to $10K a day. I wish I ran at 10 bps.
  • Maybe some top-level, executive function of our brain gets fed information at 10bps, but we offload a lot of processing to other parts of our brain and nervous system to simplify things down.

    For example, the bandwidth of the optic nerve is estimated to be about 10Mb/s [uh.edu] per eye. But clearly we don't "perceive" at 10Mb/s; different parts of our brain process the info and send us much lower-bandwidth info like "That's a stop sign" or "That's an apple."

    • by dskoll ( 99328 )

      LOL, that's what the paper said. I guess I should have read it before commenting.

  • a bit of analog information is way more than 1 bit of binary - stop with the apples and oranges.

    The human body has more than just raw inputs to our brain, muscle memory is a thing. A car analogy: modern cars have self contained subsystems scattered all over the vehicle and they only send basic info to the central control unit.

    We are also highly parallel and our database lookup for memory is near instant, um well, until we get older and now that memory is so full that the lookup takes a bit of time to
  • If you understand something quickly but it takes you hours or days to explain it to me, or vice versa, clearly one of us is going faster than the other.

  • A "brain bit" is not the same as a "computer bit". This comparison is stupid. When a human identifies a car it takes maybe 3 bits, "car", "red", model. A computer would use billion of bits for the same task.

    • by gweihir ( 88907 )

      So they are actually talking about chunks ( https://en.wikipedia.org/wiki/... [wikipedia.org] ). That makes a lot more sense. Of course, a chunk generally has much, much higher entropy than 1 bit. Seems somebody wanted to talk about information processing, but did not even know or understand the basic definitions. Color me very unimpressed.

  • It takes me much more than a second to do 8b/10b encoding in my head. Makes for very slow Infiniband transactions when I have to toggle it in by hand.

  • When I look around, it seems like two bits/s is all see
  • No idea how they ended up with this "estimate", but there is no way 10bit/s are enough to encode that data steam. Hell even a slow reader can read faster than that. Complete nonsense.

  • Try high-dose krill oil sometime.

    You'll sometimes see your information processing if you wake up / come on line gently.

    I'd estimate closer to 60 symbols per second, heavy on the rotational semantics and perhaps ten thousand rudimentary symbols, maybe 20 bits each all-in with degrees of freedom, or about 1200 bps of entropy encoding.

    That's the abstract; a different system than what someone with eidetic memory stores.

  • Reading their justification of the 10 bits per second metric makes me realize that the entire premise of the paper is totally wrong.

    Their rationale for the 10 bits per second metric is based on a personâ(TM)s ability to play the game âoe20 questionsâ. Seriously?

    These people are complete idiots and this paper is some F-level grade school shit.

  • The word bullshit itself is more than 10 bits .. did it take you one second to process it when I said provable bullshit? Unless your vocabulary is 1024 words (it isn't) then understanding an English word (in under 1 second) conveys more than 10 bits of information. So if you understood what I meant by the claim provable bullshit (not the proof, just what I meant by those words) .. then your brain processed a lot more than 10 bit/s. Same thing with object identification. If you see a snake and get scared wit

  • dtdE ~ h/2pi

    Many years ago I attended a course about system theory, and they showed that if you want to spare energy, you have to slow the processing speed of the system. One of the real-life examples discussed was exactly human brain. To keep power consumption low, its processing speed isn't that fast. The big difference is that the brain is a massively parallel system, so slow processing speed isn't that important.

    • Doesn't human vision process hundreds of megabits per second? Is this not considered part of the brain? I suppose our brain has a couple of GPUs (or more) for seeing, and something similar for hearing,
    • by kackle ( 910159 )
      That's what I was thinking (at 10 b/s, apparently). No matter what the actual speed may be (and I'll bet differs from person to person), we can't be eating food 8 hours per day. That would mean an elephant-like existence.
  • Fuck my 15 moderator points, you should be ashamed that this was allowed at all. What a load of shite.

    I'm really disappointed.

  • Say thats the most we will ever need?
  • The 10 bits/sec number is obviously unsupported bullshit pulled from the authors asses, which I assert equally unsupported that each of them had two asses, a functional one between their legs and a rotten one between their ears.
  • My wife can read an entire page of text of a novel in maybe five seconds and repeat parts of it. I can't. On a sustained basis she slows down significantly, but that's still 200 to 600b/s.

    I can look at a diagram plus text and provided I choose to commit to memory I can bring it up in my memory and "read" from the minds-eye image (very handy in school exams to the extent it felt like cheating). Ditto phone numbers and passwords. I'd have a hard time specifying the information flow rate for that, and it degr

  • Make up ridiculous number so people will visit the website.

  • So perhaps our brain can only process 10 bits per second of sensory input, but its a massively parallel processor with hundreds or even thousands of "cores". Like complaining that a Intel® Xeon® Platinum 8490H Processor only runs at 1.9GHz. But it has 960 cores.
  • I’m skeptical. Especially so, since the author of the article is the one who submitted it to Slashdot. That always raises a red flag for me. After digging into the actual paper referenced in the article, "The Unbearable Slowness of Being" by Zheng and Meister (preprint [arxiv.org] here) the article's author is, and I'm being diplomatic, taking some truly serious liberties with the paper’s findings.

    The paper explores the paradox of the brain’s slow behavioral throughput—10 bits per second—compared to the enormous input processed by our sensory systems. It offers an information-theoretic lens on why evolution may have shaped the brain this way. But the article doesn’t really stick to the science. Instead, it simplifies the results into clickbait, like comparing the brain to an old dial-up modem. That kind of analogy is misleading because it ignores the paper’s nuanced explanations of how the brain prioritizes and processes information.

    For example, the article claims that Neuralink’s brain-machine interfaces will always hit a biological bottleneck at 10 bits per second. But the paper doesn’t actually say that. It points out that Neuralink-type devices need to summarize inputs into actionable cues, not because the brain is inherently “too slow,” but because raw data streams would overwhelm it. That’s a subtle but crucial difference.

    If you want a better explanation of how the brain processes inputs based on cognitive context, I recommend Daniel Kahneman’s "Thinking, Fast and Slow." His work offers a grounded and accessible way to understand these dynamics—far more so than this article’s distorted interpretations of the science.

    It’s fine to write a provocative piece, but when the author misrepresents the science, it does a disservice to readers who want to learn something new.

  • The counting already discounts I/O processing. I guess they also pick some single point where the processing seems slow, and then they assume that the brain is a sequential machine where anything that is processed in parallel is insignificant.

    Sort of like running linpack for a week and then counting only the output at the very end as "bits."

  • Have you seen the posts here? Some are slower.
  • A human can immediately distinguish between the letter "A" and the scent of pineapple, and then pop off to drive a car at 75 mph on a moderately busy highway while listening to Benedict Cumberbatch read "The Order of Time."

    It might well be that we can only cover about 10 meaningful (to ourselves) thoughts per second, and if you want to define a "thought" as a "bit" for conversational purposes... whatever. Have fun with semantics I guess. But you can't design a computational encoding system that covers both

  • Neurons ALSO process information. The connections between nerve bodies is not magically the only place where anything useful happens. It's just easier for us to notice.

    Imagine each never is a person. All we can do is see them talk. How would you try to measure the processing of the system? You can't... Not with only measuring how often a person in the system talks (on average at 10 times a second).

    • Attention is not where or how your brain processes things. At least not solely. That's just where we notice it is working. Where our consciousness is part of the system/process.

      How does attention differ from memory? Why would a larger/better memory need more attention? That's essentially what the paper's authors said, right? That we can't possibly pay attention to so much, so 10 bits/s is the magical function limit.

      Not to mention... anything that impacted us greatly, like thinking has, impacted our a

  • why do we only have 2 hands, it would be nice to have more of them.

When some people discover the truth, they just can't understand why everybody isn't eager to hear it.

Working...