Why Do We Live at 10bits/s? (betanews.com) 97
BrianFagioli shares a report from BetaNews: It might sound unbelievable, but the human brain processes information at just 10 bits per second! Yes, folks, that's slower than the internet speeds many of us endured during the early days of dial-up. While our senses take in billions of bits of data every second, our brain intelligently sifts through the chaos, letting through only what's important.
This is no accident. Researchers Jieyu Zheng and Markus Meister explain in their study, The Unbearable Slowness of Being, that the brain is built this way for survival. Instead of getting overwhelmed by a flood of details, the brain has a system to focus on what matters most. It ensures we act quickly and effectively without being bogged down by unnecessary information. [...] The slow pace of the human brain might seem like a drawback in today's fast-paced world, but it has been sufficient for survival throughout human history. Evolution prioritized efficiency over speed, enabling the brain to focus on critical tasks without wasting energy. While machines continue to outpace us in raw processing power, the human brain remains unmatched in its ability to prioritize and adapt. The study raises an important question: Why does a brain capable of such complexity operate at such a slow rate?
This is no accident. Researchers Jieyu Zheng and Markus Meister explain in their study, The Unbearable Slowness of Being, that the brain is built this way for survival. Instead of getting overwhelmed by a flood of details, the brain has a system to focus on what matters most. It ensures we act quickly and effectively without being bogged down by unnecessary information. [...] The slow pace of the human brain might seem like a drawback in today's fast-paced world, but it has been sufficient for survival throughout human history. Evolution prioritized efficiency over speed, enabling the brain to focus on critical tasks without wasting energy. While machines continue to outpace us in raw processing power, the human brain remains unmatched in its ability to prioritize and adapt. The study raises an important question: Why does a brain capable of such complexity operate at such a slow rate?
10bit that is comcasts new speed 10bit that is com (Score:2)
10bit that is comcasts new speed when you hit your cap
Re: (Score:1, Insightful)
It's also the processing speed of your average Democrat voter.
Re: (Score:1, Insightful)
It's also the processing speed of your average Democrat voter.
Go fuck yourself.....
Re: (Score:2)
It's also the processing speed of your average Democrat voter.
Sounds like it's the average processing speed of EVERY human. Nice try but FAIL.
Good point. Would upvote if so empowered.
Re: (Score:2)
A dramatically higher percentage of Democrats have Masters degrees or PhDs and work in high-paying professions.
And those people don't even know how to use a screwdriver or change a tire.
Re: 10bit that is comcasts new speed 10bit that is (Score:2, Troll)
Re: (Score:2)
Oh yes, because the world clearly is run by uneducated retards with high school degrees working at gas stations and coal mines.
Okay, maybe this is unfair (Score:2)
But frankly as soon as I saw the stupid (and extremely forced) title they gave their "study", I pretty much decided I should dismiss it out of hand.
Re: (Score:3)
It's also provably wrong (Score:5, Insightful)
It's incredibly stupid because there's a lot more to our brains than the parts we use for conscious thought.
Even when we consider this though we clearly process data faster than 10 bits/second. Consider that the average person reads at 238 words/minute which is ~4 per second. The average human vocabulary is around 30,000 words for English speakers. Now if we map the very non-binary processing of our brains into the binary world we would need 15 bits to encode a word and we read 4/second so even with this highly inappropriate mapping (our brains do not work this way) the average brain processes 60 bits/second while reading - and that's a conscious thought process.
Re:It's also provably wrong (Score:5, Insightful)
Not to forget the image processing we do.
We literally recognize the characters (letters) in various shapes and sizes, in real-time, while putting together words, fill in missing gaps, visualize from words and from images to words.
Some of us are even capable of looking at something, and in a split second imagine an entire film scene, fully rendered with special effects worthy of a hollywood production (visual thinkers), some of us are capable of creating music with complete synthesized sounds developed in-house (in-brain) and capable of playing 1000 variants of those.
No, the brain is insanely fast, it's capable of seing the bigger picture of an massive amount of historical and present data.
Is it as accurate as a computers memory? No! Is it faster than anything we know, absolutely! We're barely capable of emulating an insect as it is.
Re: (Score:3)
Not to forget the image processing we do.
We literally recognize the characters (letters) in various shapes and sizes, in real-time, while putting together words, fill in missing gaps, visualize from words and from images to words.
Some of us are even capable of looking at something, and in a split second imagine an entire film scene, fully rendered with special effects worthy of a hollywood production (visual thinkers), some of us are capable of creating music with complete synthesized sounds developed in-house (in-brain) and capable of playing 1000 variants of those.
No, the brain is insanely fast, it's capable of seing the bigger picture of an massive amount of historical and present data. Is it as accurate as a computers memory? No! Is it faster than anything we know, absolutely! We're barely capable of emulating an insect as it is.
A personal example. I talk relatively slowly. But behind that, I am mentally parsing every word multiple times. That lends itself to greater verbal accuracy. On the flip side, until people are used to me, they tend to interrupt. However, I'm doing a whole lot faster than 10 bits per second of processing.
And yes, Trying to assign some bits/sec to a machine like the human brain is going to fail. Our noggins can switch between linear and non-linear processing and how to assign a meaningful number to that is
Re: (Score:2)
"the average human vocabulary is around 30,000 words for English speakers. Now if we map the very non-binary processing of our brains into the binary world we would need 15 bits to encode a word"
You would need 15 bits to encode each word *if they were chosen at random*. But they're not, they're part of a rational statement in a natural language and notable limitations of spicy autocomplete that given a few words of a sentence there are few choices for what the next would could be without being nonsense.
Understanding the paper (Score:5, Interesting)
But frankly as soon as I saw the stupid (and extremely forced) title they gave their "study", I pretty much decided I should dismiss it out of hand.
To achieve understanding, one only needs to read and understand 3 papers.
Firstly, there's A Mathematical Theory of Communication [harvard.edu], by Claude Shannon. This paper is an easy read, you only need to read the first third (-ish?), and you can skip over the mathematics and get the gist. Just assume the mathematics are correct and believe his conclusions and you're good. The first section of that paper defines information as a measure of data.
Shannon's Prediction and Entropy of Printed English [upenn.edu] is the next paper to read. In it, he uses the definitions and methods from the 1st paper to derive the information we get from printed English. The "channel capacity" of written words is about 1 bit per character of text, the paper goes through how he arrives at this figure, and it's an interesting read.
The third paper to read is The Magical Number Seven, Plus or Minus Two [utexas.edu], where George Miller describes the channel capacity of human sensory inputs, from experiments. The experiments are described, and listed in the reference section if you want more info.
So for human hearing, you can play notes on a piano and ask the human to choose among 3 pitch possibilities (low, medium, high pitch) and the human will always get the right answer. Similarly with four possibilities, five possibilities, all the way up to about seven possibilities. Once you hit seven possibilities the human begins to make errors in which pitch is presented.
The interesting bit about hearing is that it doesn't matter how you format the choices: you can have 4 low pitches, which the human will recognize perfectly, 4 high pitches, which the human will also recognize perfectly, but put them together (8 choices total) and the human begins to make mistakes.
The channel capacity of human hearing is about 3 bits of information, it doesn't matter if you segment the choices, you can't exceed that amount of information.
The 3rd paper (Magical number 7) goes through several human perceptual modes, such as identifying segment of X axis where an "X" occurs - more than 7 options and the human begins to make mistakes.
The paper mentioned in the OP is simply restating the various background and conclusions from the 3 papers cited.
Apropos of nothing, anyone doing AI research - not engineering, such as learning the API and hooking up an AI to perform various tasks, but actual research on making AIs that work better or more like the human brain - can use the information-theoretic description of our brain mechanisms to guide their thinking and direction of research. Knowing about information theory can eliminate a lot of proposed models and mechanisms you want to try.
I like to think of it as an analogue to modern physics using abstract algebra to inform their theories on quantum gravity. You know that any correct solution needs to transform in a certain way (Lorentz invariance, for example), and this eliminates all solutions that do not have a specific form.
I think AI will eventually be that way: correct solutions will learn quickly from small amounts of data (10 bits per second), or have a realistic explanation for why they deviate from this rule.
But until then, expect to spend $150 million to train ever larger models on Sagans and Sagans of data.
(Oh, and if you're interested, a fourth paper "Most Efficient Chunk Sizes" by David Dirlam can be quite an eye opener. The implications of this for AI are...)
Re:Understanding the paper (Score:5, Insightful)
I believe the basic premise is incorrect.
I can read at about 400wpm, and can memorize at about half that for different bursts; many can do the same-- I'm not unique.
I can do complex math in my head, sadly from habit. This is also not unique.
I'm not special. Processing at 10bit/sec is plainly bullshit. The denominator of a bit is even worse. The mind processes many things concurrently. Reaction time (motor skill) is only one measure of many.
Consider the second violinist in a typical orchestra, who's listening, playing, reading a chart, wary of the maestro, and has an itch. The parallelisms processed don't measure in any way to a descriptor of "10 bits/sec". It's plainly inane and facile to consider the brain in this way.
Mapping this to AI training, the brain does calculus, the direct path to an answer in a way similar to AI, but on a highly parallel scale. The brain doesn't gain inference in the same way, rather, core adaptation to environment (which AI is not blind to, yet references inputs from that same environment). AI, like corporations, has no soul, whatever soul is. We measure soul by understanding compassion, shame, even guilt. There are those that walk among us with no compassion, shame, guilt, or sense of love; we call them sociopaths, narcissists, and psychopaths. Yet they function among us (until they don't), just like AI.
Are you an AI? (Score:3, Insightful)
I believe the basic premise is incorrect. I can read at about 400wpm, and can memorize at about half that for different bursts; many can do the same-- I'm not unique.
Average word length is 5 letters, one bit per leete, 5*400 = 2000, then divide by 60 seconds in a minute => 33 bits/sec, and you mention that you need to go half that rate to get a memorization level, which is about 16 bits/sec.
And you think that's sufficiently distant from 10 bits/sec to invalidate the premise?
Also, you're not memorizing. Unless you can recite back what you read word-for-word, which I guarantee would be almost unique among humans.
I can do complex math in my head, sadly from habit. This is also not unique.
So to draw an analogy, you're saying that because the com
Ahem (Score:2)
1 bit per letter? I know there is some redundancy but that seems a little optimistic.
Re: Ahem (Score:2)
you havenâ(TM)t defined what a bit is. On a computer itâ(TM)s a one or zero because of technological limitations. What is a bit in the human mind? can that simply map to an instinct, where one bit equals a slithering snake out of the corner of oneâ(TM)s eye? Then we jump back out of the way for instance.
also, doing calculus in oneâ(TM)s head, for instance, simply comes from having largely memorized many things. It appears that while the human mind only processes at a fairly "low bit rate
Re: (Score:3)
if one "bit" is redefined to mean more than 2 states, then it is ridiculous to use the word bit.
The word was chosen to try to establish a metric of comparison with more obviously quantized digital computing.
10 bit/s implies only being able to listen at a pace of one word per second if you had a vocabulary of about a thousand words, ignoring on the non word stimulus you are constantly also processing. However that would be a terribly slow pace and even a poor vocabulary is 20 times larger. Also, people can r
Re: (Score:2)
The term isn't being "redefined", it simply has two different meanings in different contexts. Once you understand that, you realize the entire purpose of the post is a troll, it is trying to suggest that the brain cannot process information except at an absurdly low rate. It accomplishes that troll by relying on a misunderstanding of the measure of information being used, then it is defended here by Okian Warrior using a bunch of bad faith tactics that ignore the real problem and offer more confusion for
Re: Ahem (Score:2)
Providing a novel definition, especially one contrary to its etymology, in a new context is literally redefinition
Re: (Score:2)
This might be a Zeno's Arrow situation where the logic and data are absolutely correct yet the answer is nonsense anyway.
Re: (Score:2)
Re: (Score:2)
"And you think that's sufficiently distant from 10 bits/sec to invalidate the premise?"
Yes, once you remove all your myriad errors in calculation. See above.
"You should write this up as a series of papers. Disproving the studies mentioned in the "Magical Number 7" paper cited above would be highly valuable to science."
So should you. I would bet your paper gets more laughs than his.
"Are you an AI? Who's talking about reaction time?"
At least there's a concept of time there, unlike your claim of "channel capa
Re: (Score:3)
Um... what's the denominator of a bit again?
I chuckled.
Saying "the brain does calculus? Citation needed.
Oh, I gotta agree with the "I can do complex math in my head" humblebraggart's assertion that brains do calculus. Catching a ball, judging when another vehicle in traffic is likely to hit you or miss you on its current trajectory (and current acceleration/deceleration trend, of course), etc, etc -- there are countless examples I can think of that demonstrate the human brain intuitively doing at least second-order derivatives of motion/translation. I mean, our brains are clearly not implementing n
Re: (Score:2)
I broadly agree with your sentiment that comparison is flawed, but I will point out that your reading level is highly dependent on the context.
conscious lesson dominant recording oppose allocation highway grandfather ministry remunerate drama normal interrupt as earthwax censorship position article decade dead shareholder workshop camera rule first
The first sentence was something you could probably read at 400 wpm easily, at least I could read a sentence like that. The sequence of the same number, but rand
Re: (Score:2)
Extrapolate what you wrote randomly to parallelism. The words you wrote connect by their contrast and can be rapidly related to each other by contrast. So is that a bit?
Shortcuts are like stored procedures; they're "CPU strokes" often running in parallel. The relationships among random inputs are data points themselves. Chaos constitutes aggregated sets of data points. Making sense (whatever that is) of chaos is normal, as almost each moment is essentially smoothed chaos. 10"bits"/sec?
The descriptor and den
Re: (Score:2)
To achieve understanding, one only needs to read and understand 3 papers.
The paper mentioned in the OP is simply restating the various background and conclusions from the 3 papers cited.
Only having access to an abstract that says very little it seems estimates of information content of inputs via "outer" brain is separate from the 10bit/s "inner" brain.
The channel capacity of human hearing is about 3 bits of information, it doesn't matter if you segment the choices, you can't exceed that amount of information.
All of the codecs for audio and video processing worth using have baked in perceptual models that take advantage of illusionary nature of human perception to reduce digital bandwidth required to reproduce the original analog signal without people noticing the difference. If what you say is true where is my transparent 3-bit audio codec?
I think AI will eventually be that way: correct solutions will learn quickly from small amounts of data (10 bits per second), or have a realistic explanation for why they deviate from this rule.
Is t
Re: (Score:2)
All of the codecs for audio and video processing worth using have baked in perceptual models that take advantage of illusionary nature of human perception to reduce digital bandwidth required to reproduce the original analog signal without people noticing the difference. If what you say is true where is my transparent 3-bit audio codec?
I do think Samuel Johnson said the same sort of thing, more eloquently: "I refute it thus" *punt* Heheh.
Re: (Score:2)
As a concrete examples, it's difficult to
Re: (Score:2)
Re: (Score:2)
The issue being that the attempt to correlate to computation standards doesn't quite map well to the works.
A mapping for bits to printed character is a bit weird, since we'd be thinking about words rather than letters in terms of the "information" conveyed in natural use. However a lot of the principles extend to the written word, that our ability to process a stream of words is contingent on the previous words effectively defining the likely scope of the next word. If words break those promises, then we
Re: (Score:2)
Re:Understanding the paper (Score:4)
"Firstly, there's A Mathematical Theory of Communication [harvard.edu], by Claude Shannon. ... Shannon's Prediction and Entropy of Printed English [upenn.edu] ... uses the definitions and methods from the 1st paper to derive the information we get from printed English. The "channel capacity" of written words is about 1 bit per character of text, the paper goes through how he arrives at this figure, and it's an interesting read."
The first paper defines a "bit" exactly as we expect, one boolean of information. It also defines "channel capacity as "units of information per unit of time", and a channel as something that transmits information.
The second paper discusses information conveyed in written language. There are a bunch of issues with your comments though:
1) 1 bit per character of text is not a "channel capacity", there is no time in that metric.
2) The information you are claiming as defining channel capacity is the result after processing, the information received is far greater and that would also be processed by the brain.
3) The experiments involved only 26/27 symbols, written language uses multiples of that. The data is preprocessed to reduce the information further.
4) The conclusion was indeed 1 bit/letter with "redundancy of roughly 75%".
Now if we take your example provided below of 5 letters per word and 400 wpm, in fact the bit rate would be 6*400 because you ignored the separators, so the reading rate would be 40 bits/sec. But that's post-processed information, the input rate would be 160 bits/sec according to the article you cite and, in fact, far greater if a simplified character set was not used. Also, the "memorizing" points you make are merely moving the goalposts. Finally, what you are reading matters immensely, text processing rate is largely irrelevant. Reading a pre-K picture book is gonna be faster than Nicomachean Ethics.
So by your own citations, 10 bits/sec is off by an order of magnitude (perhaps even two orders in some written languages), and that ignores the OCR processing that's done before any of this is even measured. OCR would dwarf all the other processing BTW. The articles you cited are relevant, but they don't paint the picture you claim they do.
Didn't bother looking at the third citation, there were already so many errors starting with the first one that this post is long enough. I expect it to be interesting, but I expect you don't really understand that one either. Appeals to authority you don't understand is not a good look. Thanks for the citations, but always make sure you understand them yourself.
Re: (Score:2)
Seems like they're using a very narrow definition of "processing", since we already have a pretty good idea how much data is required to reproduce convincing visual and auditory inputs. After been doing quite a bit of tweaking my own video encoding preferences for my Kodi movie library, I'd say anything less than about 1.5 gigabytes per hour of 1080p video starts to look noticeably bad. That's about 3.33 Mbps with data compression, so even after doing all the various tricks to eliminate redundancies, it's
Re: (Score:2)
hour of 1080p video starts to look noticeably bad.
Your fovea only picks up a tiny portion of that at a time. If you could selectively downscale everything outside of that region, you probably wouldn't notice. Even within that region, your visual cortex basically tunes out details in objects that you aren't consciously paying attention to. Classic example:
https://www.youtube.com/watch?... [youtube.com]
Unless you're actively seeing through that test, you're not even conscious of the crap production values of the gorilla suit, let alone the picture quality of it. Chances a
Re: (Score:2)
hour of 1080p video starts to look noticeably bad.
Your fovea only picks up a tiny portion of that at a time. If you could selectively downscale everything outside of that region, you probably wouldn't notice. Even within that region, your visual cortex basically tunes out details in objects that you aren't consciously paying attention to. Classic example:
https://www.youtube.com/watch?... [youtube.com]
Unless you're actively seeing through that test, you're not even conscious of the crap production values of the gorilla suit, let alone the picture quality of it. Chances are you can't even consciously make yourself see the veins in your retina without shining a bright light nearby, even though they're always "seen" by your visual cortex, regardless of lighting.
The human eye has a single element lens, and it is a pretty crappy lens as lenses go. Only a very small area of "sharp" vision
Our brain does an incredible job of stitching the info it gets. What seems to be largely focused is the result of moving our eyes, and letting our brains process and commit to short term memory what we are looking at. If I don't move my eye, it focuses only on one word
Why? (Score:2)
The study raises an important question: Why does a brain capable of such complexity operate at such a slow rate?
Because otherwise we'd be overwhelmed and either go insane or be paralyzed. If our brain let us see, hear, smell, taste, and feel everything at the same time it would be sensory overload. Which happens to be a real condition [medicalnewstoday.com]. It has to limit the data stream to keep the body functioning. It's a self-preservation mechanism. And by "It has to", I don't mean it's thinking about this process, only th
Re: (Score:2)
Doesn't the brain know that CISC was destroyed by RISC?
Re: (Score:2)
No. Aren't you begging the question? Consciousness (the type of consciousness that humans have) can be aware of a certain amount of information. More is overwhelming, so either the species as a whole or the individual growing brain developed so as to not present more than that amount of information for long periods of time. But why does this consciousness hold that particular amount of information?
It's fast enough (Score:2)
Re: It's fast enough (Score:3)
Re: (Score:1)
1 bit per second for driving a car?
Re: (Score:2)
Define "bit" (Score:1)
From the original article at https://www.sciencedirect.com/... [sciencedirect.com]:
"The game “Twenty Questions” has been popular for centuries1 as a thinking challenge. If the questions are properly designed, each will reveal 1 bit of information about the mystery thing. If the guesser wins routinely, this suggests that the thinker can access about million possible items in the few seconds allotted. Therefore, the speed of thinking—with no constraints imposed—corresponds to 20 bits of information over a
Better: define "process" (Score:1)
I saw the headline and read part of TFS/TFA and then stopped. It took just a few (say 10) seconds and yet I "processed" much more than 100 bits of information in that time. Enough information to know that I didn't want to bother reading further.
It's a slow holiday on slashdot.
Re: (Score:1)
"Bit" has a clear definition: https://en.wikipedia.org/wiki/... [wikipedia.org]
What they seem to be talking about are "chunks", which is something very different: https://en.wikipedia.org/wiki/... [wikipedia.org]
When the authors do not even get _basic_ definitions right, the paper is probably not worth reading.
Re: (Score:2)
What they are talking about are "output bits", not "input bits".
Let's say you posed a question that requires a yes/no answer. And let's say it takes a system one second to produce that answer. The processing rate of that system is NOT 1 bit/sec, it could be billions of bits/sec. It is unknown how complex the processing would need to be. That's the lie here.
Re: (Score:2)
It is not 10 "output bits" either. Clearly.
Re: (Score:2)
Where do you get that idea? I clicked through to the article and the (mostly paywalled) paper, and the summary says:
This article is about the neural conundrum behind the slowness of human behavior. The information throughput of a human being is about 10 bits/s. In comparison, our sensory systems gather data at ~10^9 bits/s.
The fact that they are analyzing it in the same space as "10^9 units/s" implies they are not talking about chunks.
I'm too slow to understand (Score:2)
What does the following mean from the abstract? It sounds like a wacky non-sequitur.
"If the guesser wins routinely, this suggests that the thinker can access about million possible items in the few seconds allotted. Therefore, the speed of thinking - with no constraints imposed - corresponds to 20 bits of information over a few seconds: a rate of 10 bits/s or less."
By 2^20 I assume they are referring to a search space of a million possible items if each item were addressable via a 20 bit unique identifier.
Re: (Score:2)
Re: (Score:2)
It seems to conflate processing with quantity of information. Yet, the terminology suggests, the brain can identify the 'address' of the required record. This in fact, goes to the heart of modern knowledge about the brain. We know it does very little thinking, there seem to be millions of heuristics for the brain to find an answer: It doesn't have to look at a million records until it finds a match. That might be why stupidity is so common. A stupid person either doesn't have the brain 'power' to comp
Imagine if.. (Score:2)
Total BS (Score:1)
The "article" is total BS. This didn't deserve a post.
The Doors of Perception (Score:3)
The results of this particular study seem to agree with much of Huxley's old argument.
Dumb question (Score:2)
Why do humans run so slow? Why are we so weak? Why do we have no fur, no claws, no fangs?
Same reason we supposedly "only process 10 bits/s". Because we don't need to do more.
Decent earnings (Score:2)
How is it measured? (Score:2)
Maybe some top-level, executive function of our brain gets fed information at 10bps, but we offload a lot of processing to other parts of our brain and nervous system to simplify things down.
For example, the bandwidth of the optic nerve is estimated to be about 10Mb/s [uh.edu] per eye. But clearly we don't "perceive" at 10Mb/s; different parts of our brain process the info and send us much lower-bandwidth info like "That's a stop sign" or "That's an apple."
Re: (Score:2)
LOL, that's what the paper said. I guess I should have read it before commenting.
Click-Bait response. (Score:2)
The human body has more than just raw inputs to our brain, muscle memory is a thing. A car analogy: modern cars have self contained subsystems scattered all over the vehicle and they only send basic info to the central control unit.
We are also highly parallel and our database lookup for memory is near instant, um well, until we get older and now that memory is so full that the lookup takes a bit of time to
Oh bullshit (Score:1)
If you understand something quickly but it takes you hours or days to explain it to me, or vice versa, clearly one of us is going faster than the other.
Bits != bits (Score:2)
A "brain bit" is not the same as a "computer bit". This comparison is stupid. When a human identifies a car it takes maybe 3 bits, "car", "red", model. A computer would use billion of bits for the same task.
Re: (Score:2)
So they are actually talking about chunks ( https://en.wikipedia.org/wiki/... [wikipedia.org] ). That makes a lot more sense. Of course, a chunk generally has much, much higher entropy than 1 bit. Seems somebody wanted to talk about information processing, but did not even know or understand the basic definitions. Color me very unimpressed.
Seems high (Score:2)
It takes me much more than a second to do 8b/10b encoding in my head. Makes for very slow Infiniband transactions when I have to toggle it in by hand.
Really? That seems high (Score:2)
That is some fine bullshit (Score:2)
No idea how they ended up with this "estimate", but there is no way 10bit/s are enough to encode that data steam. Hell even a slow reader can read faster than that. Complete nonsense.
Psychedelics (Score:2)
Try high-dose krill oil sometime.
You'll sometimes see your information processing if you wake up / come on line gently.
I'd estimate closer to 60 symbols per second, heavy on the rotational semantics and perhaps ten thousand rudimentary symbols, maybe 20 bits each all-in with degrees of freedom, or about 1200 bps of entropy encoding.
That's the abstract; a different system than what someone with eidetic memory stores.
Stupid paper and flawed premise (Score:2)
Reading their justification of the 10 bits per second metric makes me realize that the entire premise of the paper is totally wrong.
Their rationale for the 10 bits per second metric is based on a personâ(TM)s ability to play the game âoe20 questionsâ. Seriously?
These people are complete idiots and this paper is some F-level grade school shit.
Provably bullshit (Score:2)
The word bullshit itself is more than 10 bits .. did it take you one second to process it when I said provable bullshit? Unless your vocabulary is 1024 words (it isn't) then understanding an English word (in under 1 second) conveys more than 10 bits of information. So if you understood what I meant by the claim provable bullshit (not the proof, just what I meant by those words) .. then your brain processed a lot more than 10 bit/s. Same thing with object identification. If you see a snake and get scared wit
Heisenberg.... (Score:2)
Many years ago I attended a course about system theory, and they showed that if you want to spare energy, you have to slow the processing speed of the system. One of the real-life examples discussed was exactly human brain. To keep power consumption low, its processing speed isn't that fast. The big difference is that the brain is a massively parallel system, so slow processing speed isn't that important.
Re: (Score:2)
Re: (Score:2)
Really, slashdot? (Score:2)
Fuck my 15 moderator points, you should be ashamed that this was allowed at all. What a load of shite.
I'm really disappointed.
Didnt Bill Gates (Score:2)
10bits/second? Nonsense (Score:1)
Variability (Score:2)
I can look at a diagram plus text and provided I choose to commit to memory I can bring it up in my memory and "read" from the minds-eye image (very handy in school exams to the extent it felt like cheating). Ditto phone numbers and passwords. I'd have a hard time specifying the information flow rate for that, and it degr
Clickbait (Score:2)
Make up ridiculous number so people will visit the website.
Multi-core and multi-threaded (Score:1)
We don't live at 10 bits/second. (Score:3)
I’m skeptical. Especially so, since the author of the article is the one who submitted it to Slashdot. That always raises a red flag for me. After digging into the actual paper referenced in the article, "The Unbearable Slowness of Being" by Zheng and Meister (preprint [arxiv.org] here) the article's author is, and I'm being diplomatic, taking some truly serious liberties with the paper’s findings.
The paper explores the paradox of the brain’s slow behavioral throughput—10 bits per second—compared to the enormous input processed by our sensory systems. It offers an information-theoretic lens on why evolution may have shaped the brain this way. But the article doesn’t really stick to the science. Instead, it simplifies the results into clickbait, like comparing the brain to an old dial-up modem. That kind of analogy is misleading because it ignores the paper’s nuanced explanations of how the brain prioritizes and processes information.
For example, the article claims that Neuralink’s brain-machine interfaces will always hit a biological bottleneck at 10 bits per second. But the paper doesn’t actually say that. It points out that Neuralink-type devices need to summarize inputs into actionable cues, not because the brain is inherently “too slow,” but because raw data streams would overwhelm it. That’s a subtle but crucial difference.
If you want a better explanation of how the brain processes inputs based on cognitive context, I recommend Daniel Kahneman’s "Thinking, Fast and Slow." His work offers a grounded and accessible way to understand these dynamics—far more so than this article’s distorted interpretations of the science.
It’s fine to write a provocative piece, but when the author misrepresents the science, it does a disservice to readers who want to learn something new.
Depends on how you count bits (Score:2)
The counting already discounts I/O processing. I guess they also pick some single point where the processing seems slow, and then they assume that the brain is a sequential machine where anything that is processed in parallel is insignificant.
Sort of like running linpack for a week and then counting only the output at the very end as "bits."
Reality Bites (Score:1)
Abusing the term bits (Score:2)
A human can immediately distinguish between the letter "A" and the scent of pineapple, and then pop off to drive a car at 75 mph on a moderately busy highway while listening to Benedict Cumberbatch read "The Order of Time."
It might well be that we can only cover about 10 meaningful (to ourselves) thoughts per second, and if you want to define a "thought" as a "bit" for conversational purposes... whatever. Have fun with semantics I guess. But you can't design a computational encoding system that covers both
Poor understanding of what happens. (Score:2)
Neurons ALSO process information. The connections between nerve bodies is not magically the only place where anything useful happens. It's just easier for us to notice.
Imagine each never is a person. All we can do is see them talk. How would you try to measure the processing of the system? You can't... Not with only measuring how often a person in the system talks (on average at 10 times a second).
And the paper is poor too. (Score:2)
Attention is not where or how your brain processes things. At least not solely. That's just where we notice it is working. Where our consciousness is part of the system/process.
How does attention differ from memory? Why would a larger/better memory need more attention? That's essentially what the paper's authors said, right? That we can't possibly pay attention to so much, so 10 bits/s is the magical function limit.
Not to mention... anything that impacted us greatly, like thinking has, impacted our a
Why do we live with only 2 hands? (Score:1)