Follow Slashdot stories on Twitter


Forgot your password?
It's funny.  Laugh. Science

Can You Raed Tihs? 997

An aoynmnuos raeedr sumbtis: "An interesting tidbit from Bisso's blog site: Scrambled words are legible as long as first and last letters are in place. Word of mouth has spread to other blogs, and articles as well. From the languagehat site: 'Aoccdrnig to a rscheearch at an Elingsh uinervtisy, it deosn't mttaer in waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht frist and lsat ltteer is at the rghit pclae. The rset can be a toatl mses and you can sitll raed it wouthit porbelm. Tihs is bcuseae we do not raed ervey lteter by it slef but the wrod as a wlohe. ceehiro.' Jamie Zawinski has also written a perl script to convert normal text into text where letters excluding the first and last are scrambled."
This discussion has been archived. No new comments can be posted.

Can You Raed Tihs?

Comments Filter:
  • Here you go (Score:5, Informative)

    by JM Apocalypse ( 630055 ) * on Monday September 15, 2003 @07:13PM (#6969205)
    No need to open the terminal ... Jeff comes to the rescue! []
  • by Anonymous Coward on Monday September 15, 2003 @07:18PM (#6969280)
    Actually, does this work well with letter pairs like, "th ch wh sh qu?" I forget what those are called.

    Digraphs? []
  • Re:Hmmm (Score:2, Informative)

    The problem comes from words that have the same ending letters, but different middle letters: Like "car" and "cur", or (more confusingly) "from", "form", "firm", "film", "farm", etc. Context would give us some cues, but it would definately require more thought to process.
  • by adamsan ( 606899 ) on Monday September 15, 2003 @07:46PM (#6969594)
    "They're called dipthongs (sic)"

    No they ain't, diphthongs are pairs of vowels that merge together. Pairs of consonants are called err..consonant pairs.
  • by gdchinacat ( 186298 ) on Monday September 15, 2003 @07:53PM (#6969656)
    I think it is actually cheerio.

    WordNet (r) 1.7 [wn]

    n : a farewell remark; "they said their good-byes" [syn: adieu,
    adios, arrivederci, auf wiedersehen, au revoir,
    bye, bye-bye, good-by, goodby, good-bye, goodbye,
    good day, sayonara, so long]
  • by edwdig ( 47888 ) on Monday September 15, 2003 @08:01PM (#6969724)
    By randomly scrambling the letters, you're eliminating a lot of the redundancy.

    Huffman compression would be unaffected though, as it works on a per character basis.
  • Compression worse... (Score:5, Informative)

    by douglips ( 513461 ) on Monday September 15, 2003 @08:07PM (#6969776) Homepage Journal
    That's easy. Let's say you have a text file that consists of 14,000 instances of the word "begat". This compresses to a file that simply indicates "repeat 14,000 'begat '".

    Now, after you scrmable it, it's got equal quantities of begat, beagt, baget, baegt, bgeat, and bgaet. It's not so easy to compress any more.

    Essentially, you're increasing the entropy of the file by a fair amount. Truly random data is not so easy to compress as english, because english has lots of order. Added disorder or entropy means compression is just not as easy.
  • by T-Ranger ( 10520 ) <`jeffw' `at' `'> on Monday September 15, 2003 @08:08PM (#6969790) Homepage
    Because english words are made up of some common components. 'i' always comes before 'e' in 'ie' pairs, for example. Compression is about rewriting common strings (of bits, not just strings of characters) into shorter strings - uncommon strings may end up being longer post compression. If your effectivly randomizing most of the text then there wont be any common strings. Or at least less then what occures in natural, ordered, prose. And there wont ever be whole words you can compress down.
  • by stienman ( 51024 ) <[moc.scisabu] [ta] [sivada]> on Monday September 15, 2003 @08:10PM (#6969805) Homepage Journal
    Yes, and it's quite simple. The script you used scrambles words randomly - again agian aagin aaign aigan aiagn - become seperate words to the compressor. Instead of changing every occurance of the word again into a short binary string, it has to treat each iteration seperately with their own binary string (simplified - compression is more complex, but the basic idea is the same)

    In other words, the adds machine randomness to a rather organized and non-random set of data. Humans can still parse it (meaning that the data is very redundant) but the machine cannot compressed this 'more random' data.

  • by Demodian ( 658895 ) on Monday September 15, 2003 @08:43PM (#6970057)
    diphthongs [] and triphthongs [] are the vowel-only subsets of digraphs [] and trigraphs [].
  • by InadequateCamel ( 515839 ) on Monday September 15, 2003 @09:10PM (#6970259)
    >Essentially, you're increasing the entropy of the file by a fair amount.

    Pardon me for being picky and off-topic, but this is a little peeve of mine...

    Definition: Entropy
    n 1: (thermodynamics) a measure of the amount of energy in a system that is available for doing work; entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity [ant: ectropy]

    "Disorder" is a terrible way of describing entropy, and to use the word entropy to describe disorder is even worse. Having said that, in computing the word has long since been hijacked to mean disorder (Shannon's formula?), so I must admit that your use is a little more valid than "My bedroom has a high degree of entropy".

    Just my 2 cents! (sorry)
  • Re:Hmmm (Score:2, Informative)

    by ikkyikkyikkypikang ( 214791 ) on Monday September 15, 2003 @09:20PM (#6970346)
    Hree's a cool ltitle scprit taht I use to sned emial to my mboil phnoe: email2sms []
  • This might help (Score:3, Informative)

    by pbox ( 146337 ) on Monday September 15, 2003 @09:23PM (#6970365) Homepage Journal
    Because english words are made up of some common components. 'i' always comes before 'e' in 'ie' pairs, for example.

    My neighbor weighed your argument. He used a beige scale, and decided it was probably the heinous act of a foreigner to make such a statement. And you're weird. So rein in yourself, and remove the veil of ignorance, ye feisty cad!

    Thou should forfeit karma, but that is neither here nor there.
  • by HalB ( 127906 ) on Monday September 15, 2003 @10:28PM (#6971090)
    Actually, entropy is the energy NOT available to do work...

    Even though the original poster did misuse entropy, even in the information theory context... From

    2. A measure of the disorder or randomness in a closed system.


    1 : a measure of the unavailable energy in a closed thermodynamic system that is also usually considered to be a measure of the system's disorder

    Get over it. 8')

  • by Raffaello ( 230287 ) on Monday September 15, 2003 @10:49PM (#6971273)
    No, that would be lisp.
  • by CarlDenny ( 415322 ) on Tuesday September 16, 2003 @02:35AM (#6972699)
    The first half dozen occurances of the definition you quoted also included:
    2: (communication theory) a numerical measure of the uncertainty of an outcome; "the signal contained thousands of bits of information"

    If it's a pet peeve of yours, perhaps you should make a study of statistical mechanics and information theory, where the concept and term are more clearly and quantitatively defined. With a slightly deeper understanding of statistical mechanics, you will find that ther term is more fundamental than you thought, and that they are mathematically identical, applied to two separate fields. With this understanding, your objection is similar to saying that length is defined by the distance between two ends of an object, and that talking about the length of a file, or a length of time, is completely wrong.

    While the term originated in thermodynamics, it was given a formal definition (even within the realm of physics) by Boltzmann with the development of statistical mechanics. Statistical mechanics allow Boltzmann to formulate and discuss entropy well in advance of energy or temperature. When they do enter the picture, thermodynamic (dQ/dt) entropy is identical to the statistical definition, with temperature defined by 1/t = d(Energy)/d(entropy) where those ds are partial derivatives. It's actually a fascinating topic, and a beautiful mathematical insight.

    The description and definition used by Boltzmann for statistical mechanics are exactly the same as those used in information theory:
    Entropy = Sum (-p(state)*ln(p(state)))
    (over all possible states)
    Or, with all states equally likely (the equipartition principle):
    Entropy = ln( # of possible states)

    Which is, of course, why Shannon used the term and the definition.

    Sorry to contradict you, but misunderstandings and misuse of the term entropy are also pet peeves of mine, and this is not one of them. ;)
  • by Anonymous Coward on Tuesday September 16, 2003 @06:34AM (#6973526)
    Some people have mentioned that they saw this years ago. Actually, it is usually said that Mark Twain originally wrote this!

    (Too lazy for HTML)
  • by Anonymous Coward on Tuesday September 16, 2003 @05:42PM (#6979823),,

    Actually, it looks like there's more to it than ONLY getting the first and last letter. The first two are easily decipherable, but the last is insanity. It's easily the hardest to make out, which is bizarre considering where we're reading it...

    "slahsodt" is much easier, while ssdhalot is next to impossible for non anagram-lovers.
  • by Anonymous Coward on Wednesday September 24, 2003 @07:49AM (#7042290)
    There have been various forms of this email doing the rounds - including one that mentioned Cmabrigde Uinervtisy (which is where I work doing research on how the brain processes written and spoken language).

    Since I thought I ought to know about this, I've written a page of notes on the science behind this meme, including a list of the factors that my colleagues and I think might be relevant for reading this kind of transposed text. You can read more here: de /


Solutions are obvious if one only has the optical power to observe them over the horizon. -- K.A. Arsdall