Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?

Babbage, A Look Back 261

A reader writes "System Toolbox just started a new computer history section in an effort to get us geeks in touch with our "roots." The current article in the monthly column focuses on Charles Babbage. The editor and author hope to raise awareness of our past so that scenes like this won't continue to take place. A big hill to climb, but worth the effort."
This discussion has been archived. No new comments can be posted.

Babbage, A Look Back

Comments Filter:
  • they didn't mention that rod based mechanical computers are likely to return with nano-tech, with carbon chains as the basic rods
    • What about organic computers ??

      I remember reading long ago about organic molecules being able to "switch" between two polarized states under the influence if an outer electronic field. This was supposed to be a future for nano registries...
      • What about organic computers ??

        What about growing brain matter??? I remember reading about Japanese researchers in the early 90s who were trying to grow brain tissue that could be used for parallel processing type projects.

      • What about organic computers ?? [As an alternative to "mechanical rod" based nanocomputers]

        I remember reading long ago about organic molecules being able to "switch" between two polarized states under the influence if an outer electronic field. This was supposed to be a future for nano registries...

        It's a matter of size.

        Electrons are big, light, fuzzy things, and the electric field goes out a long way. When you want to make circuitry REALLY small you don't want to use that for anything but holding your pieces together, because your gates and wiring would have to be very large to avoid crosstalk.

        But atoms are effectively smaller than electrons. The nuclei are MUCH smaller, and they suck their electrons down, making most of the fields cancel out at after a very short distance. And they're heavier, so they don't shake around and flit around as much.

        You can send a bit, or several bits, by sliding or rotating a rod as you can by shoving electrons into and out of a wire (even a single-molecule wire), and the signal stays IN the rod except where you WANT it to "come out". At atomic scales moving a rod is nearly as fast as moving an electron (and comparable to moving a hole), so speed isn't a problem.

        As for amplification there's not much that can beat an "interposer". That's a piece of matter that gets between a two moving parts and transfers a shove from the one supplying motion "power" to the one carrying a signal away as motion. Expend a little power to put the interposer into position and it will transfer shoves (with no further input) until you pull it away again.
        • Atoms, which by definition include one or more electrons, are effectively smaller than electrons? The whole is less than the sum of the parts? I could use a little more explanation of that.
  • ...that webster's changed the definition of a computer from a person to a machine?
  • "Analytical Society taking on the very way math was done in England."

    Actually they do maths in Britain. ;-)
  • For more on Babbage I suggest reading Gibson and Sterling's excellent novel The Difference Machine (which Babbage invented).
    • The novel is actually titled The Difference Engine.

      And I wouldn't read it for informative purposes (especially the historical sort), but it is a pretty good book.

    • Reading List (Score:2, Informative)

      by luckykaa ( 134517 )
      I'd suggest "The Cogwheel Brain" By Doron Swade (ISBN: 0 316 64847 7 ) for a very good history of the Difference Engine, as well as an account of the the Science Museum (London) building a replica.

      For some nice hacker (i.e. cracker and phreaker) history, I'd suggest Approaching Zero by Brian Clough and Paul Mungo.

  • I plead ignorance (Score:2, Interesting)

    by Kargan ( 250092 )
    So, um, I guess I should feel dumb for not knowing who 90% of those people were either?

    I mean, I'm not l33t or anything, I'm just someone who knows how to fix computers...and would it help me or affect my everyday life if I knew?
    • That's right, you should feel dumb. I'm not l33t either, but you should know the history of your art for no other reason that to not repeat mistakes (and so as not to have to reinvent the wheel)
    • by driftingwalrus ( 203255 ) on Wednesday October 17, 2001 @06:54AM (#2440503) Homepage
      You should feel dumb. This is your TRADE. You should know at least a little about it's history. If you don't recognize names like Ken Thompson and Charles Babbage, you are in a sorry state indeed.

      Do you want to know how it helps? It helps you to appreciate where it came from, the work involved in creating these machines and the passion others have had for them. It would help you to understand where YOU fit in the grand scheme of things, and it'll help you to have a little pride in your work. It's all about respect. It's about respecting the genius that made your trade possible, respecting the machine they have built and respecting yourself enough to do the best job you can. As a man who works with computers, you have to live up to the promise of your forebears. No one expects you to be another Babbage or Thompson, but you have a duty to yourself to understand the commitment they had and reflect at least some of it.

      You may think of yourself as just someone who fixes computers, but you aren't. You are a steward of the legacy of those that came before, all of us are. All of us have a duty to maintain the tradition and memory of these men. Without there contributions and endless hours of work and passion for the machine, we wouldn't even have computers.

      So, pick up a book. Read. The history of our trade is a glorious thing, full of great men and brilliant engineering. Only through it's study can we hope to go as far as they did.

      • That sums up my feelings exactly. I too "just fix" them when they break...Well to be fair I'm doing more and more sysadmining but I'm still not above changing the toner cartridge for the secretaries. Nonetheless, I'm awed by people like Turing, Zuse and Hopper. You're +5 insightful in my book.
    • by unitron ( 5733 )
      "Dumb" means mute, incapable of speech. The word you seek is stupid.

      You should not feel stupid for not knowing who these people are (or were). You should just consider your education in that area incomplete. Your intellectual curiousity should be troubled by that incompleteness. The same intellectual curiosity that led you to learn "how to fix computers" in the first place. And since it was, in part, the pioneering of these people that made possible the very existence of the computers you found interesting and challenging enough to learn how to fix, I'd say that they're due from you some modicum of respect and admiration.

      • It would be better to claim he is "ignorant", rather than "stupid" - the word "stupid" implies something fundamentally wrong with his mind, while the word "ignorant" is more of "lack of knowledge", rather than of intelligence.

        You are right to claim that he should look into and be curious about these people - they are a part of what has led him to where he is today.
  • by mikey_boy ( 125590 ) on Wednesday October 17, 2001 @05:03AM (#2440325)
    I found Computer: A History of the Information Machine [], by Martin Campbell-Kelly and William Aspray gave a good overview of the history of computing ... not too detailed but gives enough to lead you to know what you want to find out more about ...
  • "...[Babbage] became the hit of London's social circle and it was often the mark of a party's success or failure as to whether Babbage had accepted an invitation to attend...."

    Ahh, geek as social success.... good to see that some things never change ;-)


    • I believe that Charles Babbage was one of the people who held the position of Lucasian Professor of Mathematics at Cambridge.

      Perhaps it was that they had one of the leading minds in their midst that excited them...sort of like having Hawking drop in on an episode of Star Trek.

  • *LOL* (Score:1, Redundant)

    by Troed ( 102527 )
    I wonder if this [] Anonymous Coward realised just now what a fool he made of himself yesterday.
    • I wonder if this [] [] Anonymous Coward realised just now what a fool he made of himself yesterday.

      No, you just labeled yourself as humor impaired. His Charles Babbage comment was obviously a joke. Note that it was modded +1 funny.

      • Re:*LOL* (Score:1, Troll)

        by Troed ( 102527 )
        Anything ignorant americans write is considered funny by the rest of the world :) Slashdot is my main laughter of the day.

        Read the whole thread, the americans in there weren't joking - and that's plain scary.

        (i.e, one of them said that although yes, Europeans have cellular SMS, no one he knows see the point of having it. I guess the 3/4 of a billion (!) SMS sent _each day_ by the rest of the world just show that we're .. uh .. lagging behind the US? *lol*)

      • (oh, forgot, it wasn't +1 funny when I linked to it in this thread - I'm quite sure it was my comment about it that caused that moderation just now ;)

  • Graduates (Score:5, Insightful)

    by Gumshoe ( 191490 ) on Wednesday October 17, 2001 @05:07AM (#2440334) Journal
    The article posted on binaryfreedom is both fascinating and
    disturbing but also, I think, misleading, as it suggests that
    only the educational misfits are ignorant of computer history.
    This is emphatically untrue

    I've recently "graduated" from a University in England and I'm
    ashamed. I would estimate that 90% of my class are ignorant of
    not only computer history but also of trans-Windows computing in
    general. Their goal in life seems to be to make as much money as
    possible and the computer industry is the vehicle for that

    I wish systemtoolbox all the best in their endeavour but I fear
    that the only people who will read these articles will be people
    who are interested (and hence already familiar) with this
    material already.

    • Re:Graduates (Score:4, Interesting)

      by rm-r ( 115254 ) on Wednesday October 17, 2001 @05:26AM (#2440366) Homepage
      I also just graduated from an English university reading comp. sci. I think you're right on the history aspect, as far as 'trans-Windows' I think that depends on where you go- having said that in the six years I was at uni (I took an, uh, 'scenic' route through my education ;-) I did see the uni become more and more window-centric. The uni I joined was focused on first principals (we had to program in pascal for a year, and not turbopascal or delphi or such, just basic pascal) to give us the tools to equip ourselves for any computing career- the basic learn to learn thing I guess. The uni I left was a lot more into 'tools that employers want to see', ie Windows NT/2000, SQL server, and so on- great for the first couple of years of your career while these tools are still being used but once they're superceded your stuck without first principals and the ability to figure out which tool is best for the job, why, and how to use them IMHO

      As far as the history goes though, I suppose you are supposed to be interested in computing, and are supposed to do a lot of background reading, so I suppose it could be argued that you should have built up an amount by yourself...

    • Re:Graduates (Score:4, Interesting)

      by pubjames ( 468013 ) on Wednesday October 17, 2001 @05:58AM (#2440423)
      I also graduated from a university in England, although it was ten years ago now.

      I studied a combined degree of biology and computer science, and so I used to take courses from both the biology and the CS schools.

      What struck me then was what a bunch of dunderheads the computer scientists where. Sure, they new the finer points of Unix better than I ever will, but if you asked them to write an essay on the importance of computers to society, for instance, they could hardly string two words together - an average biology student could have done a far better job of it. Frankly most of their essays were embarrassing in their childish views, ignorance, poor grammar and spelling.

      And my point is? Well, at least in my experience, I think that many people who are hardcore computer enthusiasts generally have a far more myopic view of the world than people from other subject areas. They are socially inept and interested in very little else but computers, and even then in very narrow fields of computers rather than the bigger picture. I don't know if it has changed, but when I graduated many big employeers complained that computer graduates often lacked the most basic skills. Is it like this in the USA too?
      • Sure the average CS student might not be particularly bright outside their area of expertise, but in my experience at university (in Australia) the average biol student was even worse. The strong students in both areas had interests beyond their subject areas, though, and I'd back the CS student's understanding of biology well ahead the average biol student's understanding of CS.

        As far as decent writing skills, CS students weren't great. Biol students were appalling. I know, I read their lab reports - it was a struggle.

        • Actually, Computer Science teaching seems to be different in Spain (where I currently live) too. For instance, in the UK Computer Science is predominantly a male thing, in Spain it is much more mixed.

          So I guess in different countries it is different. Perhaps it is only in the UK where CS students tend to be lacking in other skills.
      • by Martin S. ( 98249 )
        What struck me then was what a &ltsnipped&gtflamebait&ltsnipped&gt

        What strikes me is your generalisations; more characteristic of a liberal arts student than scientist of any subject.

        Frankly most of their essays were embarrassing in their childish views, ignorance, poor grammar and spelling.

        Taking of which: Your own grammer is not so hot is it. Even this illiterate CS grad has spotted eleven mistakes in your post.
        • What strikes me is your generalisations; more characteristic of a liberal arts student than scientist of any subject.

          Funny! I didn't see your amusing irony the first time I read this. Well done!
        • RE: What strikes me is your generalisations; more characteristic of a liberal arts student

          Was this unintentional? Or a play on "that's a typical generalisation?"

          "Taking of which: Your own grammer is not so hot is it."

          Not as bad as your spelling. :)

          Liberal Arts is overhyped. When I went to uni I did the IBM thing and took Liberal Arts. What a hotbed of empty-headed left wing slogan-mongering and prejudicial doggerel masquerading as "progressive thinking". I actually heard some shaven-headed "womyn" student (and I use the term loosely) explain to me that given that they could put a man on the moon, the reason why they couldn't make a male birth control pill was phallocratic misogyny. When I started explaining the research (as a contributor (financially) to the MRC male contraception research team) she claimed I was trying to "blind her with science", a typical MALE trick. The discussion degenerated worse when she actually asserted the reason there's no cure for AIDS is that the oppressive patriarchy is homophobic and racist. I walked out of this "student"'s lecture with the full permission of the professor.

    • Re:Graduates (Score:3, Insightful)

      by BluesMoon ( 100100 )

      Well, we were taught comp sci in school. We started in the 8th grade (in 1988). The first three months were complete history, starting with the ABACUS, to the slide rule, napier's bones, babbage, gottfried von liebnitz, lady ada lovelace, and the rest. We ended up at the ENIAC, EDSAC and the UNIVAC, and then moved on to the binary number system for another two weeks - conversion, addition, subtraction, multiplication, floating points, etc. Finally, after all that, we started programming in GW-Basic.

      All that's changed now. After I left school, they changed the syllabus. CompSci was changed to Computers, and moved down to the primary section. Students started with paint brush.

      Jumping forward many years, in my last year of my Master's, I took part in an inter collegiate computer quiz. The finalists were from the best engineering colleges in Mumbai. They were all stumped on one question - "Who wrote the art of computer programming". Some thought it was a movie!! Suffice it to say, my team of two won that quiz through the sheer ineptness of the competition.

      These were all good students, from good colleges, studying computer engineering. I'd think that they'd have read Knuth sometime during those four years, but most hadn't even heard of him.

      I now teach several courses, and also give lectures for the ACM. I always make it a point to throw in a bit of history into all my lectures. While talking about grep and sed, I mention how they grew out of ed, and why parens have to be escaped in regexes.

      The problem seems to be that the people who set the course don't care about history, and the students who study only care about getting out, so what's past is lost.


    • My background is a BS at the University of Wyoming, and more recently a MS at the University of Colorado.

      I'm proud to say that all CS majors at UW had to take a senior level programming languages course, which taught some history of computing in addition to the concepts of programming languages. For example, we talked about Babbage and Lovelace, Grace Hopper, Wirth, etc, in addition to Pascal, Prolog, Ada, C, LISP, etc. I don't know if the undergraduate program at CU also covers the history, but I hope it does.

      I also remember students complaining that they never got to do any Windows programming at UW. Now I'm glad I didn't. (There were a few topics classes that used VB for GUI stuff, but they were few and far between). For the most part all programming was done on unix boxes. Or, for introductory courses, on PCs using the Borland compilers. Almost all of the graduate courses at CU were unix based, or tended to be platform independent, but all the CS labs are unix or linux.

      As a graduate student, I met very few people who had the "I want to program to make me rich" attitude, but that's not suprising at the graduate level. There weren't too many of those in my undergrad CS classes either... but that may have been a bit before the big software boom.

      Right before I got my BS, I ended up doing workstudy in the computer labs, and running severl lab sessions for the intro courses. I saw a lot of wannabes drop out after they found out that programming wasn't the "easy" path to riches they thought it would be.

      Maybe these differences are produced by beign from a different part of the country. Or maybe they are just an artifact of getting my undergrad degree a few years before you did. Either way, I'm glad I got the broad, trans-windows experience I did. And I hope that the academic community returns to a quality education that isn't just about churning out windows apps for a quick buck.
  • SkR1pT K1dd13Z (Score:5, Insightful)

    by StaticEngine ( 135635 ) on Wednesday October 17, 2001 @05:07AM (#2440336) Homepage
    I can't say I'm surprised that the "hacker youth" is disconnected with the past. Who doesn't know teens like this? In this consumer-oriented society, the focus is on having and bragging about it, not on doing or knowing.

    Hell, when I was that age, I used to read computer magazines in class, and a girl who sat next to me once asked "why I read those things?" Since she was hot and I was shocked that she was actually speaking to me, I answered the not quite accurate "it tells me how to fix them," to which she replied, "why don't you just take it to the shop?" Likewise, several months ago, I was talking with a younger cousin about the video game industry (where I'm currently working), and we were discussing what makes games good. His entire list of quality games was less than a year old, and when I mentioned Pac Man and the Infocom games, he had only the vaguest clue that such things once existed. Furthermore, his interests were more in how to get rich writing games rather than how a programmer actually writes good AI routines, or an artist animates characters realisticaly.

    The point is, there will always be a large element of society, at any age, which is both ignorant and uninterested in the history of anything. Most of these people will remain in the realm of Average Consumer, while the inquisitive will go forth, research the past, and build the future. The danger comes from the past-less few who simply abuse the tools that are available to them, or arguably worse, become the leaders who direct the doers of society, with little grip on why the wheels of progress turn a certain way, and no concern for how they're powered to enable to future. Because when the percieved joy is in reaching the destination, rather than within the journey itself, it tends to be one hell of a bumpy ride that doesn't exactly pave a smooth road for those who follow.

    • I used to read computer magazines in class, and a girl who sat next to me once asked "why I read those things?"

      Note to self, Reading Wired and 2600 does NOT impress the babes (or really anyone else). Maybe I should take the Tao of Steve approach.

    • The point is, there will always be a large element of society, at any age, which is both ignorant and uninterested in the history of anything.

      Yeah... imagine the shock on my face when an older, degreed, co-worker who had programmed in C for 8 years at a previous position had to ask what "K&R" was when I mentioned it in a discussion. Giving the names wasn't enough, I had to actually tell him what they did. It was even more amazing because I'm 15-16 years younger than he. It's not just a "problem" with younger people... some people just don't care and don't pick up information like they perhaps should be doing.
  • by Anonymous Coward on Wednesday October 17, 2001 @05:16AM (#2440350)
    "He used a Captain Crunch whistle to generate a 2600 kilohertz tone to get free phone calls..."

    2,600,000 Hz, that's a pretty high pitch!

  • Looks like a rogue spell checker got at the system toolbox article:

    While still a young boy, Baggage was concerned with questions of "how" over those of "why.".
  • Quote for article:

    The funny thing is that the expression of this "disorder" can be fingered early in life. One can watch for the early warning signs. Children that take apart watches or have a penchant for building elaborate structures from blocks may just be engineers in their pupae stage. By all accounts, Babbage definitely was afflicted by the time of his boyhood. His tinkering with things, his dismantling of gadgets, and his inquisitiveness as to how things worked are all sure signs. While the draw of engineering can be sublimated if caught early and treated with care,

    Maybe, ti just might be that the hackers and crackers are just not "evil" as they are made out. Instead of opening watches and playing with blocks they toy around with computers. I say this because recently there was news about a kid being prisoned and I cannot help but wonder at the wasted [] potential [].

    • Re:Think about it (Score:5, Insightful)

      by dangermouse ( 2242 ) on Wednesday October 17, 2001 @05:33AM (#2440385) Homepage
      Maybe, ti just might be that the hackers and crackers are just not "evil" as they are made out. Instead of opening watches and playing with blocks they toy around with computers.

      This rather doofy rationale has been expounded before. The counterargument, of course, is that if kids tinker with locks it's one thing... when they tinker with the locks on other peoples' buildings and go walking around inside, it's another entirely.

      You don't get to "tinker" with other people's stuff. How anyone could think one should be granted that right because one is "curious", I'll never understand.

    • ...the hackers and crackers are just not "evil" as they are made out.

      Or at least, they aren't beyond redemption. The infamous Captain Crunch [] seems to have turned his life around and is now a productive member of society.

      But evil is not the issue. The law punishes people for what they do, not who they are. Just as they should not be punished for being evil, they should not be spared punishment because they are fundamentally decent.

      Many of us have more sympathy for hackers than other types of juvenile delinquents because we recognize some of the same impulses in ourselves. To the extent we advocate mercy for hackers we are also asking for mercy for ourselves. We probably shouldn't let ourselves off the hook so easily either.

  • by Futurepower(tm) ( 228467 ) <M_Jennings @ not ...> on Wednesday October 17, 2001 @05:23AM (#2440360) Homepage

    The quote by Ken Thompson at the bottom of the article referenced in the Slashdot story is from a very interesting speech, Reflections on Trusting Trust [].

    Here is the quote:

    "I have watched kids testifying before Congress. It is clear that they are completely unaware of the seriousness of their acts. There is obviously a cultural gap. The act of breaking into a computer system has to have the same social stigma as breaking into a neighbor's house. It should not matter that the neighbor's door is unlocked. The press must learn that misguided use of a computer is no more amazing than drunk driving of an automobile."

    What should be the Response to Violence? []
  • Someone important in British Literature once said, "If I appear so tall, it is because I stand on the shoulders of Giants." (If you can remember who that was, you've got mad skills).
    I always think its important to learn about one's roots, but I don't think its as important as understanding our contemporaries.
    Sure, Babbage was revolutionary and laid a big foundation for where we are today. But so did all of the people who laid foundations for him; and the people who laid foundations for those people. Without Faraday computers wouldn't exist. Without Newton computers wouldn't exist. Without Aristotle, etc. etc.
    Does scrutinizing Aristotle (or Babbage for that matter) propel our computer knowledge farther than if we spent more time studying Kevin Mitnick or Bill Gates [even those who despise him must agree he changed the computing world, for better or worse is not the question]. Does knowing about the history of the punch card help us as much as understanding the status of quantum computing?
    The whole premise of computer science is to abstract layers upon layers so the guy who takes over can do more without having to understand fully the layers below him. Knowing about those layers is good, but do you need to know about how capacitors charge in order to write a solid C code?
    Where does one draw the line between useful information and cool things to talk about at a party?
    • I'm pretty certain that quote was Newton himself after being refered to as a maths great or something
      • Actually...Newton had a very similar quote. I believe he said that if he "sees further" it is because he is standing on the shoulders of giants. Coleridge had a very simlilar quote as well. I believe it is someone from British Literature (maybe John Donne or Jonathan Swift) remembering who said the quote really all that important? (which comes back to my original point).
        • I think the context of that quote was that Hooke objected to Newton not giving him proper credit for "Hooke's Law" --that the restoring force of a spring is proportional to its displacement from equilibrium. Newton then did some research and found about 20 other guys who also "discovered" this rather obvious observation, and cited all of them, placing Hooke's name last on the list. Then he fired off this quote. So the real message is more of a flame of Hooke, yet most people consider it some great admission of humility.

          • So the real message is more of a flame of Hooke, yet most people consider it some great admission of humility.

            In case you missed it, I must refer you to another post [] by pmc in this thread that points to an very interesting article [] that refutes your conclusion fairly decisively.

      • Newton said "If I have seen further it is because I stand on the shoulders of giants". Taken out of context it seems like a noble thing to say, but it was actually intended as an insult to Robert Hooke his contemporary and hated rival, who was very short and by all accounts sensitive about the fact.
        • by pmc ( 40532 )
          Newton said "If I have seen further it is because I stand on the shoulders of giants". Taken out of context it seems like a noble thing to say, but it was actually intended as an insult to Robert Hooke his contemporary and hated rival, who was very short and by all accounts sensitive about the fact.

          Nope - this is (probably) a fallacy. See this [] for the details.

    • Where does one draw the line between useful information and cool things to talk about at a party?

      Knowledge does not have to be either useful or cool in order to be valuable.

      One common approach is that of Cardinal Newman in the Idea of a University []: Knowledge is capable of being its own end. Such is the constitution of the human mind, that any kind of knowledge, if it be really such, is its own reward.

      The other common approach is to follow Socrates' [] dictim that "The unexamined life is not worth living", which he derived from his belief that ignorance causes evil.

      These approaches still leave unanswered the question of where you draw the line between learning and other activities, knowledge being infinite and time being short.

      I submit that there is no line. Learning includes close observation of things around you. In this way you integrate a love learning with everyday life and test the ideas acquired in solitary study.

      Despite the fact that most great scientists have been more motivated by the love of learning than anything else, I've found that people who insist that knowledge must have a utilitarian purpose cannot be convinced otherwise.

  • jeez (Score:2, Interesting)

    by TheMMaster ( 527904 )
    What I find most disturbing is not that these kids don't know everything about computer history but that they also don't seem to care... I must admit that I didn't know all the people mentioned in the story on binary but I looked up the bios if those people...
    This is happening all around us and not only in computer history, how many kids care about history at all??? How many kids know stuff about the first world war, Newton and the old philosophers like Aristoteles ???

    I must say this does worry me...
  • And another thing... (Score:4, Interesting)

    by Pogue Mahone ( 265053 ) on Wednesday October 17, 2001 @06:24AM (#2440459) Homepage
    ... that's not so widely known about Charles Babbage is his cryptanalysis expertise. It was he who first cracked the Vigenere polyalphabetic substitution cipher (previously considered to be uncrackable).

    For some reason he didn't publish his results. Some believe that he was told not to by the British government, so that they could use his discovery during the Crimean war. Babbage's work on this subject was discovered in his notebooks after his death.

  • by mindpixel ( 154865 ) on Wednesday October 17, 2001 @06:44AM (#2440482) Homepage Journal
    My advice is to make the effort and go to H2K2 and get a real sample. I think you will find like I did when I spoke at H2K, that the majority are well informed about our history.

    Like any culture, our culture needs to be taught! Only so many can have had first hand experience and there are less of us each day. Yet, each day, there are more just coming into interest who need to be taught. If you find such a teacherless group of people interested in computers, you should take it upon yourself to teach who we are.

    Show people the first computer you ever programmed []. Show them the games you played and wrote []. Show them how to say "Hello World!" directly with a Turing Machine or in Java and everything between [].

    Tell them about Norbert Wiener and Marie Ampere. Warren McCulloch, J.C.R. Licklider, John von Neumann and Vannevar Bush. Alan Turing, Claude Shannon and David Levy (yes Ken Thompson too and Belle). Scott Adams(all three) and Stanislaw Lem. Joeseph Weizenbaum and Eliza, Alaxander Bain and Donald Hebb. Nolan Bushnell, Ted Dabney and Larry Bryan. Alan Kay and Steve Russell. David Gottieb, Joel Hochberg and Al Arcorn. Thomas Hobbes and Levithan. Charles Darwin, Erasmus Darwin and Thomas Huxley. Aristotle and Lucretius. Gottfried Wilhelm Leibniz, Charles Babbage and Blaise Pascal. B. F. Skinner and Wilhelm Wundt. Robert Tinney and Peter Max. J. Presper Eckert and John Mauchly. John McCarthy and Marvin Minsky. Doug Lenat, Push Singh and myself.

    We will always need more teachers who know how to both show and to tell!
    • Like any culture, our culture needs to be taught!

      Hear, hear! But a meme [] needs a route to propagate. Who's going to do it?

      Universities aren't interested just in "educating" future academics; they've got a vested interest in crafting a workforce. Their funding derives in large part from donations from business leaders, and those leaders want employees who can program, not employees who have an appreciation for Babbage. An analogy would be to business schools: Graduates are expected to solve "real-world" (as academia sees it) problems, not be able to discourse on the history of efficiency experts. B-Schools aren't so much interested in giving their students a full and complete history of business methods as they are in providing a little bit of context to their graduates, who move on to become employees, who move on to become "leaders", who--they hope--move on to become future corporate alumni donors.

      So who educates the next generation (or the current generation; I'm painfully aware of my own ignorance in these matters)? Well, where did you learn about Babbage? Some of us probably learned about him in an academic setting, but I'm guessing that informal channels played a much more important role for most technophiles--if for no other reason then the aggressively informal culture of high-tech in general. I wasn't attracted to computers because there was such a rich history of thought and intellectual culture behind them. I was jazzed by bells, whistles and blinkenlights. Once I got involved, it was an informal network of peers, books, net sites, and conversations that led to expanding my knowledge into historical considerations.

      • We need roving teachers... HackMasters... who set HackerDojos in every city.

        We have one here in Antofagasta, Chile where I live... actually it's just my apartemnt, but I can say from direct experience that the new generating of hackers love to hang out and learn with the older generation... they just love to see my KIM-1 and TRS-80 Model I... my Cray 1 S/1000 supercomputer memory card... my collection of BYTE, 80 Micro, & Kilobaud... my books (especially my books because there is no such thing as a free public library in Chile)... to hear my hacking war stories and most interestingly, to work on new developement projects with me.

        Make your self into a HackMaster.


  • CS and History... (Score:5, Interesting)

    by glebite ( 206150 ) on Wednesday October 17, 2001 @06:44AM (#2440484)

    I interview a lot of co-op students for job placements in the company that I work for now, and for large company in the past. Sometimes, I get some really cocky student who comes in with a smug attitude that he knows it all.

    Sure enough, he can answer the technical questions flawlessly just as if he had read it from a textbook. He could show ingenuity for coming up with solutions on the fly as well... And usually when they get that look in their eye: "I know you want to hire me - make me a REALLY good offer, and I might consider working for you." I then ask the killer question:

    "Who is Charles Babbage?"

    The blank look on their face is priceless. It's a simple curveball. I've received answers ranging from: "I'm not sure - wasn't he featured in PC Magazine last month?" to "Oh - he's the founder of IBM." and "I... I... Don't know..."

    I then answer the question with a short history lesson. They of course often recall it - yes, but didn't think that it was important.

    I'm amazed at how much computing history has been forgotten from introductory courses in High School. There was an incredible amount of effort and ingenuity required to get us to the place we are today: information available within seconds, toys to entertain, tools to teach and make life easier (mine is easier now because of them), communication barriers broken down, etc... It's caused other problems too, but man - what doesn't. I'll take the benefits over the problems any day.

    Hanging up in my office is a picture of Charles Babbage, and one of Ada.

    "Who is Grace Hopper?" is my backup question.


    • Actually, I'm proud to say that Babbage, Lovelace, and Hopper all showed up as important names in the Programming Languages class I took as an undergraduate at the University of Wyoming (of all places!). The text, ("Foundations of Programming Languages" by Sebest I think) actually went into a lot of history of each of the languages it touched on (including flowmatic), and the history of computing in general.

      I think it's a bit ironic that the only 4 year university in the entire state of Wyoming (considered by many to be the armpit of the US) actually taught something of value that many other (possibly even more prestigious) schools have neglected.

      Besides my bit of history, and the fact that I never touched a MS compiler (almost all unix based!) during my years at UW, I wonder what other benefits I've unknowingly collected over an education elsewhere?
  • seeing as there have been many posts claiming ignorance of who some of those people are (congrats for being mature enough to admit it) I have to say I'm damn proud that I know who all of thema re and I've only been using computers since i was 14. (which by /. standards is still considered a newbie in some circles. oh I'm 24 now.
  • History is bunk (Score:2, Flamebait)

    by dgroskind ( 198819 )

    I'm dubious about the idea that knowing the history of computer science helps you be a better programmer. I've known several excellent programmers whose knowledge of computer science was limited to the tools of their trade and the underlying theory. My own knowledge of the history of my profession hasn't made learning OOP any easier.

    One should have a broader interest in the world than simply making a living but there are many places to go beside the history of computer science. One could argue that, given limited time, one should look outside one's profession rather than inside it for a broader perspective.

    Having said that, some of life's lesson can seem more acute when seen in the context of familiar problems. For instance, this example from Babbage's life []:

    Babbage's private income perhaps deprived him of the drive that would have whipped his work into shape. Every time he came up against a problem with the design of his various engines, his impulse was to turn away and start again. Instead of breaking through the pain barrier, he finished his 80-year life with a lot of drawings and not a prototype in sight.

    Many of us who've found a comfortable life in programming struggle with that problem every day.

    • Re:History is bunk (Score:3, Interesting)

      by mav[LAG] ( 31387 )
      I'm dubious about the idea that knowing the history of computer science helps you be a better programmer. I've known several excellent programmers whose knowledge of computer science was limited to the tools of their trade and the underlying theory. My own knowledge of the history of my profession hasn't made learning OOP any easier.

      I am of the opposite opinion - that you can't know too much about the history of computing and computer science if you want to be a better programmer. Want to know why adding programmers to a late software project makes it later? Read Brooks. Are there some problems which are not solvable by computer? Study Turing. How does entropy and communications work? Shannon tells all. How and why does LISP do what it does so well? Check out the history of AI research. Does Babbage's engine really work? Hell yes - there's a working model at the Science Museum in London. Seeing it in action was practically a religious experience for me. Why does Unix use pipes? Check out the history of it at Bell Labs online.

      You could say that all of these examples could just as well be studied theoretically. But then you'd miss the fun parts - like the story behind IBM's development of OS/360 and how some companies still haven't learned those lessons.
      History is full of these amazing guys (and gals - hello Grace) that met and solved all kinds of problems, often in surprising and non-intuitive ways. Many of the anecdotes and broader perspectives do help you with programming if only to teach you something from history. Your Babbage example is a good one.

  • Yeah, it's sad (Score:4, Interesting)

    by renehollan ( 138013 ) <rhollan AT clearwire DOT net> on Wednesday October 17, 2001 @10:18AM (#2441071) Homepage Journal
    My father was an electrical engineer. Well, not really, since WWII prevented him from ever getting an actual degree (which would have been in medicine, not engineering, but that's a whole 'nother digression). Anyway, he ended up working for RCA Canada, and later SPAR Aerospace, on passive filters for all the satellites from Allouette through Anik C (whadda mean, "satellite TV" is new? That was the only way many Inuit (native northern Canadians) could get TV in the 1970s) and Brazilsat. The point is, I picked up an interest in electronics as a kid, which soon evolved to an interest in computers. See, my father "did neat shit".

    Of course, that was way back in the 60's, er 1960's. I actually got to work with a computer in 1973 (HP2000 timesharing monstrosity running basic connected to dialup teletypes -- we got a DECwriter the next year, whee, 300 bps!) and experienced punched cards when I started my Comp. Sc. Degree program in 1979 -- we had DECwriters there too, and a few CRT terminals (at the end of current loop lines at, you guessed it, 300 bps), but course work had to be done via punched cards, submitted to the "express" window. You only had to wait 15-30 minutes for your job to get turned around.

    I remember those days quite well. Today, I sometimes interview recent grads for software design jobs. One standard problem I pose is "Write a routine, in any programming language of your choice (I've probably seen it), that sorts an array of things according to some ordering scheme. I don't care about efficiency, but I expect the following: (a) that it is correct, (b) that you can explain the complexity of the algorithm in "O(n)" notation." Of course I expect a bubble sort of integers. One smartass did a Shell sort and got it right. But over 90% of the candidates fail this basic test. That's sad.

    The scary part is that peer reviewers think this is being "too hard" on a candidate. Or worse, I remember one kid who added "servlets" to an FTP server -- basically he provided an API for command plug-ins that executed server-side and could provide massaged data back to the client. So, for example, you could do back-end SQL queries via FTP. Obviously an excercise in program extensibility. Trouble is, he didn't even get an offer -- a coworker insisted that he must have been "bullshitting" because "everyone knows that servlets are a Java thing and not an FTP thing". My protests got voted down. So, technical ignorance has permeated even the supposed technical interview -- on the part of the interviewers! Shortly after losing that battle, I left the employ of that company.

    I remember building memory boards for an old SWTPC computer (none of us geek kids could afford to buy them assembled). I remember lamenting when the IBM PC booted into ROM Basic and displayed "Ready" -- who's gonna know how the computer gets to that point? that was what, almost 20 years ago?

    There is hope. I often see young (say, under 20 years old) posters here who do have a clue. I envy that they have far better tools than I did as a kid, but note that these same "better tools" make it no big deal to be a script kiddie techno-vandal. Compare the history of mass computing to the history of the car. Anyone can drive a car today. Few know how to fix one or what makes it run, But, even with the ease of "turn the key, push on the gas, and remember to steer" driving, some still hack their cars. I take that as a sign that hacking doesn't die -- the computer hacker was a rare breed in the 1970s and still is despite the fact that there are a lot of 'leet wannabees around without a clue.

    My advice to the young hacker is to seek out other hackers, young and old (say 40 and up), and avoid the wannabees. Of course, this implies a willingness and responsibility on the part of us "old guys" to mentor -- sure, you don't need a fast sort, or balanced tree structure, when you've got a 1 Ghz processor, but imagine how much faster and scalable your code will be if you use one. In my day, RAM was fast, and disk was S L O W, so you carefully designed your algorithms to minimize disk access. These days, you want stuff to stay locked in processor cache 'cause RAM is slow, by comparison. Look at other "hobies", like HAM radio, and see how "the torch gets passed on" there. We should strive for similar effect.

  • I bet everyone here has actually had real experience with a mechanical compuer (of sorts). While not a 'computer' in the mathematical description, it's pretty close: It's the automobile automatic transmission. They are probably the most complex mechanical device that people contact every day, except for the very newest computer regulated ones.

    It actually does computational tasks in a strict sense -- it takes input, does "intelligent" operations on it based on data and outputs it, except in this case it's motion not math. It uses a series of planetary gears, pumps and pulleys to the extent that it make my brain hur thinking about it.

    Don't believe that they are so amazing? See for yourself []. They even have a cool video showing you how the whole package works.

  • bl ets/babbage/start.asp
  • As I understand it, Babbage absolutely despised street musicians. He tried to enact legislation banning them in London. Many knew of his efforts and would 'crank it up' when he passed by, adding to his torment.
  • by sv0f ( 197289 ) on Wednesday October 17, 2001 @12:40PM (#2441851)
    I wrote a report for a Philosophy of Mathematics class in college that may be relevant for this discussion. Babbage is commonly thought of as more of an engineer than a scientist. And his efforts were largely directed to building a mechanical device that exceeded the manufacturing capabilities of the day. As well, the immediate precursors to the Analytic Engine -- the Jacquard Loom and the Difference Engine -- were specialized for narrow, practical purposes.

    However, if you examine his writings, you'll realize that he had lofty mathematical goals for the analytic engine. Specifically, he understood it to be an exercise in defining what moderns might call 'effective computability'. There's a striking similarity in his pursuit of this goal and the same explorations by Hilbert, Turing, Church, etc., in the twentieth cetury. At least that's what I argued in my paper.

    I recommend his notebooks and the few existing biographies to those with a taste for the history of science and the biography of scientists.

    PS: One other conclusion that I drew from my research was that the role of Ada Lovelace in developing the abstract principles of computer science has been highly overstated. She was probably mathematically adept. But she also lead an outrageous (for the time) life. The illegitimate, unmarried, and independently wealthy daughter of a Romantic poet. I suspect Mr. Babbage (Sir Charles?) entertained and amused her, both in his glum disposition and willingness to engage her intellectually. And we owe her a death for her record of and commentary on his actions. But I seriously doubt she originated the grand ideas commonly attributed to her.
  • by cascadefx ( 174894 ) <> on Wednesday October 17, 2001 @01:25PM (#2442124) Journal

    The Difference Engine number two was ahead of its time, indeed. In a head to head polynomial calculation test with a Windows-based Canon BJ Notebook BN22 (with built-in ink-jet printer), the mechanical Difference Engine initially beat the pants off the laptop, but was then overtaken. Not bad for a technology that was concieved around 160 years before its competitor.

    From this account [], we find the following description from a witness:

    "...With the windows overhead slugging the Canon, the additional time taken for 31-digit extended precision arithmetic, and the printer buffer soaking up results before making them visible to the race referees, the Babbage engine produced nine 31-digit results before the Canon bubble jet printer blasted off the starting blocks. It then spewed out the first 50 results in little more than the time for the Babbage machine to produce one more result. The hare finally overtook the tortoise."

    Wow! Not bad for a version 2.0 product. Consider the advancements it would have made had Babbage been successful all those years before.

  • There are some very competent young people out there in computing. Probably about the same number there were 20 years ago. But they're lost in a sea of bozos who think a 20-line CGI program in Perl, or a macro for Word, is serious programming.

    We have a title problem here. The script hackers and the serious developers both call themselves "programmers". But that's just a title problem, no more.

    Go to the Game Developer's Conference. Or just read Game Developer magazine. Some very, very smart people are busily cracking tough problems in physical simulation and AI. Those people have broader knowledge than the hackers of 20 years ago.

    One big difference today is that today's good programmers know a lot more about business than the programmers of 20 years ago. That's a big plus. Game programmers also tend to know something about art, music, entertainment, and how to deal with Hollywood.

In English, every word can be verbed. Would that it were so in our programming languages.