Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Science

Babbage, A Look Back 261

A reader writes "System Toolbox just started a new computer history section in an effort to get us geeks in touch with our "roots." The current article in the monthly column focuses on Charles Babbage. The editor and author hope to raise awareness of our past so that scenes like this won't continue to take place. A big hill to climb, but worth the effort."
This discussion has been archived. No new comments can be posted.

Babbage, A Look Back

Comments Filter:
  • I plead ignorance (Score:2, Interesting)

    by Kargan ( 250092 ) on Wednesday October 17, 2001 @05:00AM (#2440322) Homepage
    So, um, I guess I should feel dumb for not knowing who 90% of those people were either?

    I mean, I'm not l33t or anything, I'm just someone who knows how to fix computers...and would it help me or affect my everyday life if I knew?
  • jeez (Score:2, Interesting)

    by TheMMaster ( 527904 ) <hpNO@SPAMtmm.cx> on Wednesday October 17, 2001 @05:24AM (#2440363)
    What I find most disturbing is not that these kids don't know everything about computer history but that they also don't seem to care... I must admit that I didn't know all the people mentioned in the story on binary but I looked up the bios if those people...
    This is happening all around us and not only in computer history, how many kids care about history at all??? How many kids know stuff about the first world war, Newton and the old philosophers like Aristoteles ???

    I must say this does worry me...
  • Re:Graduates (Score:4, Interesting)

    by rm-r ( 115254 ) on Wednesday October 17, 2001 @05:26AM (#2440366) Homepage
    I also just graduated from an English university reading comp. sci. I think you're right on the history aspect, as far as 'trans-Windows' I think that depends on where you go- having said that in the six years I was at uni (I took an, uh, 'scenic' route through my education ;-) I did see the uni become more and more window-centric. The uni I joined was focused on first principals (we had to program in pascal for a year, and not turbopascal or delphi or such, just basic pascal) to give us the tools to equip ourselves for any computing career- the basic learn to learn thing I guess. The uni I left was a lot more into 'tools that employers want to see', ie Windows NT/2000, SQL server, and so on- great for the first couple of years of your career while these tools are still being used but once they're superceded your stuck without first principals and the ability to figure out which tool is best for the job, why, and how to use them IMHO


    As far as the history goes though, I suppose you are supposed to be interested in computing, and are supposed to do a lot of background reading, so I suppose it could be argued that you should have built up an amount by yourself...

  • by shredds ( 241412 ) on Wednesday October 17, 2001 @05:36AM (#2440389)
    Actually...Newton had a very similar quote. I believe he said that if he "sees further" it is because he is standing on the shoulders of giants. Coleridge had a very simlilar quote as well. I believe it is someone from British Literature (maybe John Donne or Jonathan Swift)...is remembering who said the quote really all that important? (which comes back to my original point).
  • Re:Graduates (Score:4, Interesting)

    by pubjames ( 468013 ) on Wednesday October 17, 2001 @05:58AM (#2440423)
    I also graduated from a university in England, although it was ten years ago now.

    I studied a combined degree of biology and computer science, and so I used to take courses from both the biology and the CS schools.

    What struck me then was what a bunch of dunderheads the computer scientists where. Sure, they new the finer points of Unix better than I ever will, but if you asked them to write an essay on the importance of computers to society, for instance, they could hardly string two words together - an average biology student could have done a far better job of it. Frankly most of their essays were embarrassing in their childish views, ignorance, poor grammar and spelling.

    And my point is? Well, at least in my experience, I think that many people who are hardcore computer enthusiasts generally have a far more myopic view of the world than people from other subject areas. They are socially inept and interested in very little else but computers, and even then in very narrow fields of computers rather than the bigger picture. I don't know if it has changed, but when I graduated many big employeers complained that computer graduates often lacked the most basic skills. Is it like this in the USA too?
  • And another thing... (Score:4, Interesting)

    by Pogue Mahone ( 265053 ) on Wednesday October 17, 2001 @06:24AM (#2440459) Homepage
    ... that's not so widely known about Charles Babbage is his cryptanalysis expertise. It was he who first cracked the Vigenere polyalphabetic substitution cipher (previously considered to be uncrackable).

    For some reason he didn't publish his results. Some believe that he was told not to by the British government, so that they could use his discovery during the Crimean war. Babbage's work on this subject was discovered in his notebooks after his death.

  • by mindpixel ( 154865 ) on Wednesday October 17, 2001 @06:44AM (#2440482) Homepage Journal
    My advice is to make the effort and go to H2K2 and get a real sample. I think you will find like I did when I spoke at H2K, that the majority are well informed about our history.

    Like any culture, our culture needs to be taught! Only so many can have had first hand experience and there are less of us each day. Yet, each day, there are more just coming into interest who need to be taught. If you find such a teacherless group of people interested in computers, you should take it upon yourself to teach who we are.

    Show people the first computer you ever programmed [kim-1.com]. Show them the games you played and wrote [dadgum.com]. Show them how to say "Hello World!" directly with a Turing Machine or in Java and everything between [roesler-ac.de].

    Tell them about Norbert Wiener and Marie Ampere. Warren McCulloch, J.C.R. Licklider, John von Neumann and Vannevar Bush. Alan Turing, Claude Shannon and David Levy (yes Ken Thompson too and Belle). Scott Adams(all three) and Stanislaw Lem. Joeseph Weizenbaum and Eliza, Alaxander Bain and Donald Hebb. Nolan Bushnell, Ted Dabney and Larry Bryan. Alan Kay and Steve Russell. David Gottieb, Joel Hochberg and Al Arcorn. Thomas Hobbes and Levithan. Charles Darwin, Erasmus Darwin and Thomas Huxley. Aristotle and Lucretius. Gottfried Wilhelm Leibniz, Charles Babbage and Blaise Pascal. B. F. Skinner and Wilhelm Wundt. Robert Tinney and Peter Max. J. Presper Eckert and John Mauchly. John McCarthy and Marvin Minsky. Doug Lenat, Push Singh and myself.

    We will always need more teachers who know how to both show and to tell!
  • CS and History... (Score:5, Interesting)

    by glebite ( 206150 ) on Wednesday October 17, 2001 @06:44AM (#2440484)

    I interview a lot of co-op students for job placements in the company that I work for now, and for large company in the past. Sometimes, I get some really cocky student who comes in with a smug attitude that he knows it all.

    Sure enough, he can answer the technical questions flawlessly just as if he had read it from a textbook. He could show ingenuity for coming up with solutions on the fly as well... And usually when they get that look in their eye: "I know you want to hire me - make me a REALLY good offer, and I might consider working for you." I then ask the killer question:

    "Who is Charles Babbage?"

    The blank look on their face is priceless. It's a simple curveball. I've received answers ranging from: "I'm not sure - wasn't he featured in PC Magazine last month?" to "Oh - he's the founder of IBM." and "I... I... Don't know..."

    I then answer the question with a short history lesson. They of course often recall it - yes, but didn't think that it was important.

    I'm amazed at how much computing history has been forgotten from introductory courses in High School. There was an incredible amount of effort and ingenuity required to get us to the place we are today: information available within seconds, toys to entertain, tools to teach and make life easier (mine is easier now because of them), communication barriers broken down, etc... It's caused other problems too, but man - what doesn't. I'll take the benefits over the problems any day.

    Hanging up in my office is a picture of Charles Babbage, and one of Ada.

    "Who is Grace Hopper?" is my backup question.

    Hehehehe...

  • by Ungrounded Lightning ( 62228 ) on Wednesday October 17, 2001 @09:23AM (#2440886) Journal
    What about organic computers ?? [As an alternative to "mechanical rod" based nanocomputers]

    I remember reading long ago about organic molecules being able to "switch" between two polarized states under the influence if an outer electronic field. This was supposed to be a future for nano registries...


    It's a matter of size.

    Electrons are big, light, fuzzy things, and the electric field goes out a long way. When you want to make circuitry REALLY small you don't want to use that for anything but holding your pieces together, because your gates and wiring would have to be very large to avoid crosstalk.

    But atoms are effectively smaller than electrons. The nuclei are MUCH smaller, and they suck their electrons down, making most of the fields cancel out at after a very short distance. And they're heavier, so they don't shake around and flit around as much.

    You can send a bit, or several bits, by sliding or rotating a rod as you can by shoving electrons into and out of a wire (even a single-molecule wire), and the signal stays IN the rod except where you WANT it to "come out". At atomic scales moving a rod is nearly as fast as moving an electron (and comparable to moving a hole), so speed isn't a problem.

    As for amplification there's not much that can beat an "interposer". That's a piece of matter that gets between a two moving parts and transfers a shove from the one supplying motion "power" to the one carrying a signal away as motion. Expend a little power to put the interposer into position and it will transfer shoves (with no further input) until you pull it away again.
  • by dgroskind ( 198819 ) on Wednesday October 17, 2001 @10:14AM (#2441059)

    Where does one draw the line between useful information and cool things to talk about at a party?

    Knowledge does not have to be either useful or cool in order to be valuable.

    One common approach is that of Cardinal Newman in the Idea of a University [newmanreader.org]: Knowledge is capable of being its own end. Such is the constitution of the human mind, that any kind of knowledge, if it be really such, is its own reward.

    The other common approach is to follow Socrates' [ship.edu] dictim that "The unexamined life is not worth living", which he derived from his belief that ignorance causes evil.

    These approaches still leave unanswered the question of where you draw the line between learning and other activities, knowledge being infinite and time being short.

    I submit that there is no line. Learning includes close observation of things around you. In this way you integrate a love learning with everyday life and test the ideas acquired in solitary study.

    Despite the fact that most great scientists have been more motivated by the love of learning than anything else, I've found that people who insist that knowledge must have a utilitarian purpose cannot be convinced otherwise.

  • Yeah, it's sad (Score:4, Interesting)

    by renehollan ( 138013 ) <rhollan@@@clearwire...net> on Wednesday October 17, 2001 @10:18AM (#2441071) Homepage Journal
    My father was an electrical engineer. Well, not really, since WWII prevented him from ever getting an actual degree (which would have been in medicine, not engineering, but that's a whole 'nother digression). Anyway, he ended up working for RCA Canada, and later SPAR Aerospace, on passive filters for all the satellites from Allouette through Anik C (whadda mean, "satellite TV" is new? That was the only way many Inuit (native northern Canadians) could get TV in the 1970s) and Brazilsat. The point is, I picked up an interest in electronics as a kid, which soon evolved to an interest in computers. See, my father "did neat shit".

    Of course, that was way back in the 60's, er 1960's. I actually got to work with a computer in 1973 (HP2000 timesharing monstrosity running basic connected to dialup teletypes -- we got a DECwriter the next year, whee, 300 bps!) and experienced punched cards when I started my Comp. Sc. Degree program in 1979 -- we had DECwriters there too, and a few CRT terminals (at the end of current loop lines at, you guessed it, 300 bps), but course work had to be done via punched cards, submitted to the "express" window. You only had to wait 15-30 minutes for your job to get turned around.

    I remember those days quite well. Today, I sometimes interview recent grads for software design jobs. One standard problem I pose is "Write a routine, in any programming language of your choice (I've probably seen it), that sorts an array of things according to some ordering scheme. I don't care about efficiency, but I expect the following: (a) that it is correct, (b) that you can explain the complexity of the algorithm in "O(n)" notation." Of course I expect a bubble sort of integers. One smartass did a Shell sort and got it right. But over 90% of the candidates fail this basic test. That's sad.

    The scary part is that peer reviewers think this is being "too hard" on a candidate. Or worse, I remember one kid who added "servlets" to an FTP server -- basically he provided an API for command plug-ins that executed server-side and could provide massaged data back to the client. So, for example, you could do back-end SQL queries via FTP. Obviously an excercise in program extensibility. Trouble is, he didn't even get an offer -- a coworker insisted that he must have been "bullshitting" because "everyone knows that servlets are a Java thing and not an FTP thing". My protests got voted down. So, technical ignorance has permeated even the supposed technical interview -- on the part of the interviewers! Shortly after losing that battle, I left the employ of that company.

    I remember building memory boards for an old SWTPC computer (none of us geek kids could afford to buy them assembled). I remember lamenting when the IBM PC booted into ROM Basic and displayed "Ready" -- who's gonna know how the computer gets to that point? that was what, almost 20 years ago?

    There is hope. I often see young (say, under 20 years old) posters here who do have a clue. I envy that they have far better tools than I did as a kid, but note that these same "better tools" make it no big deal to be a script kiddie techno-vandal. Compare the history of mass computing to the history of the car. Anyone can drive a car today. Few know how to fix one or what makes it run, But, even with the ease of "turn the key, push on the gas, and remember to steer" driving, some still hack their cars. I take that as a sign that hacking doesn't die -- the computer hacker was a rare breed in the 1970s and still is despite the fact that there are a lot of 'leet wannabees around without a clue.

    My advice to the young hacker is to seek out other hackers, young and old (say 40 and up), and avoid the wannabees. Of course, this implies a willingness and responsibility on the part of us "old guys" to mentor -- sure, you don't need a fast sort, or balanced tree structure, when you've got a 1 Ghz processor, but imagine how much faster and scalable your code will be if you use one. In my day, RAM was fast, and disk was S L O W, so you carefully designed your algorithms to minimize disk access. These days, you want stuff to stay locked in processor cache 'cause RAM is slow, by comparison. Look at other "hobies", like HAM radio, and see how "the torch gets passed on" there. We should strive for similar effect.

  • Re:History is bunk (Score:3, Interesting)

    by mav[LAG] ( 31387 ) on Wednesday October 17, 2001 @10:19AM (#2441077)
    I'm dubious about the idea that knowing the history of computer science helps you be a better programmer. I've known several excellent programmers whose knowledge of computer science was limited to the tools of their trade and the underlying theory. My own knowledge of the history of my profession hasn't made learning OOP any easier.


    I am of the opposite opinion - that you can't know too much about the history of computing and computer science if you want to be a better programmer. Want to know why adding programmers to a late software project makes it later? Read Brooks. Are there some problems which are not solvable by computer? Study Turing. How does entropy and communications work? Shannon tells all. How and why does LISP do what it does so well? Check out the history of AI research. Does Babbage's engine really work? Hell yes - there's a working model at the Science Museum in London. Seeing it in action was practically a religious experience for me. Why does Unix use pipes? Check out the history of it at Bell Labs online.

    You could say that all of these examples could just as well be studied theoretically. But then you'd miss the fun parts - like the story behind IBM's development of OS/360 and how some companies still haven't learned those lessons.
    History is full of these amazing guys (and gals - hello Grace) that met and solved all kinds of problems, often in surprising and non-intuitive ways. Many of the anecdotes and broader perspectives do help you with programming if only to teach you something from history. Your Babbage example is a good one.

  • by mindpixel ( 154865 ) on Wednesday October 17, 2001 @12:27PM (#2441773) Homepage Journal
    We need roving teachers... HackMasters... who set HackerDojos in every city.

    We have one here in Antofagasta, Chile where I live... actually it's just my apartemnt, but I can say from direct experience that the new generating of hackers love to hang out and learn with the older generation... they just love to see my KIM-1 and TRS-80 Model I... my Cray 1 S/1000 supercomputer memory card... my collection of BYTE, 80 Micro, & Kilobaud... my books (especially my books because there is no such thing as a free public library in Chile)... to hear my hacking war stories and most interestingly, to work on new developement projects with me.

    Make your self into a HackMaster.

    Teach!

  • by sv0f ( 197289 ) on Wednesday October 17, 2001 @12:40PM (#2441851)
    I wrote a report for a Philosophy of Mathematics class in college that may be relevant for this discussion. Babbage is commonly thought of as more of an engineer than a scientist. And his efforts were largely directed to building a mechanical device that exceeded the manufacturing capabilities of the day. As well, the immediate precursors to the Analytic Engine -- the Jacquard Loom and the Difference Engine -- were specialized for narrow, practical purposes.

    However, if you examine his writings, you'll realize that he had lofty mathematical goals for the analytic engine. Specifically, he understood it to be an exercise in defining what moderns might call 'effective computability'. There's a striking similarity in his pursuit of this goal and the same explorations by Hilbert, Turing, Church, etc., in the twentieth cetury. At least that's what I argued in my paper.

    I recommend his notebooks and the few existing biographies to those with a taste for the history of science and the biography of scientists.

    PS: One other conclusion that I drew from my research was that the role of Ada Lovelace in developing the abstract principles of computer science has been highly overstated. She was probably mathematically adept. But she also lead an outrageous (for the time) life. The illegitimate, unmarried, and independently wealthy daughter of a Romantic poet. I suspect Mr. Babbage (Sir Charles?) entertained and amused her, both in his glum disposition and willingness to engage her intellectually. And we owe her a death for her record of and commentary on his actions. But I seriously doubt she originated the grand ideas commonly attributed to her.
  • by cascadefx ( 174894 ) <morlockhq@@@gmail...com> on Wednesday October 17, 2001 @01:25PM (#2442124) Journal

    The Difference Engine number two was ahead of its time, indeed. In a head to head polynomial calculation test with a Windows-based Canon BJ Notebook BN22 (with built-in ink-jet printer), the mechanical Difference Engine initially beat the pants off the laptop, but was then overtaken. Not bad for a technology that was concieved around 160 years before its competitor.

    From this account [sciencemuseum.org.uk], we find the following description from a witness:

    "...With the windows overhead slugging the Canon, the additional time taken for 31-digit extended precision arithmetic, and the printer buffer soaking up results before making them visible to the race referees, the Babbage engine produced nine 31-digit results before the Canon bubble jet printer blasted off the starting blocks. It then spewed out the first 50 results in little more than the time for the Babbage machine to produce one more result. The hare finally overtook the tortoise."

    Wow! Not bad for a version 2.0 product. Consider the advancements it would have made had Babbage been successful all those years before.

  • Re:I plead ignorance (Score:2, Interesting)

    by CmdrPinkTaco ( 63423 ) <<emericle> <at> <chubberware.com>> on Wednesday October 17, 2001 @03:00PM (#2442543) Homepage
    I guess that part of this argument goes back to the programmer vs computer scientist argument. Programmers are concerned with the task that is at hand, while computer scientists are more concerned with the underlying theory and the How and Why.

    Personally I find it difficult not to be exposed to some of the history of computer science in my studies, especially in the areas of mathematics. I can't imagine things like crypto existing without things like Fermat's Little Theorem, Geometry without Euclidian Identies and his Five Postulates. You get the idea.

    All of these have applications to computer science, but not programming. It all depends on what your area of focus is.
  • Re:Yeah, it's sad (Score:2, Interesting)

    by renehollan ( 138013 ) <rhollan@@@clearwire...net> on Wednesday October 17, 2001 @03:05PM (#2442577) Homepage Journal
    Well, let's see...

    First, rely on yourself. That is, get some good theoretical as well as practical books. Knuth's The Art of Computer Programming comes to mind. Yes, it is dated, but, if you have a math bent, you will enjoy Volume I. O'Reily offers lots of practical books on running this, and administering that.

    But, I suggested finding mentors, so lets get to that. I suppose that people like me are the "first generation" of modern personal computer "old farts" -- we didn't really have "old hand" mentors ourselves that were computer people: they were chemists, engineers, physicists, etc. who used computers and learned (i.e. hacked) out of sheer necessity. Them and other people like us... what we had in common was either a need or a desire to hack. The point is that "our" mentors were a lot different than "your" mentors would be.

    That said, you should probably start with your peer group -- other young hackers. Find out where they congregate and who they use for mentors. Sometimes us "old guys" will show up at a Linux Users Group meeting, or even a 2600 meeting. But, for me at least, non-hacker life (I have a family, own a home, etc.) makes that a bit difficult schedulewise. So, sadly, we're probably hard to find -- you'll probably have to pool your networking resources (which is why I suggested check with your peer group first). Of course, don't overlook places like Slashdot: you'll probably find an acceptable concentration of us here.

    Which brings me to the flip side of the problem... do we, as "old fart" hackers mentor enough? Perhaps we should make an effort to show up at local meetings, or offer to help with computer science classes in schools, that is, make ourselves more accessable.

An Ada exception is when a routine gets in trouble and says 'Beam me up, Scotty'.

Working...