Babbage, A Look Back 261
A reader writes "System Toolbox just started a new computer history section in an effort to get us geeks in touch with our "roots." The current article in the monthly column focuses on Charles Babbage. The editor and author hope to raise awareness of our past so that scenes like this won't continue to take place. A big hill to climb, but worth the effort."
I plead ignorance (Score:2, Interesting)
I mean, I'm not l33t or anything, I'm just someone who knows how to fix computers...and would it help me or affect my everyday life if I knew?
jeez (Score:2, Interesting)
This is happening all around us and not only in computer history, how many kids care about history at all??? How many kids know stuff about the first world war, Newton and the old philosophers like Aristoteles ???
I must say this does worry me...
Re:Graduates (Score:4, Interesting)
As far as the history goes though, I suppose you are supposed to be interested in computing, and are supposed to do a lot of background reading, so I suppose it could be argued that you should have built up an amount by yourself...
Re:Isaac Newton or Cave Man (Score:2, Interesting)
Re:Graduates (Score:4, Interesting)
I studied a combined degree of biology and computer science, and so I used to take courses from both the biology and the CS schools.
What struck me then was what a bunch of dunderheads the computer scientists where. Sure, they new the finer points of Unix better than I ever will, but if you asked them to write an essay on the importance of computers to society, for instance, they could hardly string two words together - an average biology student could have done a far better job of it. Frankly most of their essays were embarrassing in their childish views, ignorance, poor grammar and spelling.
And my point is? Well, at least in my experience, I think that many people who are hardcore computer enthusiasts generally have a far more myopic view of the world than people from other subject areas. They are socially inept and interested in very little else but computers, and even then in very narrow fields of computers rather than the bigger picture. I don't know if it has changed, but when I graduated many big employeers complained that computer graduates often lacked the most basic skills. Is it like this in the USA too?
And another thing... (Score:4, Interesting)
For some reason he didn't publish his results. Some believe that he was told not to by the British government, so that they could use his discovery during the Crimean war. Babbage's work on this subject was discovered in his notebooks after his death.
One 2600 meeting does not a sample make. (Score:4, Interesting)
Like any culture, our culture needs to be taught! Only so many can have had first hand experience and there are less of us each day. Yet, each day, there are more just coming into interest who need to be taught. If you find such a teacherless group of people interested in computers, you should take it upon yourself to teach who we are.
Show people the first computer you ever programmed [kim-1.com]. Show them the games you played and wrote [dadgum.com]. Show them how to say "Hello World!" directly with a Turing Machine or in Java and everything between [roesler-ac.de].
Tell them about Norbert Wiener and Marie Ampere. Warren McCulloch, J.C.R. Licklider, John von Neumann and Vannevar Bush. Alan Turing, Claude Shannon and David Levy (yes Ken Thompson too and Belle). Scott Adams(all three) and Stanislaw Lem. Joeseph Weizenbaum and Eliza, Alaxander Bain and Donald Hebb. Nolan Bushnell, Ted Dabney and Larry Bryan. Alan Kay and Steve Russell. David Gottieb, Joel Hochberg and Al Arcorn. Thomas Hobbes and Levithan. Charles Darwin, Erasmus Darwin and Thomas Huxley. Aristotle and Lucretius. Gottfried Wilhelm Leibniz, Charles Babbage and Blaise Pascal. B. F. Skinner and Wilhelm Wundt. Robert Tinney and Peter Max. J. Presper Eckert and John Mauchly. John McCarthy and Marvin Minsky. Doug Lenat, Push Singh and myself.
We will always need more teachers who know how to both show and to tell!
CS and History... (Score:5, Interesting)
I interview a lot of co-op students for job placements in the company that I work for now, and for large company in the past. Sometimes, I get some really cocky student who comes in with a smug attitude that he knows it all.
Sure enough, he can answer the technical questions flawlessly just as if he had read it from a textbook. He could show ingenuity for coming up with solutions on the fly as well... And usually when they get that look in their eye: "I know you want to hire me - make me a REALLY good offer, and I might consider working for you." I then ask the killer question:
"Who is Charles Babbage?"
The blank look on their face is priceless. It's a simple curveball. I've received answers ranging from: "I'm not sure - wasn't he featured in PC Magazine last month?" to "Oh - he's the founder of IBM." and "I... I... Don't know..."
I then answer the question with a short history lesson. They of course often recall it - yes, but didn't think that it was important.
I'm amazed at how much computing history has been forgotten from introductory courses in High School. There was an incredible amount of effort and ingenuity required to get us to the place we are today: information available within seconds, toys to entertain, tools to teach and make life easier (mine is easier now because of them), communication barriers broken down, etc... It's caused other problems too, but man - what doesn't. I'll take the benefits over the problems any day.
Hanging up in my office is a picture of Charles Babbage, and one of Ada.
"Who is Grace Hopper?" is my backup question.
Hehehehe...
It's a matter of size. (Score:3, Interesting)
I remember reading long ago about organic molecules being able to "switch" between two polarized states under the influence if an outer electronic field. This was supposed to be a future for nano registries...
It's a matter of size.
Electrons are big, light, fuzzy things, and the electric field goes out a long way. When you want to make circuitry REALLY small you don't want to use that for anything but holding your pieces together, because your gates and wiring would have to be very large to avoid crosstalk.
But atoms are effectively smaller than electrons. The nuclei are MUCH smaller, and they suck their electrons down, making most of the fields cancel out at after a very short distance. And they're heavier, so they don't shake around and flit around as much.
You can send a bit, or several bits, by sliding or rotating a rod as you can by shoving electrons into and out of a wire (even a single-molecule wire), and the signal stays IN the rod except where you WANT it to "come out". At atomic scales moving a rod is nearly as fast as moving an electron (and comparable to moving a hole), so speed isn't a problem.
As for amplification there's not much that can beat an "interposer". That's a piece of matter that gets between a two moving parts and transfers a shove from the one supplying motion "power" to the one carrying a signal away as motion. Expend a little power to put the interposer into position and it will transfer shoves (with no further input) until you pull it away again.
Re:Isaac Newton or Cave Man (Score:3, Interesting)
Where does one draw the line between useful information and cool things to talk about at a party?
Knowledge does not have to be either useful or cool in order to be valuable.
One common approach is that of Cardinal Newman in the Idea of a University [newmanreader.org]: Knowledge is capable of being its own end. Such is the constitution of the human mind, that any kind of knowledge, if it be really such, is its own reward.
The other common approach is to follow Socrates' [ship.edu] dictim that "The unexamined life is not worth living", which he derived from his belief that ignorance causes evil.
These approaches still leave unanswered the question of where you draw the line between learning and other activities, knowledge being infinite and time being short.
I submit that there is no line. Learning includes close observation of things around you. In this way you integrate a love learning with everyday life and test the ideas acquired in solitary study.
Despite the fact that most great scientists have been more motivated by the love of learning than anything else, I've found that people who insist that knowledge must have a utilitarian purpose cannot be convinced otherwise.
Yeah, it's sad (Score:4, Interesting)
Of course, that was way back in the 60's, er 1960's. I actually got to work with a computer in 1973 (HP2000 timesharing monstrosity running basic connected to dialup teletypes -- we got a DECwriter the next year, whee, 300 bps!) and experienced punched cards when I started my Comp. Sc. Degree program in 1979 -- we had DECwriters there too, and a few CRT terminals (at the end of current loop lines at, you guessed it, 300 bps), but course work had to be done via punched cards, submitted to the "express" window. You only had to wait 15-30 minutes for your job to get turned around.
I remember those days quite well. Today, I sometimes interview recent grads for software design jobs. One standard problem I pose is "Write a routine, in any programming language of your choice (I've probably seen it), that sorts an array of things according to some ordering scheme. I don't care about efficiency, but I expect the following: (a) that it is correct, (b) that you can explain the complexity of the algorithm in "O(n)" notation." Of course I expect a bubble sort of integers. One smartass did a Shell sort and got it right. But over 90% of the candidates fail this basic test. That's sad.
The scary part is that peer reviewers think this is being "too hard" on a candidate. Or worse, I remember one kid who added "servlets" to an FTP server -- basically he provided an API for command plug-ins that executed server-side and could provide massaged data back to the client. So, for example, you could do back-end SQL queries via FTP. Obviously an excercise in program extensibility. Trouble is, he didn't even get an offer -- a coworker insisted that he must have been "bullshitting" because "everyone knows that servlets are a Java thing and not an FTP thing". My protests got voted down. So, technical ignorance has permeated even the supposed technical interview -- on the part of the interviewers! Shortly after losing that battle, I left the employ of that company.
I remember building memory boards for an old SWTPC computer (none of us geek kids could afford to buy them assembled). I remember lamenting when the IBM PC booted into ROM Basic and displayed "Ready" -- who's gonna know how the computer gets to that point? that was what, almost 20 years ago?
There is hope. I often see young (say, under 20 years old) posters here who do have a clue. I envy that they have far better tools than I did as a kid, but note that these same "better tools" make it no big deal to be a script kiddie techno-vandal. Compare the history of mass computing to the history of the car. Anyone can drive a car today. Few know how to fix one or what makes it run, But, even with the ease of "turn the key, push on the gas, and remember to steer" driving, some still hack their cars. I take that as a sign that hacking doesn't die -- the computer hacker was a rare breed in the 1970s and still is despite the fact that there are a lot of 'leet wannabees around without a clue.
My advice to the young hacker is to seek out other hackers, young and old (say 40 and up), and avoid the wannabees. Of course, this implies a willingness and responsibility on the part of us "old guys" to mentor -- sure, you don't need a fast sort, or balanced tree structure, when you've got a 1 Ghz processor, but imagine how much faster and scalable your code will be if you use one. In my day, RAM was fast, and disk was S L O W, so you carefully designed your algorithms to minimize disk access. These days, you want stuff to stay locked in processor cache 'cause RAM is slow, by comparison. Look at other "hobies", like HAM radio, and see how "the torch gets passed on" there. We should strive for similar effect.
Re:History is bunk (Score:3, Interesting)
I am of the opposite opinion - that you can't know too much about the history of computing and computer science if you want to be a better programmer. Want to know why adding programmers to a late software project makes it later? Read Brooks. Are there some problems which are not solvable by computer? Study Turing. How does entropy and communications work? Shannon tells all. How and why does LISP do what it does so well? Check out the history of AI research. Does Babbage's engine really work? Hell yes - there's a working model at the Science Museum in London. Seeing it in action was practically a religious experience for me. Why does Unix use pipes? Check out the history of it at Bell Labs online.
You could say that all of these examples could just as well be studied theoretically. But then you'd miss the fun parts - like the story behind IBM's development of OS/360 and how some companies still haven't learned those lessons.
History is full of these amazing guys (and gals - hello Grace) that met and solved all kinds of problems, often in surprising and non-intuitive ways. Many of the anecdotes and broader perspectives do help you with programming if only to teach you something from history. Your Babbage example is a good one.
Re:One 2600 meeting does not a sample make. (Score:3, Interesting)
We have one here in Antofagasta, Chile where I live... actually it's just my apartemnt, but I can say from direct experience that the new generating of hackers love to hang out and learn with the older generation... they just love to see my KIM-1 and TRS-80 Model I... my Cray 1 S/1000 supercomputer memory card... my collection of BYTE, 80 Micro, & Kilobaud... my books (especially my books because there is no such thing as a free public library in Chile)... to hear my hacking war stories and most interestingly, to work on new developement projects with me.
Make your self into a HackMaster.
Teach!
Babbage and the theory of computation (Score:3, Interesting)
However, if you examine his writings, you'll realize that he had lofty mathematical goals for the analytic engine. Specifically, he understood it to be an exercise in defining what moderns might call 'effective computability'. There's a striking similarity in his pursuit of this goal and the same explorations by Hilbert, Turing, Church, etc., in the twentieth cetury. At least that's what I argued in my paper.
I recommend his notebooks and the few existing biographies to those with a taste for the history of science and the biography of scientists.
PS: One other conclusion that I drew from my research was that the role of Ada Lovelace in developing the abstract principles of computer science has been highly overstated. She was probably mathematically adept. But she also lead an outrageous (for the time) life. The illegitimate, unmarried, and independently wealthy daughter of a Romantic poet. I suspect Mr. Babbage (Sir Charles?) entertained and amused her, both in his glum disposition and willingness to engage her intellectually. And we owe her a death for her record of and commentary on his actions. But I seriously doubt she originated the grand ideas commonly attributed to her.
Ahead of his time indeed... (Score:3, Interesting)
The Difference Engine number two was ahead of its time, indeed. In a head to head polynomial calculation test with a Windows-based Canon BJ Notebook BN22 (with built-in ink-jet printer), the mechanical Difference Engine initially beat the pants off the laptop, but was then overtaken. Not bad for a technology that was concieved around 160 years before its competitor.
From this account [sciencemuseum.org.uk], we find the following description from a witness:
"...With the windows overhead slugging the Canon, the additional time taken for 31-digit extended precision arithmetic, and the printer buffer soaking up results before making them visible to the race referees, the Babbage engine produced nine 31-digit results before the Canon bubble jet printer blasted off the starting blocks. It then spewed out the first 50 results in little more than the time for the Babbage machine to produce one more result. The hare finally overtook the tortoise."
Wow! Not bad for a version 2.0 product. Consider the advancements it would have made had Babbage been successful all those years before.
Re:I plead ignorance (Score:2, Interesting)
Personally I find it difficult not to be exposed to some of the history of computer science in my studies, especially in the areas of mathematics. I can't imagine things like crypto existing without things like Fermat's Little Theorem, Geometry without Euclidian Identies and his Five Postulates. You get the idea.
All of these have applications to computer science, but not programming. It all depends on what your area of focus is.
Re:Yeah, it's sad (Score:2, Interesting)
First, rely on yourself. That is, get some good theoretical as well as practical books. Knuth's The Art of Computer Programming comes to mind. Yes, it is dated, but, if you have a math bent, you will enjoy Volume I. O'Reily offers lots of practical books on running this, and administering that.
But, I suggested finding mentors, so lets get to that. I suppose that people like me are the "first generation" of modern personal computer "old farts" -- we didn't really have "old hand" mentors ourselves that were computer people: they were chemists, engineers, physicists, etc. who used computers and learned (i.e. hacked) out of sheer necessity. Them and other people like us... what we had in common was either a need or a desire to hack. The point is that "our" mentors were a lot different than "your" mentors would be.
That said, you should probably start with your peer group -- other young hackers. Find out where they congregate and who they use for mentors. Sometimes us "old guys" will show up at a Linux Users Group meeting, or even a 2600 meeting. But, for me at least, non-hacker life (I have a family, own a home, etc.) makes that a bit difficult schedulewise. So, sadly, we're probably hard to find -- you'll probably have to pool your networking resources (which is why I suggested check with your peer group first). Of course, don't overlook places like Slashdot: you'll probably find an acceptable concentration of us here.
Which brings me to the flip side of the problem... do we, as "old fart" hackers mentor enough? Perhaps we should make an effort to show up at local meetings, or offer to help with computer science classes in schools, that is, make ourselves more accessable.