Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming IT Science Technology

Postmodern Computer Science 390

gnat writes "Two New Zealand computer scientists have a paper accepted for OOPSLA called Notes on Postmodern Programming, which identifies shortcomings in traditional views of computer science. With a section on the difference between "The Matrix" and the net, a bulleted list of new approaches called "We're All Devo", and a section called "Messy is Good" consisting of nothing but a scan of a hand-drawn diagram, this is not your father's computer science paper. It's thought-provoking stuff, though. And you know they did their homework--they cite Larry Wall's Postmodern Perl talk."
This discussion has been archived. No new comments can be posted.

Postmodern Computer Science

Comments Filter:
  • by Zipster ( 555990 ) on Tuesday October 22, 2002 @09:55PM (#4509962)
    Personally, I'm waiting for the cubist computer science movement...

  • ... and while we're at it, postmodern project management. And marketing. And clients.

    Great topic, and an important one as the field evolves. But much of commercial programming has become the equivalent of building carburators on an auto assembly line (or, perhaps in the case of OOP, putting carburators in engines).

    Any thoughts on how a nascent postmodern programmer can spark revolution up the management chain?

  • what? (Score:4, Funny)

    by 3.2.3 ( 541843 ) on Tuesday October 22, 2002 @09:56PM (#4509971)
    no chapter on the death of the programmer?
    • by JonTurner ( 178845 ) on Tuesday October 22, 2002 @11:36PM (#4510435) Journal
      "no chapter on the death of the programmer?"
      It'll be written right after they write the chapter about the death of their web server!

      It's midnight on the east coast US, so I suppose that's mid-day in New Zealand. And right now, spring is dawning and the sun is shining down on the beaches. Yet thanks to us, some poor NZ slob is stuck in the mic.vuw.ac data center trying to get his poor underpowered web server back online. You can bet your life he's cursing the day CmdrTaco was born.
      This moment brought to you by Slashdot.

  • by ActiveSX ( 301342 ) on Tuesday October 22, 2002 @09:57PM (#4509978) Homepage
    I started reading the first page, then realized I still had to read 2 more pages to get to page 1. Damn funky Postscript.
  • by nugneant ( 553683 ) on Tuesday October 22, 2002 @10:01PM (#4510002) Journal
    If anyone is interested in an extension of this theory (which begins by stating that humans are destined to give birth to computers as the next sentient race, and segues into an attack on the baby boomer culture), I do encourage them to check out Boomeritis [amazon.com]. The theories within it are rather intriguing, though the layout / writing style is nowhere near as 'hyperactive' as this article.
  • by 3.2.3 ( 541843 ) on Tuesday October 22, 2002 @10:02PM (#4510005)
    ...the postmodernism generator [elsewhere.org]...
  • huh (Score:2, Funny)

    From the pdf:The ultimate goal of all computer science is the program... Let us desire, conceive, and create the program of the future together... it will ... one day rise towards the heavens from the hands of a million workers as the crystalline symbol of a new and coming faith.

    Whoa. Wrong book.
  • by Billly Gates ( 198444 ) on Tuesday October 22, 2002 @10:05PM (#4510020) Journal
    a.) We are all Devo

  • by lewko ( 195646 ) on Tuesday October 22, 2002 @10:06PM (#4510026) Homepage
    Post modernism? Computing?
    Is it just me, or does this sound like an Arts Faculty which is tired of seeing all the university funding go to those pesky IT faculties and wants to bring itself forward into the nineteenth century?

    I think therefore I... [General Protection Fault reading philosophy]
    • by scott1853 ( 194884 ) on Tuesday October 22, 2002 @11:45PM (#4510466)
      Yeah, same thing here. For those that don't want to bother reading it, I'll offer my own interpretation of what I could stand to read.

      Page 1. The cult style writing of the first page was a little over the top in trying to rally programmers to unite with computer scientists that are already programmers, but are isolated from computer science and vice versa (I didn't understand it either).

      Page 2. Then the second page starts out by saying that nobody seems to know what postmodern computer science actually is, but the authors do, but it takes too much room to explain it, so they won't, instead they'll just reference a bunch of other works that might explain it, because they don't really know either, they're just trying to make the paper look good enough for a decent grade. In the third paragraph they also imply they are programming gurus are that you may get some recognition by simply noticing similarities between what you, as a real programmer already know, and what they are copying from other peoples books.

      Page 3. Third page was some obvious examples that programming is not the root of all evil and that CEOs are. Then there was a confusing paragraph at the end stating that we should respect the limitation of software so we can be happy little zombies.

      Page 4. Somehow the term "program" has become a paradox in several ways, it's big and small, and also has had a dozen processes used to create it but yet it was still somehow ignored, by someone. Then they define a couple very common words like component and system which are obvious even if you're a 90 year old WebTV user. But they still don't define what they consider post modern CS to be, nor do they state what their perception of computer science and programming are.

      Page 5. You should be able to unplug your computer, but then you'll miss your the important messages from your IM buddies. And when different systems communicate, they don't have any common protocol between them, so apparently they have found a way to magically turn TCP/IP packets into NetBEUI packets at some magical location in the CAT5.

      Page 6. Some crap about a cow and then cites some terms that are completely irrelevant such as an implementation of a cow.

      Page 7. My head is starting to hurt at this point. They're discussing not being able to have complete requirements and have them also be consistent. I don't know what they're suppose to be consistent with, maybe they just grabbed a random 8+ character world to toss in there. Postmodern computer science also involves lying (yes they specifically said that word). I don't think I'd consider anything a science if one of the major aspects of it is telling lies. There's a couple more paragraphs of 10+ character words randomly selected from the dictionary and strung together.

      Page 8. Different website are... different. Visual Basic is a low culture language ;).

      Page 9. They start to define postmoderism as being pretty and having nothing to do with the actual functionality. Think of it as a Flash intro to a website I guess.

      I can't read anymore, it's too painful. Does anyone know the grade they got on this? If they got a good grade, was it because the professor based the grades on the average number of letters per word? Or did he just say "I don't feel like reading this shit" and give them a C?
    • Re:Arts funding (Score:5, Insightful)

      by Tablizer ( 95088 ) on Wednesday October 23, 2002 @03:02AM (#4511030) Journal
      Is it just me, or does this sound like an Arts Faculty which is tired of seeing all the university funding go to those pesky IT faculties and wants to bring itself forward into the nineteenth century?

      Actually, software engineering does need more *psychology* related input IMO. People can fight over languages and paradigms and design methedolies for years without ever agreeing. It is more like politics than say math.

      In my observation, different software engineering approaches keep trying to model the practicioner's head rather than absolute external principles or the external world. (I am not saying that the latter is necessarily better.) For example, nobody has shown that OOP is objectively better than procedural/relational. Debates over which is better often expose different *fundimental* perceptions of reality and the change-patterns of reality.

      In some of my debates with OO fans, if we ever figure out exactly where and why we differ, it tends to be things such as the likelyhood of certain events (change requests) happening. All the "training" in the world is not going to convince somebody that change X is more likely than change Y if their personal experience and observations tell them otherwise. OOP seems optimized for handling modification patterns that don't fit reality as I perceive it.

      Paradigms and languages are a lot like ink-blot tests: they reveal a lot about our internal cognative thinking patterns and world view assumptions.

      Thus, software engineering is more related to psychology than math (barring some breakthrough mathematical proof that X is objectively and practically better than Y).

      So, let the psychology department play with computer science for a while, not just the art department. The math-heads have had their turn for long enough.
  • So.... (Score:2, Insightful)

    by mindstrm ( 20013 )
    Programming is not computer science....
    what am I missing?
    • The head of the APCS program at my high school (one of the best APCS departments in the nation) used to say exactly that. Saying CS is programming is like saying physics is learning to throw a ball and hit a target. Theory and practice are not the same, last I checked this was not a new concept.
  • Slashdotted Already (Score:4, Informative)

    by Anarchos ( 122228 ) on Tuesday October 22, 2002 @10:09PM (#4510039) Homepage
    Here's [216.239.37.100] google's html version.
  • highly appropriate (Score:3, Interesting)

    by theBrownfury ( 570265 ) on Tuesday October 22, 2002 @10:15PM (#4510068)
    This paper just seems very timely. As someone who is just about finished undergoing the quintessential undergrad experience in CS I think this paper hits a lot of nail square on their heads. Too many schools are hung up on the formal side of things without ever tying them back to the actual root of everything which is programming and this cannot be denied. And the rest of the schools are too busy teaching just programming to stop and discuss the formality of the process.

    Anyone out there find a school which strikes this balance in the undergrad??
    • what the...? (Score:5, Insightful)

      by devphil ( 51341 ) on Tuesday October 22, 2002 @11:30PM (#4510413) Homepage
      the actual root of everything which is programming and this cannot be denied

      ...fuck are you smoking?

      Programming is not the goal, nor the root, of computer science. Programming is the means, not the ends. Or, as Dijkstra (RIP) put it, "Computer Science is no more about computers than astronomy is about telescopes."

      Programming is fun, and it's certainly the part of computer science which I tend to look forward to the most when starting a project, but your statement is like saying, "the actual root of architecture is trowling cement onto bricks."

      • by SuperKendall ( 25149 ) on Wednesday October 23, 2002 @01:25AM (#4510740)
        It's true enough that "P"rogramming is not the root of things. Instead I think it is the heart, which is not quite what either of you are saying.

        You say that "P"rogramming is the means, but then give a quote about "C"omputers which is not the same thing.

        "P"rogramming is obviosuly much more than just the means. The actual running "P"rogram of just about any design can have so many facets of care and life put into things - the ease with which the "P"rogram might be built. The configurability of the "P"rogram. The API which one might access the "P"rogram through other "P"rograms. The interface that leans the user to interact with the "P"rogram are all entireley different than the abstract thoughts that gave birth to the "P"rogram, and breathe soul, if you will, into what once was abstract and souless, and are all aspects of how successful we consider the program regardless of how strict it adheres to original design, or even intent.

        To argue this point further, I'll use as a basis the section of the paper where they speak of many approaches have been taking to working with computers. Software Engineering. Software Architecture. Computer Science.

        All of these are similar in that they may produce "P"rograms, but the commonality is that all of them require "P"rograms in order to further themselves. Any of these approaches to software alone, without "P"rograms, leads to the approach becoming "dead", in the way that Latin is a "dead" language.

        I think what the original poster is really saying (and what I agree with) is that Computer Science in some places is striving to seperate itself from the "P"rogram, and in doing so also harms the ability for the student to study or engage in Architecture or Engineering or whatever other approaches can be taken with software. To lean on the paper once more, good programming education is like bad art - you know it when you see it. I'm sure there are computer programs doing a great job even now (I know Rice did an excellent job with me years ago), but we (and here I speak of any means of learning, college, self-taught, or otherwise) need to be careful to provide both the heart and the brain when bringing life to an education in software development.
        • by Dun Malg ( 230075 ) on Wednesday October 23, 2002 @01:41AM (#4510809) Homepage
          "P"rogramming is obviosuly much more than just the means. The actual running "P"rogram of just about any design can have so many facets of care and life put into things - the ease with which the "P"rogram might be built. The configurability of the "P"rogram. The API which one might access the "P"rogram through other "P"rograms. The interface that leans the user to interact with the "P"rogram are all entireley different than the abstract thoughts that gave birth to the "P"rogram, and breathe soul, if you will, into what once was abstract and souless, and are all aspects of how successful we consider the program regardless of how strict it adheres to original design, or even intent.

          I can't "P"ut my finger on it, but something about your "P"ersistent "P"enchant for "P"utting the letter "P" in quotes "P"ractically "P"uts my "P"oor eyeballs into a state of "P"ermanent "P"erplexment.
      • Re:what the...? (Score:2, Insightful)

        by bshanks ( 520250 )
        >> the actual root of everything which is programming and this cannot be denied

        > Programming is not the goal, nor the root, of computer science. Programming is the means, not the ends.

        I think there is a legitimate disagreement here (the authors of the papers know that there is another point of view here). I don't know on which side I stand. I tend to think in terms of programs myself, but I can't decide if that is a bad thing or a good one.

        As for "the actual root of architecture is trowling cement onto bricks", in some sense it is. But more so for programming; you have to look at context. Some reasons why "architecture" and "cement laying" are less distinct in programming:

        1) the "bricklayers" and the "architects" of programming must have similar training and are often considered to be in the same profession

        2) programs are somewhat malleable and so it is possible for the blueprints to change in the middle or to become muddled with the program itself
  • Postmodernism (Score:4, Interesting)

    by Find love Online ( 619756 ) on Tuesday October 22, 2002 @10:15PM (#4510072) Homepage
    I think post modernism is by far one of the most interesting ideas, and in a lot of ways like things computer geeks like, you know recursion and all that (Read Godel Escher Bach).

    You could say that the basis of post-modernism is "self-reference and irreverence". Basically looking inward, and realizing the absurdity of it. Obviously it has a lot of appeal to a cynical bastard such as myself :P

    I mean, the idea on its face is absurd. How can something be "post-modern" Wouldn't the newly post-modern become modern, and the old modern simply old? (it's a bit more complex then this, as Modernism was an attempt to break from "classicalism" in the middle of the century. To build great new things. Post-modernism basically gives up on the great new things and says "fuck it")

    Also the site seems to be slashdotted.
    • by Black Parrot ( 19622 ) on Tuesday October 22, 2002 @10:53PM (#4510256)


      > I mean, the idea on its face is absurd. How can something be "post-modern" Wouldn't the newly post-modern become modern, and the old modern simply old?

      Postmodernism is already déclassé. (I'm neo-futuristic, myself.)

      > (it's a bit more complex then this, as Modernism was an attempt to break from "classicalism" in the middle of the century. To build great new things. Post-modernism basically gives up on the great new things and says "fuck it")

      I think Postmodernism was basically a result of the fact that everyone was out of ideas for interpreting Homer and Hemingway, and shortly after running out of new ideas they got tired of writing their (n+1)th essay interpreting them as "man's inhumanity to man" or whatever, so they decided to kick down the whole edifice of bullshit that they had built up over the centuries.

      But don't let my cynicism fool you: though I called it an "edifice of bullshit", I don't exactly find Postmodernism more edifying. It's more like a three year old throwing his blocks around the room because he got frustrated with his failed attempts to stack them higher.

  • by PissingInTheWind ( 573929 ) on Tuesday October 22, 2002 @10:16PM (#4510079)
    And you know they did their homework--they cite Larry Wall's Postmodern Perl talk.

    Ugh... that was far from being the best thing (or even one of the best things) Larry ever wrote.

    The ideas are interesting by themselves, linking to other's work isn't much a validation in itself.
    • Larry Wall writes:

      When I started designing Perl, I explicitly set out to deconstruct all the computer languages I knew and recombine or reconstruct them in a different way...

      Wall does not use the word deconstruct correctly. To deconstruct does not mean to take apart, as Wall's usage suggests. Rather, deconstruction is about finding the assumed hierarchy of an opposition and showing that the component that seems to be higher/superior can't be defined without reference to the lower/inferior component.

      Of course, you could always say that Wall has simply recontextualized deconstruction, but then you'd be one of those intellectually feeble postmodernists.

      • by EvlG ( 24576 )
        I think Wall's usage of the word deconstruct is consistent with the definition you provide, given his explicit mention of languages such as BASIC (viewed as the 'inferior' languages) and other 'superior' languages (like C and Python).

        Even saying Perl (a 'superior' language) is based upon the 'inferior' ones is evidence of this.
  • From the article:
    By the grace of Heaven and in rare moments of inspiration which transcend the will, computer science may unconsciously blossom from the labour of the hand, but a base in programming is essential to every computer scientist. It is there that the original source of creativity lies.

    Hmm, I would say yes. BUT: I would say that inspriation/creativity is tantamount to computer science, in such a way that it allows us to forge novel solutions for hard/near-impossible problems.

    Plus, who here can say that they solved a tricky logic problem in their sleep rather than from consciously fleshing it out?
    -Cyc

  • by abhinavnath ( 157483 ) on Tuesday October 22, 2002 @10:32PM (#4510151)
    Different computer programs are different.

    We cannot use one system of development to write all the different types of program.

    Therefore we need to use a flexible language that does not have a rigid structural or developmental style.

    That's it, we're done. We're just going to sit here twiddling our thumbs.

    Oh cool! I can scan in a page of doodling and pass it off as a valuable insight into post-modernism. Only 15 more pages to go...

    That paper was a waste of time and bandwidth. Be grateful that it is slashdotted.
    • by jtdubs ( 61885 ) on Wednesday October 23, 2002 @01:09AM (#4510679)
      A flexible language? Without rigid structural or developmental style?

      It's a shame we don't have any languages like that right now.

      Someone, quick, go invent LISP...

      Justin Dubs
  • by Animats ( 122034 ) on Tuesday October 22, 2002 @10:50PM (#4510240) Homepage

    Near the end of this polemic comes the good part:

    The task is to instruct a computer to print a table of the first thousand prime numbers.

    To write this program, we first connected our computer to the Internet, downloaded some music from Napster, and then read our email. (You have to receive email to perform a workday [11]). We received 25 pieces of email of which 16 were advertisements for Internet pornography, administriva, or invitations to invest in Nigerian currency trades. After dealing with this email, we typed "calculate prime numbers" into Google. This found several web sites re- garding prime numbers, and some more pornography. After a while, we were interrupted, and so moved on to the prime number web sites. In particular, http://www.2357.a-tu.net [a-tu.net] includes a the "ALGOMATH" C library for calculating prime numbers; another site included an EXCEL macro which was top complex to understand. Although we had not programmed in C for years, after downloading and compiling the library (by typing "make"), we noticed the documentation included the following program:

    • int *pointer , c=0;
      if((pointer = am_primes_array(4, 3)) == NULL)
      printf("not enough memory\n");
      while( *(pointer+c)){
      printf("%d\n",*(pointer+c));
      c++;
      }
      return;
    We cut and pasted this program into a file and compiled it several times, having to add a few extra lines (e.g. main () { ). Eventually we ran it, and indeed it appeared to generate three prime numbers larger than four. We edited the parameters to am_primes_array to (2,1000), and then ran the output through "wc -l" to check that it had printed 1000 numbers.

    Here we have completed what we announced at the beginning of this section, viz. "to describe in very great detail the composition process of such a [postmodern] program".

    Now that's what postmodern programming really is.

    • And that is why I am so damned scared of postmodern programming. Someone one day is going to be programming nuclear missle guidance systems like that, and then we'll be sorry....
  • And you know they did their homework--they cite Larry Wall's Postmodern Perl talk.

    And when you look at the list of 74 references...

  • by coupland ( 160334 ) <dchase@@@hotmail...com> on Tuesday October 22, 2002 @11:03PM (#4510307) Journal
    These guys may think they're clever and have published a paper that discredits all coders today. But have they weighed the consequences of their lack of faith? When they die they will go to Coder Heaven and be questioned by St. Carmack at the PERL-y Gates. Do they really think he'll be impressed by their rhetoric? Really, I'd like to be there when they're blinded by a lightmap on the road to Bumpmapicus...
  • Christ.... (Score:2, Insightful)

    this is dumb. It's going along the lines of music. postpunk postpostnupostpostpunk. Before you know it, we'll have post-post-nu-post-avante-garde-post-programming
  • by TekkonKinkreet ( 237518 ) on Tuesday October 22, 2002 @11:15PM (#4510363) Homepage
    The authors decline to define postmodernism, for reasons of space. While I respect their decision, here's some insight from Frederick Jameson, William A. Lane Professor of Comparative literature and Director of the Graduate Program in Literature and the Center for Crirical Theory at Duke University, perilously near to where I live:

    "Any sophisticated theory of the postmodern ought to bear something of the same relationship to Horkheimer and Adorno's old 'Culture Industry' concept as MTV or fractal ads bear to fifties television series."

    If you don't know what this means, it's because your brain evolved to reject drivel. To be perfectly honest, I hope this is a hoax. Wouldn't be the first time. [skepdic.com]

    But then, with postmodernism, you can't really tell the hoaxes from the honest nonsense. [miami.edu]

    Adam Gopnik of the New Yorker noted some time ago that the message of postmodern work is almost always trivial (like "violence is bad"), but couched in the most inscrutable and/or eye-catching terms (like "search for an interpretive skein within that overburdened word 'violence'" or "violence as style"). How about this one, from the paper: "Without a grand narrative, there will no be one common way to program, or even one common kind of interface between programs." More than one way to program? Sign me up for a grand narrative, post-haste!

    I thought Slashdot was immune to this kind of idiocy. (Well...no, I didn't, but I can dream, can't I?)

    • On "drivel." (Score:2, Interesting)

      Not that I can argue that any of the phrases or sentences in the link that you provided are clear and concise...

      But neither can you or the creators of the page in question honestly argue that the phrases or sentences are "drivel" when they have clearly been taken out of context in this fashion. Supply some context or be content to look like fools.

      Like it or not, 'postmodern' is the widely accepted name for the cold-war and media-essential era which falls after the 'modern' era of the World Wars. Simply tossing words like 'drivel' about and quoting long sentences out of context does not automatically render moot any argument that you disagree with, postmodern or otherwise.
      • Re:On "drivel." (Score:3, Interesting)

        by j_w_d ( 114171 )
        Actually, if you do a little research, it has been experimentally proven that "post-modern" language, if that is what you want to call it, and non-sense are indistinguishable to proponents of post-modernism. I kid you not. Some years ago a "paper" was generated using a computer programmed to plug in p-m catch phrases. The piece was published, and IIRC, was critically approved of among the sacrosanct priesthood of P-M. The last I heard there were still some proponents who were certain the "computer generated" aspect was a hoax.

        However, this being the case, it is certainly reasonable to argue that drivel and post-modernist language are indistinguishable to those of us who do not pretend to understand the WTH post-modernists are purveying. Following that line further along, since neither adovcates nor critics of P-M can distinguish between drivel and P-M, it apears that P-M and drivel are synonymous.

        Enjoy,
    • by Fnkmaster ( 89084 ) on Tuesday October 22, 2002 @11:55PM (#4510496)

      But then, with postmodernism, you can't really tell the hoaxes from the honest nonsense.


      Hmm, I actually think this is part of the point of postmodernism. Postmodernism goes beyond just recognizing that truth is inscrutable and rejecting absolutism of ethics, aesthetics, and knowledge, and embraces the subjectivity of everything. In fact, some postmodernists seem to think that knowledge and reality are DEFINED by language games, i.e. who spins the best bullshit (apparently this derives from Wittgenstein).


      So you see, the nonsense and the hoaxes aren't truly discernible to the postmodernist, and a true postmodernist would likely reject the very idea that a hoax is a meaningful concept. Anyway, I find it all to reek of bullshit after spending 4 years in college debating with my fellow students who majored in subjects like Social Studies (which included a heavy dose of postmodern theory) about whether these concepts were meaningful.


      My general conclusion is that concepts that do not lead us any closer to understanding or interacting with the world in a productive manner and that lead to liberal arts students becoming unshaven, unshowered nihilists are just as bad as things that lead computer science students to become unshaven, unshowered Counter Strike addicts or code monkeys.

    • by MisterSquid ( 231834 ) on Wednesday October 23, 2002 @12:01AM (#4510516)

      It's late, but I wanted to quickly challenge your cynical dismissal of postmodernism as a school of thought. But before doing so, I want to note that your skepticism is obviously well-informed. You probably deserve a reply more thoughtful than the one I can muster right now, but here I go anyway.

      You quote Jameson's line, Any sophisticated theory of the postmodern ought to bear something of the same relationship to Horkheimer and Adorno's old 'Culture Industry' concept as MTV or fractal ads bear to fifties television series.

      This is easy to understand for students of cultural theory. Basically, Adorno's criticsm of the "Culture Industry" (also known as the Frankfurt school) was a Marxist critique of Hollywood (an oversimplification to be sure). That critique by today's standards is old-fashioned, but still hold truth for dyed-in-the-wool Marxists. (as a sidenote, Adorno and Horkheimer escaped/fled Nazi Germany and their entire view is largely shaped by interpreting American capitalism as a kind of fascism.)

      Jameson's own postmodern theory also has Marxist stripes. But in Jameson's view, our contemporary culture is infinitely more complex than the 1920's-era Hollywood that Adorno was writing about. As a result, a more complex form of critique is necessary.

      The whole thing can be symbolized thus:

      postmodernism
      :Frankfurt School :: MTV:50's television

      In English, "postmodernism is to the Frankfurt school of cultural theory as MTV is to 50's television."

      (I'm too tired and lazy to hunt down the links that'll make this more than another rant, but you get the idea.)

      Postmodernism has its roots in art and cutlural criticism. Expropriations of postmodernism by science, technology, and history end up overlooking the origins of this material. No, it's not science, though science sometimes makes reference to it. Postmodernism is a mode of understanding and it is a specialized discourse, one that's as difficult for non-specialists to understand as assembly language is for the average end-user.

      With all due respect

    • John Leo, in US News and World Report, wrote [earthlink.net] in an article about Postmodernism, "A professor once wrote this about Tonya Harding's attack on Nancy Kerrigan: 'This melodrama parsed the transgressive hybridity of unnarrativized representative bodies back into recognizable heterovisual codes.' Possible English translation: Maybe Tonya had Nancy's leg smashed because she was attracted to her. If so the media wouldn't tell.
      The professor was writing in 'pomobabble' - the jargon of postmodernism'..."


      The Postmodernism Generator [monash.edu.au] Leo cites in the article (create your own Postmodern article!) has been moved [elsewhere.org].
  • !(Truth) == truth (Score:3, Insightful)

    by Ted_Green ( 205549 ) on Tuesday October 22, 2002 @11:20PM (#4510379)

    Frankly, the absence of a value system and a "Grand Narrative" (see "No Big Picture" 7) in approaching programing is a rather dangerous mindset and can seriously lead to sloppy programing. While I'll agree that it's nice to say in principle that C++ isn't better than C# which isn't better than Java which isn't better than Qbasic, and that there's no "wrong" way to write code, in practice I'd say it's far easier and more efficient to act as though there was a Grand Narrative, and that ASM is far better for writing faster base level routines than Pascal is.

    While I admit I have yet to read the whole article (I'll get to it) my first impression is that succeeds at failing where so many other "Postmodern" calls have done before. Which is to say it inadvertently deconstructs itself (one contradicts themselves when they say "there is no right way to do things" as this in itself proscribes a "right" action [to treat all ways equal])

    At a more fundamental level I have a hard time accepting "Postmodernism" next to "Programing" as the former is a system stating there is no such thing as a Truth statement and the latter, at it's very core, is based on truth statements. (Yes, yes, I know there is a rather big difference between Truth and a truth, esp. when truth is meant as an on/off switch, but it still quirks me regardless =)

    Either way, it's still a pretty good read.
    • Although I'd like to agree with you, it's interesting that the mindset the paper puts forward is what you often find in the industry... I mean, it's quite clear to many CSers that roll out of college that in the Real World, software seems to be built in a slightly less "optimal" way than they were taught. It _does_ come down to, everything is as good as anything else, no principles or higher ideals matter, the only thing that matters at the end of the day is how well the problem has been solved, not how Truthful you did it.


      You can clearly see this in terms of the way managers push for the cheapest solution, ignoring all the protest or advice from their more wised-up seniors... But also in eg the way that Win32 programming has devolved into using google/codeguru/codeproject to look up a piece of source that comes close to the sort of thing you're trying to do, then fix it up... in that respect, the prime number example isn't all that nutty..


      I like it :) This postmodern idea looks like a good summary of what programming is _really_ about out there (unfortunately, I might add). It's probably also why so many people do things The Right Way after hours in their hobby/opensource project, because the postmodern thing just clashes too heavily with their idea of Truth (ie clean programming).

  • Layers (Score:5, Insightful)

    by Veteran ( 203989 ) on Tuesday October 22, 2002 @11:40PM (#4510449)
    Here are the two fundamental problems of computer science:
    • Bad programmers write bad code.
    • There are many more bad programmers than good programmers.


    Programming is a bit like chess; you can't point to anything specific that a bad chess player
    does wrong. It is not that a bad chess player moves his pieces incorrectly - bad players are constrained by the same rules of the game that good players are; a bishop stays on its color for both the good and bad players. The only difference between good and bad players is that poor players make poor choices of moves.

    In a similar fashion poor programmers use the same tools as good programmers - they both get their programs to compile and run - but poor programmers just make poor programming choices.

    Here is an example of something which poor programmers don't seem to get. When you put a nice shiny new paint job on a layer of crap - it might look ok - but it is still a layer of crap.

    That simple observation explains why Microsoft's operating systems stink.
    • Re:Layers (Score:4, Insightful)

      by captaineo ( 87164 ) on Wednesday October 23, 2002 @12:24AM (#4510564)
      One aspect of poorly-written code (volumes of which I produce myself) is that it does not extract as much redundancy from the problem as it could. e.g. big if() or switch() statements instead of a table of results or function pointers. I find that the best program is almost always the shortest. (within reason; removing all the newlines doesn't count =)

      Non-orthogonal, inconsistent APIs are another big source of trouble. (stdio comes to mind... quick, which of the arguments to fread() is the FILE* pointer? What's the difference between fputs(foo, stdout) and puts(foo)?)
    • Re:Layers (Score:3, Funny)

      by Alsee ( 515537 )
      Programming is a bit like chess; you can't point to anything specific that a bad chess player does wrong.

      Sure I can.

      Move 1: P-KR4

      Unfortunately I'm only half joking. I can't tell you how many times I'v seen that. Then there's the nearly as common, nearly as bad P-QR4.

      -
      • As somebody who knows no more about chess than (i assume all of) the rules, and can't read the moves, what are those moves, and what's bad about them?
        • [guestimation] these two moves are bringing the pawns in front of king and queen, respectively, out two squares. this i would assume is a bad choice as it leaves gaping holes at the most vulnerable/valuable pieces. [/guestimation]

          oh, btw i am definitely a bad chess player. but a good computer scientist [i think]...
          • Re:Layers (Score:4, Insightful)

            by jazmataz23 ( 20734 ) <[moc.oohay] [ta] [naicitamzaj]> on Wednesday October 23, 2002 @02:32AM (#4510958)
            Augh, this is why the "modern" notation should always be used. It's much more readable to denote the files (columns) a-h, and the ranks (rows) 1-8) instead of naming squares by the pieces that originally resided there. Those P-KR4 (Pawn to h4 or P-h4 in modern) and/or P-QR4 (P-a4) mean moving the pawns in front of the rooks out two spaces. (often pawn moves are written simply as the position they move to, but I added P for clarity)

            The duffer is trying to activate his rooks immediately. This is an awful move because the rooks will be quickly destroyed by the far more agile enemy knights. Rooks should not (as a rule of thumb) be activated until the middle game once the board has been cleared up a little. Being up a rook in the endgame (typically defined as the game after the queens have left the board) is a HUGE advantage.

            In fact, if you're prepared to support them (and your opponent allows it), pushing the d and e pawns two spaces each can be a large advantage for white; you control the center, and have lots of space behind them to develop your minor pieces.

            Now, Mr. Modernist Moderator, go ahead and mod me -1, offtopic. I will simply PostModerate you Unfair!

            like it or not, (and reading the discussion above, I see a lot of not-liking, not-understanding in the discussion) slashdot is intensely postmodern in its character.

            heh, here's a simple example from my own post. [B][/B] is modern HTML, [STRONG][/STRONG] is postmodern.

            OK it's *really* late, that seemed too funny...
            jaz

    • Re:Layers (Score:4, Insightful)

      by Dr. Bent ( 533421 ) <`ben' `at' `int.com'> on Wednesday October 23, 2002 @01:06AM (#4510671) Homepage
      I completely agree. I think that these problems are compounded by the fact that there is a disproportionate number of bad programmers in the market today. Every discipline has it's 3 standard deviations, tip of the curve, top-tier professionals and...everyone else...but software development has many, many more in the "everyone else" catagory because:

      1) Dotcoms gave jobs to people who had no business being programmers and encouraged students to drop out of school to take high paying jobs that are non-existant now.

      and

      2) Most people do not have a clear understanding of what software development is all about. They equate computer use with computer science, and then are surprised to find out (after 4 years of college) that it's not at all what they expected.

      This leads to more crappy software, less general understanding of effective software development techniques, and a whole hell of a lot of people who have no clue what they're talking about.

  • Source: [US mirror 1] Adobe PDF [perl.org] (1797kb) ; GZipped PostScript [perl.org] (1700kb)
    Source: [US mirror 2] Adobe PDF [riehle.org] (1797kb) ; GZipped PostScript [riehle.org] (1700kb)
    Source: [US mirror 3] Adobe PDF [clearfield.com] (1797kb) ; GZipped PostScript [clearfield.com] (1700kb)
    Source: [NZ mirror 1] Adobe PDF [paradise.gen.nz] (1797kb) ; GZipped PostScript [paradise.gen.nz] (1700kb)
  • To suggest that we've already reached and breached the modern age of computation is awfully self-congratulatory. We've had computers for what, 65 years? When we've had computers for 1000 years, then I'd be comfortable suggesting that we had reached the age of "modern" programming. People say "postmodern" way way too easily.

    Postmodern programming will begin *after* the first self aware computer chooses to program its own destruction. Then we can begin to discuss postmodernism and programming at the same time.
  • "By the grace of Heaven and in rare moments of inspiration which transcend the will, computer science may unconsciously blossom from the labour of the hand..."

    "The key reason these languages [Java, C#, Smalltalk, etc.] are postmodern is that they cannot be considered against technical criteria."

    Teehee, just look at p. 15! These guys must be laughing harder than Don Woods and James Lyons after Intercal (ohh, they even mentioned it - "Intercal must be considered as a post-modern language (mostly for non-technical reasons)."

    Thanks for the laugh, you crazy Kiwis =].

  • thought provoking (Score:2, Informative)

    by akuzi ( 583164 )
    The world of computer programming seems to be getting more 'pluralistic' by the day. In certain areas there is convergence but in general the number of technologies and methologies seem to be increasing at an alarming rate - almost impossible to keep up with.

    Most experienced programmers realize there is no 'silver bullet' to the problem of engineering software, in most cases many sets of different methodologies and programming technologies could be combined to produce a working system, each with their own advantages and disadvantages.

    The paper argues that this shouldn't be seen as a 'failure' of software engineering (and more generally computer science) but rather as something once realized can result in more pragmatic approaches to building software, such as using methodologies and tools which support multiple approaches (like XP and Perl). Mix and match styles that most suit, like people mix and match their beliefs in post-modern society.
  • by kma ( 2898 ) on Wednesday October 23, 2002 @12:34AM (#4510593) Homepage Journal
    Reading the paper, I get the impression that this is mostly typical undergraduate hand-wringing about the gulf between academia and industry. That's fine, as far as it goes, and I've certainly indulged in my fair share of it. However, as an occasional student of stuff other than computer science, I'm a bit worried by their choice of terminology.

    To sum up: Post-freaking-modernism??? Do these people have any idea what a plague [drizzle.com]on the humanities the loose collection of intellectual conceits known as "postmodernism" has been?

    I've tried my hand at reading Foucault/Derrida/Barthes/etc., and their secondary sources. It's exceptionally difficult, but not in the way that, say, a complex algorithm is difficult. It's difficult in the way religious texts, or David Lynch movies are difficult; i.e., the difficulty is a smokescreen to keep the reader from catching on that this is all a bunch of bullshit.

    This sort of deal typically begins with, "I will argue that {truth,reason,science,gender} is {non-existent,socially constructed,a masculinist plot}." Several hundred extraordinarily poorly written pages follow in which the author, in varying degrees of good faith, actually tries to argue these points. Of course, if truth is socially constructed, we all have no basis upon which to discuss anything. Rather than calling one another on it, the postmodernists collectively wink at one another, and promise to take one another seriously, and quote one another every chance they get. It's academics by pyramid scheme.

    I understand why humanities people, even bright ones, fall for this routine, since they might go through all of their undergraduate and graduate education without encountering a single academic who hasn't drunk from postmodernism's poisoned cup; but why on earth would computer scientists be visiting this curse on a journal I subscribe to?

    To those posters above tempted to give in to the siren song of self-referentiality, who might be thinking, "Hey, some of my CS classes are boring, maybe we need some of this radical 'postmodern' stuff to kick boring old CS in the pants," remember: computer science is very, very young. New ideas and techniques are thick on the ground in fields as diverse as graphics, systems, theory, AI, and software engineering. Literary critics eventually turned to postmodernism in part because it seemed like there was nothing left to say, and this postmodernism stuff, bullshit or not, was at least different. In computer science, we are still learning how to write a well-structured novel.
    • Undergraduate hang wringing and so much Bailey's you can't tell whether the alcohol or the sugar is having the worst effect. It seems to me that writing great programs has a lot to do with creating good namespaces and chosing your names well.

      class this {
      pre this (); // constructor
      post this (); // destructor
      };

      At least that partially makes sense, unlike anything else named "modernism" by people who are already dead.
    • It's not that CS class is boring, it's that it is teaching values which have been declared by the industry as irrelevant. Remember "MATURE" software ? Maintainable, Adaptable, Transparent, User Friendly, Reliable, Efficient. Say that to a manager and you get "BWAHAHA".

      Programming has been deconstructed de facto to a bunch of hacking that only needs to get the job done; anything CS has to say about the process has been declared by the industry as irrelevant. Sounds very postmodern to me.

  • by Brian_Ellenberger ( 308720 ) on Wednesday October 23, 2002 @01:12AM (#4510686)
    Postmodernism is a nonliberal arts field like Computer Science?

    Post-modern math: The derivative of x^3=3x is too narrow of a definition. We need to somehow break free of such rigid rules that prevent expression. Lets try dx/dy x^3=18x on Mondays and dx/dy x^3=5x on Tuesdays.

    Post-modern engineering: The concept of the modern suspension bridge is patriarchal in design and form. Instead of being tied down by cables in a seemingly unending pattern, lets have the cables lifted to the air by giant balloons! I have the math right here to prove it will work (see post-modern math)

    Post-modern Biology: Sure the lungs are commonly thought to simply process Oxygen and CO2. However, that was simplistic modernistic thinking. Today we will demonstrate neo-objectivism by removing the lungs from this patient and observing their meaninglessness.

    Come on, Computer Science is a Science! It has rigid and unavoidable laws, a concept which postmodernism rejects. Fundamentially, when you get down to the heart Computer Science is math and is governed by a ton of mathematical rules.
    We have Shannon's laws on Information Theory, Turing-Church Thesis and the Turning Machine describing the limits of computers (see Halting Problem), NP-Completeness, the wide variety of research on various algorithms, etc.

    Guess what, fundamentially there is no difference between Perl, C, C++, Ada, LISP, or whatever other language you come up with because at the end of the day they are all Turning Complete.

    At the end of the day the Turning Machine *IS* the "Grand Narrative". It is the fundamental basis by which all computers and all languages must obey. To use the author's words, it is the "12-note row", the thing that couples everything else together in the sea of chaos.

    Of course, a writer may use a Word Processor to write a post-modern play or a animator may use a graphics tool to draw a post-modern animation. But these aren't examples of Computer Science.

    Brian Ellenberger
  • Summary:

    The essential paradigm of cyberspace is creating partially situated identities out of actual or potential social reality in terms of canonical forms of human contact, thus renormalizing the phenomenology of narrative space and requiring the naturalization of the intersubjective cognitive strategy, and thereby resolving the dialectics of metaphorical thoughts, each problematic to the other, collectively redefining and reifying the paradigm of the parable of the model of the metaphor.

    The source [catalog.com] of the above paragraph should serve as an adequate introduction to postmodernism.

  • Sydney Opera House (Score:4, Informative)

    by Dun Malg ( 230075 ) on Wednesday October 23, 2002 @01:30AM (#4510759) Homepage
    His example of the Sydney Opera House employing modular/modernist components despite its postmodern design fails to mention the real lesson Jorn Utzorn learned. Utzorn's initial design for the shell roofs didn't include "ribs supporting them." His original thought was that they'd be self-supporting, but he never had the proper engineering studies done. Subsequently, they had the first 20' of the shells built up before he realized that his napkin-based engineering tests weren't good enough. At that point there was a mad scramble to find off-the-shelf materials that could be added to hold up the roof. Basically "modernist components" saved this guy's ass because he was too engaged in the "art" of architectural design and didn't pay enough attention to the "science" needed to make things work. The projected $10 million cost ballooned up to $150 million because of Utzorn's failure to take into account the laws of physics, so in 1966 he (resigned/was fired from) the job. The guy who took his place as design architect found out what a further loser Utzorn was as an engineer when he looked at the plans and saw that elevation drawings of the glass walls that enclose the ends of the "shells" contained no design or engineering specs for their construction whatsoever: basically Utzorn had written "glass wall" with an arrow pointing to the empty space. Nice, eh?

    I think the important lesson the Sydney Opera House debacle teaches us is that postmodernism is pretty, but if you're using it in creating something functional, make sure it'll at least function. That, and "don't send an artist to do an engineer's job".
  • PoStmOderN (Score:5, Interesting)

    by joab_son_of_zeruiah ( 580903 ) on Wednesday October 23, 2002 @01:58AM (#4510875)

    Postmodernism is a license to criticize without being held to the rigorous requirements of critical throught. Self consistency is not one of its strong points, at least by usual standards. Just to illustrate this point, the authors' cite Wittgenstein's Tractatus Logico-Philosphicus, and its famous initial assertion (Die Welt ist alles was der Fall ist. - "The world is all that is the case.") Postmodern critics like to avoid the use of abstraction, tending to rely on facts to establish contradictions. The authors' literal reading of the first assertion is fully consitient with postmodern criticism. Of course the rest of the Tractatus has a lot of abstractions in it, which puts it about as far from postmodern as you can get. And as "everybody" knows the Tractatus is the philosophical manifesto for databases, logic programming, UML, .... (which are about abstractions too.) Hmmmm.

    Postmoderism tends to irk those who attempt to read it and apply purely "modernist" notions of criticism. At least it irks me. There is a rather well developed theory of postmodern criticism, which the authors of this paper try to explicate (terms like "antitotalizing" etc), examplified, e.g., in the writings of Jacques Derida and many others. This is usually where the academic starts - by aligning their field of study with the concepts of postmodern criticism. This is a small industry and this paper is of that ilk. The best that can be said about postmodernism, IMHO, is that it's like brainstorming written down on paper. It's usually thought provoking. Postmodernist thinking is like a written form of a stream of conscious -- only less well organized! ;)

    As an aside, when someone asks whatever became of all the nominal Marxists in this world? They all became postmodernist! They had to become something - given that their theory and all of its incarnations are failures. Marxism was the great 19th century critique of capitalism; it was successful so long as you didn't mind some nastiness on the road to Utopia. (Turns out, people *did* mind.) For a large portion of the political landscape both here in the US and around the world, the felt need to criticize the capitalist and capitalism has *not* diminished. Postmodern literary criticism fulfills that role nicely.

    But these authors do make a point. Why do you need to learn programming if the reality is that you can purchase the answer? Or look it up for free. I think programming is good for the soul, but some might dispute that motive. Or that to even have the software given to you?? What would be the point of learning to program? Best to leave it to the highly productive few who are best able to do it. With the Internet, the answers are all there for the taking. Don't need nearly as much in the way of university faculty as you might have thought.

    I sympathize with the authors' point of view because in my day job I profess computer science for a living. After 34 years of programming (hardly any of it in teaching, but with teaching experience separated by over two decades) I can see a pretty substantive material change in attitude.

    However, to claim that all of computer science is only about programming -- this is not quite a postmodernist sentiment!

  • by Bowie J. Poag ( 16898 ) on Wednesday October 23, 2002 @03:04AM (#4511034) Homepage


    This paper is about 3 microns away from justifying plagarism and copyright theft under the guise of postmodernism. I love it. :)

    Here's what I came away with in this paper; I'll annotate the good parts with a +, and the bad parts with a -.

    This paper proposes that there really isnt any point to enforcing a rigid set of rules that forces each of us to reinvent the wheel (-) whenever we want to do something constructive (+) . However, that ideas a few caveats, namely that by allowing (or encouraging) people to simply 'glom off' the work of others, we deprive them of the experience and perspective that can only BE gained by reinventing the wheel (-)... Here's a cute example. About 7 years ago, I took a class in x86 assembly. Our instructor was pretty hardcore -- Was around even before punch-cards. The manner in which he taught the class was to introduce us to the most minimal set of tools possible, and force us to combine these tools in a way which allowed us to do more things (+) --For example, the MUL instruction in x86 (simple multiply) wasn't revealed to us until Week 4 -- Before then, we had to write our own routine to perform multiplication. To me, this is how it should be. In order to appreciate the car, at SOME point you must first reinvent the wheel and learn what thats like.

    This paper puts forth the notion that its simply embracing the evolution of our science to take pre-existing forms, and adapt them for our own uses. In a nutshell, the whole concept of open source (+) . That having vast libraries of code to draw from, and then NOT doing so, is a terrible misuse of resources. After all, if we were to build an automobile, we wouldn't start off by cracking open a book on Chemistry to learn about electron exchange between atoms. We don't crack open a book on Newtonian physics, either, to learn why F=Ma. Chemistry and Newtonian physics can be thought of as the "legacy code" of manufacturing and construction, similar to all the standard tenets of programming. Why write new code when theres something 99% similar to it out there already, that you can simply adopt, modify, and re-release? (+) ..We incorporate the ideas and functionality they provide into our own work, simply because its convenient to do so. (+) It makes just sense. Anything less would be a waste of time. (-)

    I feel better about writing code now, after reading this paper. I had always felt a wee bit guilty about pilfering around in other people's code for a solution to a particular task, feeling that somehow I sucked that much more since I couldn't come up with my own solution, from scratch. This paper allowed me to realize that chances are, the person who I'm "cheating off of" probably did the same thing to someone else, to prepare his own. :) What before I used to refer to as "cheating", is now simply a manefestation of right and proper progress. I take and use, so that others may take and use from me. (+) It feels better to code without guilt. (?)

    Cheers,
  • by AlecC ( 512609 ) <aleccawley@gmail.com> on Wednesday October 23, 2002 @06:15AM (#4511412)
    So now I have got a name for the way I have been programming all my life - use the best tool that comes to hand without argueing whether it is theoretically perfect. Use mixed tools if that is what the problem at hand demands. Don't reinvent if you can possibly beg/borrow/steal.

    The paper strikes me as completey tautologous anywhere outside a Computer Science department (and probably to the more practical half of those inside). If you're involved in shipping code, either for money or for the good of the community, you are interested in what works, not what is theoretically best. Of course, if a nice theoretically clean tool does the job - use it. But if a steaming heap of old code does the job (where reliability and efficiency may for part of the spec), use that.

    Welcome to the real world, guys.
  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Wednesday October 23, 2002 @09:05AM (#4512228)
    I'm a trained Artist. I know these people that shit in the corner and call it [fill in random art-style bullshit].
    Perl is cool, Perl is geeky and gives a humorous look at the way things where back then with *nix admins. It's an anacronisim with a cool and powerfull interpreter, thus people still like to use and learn it. Even though it's syntax sometimes is like "ActionScript on crack" or something.
    But calling this (crappy software design and/with/or Perl) 'Postmodern' is like calling Lingo an 'interessting aproach to PLs'. Just because Perl is the tool of choice for a certain set of problems, there's no reason whatsoever in calling this 'postmodern'.
    Gawd, what people can crap about in more than 2 sentences amazes me ever so often.
  • by Jagasian ( 129329 ) on Wednesday October 23, 2002 @09:51AM (#4512597)
    The goal of Computer Science is the program? I guess computability theory and complexity theory aren't goals of Computer Science? What about programming language design? Such a narrow definition of Computer Science, such a wrong definition. A better, yet still incomplete definition of Computer Science is that its goal is to understand what can/cannot be computed, and how the computable can be computed.

    First off, this paper seems to confuse "Computer Science" with "Software Engineering". "Computer Science" is about theory while "Software Engineering" is about making products and services using software. This makes all of the knocking of the traditional theories of computer science nothing more than apple vs orange. If you read "Software Engineering" in place of "Computer Science", then saying that the goal of computer science (i.e. Software Engineering) is the program... saying that is more correct.

    I especially liked the part where they say that elements of a program are not abstractions but symbols. Maybe someone should tell the writer that Computer Science started as an off-shoot of a branch of Constructive Formal Mathematics known as "Metamathematics". Metamathematics concerned itself with symbolic representations of abstractions. Mathematicians 100 years ago spent allot of effort studying various aspects of Metamathematics. Read the original works of Brouwer, Hilbert, Kleene, Church, Turing, and Godel to name a few. Kleene has a good classic textbook on Metamathematics that the writer of this paper should read.

    This paper is not scientific. It is not mathematical.

    This paper expounds nothing new, original, or worthwhile.

    This paper is nothing more than a waste of time. At best it made up some new terminology for someone else's achievements. It would be even more entertaining if the title included a few other meaningless buzzwords/buzzphrases, such as: "paradigm shift".

In order to dial out, it is necessary to broaden one's dimension.

Working...