Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
The Internet Science

Internet Emulator 139

John3 writes "InternetNewsM is reporting that PlanetLab is getting closer to reality. According to this article, a consortium of universities (including Princeton) is launching a test-bed platform based on Red Hat Linux. This project is different than Internet2 or some of the other "alternate Internet" networks being developed, and seems to offer the most benefit to distributed computing projects rather than generic WAN/Internet communications."
This discussion has been archived. No new comments can be posted.

Internet Emulator

Comments Filter:
  • Is it just me... (Score:5, Insightful)

    by Dthoma ( 593797 ) on Tuesday June 24, 2003 @04:54PM (#6288374) Journal
    ...or was the article blurb just a bunch of buzzwords stuck together? I mean, each of the clauses in it on its own made sense but the whole blurb just seemed kind of incoherent. It's very thin on actual specifics; this sounds like it could just be more vapourware, unfortunately.
  • by loveandpeace ( 520766 ) * on Tuesday June 24, 2003 @04:55PM (#6288387) Homepage Journal
    one of the things i find so interesting about PlanetLab is the way employing standards has actually increased the flexibility of the whole product. too often, standards are a primary ossifying force in technological development, especially when created after the fact; by coming up with a common platform and software package at the outset, and by having flexibility as one of the primary goals considered in development, standards will actally help ensure PlanetLab works as it was intended.
  • Shiny! (Score:5, Insightful)

    by cultobill ( 72845 ) on Tuesday June 24, 2003 @04:55PM (#6288389)
    I can't help but say that most CS/IT majors need this. I've seen too many people write apps (simple ones even) that relied on that ethernet connection that the dorms give, 10Mbit between machines. "Scale down? Who has less than a fast cable modem these days?"

    Now they just need to break the schedulers on the machines, to make them randomly almost-starve a process to make sure it can cope with a slow machine.
  • by SoSueMe ( 263478 ) on Tuesday June 24, 2003 @05:02PM (#6288455)
    ...and allow the public to join the project?
    Different than the Internet 2 project or even Grid computing, the group says the most obvious benefit is that network services installed on PlanetLab experience all of the behaviors of the real Internet where the only thing predictable is unpredictability (latency, bandwidth, paths taken).

    If you want to emulate all the behaviors of the real internet, you need to welcome the hackers. crackers and script kiddies, not to mention the "moms".

    Forget about the AOLers, we don't need 'em.
  • by maxume ( 22995 ) on Tuesday June 24, 2003 @05:08PM (#6288518)
    Wouldn't it be cheaper to use a station wagon full of hard disks? The cost per GB on hard disks isn't that much higher than it is for DVD-R media, and if you bother to factor in the amount of time it would take to create the DVD-R's versus filling the harddrives, they might come in cheaper. Should be faster to read in to.

    I know that some companies are offering thier GIS datasets on HD instead of cdr now, but they do charge a bit more. Backing up to cdr is pretty useless for 40 Gigs of data though. Ramble Ramble.

  • by John3 ( 85454 ) <john3@@@cornells...com> on Tuesday June 24, 2003 @05:12PM (#6288548) Homepage Journal
    Doh...never mind. :-)

    After yet another read of the article it looks like they are just building a mock-up Internet on which to test their distributed apps. This would allow them to see how their apps will perform when linked over the Internet rather than in a closed lab 100mb network environment.

    This would help them avoid comments like "Gee, those data packets sure take a long time to get back to us" once they move their app to the real world outside the lab.

  • I did RTFA (Score:3, Insightful)

    by Ricin ( 236107 ) on Tuesday June 24, 2003 @05:19PM (#6288615)
    up until the 3rd paragraph (emphasis mine):

    ''[The Web is] so successful and so many people depend on it, it's become impossible to go to the core of the Internet and make radical changes to introduce the kind of new services we see people wanting to deploy,'' Princeton University scientist and Intel Research member Larry Peterson said during a conference call to the press.

    How are changes so "radical" that it needs a newly designed system to merely do development and testing ever going to able to be gradually introduced into the "core of the Internet"?

    Won't fly IMHO.

  • Re:Shiny! (Score:3, Insightful)

    by Patrick ( 530 ) on Tuesday June 24, 2003 @09:19PM (#6290435)
    I can't help but say that most CS/IT majors need this. I've seen too many people write apps (simple ones even) that relied on that ethernet connection that the dorms give, 10Mbit between machines. "Scale down? Who has less than a fast cable modem these days?"

    PlanetLab won't help much with that. Most of the PlanetLab nodes are pretty well connected, certainly better than modems. It lets you test latency pretty realistically, given that the nodes span the globe.

    Modelnet [duke.edu] might be a better bet for emulating modem-dominated networks.

Pound for pound, the amoeba is the most vicious animal on earth.

Working...