Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software The Gimp Science

Open 3D Scientific Visualization Toolkit 97

Mark Leaman writes "The Science Museum of Minnesota has just announced an online community site for scientific visualization, including thier Open 3D Visualization Toolkit that includes Blender and the GIMP as part of the core development tools. Frustrated with a lack of consolidated resources and discussion about open-source, scientific visualization development tools, the Science Museum of Minnesota's Learning Technologies Department decided to develop their own."
This discussion has been archived. No new comments can be posted.

Open 3D Scientific Visualization Toolkit

Comments Filter:
  • by Dancin_Santa ( 265275 ) <DancinSanta@gmail.com> on Saturday January 22, 2005 @11:12AM (#11441122) Journal
    These museums, with very few exceptions are almost purely supported with government funds. They just can't make back the cost of upkeep, much less salaries, on the few dollars they make through admission fees.

    There are a few that can make ends meet by appealing to private business, but for the most part these museums are supported with public money.

    Now the point of all this government talk is that sometimes it takes the government to do something good and worthwhile for the general public. If it were up to the private sector, such an undertaking would 1) not have been undertaken in the first place and 2) if it were developed, it wouldn't have been released as OSS.

    Hooray for these hackers! And thank god they've got an enlightened government supporting them.
    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Saturday January 22, 2005 @12:03PM (#11441351)
      Comment removed based on user account deletion
      • That's because most universities in the U.S. at least have turned to public/private ventures and patents to bring in more revenues. This has many benefits, including placing students in nice jobs, but the downside is an increased focus on doing what the private sector wants.
        • Well if you consider it though, government funded academic projects would have to fall under the BSD license. Corporations pay taxes, which among others fund acadmeic institutions. So any work derived from their tax dollars would need to be accesible to them, and the GNU license is to encumbering for them.

          -PHiZ
      • There is actually no express need for academics to get into the Open Source Movement. In my University library, MS Windows Professional is available for $10 for installation on the staff computers, other packages are similarly priced. With big companies giving away their software to Universities, and with support options available, not many would want to work with Linux. Consequently, there is no development geared towards the Linux OS. All the professors in my department (EE) use Windows XP without excepti
        • Strange. I would think that in an EE/CS academic environment, the availability of source code alone would be a compelling draw. That and the relative freedom from virii and spyware...
          • It's been a while since you've met a lot of CS students, hasn't it? Most are in it thinking they'll make lots and lots of money after they graduate. They don't care about software wanting to be free. They don't even like coding. This represents about 80% of the CS student body, in my experience. They're not clever problem solvers, not at all. Most hate their major. They just want a nice cushy job in corporate america so they can buy whatever BMW they want.

            The CS department at my school forces kids
            • then the school really needs to do a better job convincing them that they are in the wrong major.

              If you dont enjoy your major, get out of it. What I think is really cool about schools like the University of Chicago is that they force a fairly wide spread of classes on their students and then dont let them formally declare a major until end of their sophomore year. This forces many a student to discover new things and pick something they like.

              • I totally agree. I chose ES as a major because I'm completely and utterly facinated by natural systems. I love learning about connects and interconnects between the various spheres. Computers interest me, but only superficially, and thanks to the open source community I can teach myself a lot for free. UCR - the school I go to - does a similiar thing with it's breadth requirments. Humanities majors are supposed to take basic science classes and science majors are supposed to take basic humanities class

              • then the school really needs to do a better job convincing them that they are in the wrong major.

                For a very long time many students have been choosing careers in medicine for the same reason: that it is financially rewarding.

                But, having to get all A's in courses like biochemistry has usually helped to insure that only the most capable students get through the system. Not always, but it does a reasonably effective job of weeding out the less intelligent and the lazy.

                Likewise, most university CS courses

            • It's been a while since you've met a lot of CS students, hasn't it?

              Can't knock CS grads as a group, but... To be honest, the best programmers I've worked with in the past have tended to spend time in other (often unrelated) fields before getting into programming as a profession. I don't have a theory as to why that should be, however...

              • by Anonymous Coward
                I do: CS degree is a hammer looking for a nail. Go out in the world and learn that there are problems that need to be solved, then become a programmer and solve them.
      • One thing that always disappoints me is the lack of involvement of Academia in helping OSS

        Don't know what you're talking about, really. Check the sciences, almost all OSS is academic. The OSS tools I use for research were all made my students or profs or multi-university collabortations. If you mean big projects that solve non-academic problems, like spreadsheets and word processors - well, why should researchers (outside of CS people perhaps) involve themselves with that?

        • Don't know what you're talking about, really. Check the sciences, almost all OSS is academic. The OSS tools I use for research were all made my students or profs or multi-university collabortations.

          Definitely. Modelling programs, algorithms, basically all scientific models are open source because they HAVE to be. You have to allow your collegues to look at your work simply because that's how science works. Where the open source disconnect occurs is in data analysis. Most sciences that aren't entirel
      • Actually I find the opposite with one small quantifier. There are lots..literally millions of OSS projects written and maintained by universities but they are usually very specialist. There is normaly not any research value in the big name OSS projects like OO.org, apache etc. However try doing a little survey of software for natural language programming, you'll find most of the core research is implemented through OSS projects.

    • I really like this muesuem, it's a good asset to my state. The exhibits are usually pretty nice.

      It does have corporate benefactors as well as individual memberships [smm.org]. The whole thing was started by business. As far as I know it doesn't use much public funds.

      It even built a very nice new facility in 1999. This museum does more than earn the cost of upkeep.
    • These museums, with very few exceptions are almost purely supported with government funds. They just can't make back the cost of upkeep, much less salaries, on the few dollars they make through admission fees.

      That is almost entirely false, at least for the SMM. It has a yearly budget of ~$25 million. Of that, $750,000 is government money. The rest is from ticket/food/merchandise sales (roughly $20 million) and private/corporate donations (~$4 million). See here [state.mn.us] - warning PDF.

      It does use some federal mone
  • by Anonymous Coward
    Right now they're visualising squat:

    Warning: mysql_connect(): Too many connections in /Users/silver/Sites/visualize/includes/database.my sql.inc on line 31
    Too many connections

    LOL!
  • Open Data (Score:3, Insightful)

    by Doc Ruby ( 173196 ) on Saturday January 22, 2005 @11:51AM (#11441296) Homepage Journal
    Where is the repository for open scientific data for visualization? The NASA website of raw data decoded from the streams sent by our probes? The USGS GPS models? CAT/MRI scan files from dead people? X-ray crystallography data from public research institutions? Their CD distro is a good start, with models from their Turkish dig site. Without data, this tool is just a toy.
    • Re:Open Data (Score:1, Interesting)

      by Anonymous Coward
      I know a half a dozen scientific data repositories in my field and dozens of researchers who have data on websites (including ours). I suggest you try google. Nothing you listed is in my field. One that amazes me most is that MODIS data centers. You can request raw data files that are 500MB in size. It takes up to an hour for the data to be online, but it's free and open to everyone.
    • A lot of data probably isn't available simply because the people who create it never think others would be interested in it. Or don't have the time to explain the format in simple terms. I do computational fluid dynamics and create terabytes of data from my simulations. Even reduced sets of this data would be quite large, so I have no where to put them.
    • Re:Open Data (Score:2, Informative)

      by pedroloco ( 778593 )

      As far as NASA planetary datasets go, try the Planetary Data System [nasa.gov]

      Some of the USGS topo datasets are available from the EROS Data Center [usgs.gov]. Some free datasets are available for download.

    • You can get the Visible Human data set here:

      http://www.nlm.nih.gov/research/visible/getting_ da ta.html
    • Where is the repository for open scientific data for visualization? The NASA website of raw data decoded from the streams sent by our probes? The USGS GPS models? CAT/MRI scan files from dead people? X-ray crystallography data from public research institutions? Their CD distro is a good start, with models from their Turkish dig site. Without data, this tool is just a toy.

      I'd agree with the last statement - if you have data, like me, this is damned cool. If you don't, why would you need it anyway?

      • Looking at the visualizations is cool, even when it's just for fun. And all of that data that I mentioned (except maybe the CAT/MRI) belongs to the public. We just need tools, like the Internet and this toolkit, to use it.
        • Looking at the visualizations is cool, even when it's just for fun. And all of that data that I mentioned (except maybe the CAT/MRI) belongs to the public. We just need tools, like the Internet and this toolkit, to use it.

          Then the entire gripe is what, that you want everything to not only be open, but conveniently packaged for you?

          • I don't know if it's a "gripe", but yeah, of course I want it to be open and conveniently packaged - in an open data format, with a program to use it. What else would I want? Something hard or impossible to use? As for "everything", it's not like I asked for Exxon's seafloor soundings data, or the DoE nuclear test data.
            • I don't know if it's a "gripe", but yeah, of course I want it to be open and conveniently packaged - in an open data format, with a program to use it. What else would I want? Something hard or impossible to use? As for "everything", it's not like I asked for Exxon's seafloor soundings data, or the DoE nuclear test data.

              These are unrelated programs you're asking for. You want a toolkit developed by one school to work with a database from another organization. As someone who works woth data, that ain't goi

              • How long have you been working with your data? I've seen data formats come and go in my 28 years in computing, mostly "come" - not so much "go". And converters stick around, too. If you're content with your proprietary data ghetto, I envy you. I'm one who likes mungers, and even stomachs XML for exactly this kind of cross-app data transfer. My world is getting better, while the proprietary data world is staying the same.
                • How long have you been working with your data? I've seen data formats come and go in my 28 years in computing, mostly "come" - not so much "go". And converters stick around, too. If you're content with your proprietary data ghetto, I envy you. I'm one who likes mungers, and even stomachs XML for exactly this kind of cross-app data transfer. My world is getting better, while the proprietary data world is staying the same.

                  Don't know what your point is - my "proprietary format" is ASCII. Don't think that's

    • This is a bit of a disingenuous comment. When exactly should the data be put online? Straight away after is it acquired? Then where is the incentive for the scientists who acquired it? After the first few papers on the data are published? Then will it be relevant? Furthermore, data isn't always amenable to analysis without a good knowledge of the equipment that generated it (and not just the general class of equipment - often you need to know the quirks of the specific device). Additional documentation of t
      • Sure your comment is disingenuous. The straight answer to your questions is "at all", or "whenever possible". For our own interests, unavailable data has much less value. And publishing it to people like Slashdotters is marketing for further funding. Not to mention that practically everyone outside the discipline generating the data is in the same position as "Slashdotters" in getting the data, though many of us can create more practical value when it's exchanged. In the past, scientists used to all learn L
        • No, I meant your comment is disingenuous. But we'll stop name calling here :)

          We will never get "marketing for further funding" from openly publishing raw data. That's not a concern to the average scientist at all. We get funding by getting our data peer reviewed and published in journals and by being able to point to other scientists who are sufficiently impressed with what we do to want to collaborate with us. Bluntly, lay people are not the target for raw data. Naturally there is an obligation to share t
          • I meant that opening the data "at all", by publishing it on websites, is a good start. Without necessarily documenting it, or converting it to a standard data format. Just put the file, or a DB CGI query, in a link. Publishing the data helps create an audience for it, especially among geeks across the world. That in turn can help raise the money that funds the research, especially when orchestrated by a good fundraiser. Of course none of that compares to peer review. But publishing the data can be a small c
      • the "make it open" mantra is also very popular in science, especially academia. I was at the American Geophysicist Union meeting in San Fracisco this december, and open data was the name of the game. Infact, open data is one of the main tenents of science. More and more data is becoming open and available. The USGS [usgs.gov] make data available. As does the professor i used to work for (i'd link you, but he's in the middle of an interstate move and the server doesn't have a new domain name yet). There was a who
        • Cool - that sounds good. Please don't think I'm against the idea of open data. However, it just sounds like it could end up being someting that becomes a requirement for funding without the necessary additional resources being given to scientists that allows them to actually continue acquiring new data. Personally, I'll freely admit to being a little biased in that the issue of open access to publications and peer-reviewed articles seems a much more pressing issue and something that, in a much shorter term,
  • Not the only one (Score:2, Informative)

    by Anonymous Coward
    Frustrated with a lack of consolidated resources and discussion about open-source, scientific visualization development tools

    This the point where I remind people of OpenDX [opendx.org], which is the open sourced IBM Visualization Data Explorer. DX used to be an extremely expensive commercial product, but it's been open source for a couple of years now.

    It's very good. If you're into scientific visualization it's worth examining.

    • I started trying to use OpenDX, and immediately found that it wouldn't import 16-bit grayscale images. If you know of a way to get around that, I'd be interested in hearing about it. Otherwise, it's useless to me.
  • There are toolkits available for 3D visualization that are open source. I used a couple for some work in a seminar a year or so ago. http://www.vtk.org and http://www.itk.org (owned, pretty obviously, by the same people). Their principle application has been in medical work, but I used the segmentation and registration data to begin some work on tracking torsos in video.
  • by hawkstone ( 233083 ) on Saturday January 22, 2005 @12:14PM (#11441415)
    Sorry, but I fail to see what Blender and the GIMP have to do with real scientific visualization. Blender is for 3D modelling, and the GIMP is for image processing.

    If you're looking for complete, open source scientific visualization and data analysis packages, try VisIt, which supports dozens of input formats and runs on Linux, Windows, and MacOSX. Pick it up at http://www.llnl.gov/visit [llnl.gov], or get the latest binaries from FTP here [llnl.gov].

    I have less knowledge of ParaView, but it is also free: http://www.paraview.org [paraview.org].

    Both of these are also developed in part by the national labs; they can run parallel to handle terabytes of data, so if you've got small dataset they should be smokin' fast, and if you've got your own cluster you should be able to visualize some huge data.

    If you're looking for just a toolkit to build your own application, try OpenDX [opendx.org] or VTK [kitware.com].

    • OpenDX and MayaVI (Score:3, Informative)

      by Noksagt ( 69097 )
      I think OpenDX is a bit more than just a tool-kit. It also has a great GUI for doing visualization, without the need for too much coding (somewhat analagous to LabView, I suppose). I have found I really like MayaVI [sourceforge.net], which is a GUI for VTK. MayaVI/VTK are python scriptable, which is great.
      • You're probably right about OpenDX, but I haven't used it. Thanks for the clarification.

        Also, I think VTK has a native Tk interface, and I know VisIt is fully exposed through a Python API.
  • by Wills ( 242929 ) on Saturday January 22, 2005 @12:59PM (#11441629)

    Open-source Visualisation software:

    "[We, the Science Musuem of Minnesota,] are frustrated by a lack of consolidated resources and discussion about open-source, scientific visualization development tools"

    Counter-examples:
    • MayaVi [sourceforge.net] is also quite nice. And it's Python.
    • Also check out Partiview [uiuc.edu] , a fast open source viewer from the NCSA. It has its limitations, but can handle large animated multidimensional data. The AMNH uses it as the viewer for its nifty planetarium-on-your-laptop Digital Universe, and we've used it to visualize cosmic ray showers, dark matter simulations, global computer networks, clustering patterns, and as a general infovis tool. It also works in stereo, which is really nice when you have two projectors and polarizing filters. The linux binaries suppl
  • "I" before "E" except after "C" or when sounded like "A" as in neighbor or way. But their weird and either, foreign seize neither, leisure forfeit and height are exceptions spelt right.
  • If you're not familiar with the history of the open-source 3D modelling tool known as "Blender," I highly recommend taking 5 minutes to read about it here: http://http//www.blender3d.org/cms/History.53.0.ht ml [http] It's a great story of how this particular set of proprietary code escaped to the freedom of open-source under the GNU Public License. (With a little help from an incredible fan-user base.)

  • I've tried all of the 3d-viz programs and they are all gigantic unwieldy albatrosses. They are gigantic in size and only deal with custom file formats.

    Enter Animabob. You can take a simply 3 dimensional matrix, dump it to an output file, get a color map, then just run Animabob on the 3d matrix. Its incredibly simple. The rest of these programs I abhor require custom file formats and various other crap to get them to work.

    Animabob just requires an x, y, z matrix dump to work!

    Its available at : http:/

Those who can, do; those who can't, write. Those who can't write work for the Bell Labs Record.

Working...