Jump to content
Science Forums

Super Computer


Recommended Posts

http://www.bbc.co.uk/news/technology-21993132

 

I don't know too much about super computers, but isn't there an application that something this fast could still preform with? A corporation, other fields of government, etc?

 

As I said I don't have too much knowledge in this subject, but isn't quantum computers the wave of the future?

 

Either way the capabilities of these super computers are cool.

Link to comment
Share on other sites

The biggest problem with super computers is that they've all progressed to getting their "speed" by increasing the number of processors.

 

What programmers know from hard experience is that in order to take advantage of this parallelism, you need to have software tools to exploit it without making the programmer do all the work of breaking down problems, especially those where the nature of the optimal division of labor across processors isn't clear from a top-level analysis of the problem and the data to be manipulated.

 

We've gotten a few technologies that attempt to deal with this (parallel computing libraries and "Functional programming languages"), but they're still not terribly good at dealing with massive parallelism on a large-scale dependent datasets.

 

So the bottom line is, it's certainly possible that someone else could use the Roadrunner, but even if they got it for free, they might not be able to afford getting the program written to solve their problem. Moreover, if their problem is 1/10 the size of what the Roadrunner is used for, they can get much cheaper results with much smaller processor arrays that can be gotten for actually not very much money. The programming cost is still going to be the same.

 

I can also imagine that the A/C costs for that dang thing are equivalent to a small town.

 

 

We can only see a short distance ahead, but we can see plenty there that needs to be done, :phones:

Buffy

Link to comment
Share on other sites

Here's another article, confirming the problem is power consumption:

 

Costing more than $120 million, Roadrunner's 296 server racks covering 6,000 square feet were connected with InfiniBand and contained 122,400 processor cores. The hybrid architecture used IBM PowerXCell 8i CPUs (an enhanced version of the Sony PlayStation 3 processor) and AMD Opteron dual-core processors. The AMD processors handled basic tasks, with the Cell CPUs "taking on the most computationally intense parts of a calculation—thus acting as a computational accelerator," Los Alamos wrote.

...

Petaflop machines aren't automatically obsolete—a petaflop is still speedy enough to crack the top 25 fastest supercomputers. Roadrunner is thus still capable of performing scientific work at mind-boggling speeds, but has been surpassed by competitors in terms of energy efficiency. For example, in the November 2012 ratings Roadrunner required 2,345 kilowatts to hit 1.042 petaflops and a world ranking of #22. The supercomputer at #21 required only 1,177 kilowatts, and #23 (clocked at 1.035 petaflops) required just 493 kilowatts.

 

2.3 megawatts is enough to power about 2000 homes....

 

Continuing the other point I made above, as the quote says, it uses a mixed processor architecture, which would require custom operating system and libraries to manage communications between parts of an application running on one processor talking to other parts run on another. One of those programming exercises that stretches one's brain.

 

A good designer must rely on experience, on precise, logic thinking; and on pedantic exactness. No magic will do, :phones:

Buffy

Link to comment
Share on other sites

I understand converting the computer to a new application seems to be more work than its worth.

 

2.3 megs, Wow, :o that's a lot of power. To operate the equipmnent needing that much is hard to believe. We run at 3.2 megs in our water plant, but that's used to drive several 2000hp motors on pumps. Our electric bills are in the millions so that must be a big factor for someone looking to adapt the computer for their use.

 

I wonder if they used the computer to run a simulation on how to run something like this with greater efficiency thus putting itself out of a job? :blink:

Link to comment
Share on other sites

While all these big computers are amazing to behold, LANL’s new “Cielo”, the replacement for their “Roadrunner”, is only the 18th fastest by most analyses, a bit slower than ORNL’s “Jaguar” was in the 2009, and about 1/20th as fast as Jaguar’s uprade, “Titan”.

 

Following the technical details of these machines is difficult for a mere software person like me (it feels like fiddling with sports statistics, really, as my grasp of the meaning of the hardware specs is sloppy at best), but the main improvements of late appear to be me to be in the internal network hardware (“routers” by any other name) their many “blade”-style boards (they all have on the order of 10000) exchange/share memory, from IBM SeaStar2s to ARARA Geminis.

 

As I said I don't have too much knowledge in this subject, but isn't quantum computers the wave of the future?

That very much depends on who you ask, and what you mean by “quantum computer”. The term refers to 2 very different kinds of machines, both very different than super/petaflop class computers like the Ceilo and Titan.

 

One kind, which you can actually buy, is essentially a sort of tamper-detecting network security device that uses conventional electronics and quantum mechanical principles to assure that some small, very important network communication, such as a cryptographic key, can’t have been received by other than its intended recipient machine.

 

The other kind, which we’re referring to when we speculate they may be the “future of computing”, and which are strange lab prototypes that more resemble solid state physics experiments than computers, are intended to, essentially, execute near infinite number of parallel “guessing” programs, then select the one that guesses right. IMHO, however, there remain some deep physics questions as to whether they can ever actually work. Some pretty good physics has been written reaching tentative answers of both “yes” and “no”.

 

2.3 megs, Wow, :o that's a lot of power. To operate the equipmnent needing that much is hard to believe. We run at 3.2 megs in our water plant, but that's used to drive several 2000hp motors on pumps. Our electric bills are in the millions so that must be a big factor for someone looking to adapt the computer for their use.

For comparison, the computer-running organization I’m a small part of has 5 large data centers, which collectively draw at about 10 MW. They’re used to support the health care of about 3% of the US population.

 

Much of the work I’ve done in the past decade has involved coping with moving to new data centers because old ones simply could not get enough power from their local electric companies.

Link to comment
Share on other sites

http://www.space.com/20484-how-spiral-galaxies-evolve.html

 

Unless I missed it, the article doesn't say which SC they used.

It took a bit of digging, but from the last line of D'Onghia, Vogelsberger and Hernquist’s 2 Apr 2013 paper “Self-perpetuating Spiral Arms in Disk Galaxies” (available here via arxiv), I found:

Numerical simulations were performed on the Odyssey supercomputer at Harvard University.

This machine is mentioned in the credits to the pretty video at Space.com and other news sites, but before digging, I wasn’t sure if it was used in the actual simulations, or just in rendering.

 

Harvard’s Odyssey supercomputer is a fairly old (2008), small one (4096 CPU cores, 16 TB memory – compare to the Cielo’s 107264 cores and 221 TB). It draws about 180 KW (vs Ceilo’s 4,000 KW) Nowhere in its webpages did I find a speed rating, but I’d guess it’s in the 1-10 TFlop range, about 1/1000th the speed of Ceilo, and 1/10000th that of ATLAS.

 

As any programmer will tell you, though, it’s not the size of your tool that’s important, but how you use it. :) The big national lab computers are mainly used to simulate nuclear weapon explosions, allowing them to be designed without treaty-violating testing. Odyssey, is used, by astronomer/astrophysicists like D’Onghia and coauthors, to model galaxies and the like. Per their paper, they’re using code written in 2005, version 3 (or maybe 2) of the GADGET software package, which was originally written by Volker Springel.

 

These folk serve to keep me humble when I start to fantasize that any of my little simulators deserve much praise.

Link to comment
Share on other sites

  • 4 months later...

How do "Flops" compare to "Cycles Per Second" {What most call "Hertz"...}

 

I've never seen that addressed.

 

Could the huge Super Computers be used for "Completely Lifelike Rendering/Animation"...

 

Meaning there would be no way to tell if it was real or virtual—no matter how closely the Tapes were analyzed?

 

Or is that still in the future?

 

 

Saxon Violence

Link to comment
Share on other sites

How do "Flops" compare to "Cycles Per Second" {What most call "Hertz"...}

The acronym FLOPS in the computer context means “floating point operations per second”. A floating point number is any of many ways to represent what in pure math we’d call a limited range subset of the rational numbers. They’re very useful in many kinds of computing, especially scientific simulations, since they can represent many physical quantities as accurately as needed.

 

“Cycles per second”, which can be and typically is measured in the SI unit hertz, usually refers to the rate of the “clock” signal controlling some piece of computer hardware. Another useful measurement is IPS, “instructions per second” (rarely seen without a prefix, such as M for million to make MIPS), how many instructions are executed in a second.

 

Since some instructions require many clock cycles to be executed, and some hardware requires many instructions to perform floating point number calculations, the relationship between FLOPS, IPS, and clock rates for even a simple, single-CPU computer can be complicated.

 

For present day supercomputers, which are really very highspeed networks of many smaller computers that are themselves sorts of networks of simpler computer chips, where the hardware making the network work contains specialized complicated computers (“routers” is a reasonable term) the relationship gets even more complicated. These many “parallelized” computers must report to and be managed by special controlling computers to produce useable results, further complicating their description.

 

So while standard metrics of computer performance give a single FLOPS or MIPS rating to a given computer (which in nearly all cases is really a network of computers), It’s good to be mindful, too, that this metric isn’t truly constant, but depends on specifically what sort of program the computer is running. A computer with a higher FLOPS or MIPS rating might actually run a given program slower than one with lower ratings that is better suited to the program.

 

 

Could the huge Super Computers be used for "Completely Lifelike Rendering/Animation"...

Given the very high performance of even low-end present day supercomputers, I’m pretty sure the answer to this is a limited “yes”. The human eye has a finite number of receptors and it and its related nerves and brain systems a limited processing speed, beyond which more and faster graphical are wasted.

 

The “limited” part has to do with what’s being modeled, and for how long. Even the best and most precise simulations are unable to duplicate many systems – especially those termed chaotic, which include such everyday ones as the weather and the motion of many bodies in space – so while a casual observer might find such a simulation “completely lifelike”, a smart one might be able to tell when even a very precise simulation became subtly, or un-subtly, wrong.

 

We’re also far from being able to really simulate more than tiny, constrained biological chemical processes, so any simulation involving those – wouldn’t hold up to close and/or prolonged scrutiny – you couldn’t cut it up and look at it under a microscope without destroying the renderings illusion of real life.

 

Meaning there would be no way to tell if it was real or virtual—no matter how closely the Tapes were analyzed?

For everyday visual recordings, where we not allowed to carefully watch the weather or examine biological material under a microscope, I think the answer to this is “yes”. This is especially true if the quality/resolution/frame rate of the recording is low, much lower than that of the simulation that produced it.

 

Keep in mind, though, that while even low-end supercomputers can make practically perfect videos, typical video display devices – “screens”, for the most part – are far from perfect recording devices, optimized for typical human eyes and sight systems. So while it could be difficult to tell if what you’re watching on a screen was recorded from nature by a camera or generated from a model of nature by a computer, that you’re watching a screen, rather than directly experiencing nature, is easy to tell. I could be wrong, but I don’t think even the best immersive VR systems come close to really fooling anybody, yet.

 

I suspect this is less of a problem than one might think, though, because we humans like, when we’re sure it’s not to our detriment, to be fooled, and perhaps even prefer our visual reality rezed down a bit below natural. When a VR technology is good enough is, I think, largely a matter of when it can present everything we care about in a particular situation as or more effectively than our natural perception of actual reality, rather than when we can no longer distinguish between them. Some, I among them, would argue we’re already there, using fairly cheap hardware like a PS3, and well-written and optimized software, like the video game BioShock:Infinite running under Unreal Engine.

Link to comment
Share on other sites

  • 7 months later...

Many governments are investing huge amounts of money on quantum computer research, but it appears they are still far off from being fully functional. The supercomputers of today are quite extraordinary and it is predicted that in the next 15 years they will be able to produce weather models that will be very accurate over two week periods.

Link to comment
Share on other sites

...and it is predicted that in the next 15 years they will be able to produce weather models that will be very accurate over two week periods.

 

They basically do this today. Accuracy is about 90% one week out and deteriorates from there but 2 week forecasts are still pretty good. It's arguable that "throwing more hardware and data" at the problem is already reaching significantly decreasing marginal returns: they could double the horsepower and amount of data being fed into the system and they're only going to get a percentage point or two increase in accuracy because the quantum/chaotic effects have way too much power out past a couple of weeks.

 

Climate change on the other hand is about averaged trends over the long term so the predictions years out are proving quite accurate (although turning out to be worse because the models are too conservative due to pressure from the naysayers....).

 

 

Weather forecast for tonight: dark. Continued mostly dark tonight, with widely scattered light in the morning, :phones:

Buffy 

Link to comment
Share on other sites

  • 1 month later...

We can only see a short distance ahead, but we can see plenty there that needs to be done

 

This is so true on so many levels it is almost funny. The longer we live the more we tack onto that list of things to do. The technology my kids have now is nothing like what it was when I was a kid. This is a great example of just one small change. I am wondering what the world is going to be like when I am 80. We have already seen a great deal of changes in the past 30 years to make our lives easier/better/etc. but what will come in the next 30?

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...