Video gamers’ cravings for ever-more-realistic play have spawned a technological arms race that could help cure cancer, predict the next big earthquake in San Francisco and crack many other mathematical puzzles currently beyond the reach of the world’s most powerful computers.
The lab tests come amid growing efforts to harness the GPU for general high-performance computing, and the UNC paper promises to be something of a showstopper at the weeklong gathering of the supercomputing elite: According to the Chapel Hill team, a low-cost parallel data processing GPU system can conservatively surpass the latest CPU-based systems by two to five times in a wide variety of tasks.
Signs of a breakthrough are coming as Nvidia and ATI, the two dominant GPU makers, are opening up their technology for non-graphics related applications.
On Wednesday, Nvidia announced the industry’s first C-compiler development environment for the GPU, called CUDA, a move that will make it easier to tap the GPU for custom applications, from product design to number crunching.
“The GPU now looks like a CPU,” Keane said. “CUDA provides a very flexible and accessible way to access the amazing performance inside the GPU in a way people can actually use.”
I appreciate the fact that they’ve mentioned the downside of number-crunching, I’m sick to death of technology being derided because it was oversold.
let the tech posts begin!
hey, is she missing fingers?
That is an amazing picture.
Please tell me there are more like this. Where did you find it SN?
Oh, and cancer research is good, too.
So when are we going to move away from this Rasterization crap and get some vector action happening?
Will AMD produce a quad-core CPU/GPU? Sounds clever:
Dual integer and all-purpose CPU and an on-chip crossfire solution with two GPUs.
All glued together with Hyper-transport. Talk about media center performance. Just think: smaller boxes, cooler and more powerful devices. One can almost do away with the chipset, once someone devises an hyper-transport enabled RAM module.
and BTW, nice booth babe… 🙂
#5
You’re talking about AMD, chances are that their machines will resemble an industrial fan with a chip attached.
This is all hype by the graphics chip makers. I remember reading an article six years ago when the graphics companies were saying this. They are just too full of themselves.
LOL – Pretty entertaining. AMD acting like it can move the market after Intel did the same to them, moving from 64 bit to dual cores.
“AMD acting like it can move the market after Intel did the same to them, moving from 64 bit to dual cores.”
Um, maybe you’re kidding. But AMD certainly moved the market with the first 64 bit and the first dual core CPUs.
Interesting to see how this shakes-out.
Apple & Microsoft are using this for graphical eye-candy in their OS interfaces, and Apple is using the GPU to allow OS-level photo & video editing – applying flters, etc. And, allowing 3rd party developers to use the frameworks in their own applications.
But… this will be opening another “digital divide” between the people who can afford a “decent” video card, and those – especially on low-end laptops – who are stuck with generic, on-board graphics. i.e. Intel GMA 900/950.
Will I be happy with the performance of a machine with GMA 950 graphics, or would I be better-served buying a laptop with the same procesor – but an up-scale ATI/Nvidia GPU?
Will the software tie-ins to the GPU – in the next couple of years – dramatically expand the performance difference between a GPU-equipped and a GMA-equipped computer? [all else being equal]
#10 – Will I be happy with the performance of a machine with GMA 950 graphics, or would I be better-served buying a laptop with the same procesor – but an up-scale ATI/Nvidia GPU?
Why is that question important?
You will be happy with the run of the mill graphics chip if you are an Office/Internet user. If you are the average user, you’ll never know the difference.
If you are a media content developer or a gamer, you’ll need the upscale power… but that begs the question, why a laptop?
My brother still uses a 100,000,000 year old Commodore Amiga for graphical projects. I wonder how far our socks would be blown off if its chipset could be brought up to current tech standards? Maybe the problem with current computers is more the architecture than the density of microscopic transistors.
YAY!~ Booth babes!!!