Processor, meet graphics
The ink is now dry on the AMD/ATI merger, so now we can start speculating on what it means for the industry. I wonder what Intel thinks of the situation?
Advanced Micro Devices and ATI Technologies finalized their $5.4 billion merger and are now on track to go forward with plans to fuse the CPU and GPU. The chips will be integrating AMD’s x86 with a graphics processor on a single piece of silicon by early 2009.
The project if codenamed “Fusion” and is designed to lead to chips for a wide range of applications, from laptops to servers. This will allow the integration of a PC’s graphics processor and the “brain” of the computer on one chip on the motherboard.
Even considering the increased power efficiency of a single device, since chip density is already causing thermal management issues will an AMD/ATI “fusion” chip cause more problems than it solves?
I like the code name.
Ultimately, wouldn’t putting all graphic functions on a CPU result in the same kind of shared-memory problems (slower video response, primarily) as memory-sharing video chip-on-the-mobo designs? For a better user experience on those, you throw in your own video card with its own processor and memory. Don’t see how a consolidated chip would change that, except maybe to lock out third-party video cards.
Maybe they’re planning specialty processors for portable DVD players, set-top boxes and the like. Would those markets justify a $5.4bn purchase?
Was the razor named after you also? That was really cold. : )
This should make Intel and nVidia stock go up.
It sounds like the chipset could double as a space heater as well.
No ties, one wearing a golf shirt.. get it?
#6
Couldn’t said it better 🙂
NOW you will be stuck with the MOBO chipset video…
I can understand the desire for fewer components. Simply put, the vast majority of computers are bought by people who do not need the latest and greatest hardware. Cost is very important to them and if they can shave a few bucks and claim greater performance for the bargain crowd, then it might work.
My guess is the idea will die on the product acceptance side. Integrating graphics with the chipset has proven to be the way to go – not with the CPU.
– Enthusiast segment will reject it
– Low end is better served with the cheaper chipset integration path
#1….Mr. Fusion….HAHAHAHAHAHAHAHAHAHAHAHAHA
As soon as I read it, I just knew you would be pleased.
#2 Ultimately, wouldn’t putting all graphic functions on a CPU result in the same kind of shard-memory problems (slower video response, primarily)
—-
Not exactly. the original Xbox system had a vastly different bus architecture and the video and system ram was shared and only had 64megs to use.
The reason it’s BAD for a normal PC to use shared ram is because the video has to pass though a slower bus to get to the ram, get around the CPU and get back. This was why it was better for video cards to have their own local ram to avoid this, but it’s mostly used for texture storage.
The reason why we have video cards approaching 1gb is because games are starting use LARGE texture sizes, and many different unique textures applying only to a few (or just one) surface.
However the one thing I don’t like about a combo video/cpu is it will be more of a challenge to know what chip is better than the other. They can create many chips with the same CPU performance, but different grades of GPU performance… and vice versa. The market isn’t ready for something like this and would just further confuse consumers on what to buy. Which is why video and CPU have always been separate.
Now if this was for portable devices and notebook PCs.. then I think that would make sense since the consumer wouldn’t be making direct purchases of these hybrid GPU/CPUs. IT would be all OEM anyway.
#6 I hear AMD is now working out the design details of its new heat sink with George Forman.
Crap, does this mean that we will now have to install AMD/Motherboard drivers in myriad different and cryptic orders to get them working?
Can you say All-In-Wonder (how to get this dang thing to keep working).
Geez, don’t you guys remember when the Math Co-processor was OUTSIDE the CPU and sold as a 300$ option?!?!?
Only good can come out of such integration, cheaper prices. The sub-100$ computer and sub-300$ laptop is good thing.
When you integrate, the total number of amps consumed is LESS, less energy, less overall heat. Better for the environment.
16,
NOT when 2 corps want money for the work.
THEN try to upgrade your system, without getting a NEW CPU.
I think the factors driving this purchase are:
1. Synergies in chip design and manufacturing — each party has strengths and talents that can benefit the other.
2. Moore’s Law — “We can barely figure out how to use 100 million transistors; what will we do with 200 million?” “I know — graphics!” A variation on the multi-core processor. You can only use so many CPU cores at a time — what possible advantage will the 80-core CPU have?
3. The Corporate market — Companies that need PCs for thousands of desks would love to save even the few dollars on-the-motherboard graphics add to the cost per machine. The integrated graphics should be more reliable than cards, even than mobo chipset graphics.
Uncle Patso
18,
#3…
A corp that purchases a $600 gaphic card JUSt to type letters is a FOOL, and will go bankrupt(I hope)… Integrating it, only FORCES you to pay MORE everytime we need a NEW graphic card…Because you cant get one without buying a NEW CPU, along with it.