AMD, the world's second largest CPU manufacturer has big plans for the future of the CPU and the GPU. With the purchase of ATI, AMD has been working for years on a project named AMD Fusion which will place the CPU and GPU onto the same piece of silicon. The resulting APU, or Accelerated Processing Unit, has its advantages since CPUs and GPUs were designed to do serial and parallel calculations respectively; combining these forms a central core where serial and parallel processes can occur simultaneously, creating a powerful graphics and processing core. AMD has already demonstrated a working example of the APU, and plans to debut the technology in 2011.
What do you think of AMD Fusion? Is it innovative and do you think it will work well?
Answer by r0bErT4u · Jul 09, 2010 at 06:11 AM
Many motherboards already have integrated graphics. Combining the CPU & GPU is a logic step, especially for AMD & ATI. Eventually everything will be combined into one solid state part including the operating system. If it breaks, replace that part.
Sorry, but no supporting links or pictures ... just the way I see things are going. A bunch of script monkeys for support, and the idiot technicians that just replace parts. No more working on the OS or software, just replace it. Us geeks will die the way of the dinosaurs. The Apple Lemmings will continue to by Apple Gadgets and not question the loss of privacy & ownership of the product they paid for.
Answer by Leapo · Jul 07, 2010 at 12:18 AM
It will be nice for low-power systems (like low-end desktops) and laptops, assuring that such systems will come with a decent graphics processor onboard. It's not going to change a whole lot for mid-range and high-end desktop computers. Dedicated graphics cards will still be faster than the integrated GPU, so you'll still see plenty of graphics cards being sold.
If they're smart about it, they'll figure out some way to use those extra GPU units on the CPU for general computation when a discreet graphics card is present. That way, they don't go wasted.
Answer by Justen Robertson · Jul 07, 2010 at 12:19 PM
One of the nice things about the two pieces of hardware being separate is that you can upgrade them independently. It's not too hard to fork out $150 or so every year for a 2-year old video card so you can run modern games at reasonable framerates, but if you have to upgrade an integrated solution you're looking at a larger expense (especially with CPU manufacturers' love of changing sockets and thus motherboards vs. GPUs which tend to stick with the same slot type for half a decade or more). The real problem I think is that PC gaming is not very economical or practical in general, but this is going to be a hard sell to serious PC gamers. As mentioned above it may do well in laptops and netbooks if AMD can manage to keep the heat down, which they're horrible at (as I write, the fan in my tablet PC is howling like a banshee to keep the AMD CPU inside running below 90 degrees centigrade).
Is Intel's Core 2 Duo processor old? 5 Answers
Where is the NorthBridge? 1 Answer