login about faq

To prove you're not a spammer, email newuser.lgqa@gmail.com with the subject "Account Request" to request an account.


AMD, the world's second largest CPU manufacturer has big plans for the future of the CPU and the GPU. With the purchase of ATI, AMD has been working for years on a project named AMD Fusion which will place the CPU and GPU onto the same piece of silicon. The resulting APU, or Accelerated Processing Unit, has its advantages since CPUs and GPUs were designed to do serial and parallel calculations respectively; combining these forms a central core where serial and parallel processes can occur simultaneously, creating a powerful graphics and processing core. AMD has already demonstrated a working example of the APU, and plans to debut the technology in 2011.

Link to Wikipedia article

What do you think of AMD Fusion? Is it innovative and do you think it will work well?

asked Jul 06 '10 at 22:51

Baoster%20Wowster's gravatar image

Baoster Wowster
756142133

edited Jul 06 '10 at 23:12

Leapo's gravatar image

Leapo
2.2k92246


I think that it will work well, and if Apple plans on switching to AMD as their processor, that would be one of the best times to switch.

answered Jul 06 '10 at 23:10

catchatyou's gravatar image

catchatyou
20.7k91166383

Many motherboards already have integrated graphics. Combining the CPU & GPU is a logic step, especially for AMD & ATI. Eventually everything will be combined into one solid state part including the operating system. If it breaks, replace that part.

Sorry, but no supporting links or pictures ... just the way I see things are going. A bunch of script monkeys for support, and the idiot technicians that just replace parts. No more working on the OS or software, just replace it. Us geeks will die the way of the dinosaurs. The Apple Lemmings will continue to by Apple Gadgets and not question the loss of privacy & ownership of the product they paid for.

answered Jul 09 '10 at 05:11

r0bErT4u's gravatar image

r0bErT4u
31.0k515672938

I wouldn't worry about geeks and hackers going anywhere. You may not have noticed (and I wouldn't blame you) but the cost of home manufacturing is plummeting, and open-source hardware (including CPUs) is making huge progress. For what it cost 20 years ago to OWN a home computer you can now own most of the machines you need to build your own from raw materials. It's a good time for us to be alive. Let the "lemmings" have their brains-free gadgets, we'll hack together our own. In another decade it will be unthinkable for people like us to buy technology pre-assembled.

(Jul 12 '10 at 12:31) Justen Robertson Justen%20Robertson's gravatar image

It will be nice for low-power systems (like low-end desktops) and laptops, assuring that such systems will come with a decent graphics processor onboard. It's not going to change a whole lot for mid-range and high-end desktop computers. Dedicated graphics cards will still be faster than the integrated GPU, so you'll still see plenty of graphics cards being sold.

If they're smart about it, they'll figure out some way to use those extra GPU units on the CPU for general computation when a discreet graphics card is present. That way, they don't go wasted.

answered Jul 06 '10 at 23:18

Leapo's gravatar image

Leapo
2.2k92246

edited Jul 07 '10 at 10:26

AMD is planning to target both netbooks and desktops. AMD demonstrated that the APU developed for netbooks can run modern graphics-intensive games at good quality, so the power of the desktop targeted APU could possibly compete with top-end CPUs and GPUs.

(Jul 07 '10 at 08:47) Baoster Wowster Baoster%20Wowster's gravatar image

Yes, low-end desktops that would otherwise have integrated graphics will benefit from this.

It will, in no way, be able to compete with top end graphics cards. Modern CPU's have 758 million transistors, while modern GPU's have 3 billion transistors.

An APU that could compete with a modern GPU would need to be more than 5 times larger than current CPUs, would have to be rated at over 300 watts, and would need a heatsink the size of a cinder block to keep it cool. The pointless monster of a chip would also cost in the neighborhood of $800. It's just not feasible.

(Jul 07 '10 at 10:22) Leapo Leapo's gravatar image

The netbook oriented version is oriented to be under 10 watts o_o. The desktop oriented version is oriented to be under 100 watts o_o.

(Jul 07 '10 at 10:40) Baoster Wowster Baoster%20Wowster's gravatar image

Maybe you should research a little more... xD

AMD is planning to base the GPU portion of the chip on the Radeon series and the CPU portion of the chip on the Athlon II to Phenom II x64 CPUs. Thus, the transistor count is as high as any combination of current CPUs and GPUs. However, by placing the GPU and CPU on the same chip, AMD has reduced latency between the two units and lowered the power requirements for this APU. The APU is not about pure power and wattage, it's about efficiency. Apparently even 9-watt APUs can run modern games quite well. (This is comparable to Intel's 8.5 watt Atom processors). AMD plans for the quad-core version of their APUs to run at about 55-watts so who knows what an 100-watt APU may be capable of?

(Jul 07 '10 at 12:46) Baoster Wowster Baoster%20Wowster's gravatar image

I've done plenty of research. You, on the other hand, don't appear to have a very firm grasp of just what is, and is not, possible with this type of processor.

What AMD is producing is a low-wattage CPU, with what is the rough equivalent of a Radeon HD5550 class GPU onboard.

As I keep telling you, such a chip will not come anywhere close to matching the performance of high-end graphics cards like the HD5970 or the GeForce GTX480. Like I said before, it would take a chip with almost 4 billion transistors (With 3 billion of those dedicated to graphics) to match current high-end graphics cards. Such a monstrously large processor would need hundreds of watts of power, and put out an absurd amount of heat.

For reference, Fusion CPU's are estimated to be closer to the 1.2 Billion transistor mark. A far-cry from high end GPU's of today.

(Jul 07 '10 at 16:07) Leapo Leapo's gravatar image

Yes, but you have to recognize that AMD Fusion's micro-architecture is completely different from any micro-architecture of today. Currently, increasing performance has been solely about doubling the number of transistors on a piece of silicon every 2 years. Placing the CPU and GPU side by side has its inherent advantages; all data moves within the chip without being sent to an internal bus, the CPU and GPU will be running on par with each other, the GPU will be able to access the cache, etc. These differences in micro-architecture alone are enough to compensate for the transistor count.

(Jul 07 '10 at 22:24) Baoster Wowster Baoster%20Wowster's gravatar image

Increasing performance is NOT based solely upon doubling the number of transistors on a given chip. More efficient instruction sets and faster buses are a major contributing factor as well.

Yes, placing the CPU and GPU side-by-side does have advantages. The small number of GPU Stream Processors that AMD is going to include on these Fusion CPU's perform very efficiently. This DOES NOT make up for the difference in transistor count between the integrated GPU on a Fusion CPU, and a high-end graphics card like the HD5970 or the GTX480.

The integrated GPU on Fusion CPU's performs about the same as an HD5550. That's roughly 32 times slower than an HD5970, and I'll tell you exactly why that's the case:

  • A Fusion CPU will have roughly 30 GB/s of memory bandwidth when using DDR3. Modern GPU's have more like 256 GB/s of memory bandwidth using GDDR5.
  • Not only will it be starved of memory bandwidth, the integrated GPU has to share what little bandwidth there is with the CPU. A Dedicated GPU has its own dedicated RAM.
  • A Fusion CPU will have roughly 320 GPU Stream Processors, an HD5970 has 3200 GPU Stream Processors.

It is not up to par with highend GPU's, end of story. What part of this don't you understand?

(Jul 08 '10 at 23:09) Leapo Leapo's gravatar image

You seem to think that AMD Fusion will be similar to Intel's integrated graphics/GMA... This is not the case. AMD will base its Fusion processors off of discrete graphics cards. Intel's GMAs can only handle Windows Aero. AMD Fusion can handle modern games.

When I say compete with high-end graphics cards, why do you suddenly try to compare AMD Fusion to the GTX480... You can't even compare any other graphics cards of today to the GTX480... Here's a reference list of what is considered a high-end GPU. Drawing a concrete comparison between AMD Fusion and current GPUs/CPUs is impossible. They're considered completely different microarchitectures. We will have to wait for benchmark tests before we can actually place AMD's APUs above or below certain GPUs/CPUs.

Also, would you like to cite those specs please? Any released specs are most likely for the netbook-line of Fusion APUs, which is the only line that AMD has demonstrated publicly; This line is targeted at competing with Intel's Atom line of CPUs and if you're trying to compare an Atom/Netbook-level APU to top-end graphics.. I'd say that's not a fair comparison.

(Jul 09 '10 at 04:22) Baoster Wowster Baoster%20Wowster's gravatar image

Yes, I compared it to a top-end graphics card, because you quite clearly said you thought it would compete with top-end graphics cards. And I quote:

the power of the desktop targeted APU could possibly compete with top-end CPUs and GPUs.

It simply will not be able to compete with anything I would consider "top-end."

Yes, I realize it's quite different from most anything we have today as far as graphics processors go. That doesn't change the fact that you can't squeeze a 5970's worth of hardware on a CPU-size package. The specs will be lower, the performance will be lower. This I can assure you.

Dual Channel DDR3 tops out at 30GB/s. The last high end card that had that little memory bandwidth was the GeForce FX 5950, and some people believe that even its GPU was bottle-necked by it.

Note: This does not mean I don't believe it will perform admirably in modern games. It most likely will, but again, not nearly as well as a real top-end graphics card.

(Jul 09 '10 at 04:35) Leapo Leapo's gravatar image

the power of the desktop targeted APU could possibly compete with top-end CPUs and GPUs.

Not the netbook version that AMD has revealed and demonstrated.

(Jul 09 '10 at 05:34) Baoster Wowster Baoster%20Wowster's gravatar image

Yes, I know. The desktop targeted one has no chance of competing with top-end discreet graphics cards either. It's not technically feasible to make a CPU that large.

Even if they did attempt to go for a GPU that fast on the CPU, you'd also need 8 times more memory lanes on the motherboard, and lots of small sticks of DDR3 in parallel, in order to muster up the massive amount of bandwidth a GPU that fast would require. That's a pointlessly expensive design.

(Jul 09 '10 at 10:53) Leapo Leapo's gravatar image
showing 5 of 11 show all

One of the nice things about the two pieces of hardware being separate is that you can upgrade them independently. It's not too hard to fork out $150 or so every year for a 2-year old video card so you can run modern games at reasonable framerates, but if you have to upgrade an integrated solution you're looking at a larger expense (especially with CPU manufacturers' love of changing sockets and thus motherboards vs. GPUs which tend to stick with the same slot type for half a decade or more). The real problem I think is that PC gaming is not very economical or practical in general, but this is going to be a hard sell to serious PC gamers. As mentioned above it may do well in laptops and netbooks if AMD can manage to keep the heat down, which they're horrible at (as I write, the fan in my tablet PC is howling like a banshee to keep the AMD CPU inside running below 90 degrees centigrade).

answered Jul 07 '10 at 11:19

Justen%20Robertson's gravatar image

Justen Robertson
34139

1

This will not stop you from buying the latest-and-greatest PCIe graphics card as an upgrade to the integrated solution. Better still, when you use a dedicated graphics card, those extra units on the CPU can be used to increase its parallel computing performance.

Installing a dedicated graphics card will be an even bigger performance boost than before. Sounds nice eh?

Also, AMD processors have been very cool-running since the release of the original Athlon 64 back in 2003. I have a Phenom II X4 running at 4GHz right now, and it's idle at only 32c (and load temperature never exceeds 45c). I would check to make sure your heatsink is making proper contact, if I were you.

(Jul 07 '10 at 16:20) Leapo Leapo's gravatar image

My Athlons always ran cool (even with some of the ridiculous overclocking and modding we all used to do to them), but AMD's mobile CPUs are notorious for running hot. I actually can't reseat the heatsink & fan as they used some sort of thermal cement on it and I'm afraid I'll damage the chip if I try to remove it. Worse yet it's one of those nasty little proprietary ducted fans so I'd have to severely mod the chassis to slap something better on it. The whole TouchSmart series is known for its heat & fan issues - I guess that's what we get for being early adopters. Anyway yeah, It may be more of an engineering problem on HP's part than on AMD's, but I continue to see similar complaints of AMD mobile CPUs running hot in other systems so I am blaming them. :)

(Jul 07 '10 at 18:02) Justen Robertson Justen%20Robertson's gravatar image

Yes, I agree with Justen Robertson. Mobile AMD CPUs overheat a lot. The fans and cooling systems on notebooks just can't handle AMD. xD

(Jul 07 '10 at 22:27) Baoster Wowster Baoster%20Wowster's gravatar image
Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Markdown Basics

  • *italic* or __italic__
  • **bold** or __bold__
  • link:[text](http://url.com/ "title")
  • image?![alt text](/path/img.jpg "title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported


Tags:

×273
×236
×177
×73
×17
×6

Asked: Jul 06 '10 at 22:51

Seen: 2,555 times

Last updated: Jul 12 '10 at 12:31