Results 1 to 12 of 12

Thread: The incoherent ramblings of Mr. Crusty: AMD Fusion & the future of discrete GPUs.

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    mrcrusty's Avatar Primicerius
    Join Date
    Jan 2007
    Location
    Australia
    Posts
    3,090

    Default The incoherent ramblings of Mr. Crusty: AMD Fusion & the future of discrete GPUs.

    Disclaimer: This post is really my thoughts wandering around. Any resemblance to a professional article on a tech website is desired but completely coincidental.

    Odd thing to spark this topic up, but while I was at university and coming home on the train today, many people had laptops. Some were working, others on the net (facebook, youtube, etc) and others were playing games.

    What intrigued me was the technology "under the hood" so to speak. While technology is always growing and ever evolving, the past 5 years or so have mainly seen existing technologies in personal computing being added to and being made more powerful. We have more powerful CPUs with more cores. We have more powerful GPUs with more cores, we have higher density RAM that runs at faster speeds. We now utilise flash memory heavily as a permanent form of storage (SSDs) in addition to a temporary or lightly form of storage (flash drives).

    To say there have been no technological advancements in the area of personal computing hardware is silly. It's simply not true, yet the technology and hardware advancements we have made since 2006 have mostly been evolutionary in nature. My current PC compared to my old one from 2007 is bigger, faster and stronger, but aside from more USB ports and using SATA over IDE, there's been little difference to either the system composition or what role each component plays in my system.

    There is something to say about the so-called move towards cloud computing, or the increasing focus on parallel processing computing but both, particularly the latter is held back by software limitations far more than hardware limitations.

    During the next 5 years however and especially after, we will see some major changes to how we view personal computing hardware.

    There are many areas here I would like to cover, but the random thoughts buzzing through my head without pictures, graphs or interesting excerpts on the net to provide context would make the read rather un-enjoyable (maybe next time, LightPeak, PT switched mode power supplies, Graphene).

    So I'll focus on the white elephant in the room for portable computer users and gamers alike: Accelerated Processing Units.


    Now the APU is just a fancy term AMD came up with to classify their AMD Fusion project.

    Spoiler Alert, click show to read: 

    Not to mention it's a popular character on The Simpsons.


    What it really means however, is that they will combine two separate and distinct elements of a computer: the Central Processing Unit and the Graphical Processing Unit and combine it onto a single integrated circuit (chip).

    Despite what AMD's marketing department will have you believe, it's hardly a new, nor original concept. The concept of integrating a graphics processor into a system without the need for a physical, separate card has been around for a very long time.

    The need for discrete (distinct) graphics processors only came in the 1980s commercially despite computers being in use for decades before then. Able to present images, without the need for a discrete graphics processor. As 2D images became more complicated and detailed, it became too draining for a single chip (the CPU) to handle on it's own in addition to other instructions it needed to process. This, combined with the move to 3D imaging on PCs, created the need for discrete graphics cards.

    More recently, the use of integrated chips, which is embedded usually into the motherboard's chipset, makes up around 75% of graphics processors around the world. This makes Intel the largest graphics processor manufacturer on the planet, dwarfing both AMD and Nvidia, having over 50% of total graphics processor market share. Funny, really.

    Spoiler Alert, click show to read: 

    Integrated v Discrete graphics on an AMD/ATI desktop chipset + Integrated v Discrete on notebooks.
    Read the full article here.


    Total graphics market share. Q2, 2010.
    Read the full article here.


    Even more recently, Intel released their Sandy Bridge platform CPUs with an integrated graphics processor attached to the CPU, similar to AMD's Fusion.

    So why the big hubbub about AMD's Fusion project if it's not new, or original?

    It represents the direction AMD wants to take into the future, and likely, of what is to come from other major players in the industry (Intel, Nvidia, etc).

    For the last 15 years or so, integrated graphics chips have had the large and crippling weakness of being rather lightweight and weak for 3D applications, sometimes including videos but most notably games and rendering applications. This is why many people today still use discrete graphics cards.

    AMD Fusion aims to radically change how portable and personal computers are seen by providing am amazing Performance per Watt ratio. They aim for (eventually) a high end CPU, a high end GPU with lesser power consumption than it's individual counterparts through efficient design, a smaller transistor size and improved utilisation of both GPU & CPU in parallel.

    AMD embarked on this project in 2006. 2006 is also the same year that they bought ATI Technologies, the second largest discrete graphics card manufacturer at the time. It is also the year they lost the performance crown to Intel in the CPU department. They have yet to retake it back since. Interesting coincidence, but it also indicates that AMD Fusion is more than a single product, it's a platform upon which future products and designs are made. Otherwise, why take so long to develop it at the expense of other products?

    Spoiler Alert, click show to read: 
    Fusion is expected to make its debut in the in late 2008/early 2009 timeframe. AMD said that it will use the technology within all major computing segments, including mobile, desktop, workstation, server, consumer electronics as well as products for emerging markets. Further details were not provided, but it is obvious that AMD can play with several ideas, ranging from strategies that reduce the cost of today's CPU/GPU combinations to high performance platforms that leverage the floating point capabilities of graphics engines. AMD believes that "modular processor designs leveraging both CPU and GPU compute capabilities will be essential in meeting the requirements of computing in 2008 and beyond.
    I think AMD set their watch behind by two years, but that's okay. Better late than never.
    Read the full article here.


    You said "a platform upon which future products and designs are made." What do you mean by that?

    AMD intends to take this method of creating APUs and make it mainstream.

    The first generation AMD Fusion products have been aimed at netbooks and low to middle range notebooks. This is in addition to the HTPC market they are targeting with their desktop iterations of Fusion. As time goes on and as AMD refine the process, they will progressively scale up and down from that medium. Meaning they will create higher end and lower end products.

    The higher end products will undoubtedly include high end notebooks and eventually move onto middle to high end desktops too.

    As for the lower end products, this will mean that AMD will target the mobile market. Tablets in particular as Smartphones are particularly saturated with competition. The tablet market is too, but there is room to maneuver. A phone is a phone no matter what, and will unlikely have the need for a design that can be scaled to be more powerful or to be weaker. The small screen also limits the usefulness of what Fusion aims to do. The tablet however, may have the market as tablets are increasingly being used for movies, games and multitasking.

    AMD Fusion has large consequences for these products.

    Spoiler Alert, click show to read: 
    Seifert also addressed the tablet question, the issue that some said had forced Meyer out. Over the last couple of months, he said, usage models had shifted the market in favor of AMD, including 3D graphics and higher overall performance.

    AMD's customers may prefer to wait until the second-generation APU products coming next year before AMD can address a significant amount of those form factors, Seifert said. But some tablet makers are buying AMD chips now, he added.

    "So we see that there are parts of this form factor that move in our direction," Seifert said, according to the official transcript. "And we were really happy to see the Acer just in the beginning of the month launched the first Windows based tablet based on our low power Bobcat architecture. So we see that there are usage scenarios in this tablet segment that move towards our capability. At the same time, we are continuing to work hard on reducing our power consumption for our products in the low power segment and the same time being able to deliver and continue to deliver cutting edge graphics capability. So especially with the products--with the second generation APU products that are on the roadmap for next year, we feel confident that we can also address a significant amount of those form factors out in the tablet space."

    He also said that a significant source of profits remained the server segment, and that it was important "not to be distracted from where those opportunities are".
    AMD Interim CEO Thomas Seifert on AMD Fusion after Dirk Meyer resigns as AMD CEO.
    Read the full article here.

    “Think about the advantages you get from a power performance and a form-factor perspective when you can take GPU and CPU and put them on the same die. You get enormous power efficiencies which is going to enable, not only things like tablet and slate, but these great experiences when you use it,” said Leslie Sobon, vice president of marketing at AMD, in an interview with Techradar web-site.

    For example, the most popular slate-type PC of today – Apple iPad – lacks Adobe Flash, multi-tasking and a number of other crucial features. Everything that Apple iPad and its successors with ARM processors inside lack will be fully supported by AMD’s Ontario processors, which will ultimately revolutionize the market of tablet personal computers.

    "The form factor is one thing but you have to be able to have a great experience with a tablet. Multi-task, watch flash videos; there are things you are going to want to do as a user. People know what they want and web video is pretty much in the top three of every wish list.They might not think of it as web video – they certainly don’t think of it as Flash – all they know it when they go to YouTube they want it to work. They don't want it to flicker and all the better if they can get it in HD or upscale it and that kind of thing,” said Ms. Sobon.
    Fusion is teh greetest.
    Read the full article here.



    What can we realistically expect from 1st gen. AMD Fusion APUs (Llano & lower)?

    Now most of my writing is just hyping up Fusion but let's be reasonable. AMD Fusion will not be some incredible beast giving us HD 6950 and Phenom II x6 performance on a tiny chip with a TDP of 25w. At least not yet.

    It is my belief that the first generation AMD Fusion APUs will be very competitive with the current notebook market. It won't provide absolute performance supremacy against it's competition at it's price point, but it will be better in many cases. The CPU of Llano (K10 - Athlon II, not Phenom II as I had previously believed) is not likely to be too much better than Intel's offerings while the GPU will outshine the integrated graphics processor and will be the cause of much of Fusion's initial success.
    The TDP is likely to be higher, but adequate for the extra graphical performance.

    Performance per Watt, it will be better. It is my opinion however, that AMD Fusion will not blow
    Intel's offerings out of the water.

    Spoiler Alert, click show to read: 

    A video showing off AMD Fusion's Llano against a competing Intel Sandy Bridge computer.


    What does this all mean for my discrete graphics cards?

    In blunt terms, it means the era of discrete graphics cards is waning and may die off.

    AMD has half of the discrete graphics card market. They have made it abundantly clear that they wish to move onto an APU platform instead of carrying both a CPU and GPU brand (though separate CPU development will continue for the high end desktop and especially server markets). This isn't going to happen overnight, it will be a gradual change, spanning the better part of a decade. It is however, the first step.

    Nvidia too, essentially the other half of the discrete GPU market, has signalled a different path for the future.

    At the Consumer Electronic Show 2011 in Las Vegas, Nvidia announced plans to develop an ARM based processor with an integrated graphics processor built for high performance computing. Coupled with the announcement that Windows will release Windows 8 with compatibility for both x86 and ARM processors, this likely means Nvidia could in the future, aim for the desktop market too.

    Spoiler Alert, click show to read: 
    No, NVIDIA didn't finally take the wraps off its x86 project—assuming that it hasn't been cancelled, that's still a secret. But the chipmaker did unveil Project Denver, a desktop-caliber ARM processor core that's aimed squarely at servers and workstations, and will run the ARM port of Windows 8. This is NVIDIA's first attempt at a real general-purpose microprocessor design that will compete directly with Intel's desktop and server parts.

    The company has offered nothing in the way of architectural details, saying only that the project exists and that the company has had a team of crack CPU architects working secretly on it for some time. Indeed, NVIDIA CEO Jen-Hsuan's very brief but dramatic announcement of Denver raised more questions than it answered.
    Nvidia announces that it will make an ARM-based CPU for servers and workstations and... not much else.
    Read the full article here. Recommended very good article + Fusion article, though I disagree with it's conclusion.


    As we see both of graphics processors two biggest manufacturers move away from the shrinking discrete graphics card market and diversifying into a SoC (system on chip) type product, obviously research and development into making bigger and better graphics cards could be compromised. This is compounded by the simple fact that software and application development lags behind hardware development by a significant margin. Software doesn't conform to Moore's Law (though I guess hardware has already surpassed Moore's expectations), it doesn't potentially become twice as complex every year, especially games. Whereas hardware doubles it's transistor count every year according to the law, giving it a potential to multiply in performance by 2 every year.

    4870 -> 5870

    GTX 285 -> GTX 480

    It's not like it's unheard of.

    It will eventually reach a tipping point whereupon the graphical performance of the integrated chips will match that of the discrete or will be close to it.

    To put it more simply, if you could have a Phenom II x6 1055T and a Radeon HD 6870 today on a single APU chip that consumes less space, creates less heat and requires less power than either the 1055T or HD 6870 by itself, would you not buy the APU if it was at a similar price? You would have to have a very good reason to skip on the APU.

    That situation is very possible in the future. Making it difficult to justify manufacturing all but the most powerful discrete graphics cards, which ironically makes the least money. Large quantities of low to middle end graphics cards is where most of the profit is.

    Spoiler Alert, click show to read: 
    To really answer the question “will discrete GPUs die out?” we need to look at the quantum level. Despite having the power budget for nearly unbounded performance, one of the bottlenecks for discrete GPUs is PCI Express, the interface to the system. In the case of graphics workloads, normally PCI Express does not present a constant bottleneck, however, everyone and their dog has heard of parallel computing. This is the case where there is a lot of traffic between the discrete GPU, CPU and main memory, and PCI Express becomes a liability in terms of bandwidth and latency.
    Many people do not realize this, but the “life” of a CPU is dreadful and boring. It exists primarily waiting for data. The use of discrete GPUs for parallel computing through PCI Express will not improve the “quality of life” of the CPU. While some discrete GPUs will offer graphics and pure FLOP performance over an APU, the performance will be limited, in some cases, by the PCI Express interconnect.

    This is where AMD Fusion APUs will shine. AMD Fusion APUs have not only been designed to offer great graphics performance, they also have been designed to offer great parallel compute performance. The fact that the CPU core resides next to the GPU core connected by a bus of mere nanometers, helps diminish the bandwidth and latency issues presented to parallel computing on a PCIE bus.
    The design plan for successive generation of AMD APUs includes architectural innovation, as well as tighter and faster interconnects between the CPU cores and the GPU cores. One goal is to advance the parallel compute capabilities without sacrificing x86 and graphics performance.

    So, to finally answer the question whether discrete GPUs will die, the answer is: Hell no
    Don't disagree with me dammit.
    Read the full article here.



    Keep in mind, this is all my opinion and speculation (aside from the quotes in the articles of course), but I do find it probable that with both AMD and Nvidia focusing on more integrated solutions, the demand for discrete graphics cards will shrink, making it become more of a novelty, not a necessity for gamers within the next 10 years if not sooner.

    It may not die out (in fact, it may never die out), but expect to see the low and even mid range graphics card market shrink substantially if not disappear altogether.

    Anyways, that's what more free time gets me. A lot of writing. I really should've used this time to writeup a ToTW entry, haven't gotten around to writing one in months even though I've reserved like half a dozen of the contests. Mega is gonna be pissed.

    If you haven't blown your brains out from boredom by now, thanks for reading.



    Think I'll actually post something else like this tomorrow, don't know, when I feel like writing, I guess I just feel like writing.
    Last edited by mrcrusty; March 02, 2011 at 05:04 AM.


  2. #2
    ♔DARTH LEGO♔'s Avatar Vicarius
    Join Date
    Feb 2009
    Location
    Durban, South Africa
    Posts
    2,593

    Default Re: The incoherent ramblings of Mr. Crusty: AMD Fusion & the future of discrete GPUs.

    nice info and a good read

    + cookies for you.

  3. #3

    Default Re: The incoherent ramblings of Mr. Crusty: AMD Fusion & the future of discrete GPUs.

    Ironically, this is the format(practical) which I was looking for to explain new AMD stuff. Thank you!

  4. #4
    Top-Tier-Tech's Avatar Protector Domesticus
    Join Date
    Feb 2009
    Location
    USA, state of Minnesota
    Posts
    4,258

    Default Re: The incoherent ramblings of Mr. Crusty: AMD Fusion & the future of discrete GPUs.

    I'm going to have to read this later when I have the time but I'd just like to throw in that I own the new Zacate APU platform, The ASUS E35M1-M PRO. It score well in WEI, and I have it running entirely stagnant passive (no airflow at all) and it runs between 40 and 50C whilst pushing out 1920x1080 over HMDI and having a medium load on the CPU. I would like to note that the CPU doesn't seem to be optimized very well as whatever is being used on the one core shows a perfectly inversed useage graph on the other core in most tasks while monitoring it in task manager. Regardless it seems to be a great low-wattage platform.
    My Gaming PC
    CPU: intel i7-2600k Quad-core @ 3.80Ghz.
    Motherboard: Asus Sabertooth P67
    RAM: 8GB G.SKILL Ares DDR3 1600
    GPU: 2, Zotac 448 core GTX 560ti's in SLI
    Storage: Crucial M4 256GB SSD
    PSU: Corsair CMPSU-1000HX Semi-modular
    Case: Coolermaster Cosmos II XL-ATX Full Tower
    Heatsink: Thermaltake HR-02 Passive CPU Cooler
    Keyboard: Logitech G19 with LCD Display
    Mouse: Logitech G700 Wireless
    Screens: LG Infinia 55LW5600 55 inch LED ~ Cinema 3D ~ 3 in Nvidia 3D Surround

  5. #5

    Default Re: The incoherent ramblings of Mr. Crusty: AMD Fusion & the future of discrete GPUs.

    I like incoherent ramblings.


  6. #6
    Freddie's Avatar The Voice of Reason
    Patrician

    Join Date
    Oct 2004
    Location
    UK
    Posts
    9,534

    Default Re: The incoherent ramblings of Mr. Crusty: AMD Fusion & the future of discrete GPUs.

    In blunt terms, it means the era of discrete graphics cards is waning and may die off.
    I doubt very much the discrete market will die off completely but I do agree the market is shrinking.

    From what I understand about Fusion is that its more then just a CPU with a GPU on the same die like Sandybridge but in the future it will be able to process takes with optimum efficiency as the APU can process command in both serial and parallel depending on the task in hand.

    Hats of to AMD, I've always bought Intel processors but credit where it's due Fusion/Llano will either bury Atom or make Intel revise the whole platform and Fusion is the way forward in my book.

    On a side note I can't wait to see who well Windows 8 runs on a ARM platform, if it works then no more x86 for a lot of computers I just hope ARM can capitalise on it's successes and doesn't make the mistakes of its predecessor (Acorn Computers).

  7. #7

    Default Re: The incoherent ramblings of Mr. Crusty: AMD Fusion & the future of discrete GPUs.

    I don't believe that discrete video cards will die off but they will become higher end and more specialized in my opinion. The situation today is somewhat analogous to the rise and eventual replacement of discrete sound cards in systems back in the late '80s through the '90s. Remember it used to be that most systems didn't even have or need sound cards - these cards were strictly for gamers. Then as gaming became more mainstream those sound cards were replaced for the most part by integrated chips. These days a sound card isn't needed by the vast majority who don't care what kind of sound quality they get from their computers - they just want to be able to listen to music and watch some streaming video. Sound cards are for the true gamers and audiophiles. Maybe the GPUs of the future will focus on 3d gaming or some such. Who knows where this is headed.

    On a tangent - your ramblings made me think of the old monochrome Hercules cards for some reason. As the idea of gaming became more visually oriented (instead of NetHack type games or the old MUDs) those cards were replaced with EGA, VGA, SVGA etc. It's interesting to be able to look back at the progress over the years and see where we've ended up. Pretty cool actually. And yes - I am a dinosaur.
    Last edited by PoleCat; August 10, 2011 at 02:02 AM.
    Piss Poor Tech Support of Last Resort

  8. #8
    mrcrusty's Avatar Primicerius
    Join Date
    Jan 2007
    Location
    Australia
    Posts
    3,090

    Default Re: The incoherent ramblings of Mr. Crusty: AMD Fusion & the future of discrete GPUs.

    Quote Originally Posted by Freddie View Post

    On a side note I can't wait to see who well Windows 8 runs on a ARM platform, if it works then no more x86 for a lot of computers I just hope ARM can capitalise on it's successes and doesn't make the mistakes of its predecessor (Acorn Computers).
    Funny, that is actually the next topic I want to write about. Not necessarily Windows 8 but about ARM & x86.


  9. #9

    Default Re: The incoherent ramblings of Mr. Crusty: AMD Fusion & the future of discrete GPUs.

    Sticked for the time being.
    +rep

  10. #10
    Top-Tier-Tech's Avatar Protector Domesticus
    Join Date
    Feb 2009
    Location
    USA, state of Minnesota
    Posts
    4,258

    Default Re: The incoherent ramblings of Mr. Crusty: AMD Fusion & the future of discrete GPUs.

    It's a whole lot more likely people won't notice this while it's stickied you know
    My Gaming PC
    CPU: intel i7-2600k Quad-core @ 3.80Ghz.
    Motherboard: Asus Sabertooth P67
    RAM: 8GB G.SKILL Ares DDR3 1600
    GPU: 2, Zotac 448 core GTX 560ti's in SLI
    Storage: Crucial M4 256GB SSD
    PSU: Corsair CMPSU-1000HX Semi-modular
    Case: Coolermaster Cosmos II XL-ATX Full Tower
    Heatsink: Thermaltake HR-02 Passive CPU Cooler
    Keyboard: Logitech G19 with LCD Display
    Mouse: Logitech G700 Wireless
    Screens: LG Infinia 55LW5600 55 inch LED ~ Cinema 3D ~ 3 in Nvidia 3D Surround

  11. #11
    iPwntUrMum's Avatar Miles
    Join Date
    Oct 2009
    Location
    Arkansas, US
    Posts
    330

    Default Re: The incoherent ramblings of Mr. Crusty: AMD Fusion & the future of discrete GPUs.

    Quote Originally Posted by ChaobSiroc View Post
    It's a whole lot more likely people won't notice this while it's stickied you know
    Like me.

    ...pointless post, sorry.
    I liked the incoherent rambling, though I only understood some of it.
    Processor: Intel Core 2 Quad Q8300 @ 2.5 GHz RAM: 4 GB 800 MHz
    Graphics: XFX ATi Radeon HD 5770
    OS: 64-bit Vista Home Premium Case: Dell Inspiron 518
    Keyboard: Logitech G15 Gaming Keyboard Mouse/Surface: Razer Imperator/Vespula

  12. #12

    Default Re: The incoherent ramblings of Mr. Crusty: AMD Fusion & the future of discrete GPUs.

    Interesting, thanks.
    What do we say to the God of Death? Not today.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •