Disclaimer: This post is really my thoughts wandering around. Any resemblance to a professional article on a tech website is desired but completely coincidental.
Odd thing to spark this topic up, but while I was at university and coming home on the train today, many people had laptops. Some were working, others on the net (facebook, youtube, etc) and others were playing games.
What intrigued me was the technology "under the hood" so to speak. While technology is always growing and ever evolving, the past 5 years or so have mainly seen existing technologies in personal computing being added to and being made more powerful. We have more powerful CPUs with more cores. We have more powerful GPUs with more cores, we have higher density RAM that runs at faster speeds. We now utilise flash memory heavily as a permanent form of storage (SSDs) in addition to a temporary or lightly form of storage (flash drives).
To say there have been no technological advancements in the area of personal computing hardware is silly. It's simply not true, yet the technology and hardware advancements we have made since 2006 have mostly been evolutionary in nature. My current PC compared to my old one from 2007 is bigger, faster and stronger, but aside from more USB ports and using SATA over IDE, there's been little difference to either the system composition or what role each component plays in my system.
There is something to say about the so-called move towards cloud computing, or the increasing focus on parallel processing computing but both, particularly the latter is held back by software limitations far more than hardware limitations.
During the next 5 years however and especially after, we will see some major changes to how we view personal computing hardware.
There are many areas here I would like to cover, but the random thoughts buzzing through my head without pictures, graphs or interesting excerpts on the net to provide context would make the read rather un-enjoyable (maybe next time, LightPeak, PT switched mode power supplies, Graphene).
So I'll focus on the white elephant in the room for portable computer users and gamers alike: Accelerated Processing Units.
Now the APU is just a fancy term AMD came up with to classify their AMD Fusion project.
Spoiler Alert, click show to read:
What it really means however, is that they will combine two separate and distinct elements of a computer: the Central Processing Unit and the Graphical Processing Unit and combine it onto a single integrated circuit (chip).
Despite what AMD's marketing department will have you believe, it's hardly a new, nor original concept. The concept of integrating a graphics processor into a system without the need for a physical, separate card has been around for a very long time.
The need for discrete (distinct) graphics processors only came in the 1980s commercially despite computers being in use for decades before then. Able to present images, without the need for a discrete graphics processor. As 2D images became more complicated and detailed, it became too draining for a single chip (the CPU) to handle on it's own in addition to other instructions it needed to process. This, combined with the move to 3D imaging on PCs, created the need for discrete graphics cards.
More recently, the use of integrated chips, which is embedded usually into the motherboard's chipset, makes up around 75% of graphics processors around the world. This makes Intel the largest graphics processor manufacturer on the planet, dwarfing both AMD and Nvidia, having over 50% of total graphics processor market share. Funny, really.
Spoiler Alert, click show to read:
Even more recently, Intel released their Sandy Bridge platform CPUs with an integrated graphics processor attached to the CPU, similar to AMD's Fusion.
So why the big hubbub about AMD's Fusion project if it's not new, or original?
It represents the direction AMD wants to take into the future, and likely, of what is to come from other major players in the industry (Intel, Nvidia, etc).
For the last 15 years or so, integrated graphics chips have had the large and crippling weakness of being rather lightweight and weak for 3D applications, sometimes including videos but most notably games and rendering applications. This is why many people today still use discrete graphics cards.
AMD Fusion aims to radically change how portable and personal computers are seen by providing am amazing Performance per Watt ratio. They aim for (eventually) a high end CPU, a high end GPU with lesser power consumption than it's individual counterparts through efficient design, a smaller transistor size and improved utilisation of both GPU & CPU in parallel.
AMD embarked on this project in 2006. 2006 is also the same year that they bought ATI Technologies, the second largest discrete graphics card manufacturer at the time. It is also the year they lost the performance crown to Intel in the CPU department. They have yet to retake it back since. Interesting coincidence, but it also indicates that AMD Fusion is more than a single product, it's a platform upon which future products and designs are made. Otherwise, why take so long to develop it at the expense of other products?
Spoiler Alert, click show to read:
You said "a platform upon which future products and designs are made." What do you mean by that?
AMD intends to take this method of creating APUs and make it mainstream.
The first generation AMD Fusion products have been aimed at netbooks and low to middle range notebooks. This is in addition to the HTPC market they are targeting with their desktop iterations of Fusion. As time goes on and as AMD refine the process, they will progressively scale up and down from that medium. Meaning they will create higher end and lower end products.
The higher end products will undoubtedly include high end notebooks and eventually move onto middle to high end desktops too.
As for the lower end products, this will mean that AMD will target the mobile market. Tablets in particular as Smartphones are particularly saturated with competition. The tablet market is too, but there is room to maneuver. A phone is a phone no matter what, and will unlikely have the need for a design that can be scaled to be more powerful or to be weaker. The small screen also limits the usefulness of what Fusion aims to do. The tablet however, may have the market as tablets are increasingly being used for movies, games and multitasking.
AMD Fusion has large consequences for these products.
Spoiler Alert, click show to read:
What can we realistically expect from 1st gen. AMD Fusion APUs (Llano & lower)?
Now most of my writing is just hyping up Fusion but let's be reasonable. AMD Fusion will not be some incredible beast giving us HD 6950 and Phenom II x6 performance on a tiny chip with a TDP of 25w. At least not yet.
It is my belief that the first generation AMD Fusion APUs will be very competitive with the current notebook market. It won't provide absolute performance supremacy against it's competition at it's price point, but it will be better in many cases. The CPU of Llano (K10 - Athlon II, not Phenom II as I had previously believed) is not likely to be too much better than Intel's offerings while the GPU will outshine the integrated graphics processor and will be the cause of much of Fusion's initial success. The TDP is likely to be higher, but adequate for the extra graphical performance.
Performance per Watt, it will be better. It is my opinion however, that AMD Fusion will not blow Intel's offerings out of the water.
Spoiler Alert, click show to read:
What does this all mean for my discrete graphics cards?
In blunt terms, it means the era of discrete graphics cards is waning and may die off.
AMD has half of the discrete graphics card market. They have made it abundantly clear that they wish to move onto an APU platform instead of carrying both a CPU and GPU brand (though separate CPU development will continue for the high end desktop and especially server markets). This isn't going to happen overnight, it will be a gradual change, spanning the better part of a decade. It is however, the first step.
Nvidia too, essentially the other half of the discrete GPU market, has signalled a different path for the future.
At the Consumer Electronic Show 2011 in Las Vegas, Nvidia announced plans to develop an ARM based processor with an integrated graphics processor built for high performance computing. Coupled with the announcement that Windows will release Windows 8 with compatibility for both x86 and ARM processors, this likely means Nvidia could in the future, aim for the desktop market too.
Spoiler Alert, click show to read:
As we see both of graphics processors two biggest manufacturers move away from the shrinking discrete graphics card market and diversifying into a SoC (system on chip) type product, obviously research and development into making bigger and better graphics cards could be compromised. This is compounded by the simple fact that software and application development lags behind hardware development by a significant margin. Software doesn't conform to Moore's Law (though I guess hardware has already surpassed Moore's expectations), it doesn't potentially become twice as complex every year, especially games. Whereas hardware doubles it's transistor count every year according to the law, giving it a potential to multiply in performance by 2 every year.
4870 -> 5870
GTX 285 -> GTX 480
It's not like it's unheard of.
It will eventually reach a tipping point whereupon the graphical performance of the integrated chips will match that of the discrete or will be close to it.
To put it more simply, if you could have a Phenom II x6 1055T and a Radeon HD 6870 today on a single APU chip that consumes less space, creates less heat and requires less power than either the 1055T or HD 6870 by itself, would you not buy the APU if it was at a similar price? You would have to have a very good reason to skip on the APU.
That situation is very possible in the future. Making it difficult to justify manufacturing all but the most powerful discrete graphics cards, which ironically makes the least money. Large quantities of low to middle end graphics cards is where most of the profit is.
Spoiler Alert, click show to read:
Keep in mind, this is all my opinion and speculation (aside from the quotes in the articles of course), but I do find it probable that with both AMD and Nvidia focusing on more integrated solutions, the demand for discrete graphics cards will shrink, making it become more of a novelty, not a necessity for gamers within the next 10 years if not sooner.
It may not die out (in fact, it may never die out), but expect to see the low and even mid range graphics card market shrink substantially if not disappear altogether.
Anyways, that's what more free time gets me. A lot of writing. I really should've used this time to writeup a ToTW entry, haven't gotten around to writing one in months even though I've reserved like half a dozen of the contests. Mega is gonna be pissed.
If you haven't blown your brains out from boredom by now, thanks for reading.
Think I'll actually post something else like this tomorrow, don't know, when I feel like writing, I guess I just feel like writing.![]()








Reply With Quote











