Page 2 of 2 FirstFirst 12
Results 21 to 22 of 22

Thread: intel VS AMD diccussion can AMD outclass intels last stromng point in the CPU market?

  1. #21

    Default Re: intel VS AMD diccussion can AMD outclass intels last stromng point in the CPU market?

    Quote Originally Posted by Condottiere 40K View Post
    Both Advanced Micro Systems and Intel are selling as much as they can produce, and are suffering from capacity crunch, though in the former's case, they claim that happened because they underestimated demand.


    With Intel, it's certain they'll prioritize server over Pentiums, leaving a clear field for AMD to clear up at the lower end.


    Global Foundries is making a profit, because they didn't expend capital to upgrade to seven nanometres (which may turn out to be a mistake); however, AMD is required to purchase a minimum number of wafers from them, which can be the I/Os, chipsets and/or older Ryzens.
    Ryzen I/O is on 12nm, probably GloFlo's 12nm process. Their contract is solidly through 2020.

    Four core APUs for the desktop probably makes little significant difference if they're manufactured at seven, twelve or fourteen nanometres; it does if they're meant to go into laptops. What happened last year was that the three thousand series appear to move into the twelve nanometre production lines vacated by transferring the chiplets over to seven nanometres, and Ryzen Three is scheduled to be manufactured at seven nanometres plus, as well as Apple moving on to five nanometres, which should open up capacity.


    Doesn't necessarily mean that 4200G will have six cores at seven nanometres, though I'm pretty sure that will be the case with the 6200G; or maybe they'll do that for a more premiumized 4400G. I just have hopes for it.
    It's hard to say for sure. I don't expect it in this year or the next since 7nm wafer supply is pretty limited. I also wouldn't be surprised to see even less desktop APUs in general. Like I keep saying, it's a pretty limited market.


    But looking at it from a meta perspective, it makes sense for all that technology and research dedicated to gaming consoles to feedback to laptops and desktop variants, since the concept of the chiplet is to create a simple processing unit that can be scaled through all products, which Threadripper is demonstrating.


    The consumer has to figure out how this benefits him, and how to optimize that; in my case, 2200G has a very acceptable performance, and falls within a specific budget I have for what I'll utilize for, that would have to be balanced with equally economical computer components, cheap RAM and a cheap motherboard, whether used or discounted.


    My usual supplier reports that due to the Wuhan flu, there are going to be shortages and delays, good luck finding facemasks, and that resellers are likely going to hike the prices; there's currently an ongoing discount on Gigabyte products, and I'm wondering if I should pick up a pair of mATXes with four RAM slots while the opportunity presents itself.
    Well, people may say that. What I'm really wondering is how long will they milk these rumors for to justify jacking up prices. The technology for the general public is getting better, but personally I'm more excited for more custom designs.

  2. #22

    Default Re: intel VS AMD diccussion can AMD outclass intels last stromng point in the CPU market?

    Hard to say how much the supply chain is going to be disrupted, my gut feeling says that you might as well calculate in a three month delay as a precaution; the stock market is trying to figure out how much is going to cost in lost productivity.


    As a precaution, and probably as an excuse, I bought three sets of discounted 3000Gs and mATXes, since I doubt I could find any thing cheaper than that with three year guarantees; couldn't find any dirt cheap RAM and ran short from my personal stock, not that it would matter, since two are in reserve.


    I hear the price of the Thirty nine hundred ecks is dropping due to oversupply, so I'm going to guess that seven nanometre wafers are not an issue, at least now.


    My believe is that as long as it makes financial and architectural sense, you add at least minimal graphics onboard, so that you don't need a discrete graphics add in, especially if the user has to do diagnostics. There are several, not necessarily niche, market segments that would want onboard graphics without needing the added complexity of getting an extra component, whether business, or users in developing countries that will be satisfied with okay graphics for gaming.


    For me, the 3000Gs are placeholders, in case I can't find better performers at the price I want and am willing to pay for; can they run Total War? I may find out.
    Eats, shoots, and leaves.

Page 2 of 2 FirstFirst 12

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •