Page 6 of 14 FirstFirst 1234567891011121314 LastLast
Results 101 to 120 of 283

Thread: Promised performance increases observed - with Patch 16.1 - updated 05.02.2015

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1

    Default Re: CA is joking us people with capable hardware

    Quote Originally Posted by Crazyeyesreaper View Post
    Alot of it game engine but Direct X is fairly deficient as well. No matter how well threaded they make the game there are still limitations via Direct X.

    After all there is a reason AMD's Mantle API can speed up Battlefield 4 by up to 45% and keep in mind that engine will scale to 8 cores sadly Direct X means not all those cores are used properly regardless Frostbite engine is one of the better optimized engines and it can still see a 45% increase from a better base API.

    Between API and Engine limitations CA has hit the wall and hard. They need to start from scratch but having pissed away their money its not gonna happen. CA will be ridden into the ground by SEGA for quick profits.
    45% I'll believe that when I see it! (well up to could mean 5%...then I suspect people will start calling it a fail which it is smelling of atm).

    p.s if mantle is ever realeased that is..

    yep they need to start from scratch. x years have proven it isnt working. (well to some extent but considering the poor a.i plus poor performance/optimisation something aint right).

  2. #2

    Default Re: CA is joking us people with capable hardware

    Quote Originally Posted by Totalheadache View Post
    45% I'll believe that when I see it! (well up to could mean 5%...then I suspect people will start calling it a fail which it is smelling of atm).

    p.s if mantle is ever realeased that is..

    yep they need to start from scratch. x years have proven it isnt working. (well to some extent but considering the poor a.i plus poor performance/optimisation something aint right).

    Mantle is already released and working. Developers are using it already. It's just some developers are not very organised and rush games out such as in BF4. It's not because mantle itself isn't ready, they just haven't been able to implement it due to the massive fail release that BF4 went through.
    Some games are running Mantle with even better frame rate boosts than 45% though, I have seen youtube videos where some games are getting 100-200% boosts in performance over DX11.

    A lot of the performance gain will still depend on the capabilities of the game engine itself of course. There is absolutely no doubt in my mind that Mantle will totally replace DX11 for games within the next few years, if not sooner. Developers prefer it because it cuts out a lot of work for them too.

  3. #3

    Default Re: CA is joking us people with capable hardware

    Quote Originally Posted by Tr0tskY View Post
    Mantle is already released and working. Developers are using it already. It's just some developers are not very organised and rush games out such as in BF4. It's not because mantle itself isn't ready, they just haven't been able to implement it due to the massive fail release that BF4 went through.
    Some games are running Mantle with even better frame rate boosts than 45% though, I have seen youtube videos where some games are getting 100-200% boosts in performance over DX11.

    A lot of the performance gain will still depend on the capabilities of the game engine itself of course. There is absolutely no doubt in my mind that Mantle will totally replace DX11 for games within the next few years, if not sooner. Developers prefer it because it cuts out a lot of work for them too.
    well bf4 wasnt a disaster for me. only had a few crashes....

    it's funny how Mantle is late and people are all crying "it's dices fault". link me some proof that this is the case.

    actually we all know zip. so as it stands it's up in the air. how on earth have u seen vids of mantle when from what i can see it's unreleased...!!!

    p.s and just because someone says that mantle will do this and that doesnt mean it is true. dont get me wrong it might do but until the thing is properly released and analysed we can hypothese all we want it means diddly squat!!!

    it gets tiresome all this speculation. just wait and then make claims.

    it shows a lack of intelligence to say "x is going to do this etc when there is nothing to back it up with". cmon!

  4. #4

    Default Re: CA is joking us people with capable hardware

    Quote Originally Posted by Totalheadache View Post
    well bf4 wasnt a disaster for me. only had a few crashes....

    it's funny how Mantle is late and people are all crying "it's dices fault". link me some proof that this is the case.

    actually we all know zip. so as it stands it's up in the air. how on earth have u seen vids of mantle when from what i can see it's unreleased...!!!

    p.s and just because someone says that mantle will do this and that doesnt mean it is true. dont get me wrong it might do but until the thing is properly released and analysed we can hypothese all we want it means diddly squat!!!

    it gets tiresome all this speculation. just wait and then make claims.

    it shows a lack of intelligence to say "x is going to do this etc when there is nothing to back it up with". cmon!

    It's hilarious that you round calling others uninformed and so on.

    http://www.youtube.com/watch?v=9rQZesWWGIs

    Here is a running, working tech demo using Mantle technology.

    Yes the the BF situation IS Dice's fault. Mantle is working right now it just needs to be implemented by the game devs.

  5. #5
    alQamar's Avatar Citizen
    Join Date
    Jul 2012
    Location
    Dortmund, Germany
    Posts
    5,963

    Default Re: CA is joking us people with capable hardware

    Maybe you are right, I still hope that the beta will not reflect the final results, still I haven't seen any submissions of improvements with screenies and comparisons (8.1 to 9 beta)
    NEW: Total War Saga: Britannia benchmark thread - last update: 10.05.2018
    HOW-TO-step-up-from-MBR-CSM-LEGACY-BOOT-to-UEFI-GPT
    Many of my past contributions in the time from 2011-2017 will contain content that now show broken links. Unfortunately I had to delete all pictures linked on TWC that were hosted on imageshack.us. Read why
    If you are missing anything of interest, please let me know. Sorry for any inconvinience caused.

  6. #6
    alQamar's Avatar Citizen
    Join Date
    Jul 2012
    Location
    Dortmund, Germany
    Posts
    5,963

    Default Re: CA is joking us people with capable hardware

    I tested the scenario with a Ivy Bridge 3770K and a 670 FTW, complete different Windows and hardware > Similar results. It is a shame Jabberwock commented my work this way, like it would be local / specific problem.
    Infact this showed me my results are in fact generally.
    Last edited by alQamar; January 26, 2014 at 08:48 AM.
    NEW: Total War Saga: Britannia benchmark thread - last update: 10.05.2018
    HOW-TO-step-up-from-MBR-CSM-LEGACY-BOOT-to-UEFI-GPT
    Many of my past contributions in the time from 2011-2017 will contain content that now show broken links. Unfortunately I had to delete all pictures linked on TWC that were hosted on imageshack.us. Read why
    If you are missing anything of interest, please let me know. Sorry for any inconvinience caused.

  7. #7

    Default Re: CA is joking us people with capable hardware

    alQamar , I think most of the problems are caused by not enough Vram .
    I don't think that a game like Rome 2 doesn't use more than 2.5GB.
    I mean if you think about all the textures at once ...

    But I will reinstall Rome 2 today , so I can do some testing with my Radeon HD 7970 GHz , 3GB graphics card.
    I think this card is more than enough to run Rome 2

  8. #8

    Default Re: CA is joking us people with capable hardware

    Quote Originally Posted by Leonidas II View Post
    alQamar , I think most of the problems are caused by not enough Vram .
    I don't think that a game like Rome 2 doesn't use more than 2.5GB.
    I mean if you think about all the textures at once ...

    But I will reinstall Rome 2 today , so I can do some testing with my Radeon HD 7970 GHz , 3GB graphics card.
    I think this card is more than enough to run Rome 2
    There actually aren't that many different textures to load in Rome 2 and they aren't even high res.

    Rome 2 defnitely cannot use more than 3GB at max because it is a 32 bit game. No game in the world that runs on 32 bit can use more than 3GB.

    http://en.wikipedia.org/wiki/3_GB_barrier

    There is some slight variance depending on the mobo and so on but its around this value. Always.

  9. #9

    Default Re: CA is joking us people with capable hardware

    Quote Originally Posted by Tr0tskY View Post
    Rome 2 defnitely cannot use more than 3GB at max because it is a 32 bit game. No game in the world that runs on 32 bit can use more than 3GB.
    http://en.wikipedia.org/wiki/3_GB_barrier
    Wrong, u only RAM is limited not VRAM. Every 32 application bit can use as much VRAM as it wants. Even 4 or 8 or more GB.

    Also, if Rome 2 would support 8xMSAA or 16xMSAA then the amount of VRAM allocated would be much higher just because the driver needs that for the anti-aliasing feature.

    So, 32 bit applications can easily exceed 3GB VRAM.

  10. #10

    Default Re: CA is joking us people with capable hardware

    Quote Originally Posted by A Barbarian View Post
    Wrong, u only RAM is limited not VRAM. Every 32 application bit can use as much VRAM as it wants. Even 4 or 8 or more GB.

    Also, if Rome 2 would support 8xMSAA or 16xMSAA then the amount of VRAM allocated would be much higher just because the driver needs that for the anti-aliasing feature.

    So, 32 bit applications can easily exceed 3GB VRAM.
    I had to check this because I wasn't sure about it.

    Turns out you are incorrect.

    Any 32bit application has a maximum memory address limit. This is just a fact. It absolutely cannot map any system memory beyond the 32 bit address space. I'm sorry.

    This means the max combined memory (VRAM+RAM) That any 32bit game uses 4GB. This is why games like skyrim crash all the time when people start throwing in lots of extra mods and increasing UGRIDs. Memory leaks all over the place from running a 32bit application and exceeding the memory limit. Crashes happen within seconds or minutes in this case.

    In any case, it is known for fact that Rome 2 cannot use more than 3gb of VRAM at the moment. It's just not possible. 32 bit limits and all that.

    I don't know why you are talking about MSAA and stuff like that, post processing stuff like this is ran mostly by the calculations in the GPU not by massively inflating the video memory. That's why it causes such a frame drop.
    Last edited by Tr0tskY; January 26, 2014 at 01:29 PM.

  11. #11

    Default Re: CA is joking us people with capable hardware

    Quote Originally Posted by Tr0tskY View Post
    Any 32bit application has a maximum memory address limit. This is just a fact. It absolutely cannot map any system memory beyond the 32 bit address space. I'm sorry.
    1) VRAM is not directly addressed like RAM. Only a small 'window' of the whole VRAM is mapped to the application. Therefore there is no 32 bit address limitation. Also, 32 bit applications can use 64 integers and can for example also read/write files much larger than 4GB.

    2) MSAA is no post processing filter. MLAA, FXAA and SMAA are and it's true that they require much less VRAM than MSAA. But MSAA (which is an optimized version of SSAA) requires a lot of VRAM because they the graphics are rendered in a higher resolution internally.

    3) Skryim, which is a 32 bit application, can easily create VRAM requirements above 4GB. Just use a high screen resolution, like 2560x1440, very high resolution textures and add 4xSSAA (Here is someone with a Titan and his Skyrim is using >5GB VRAM http://forum.step-project.com/showthread.php?tid=2040)
    Last edited by A Barbarian; January 26, 2014 at 07:58 PM.

  12. #12

    Default Re: CA is joking us people with capable hardware

    Actually patch 9 is the only one that brough some performance increase. Game still fails to load GPU properly due to CPU-bottleneck, but now at least it's able to trigger GPU-boost, lol. Before Rome 2 caused my GTX 780 ti to drop frequency to 810MHz and GPU load was about 40%. Now it's 1251 and 55% (peak, 67%) correspondingly. No improvement in the benchmark (or a slight one - about 3 fps), but huge improvement on the actual battlefield - i don't get single-digit fps when armies clash or when i zoom in anymore. 40v40 battles are playable, sieges... not so much =). However, zombie armies still plague my game, even in the benchmark when the camera shows soldiers marching, the 6th row and further get thin legs and zombie faces, then textures pop up when they come closer. Looks ugly as hell.

  13. #13
    alQamar's Avatar Citizen
    Join Date
    Jul 2012
    Location
    Dortmund, Germany
    Posts
    5,963

    Default Re: CA is joking us people with capable hardware

    Quote Originally Posted by NekOnOkO View Post
    ame still fails to load GPU properly due to CPU-bottleneck, but now at least it's able to trigger GPU-boost, lol. Before Rome 2 caused my GTX 780 ti to drop frequency to 810MHz and GPU load was about 40%. Now it's 1251 and 55% (peak, 67%)
    which preset do you use, please. can you please provide screenshots of the benchmark with same preset using 8.1 and 9 beta with maxed out settings? That would be a great help.


    But I will reinstall Rome 2 today , so I can do some testing with my Radeon HD 7970 GHz , 3GB graphics card.
    I think this card is more than enough to run Rome 2
    I am looking forward to it. Would be nice if you can help me to gather logs for this card with MSI afterburner.
    Last edited by alQamar; January 26, 2014 at 10:54 AM.
    NEW: Total War Saga: Britannia benchmark thread - last update: 10.05.2018
    HOW-TO-step-up-from-MBR-CSM-LEGACY-BOOT-to-UEFI-GPT
    Many of my past contributions in the time from 2011-2017 will contain content that now show broken links. Unfortunately I had to delete all pictures linked on TWC that were hosted on imageshack.us. Read why
    If you are missing anything of interest, please let me know. Sorry for any inconvinience caused.

  14. #14

    Default Re: CA is joking us people with capable hardware

    Quote Originally Posted by La♔De♔Da♔Brigadier Graham View Post
    Couldnt agree more if I tried old bean, and it pleases me to see someone like you who obviously knows a thing or two about PC optimization and all that malarkey,tirelessly try to get your point across to C.A.!
    Dont ever change or give up the good fight my fine German friend

    P.S. yes moderators here are certainly more relaxed, even not opposed to criticism in some cases!
    Here our "Peelers" do a splendid job!
    "old bean" "malarkey" "splendid" worth reading just for key words

  15. #15

    Default Re: CA is joking us people with capable hardware

    There actually aren't that many different textures to load in Rome 2 and they aren't even high res.

    Rome 2 defnitely cannot use more than 3GB at max because it is a 32 bit game. No game in the world that runs on 32 bit can use more than 3GB.

    http://en.wikipedia.org/wiki/3_GB_barrier

    There is some slight variance depending on the mobo and so on but its around this value. Always.
    That sounds logic to me.Agree

  16. #16
    RedFox's Avatar When it's done.™
    Join Date
    Nov 2006
    Location
    Estonia
    Posts
    3,027

    Default Re: CA is joking us people with capable hardware

    Well, this is getting a bit technical, but I thought I'd just throw in my knowledge about memory, too:

    1) Each running process can access up to ~3GB Virtual Addresses - these addresses are mapped by the OS to Physical Memory.
    2) The upper 1GB of the 32-bit address space is reserved for kernel level drivers and also inter-process shared memory for DLL's.

    So, you can use ~3GB of memory in a 32-bit application, while the remaining 1GB is used by the system for Shared Memory and drivers.

    To access GPU memory, you need to request the OS to remap some part of the GPU Memory to that upper 1GB. When you're done writing data to GPU Memory, you will tell the OS that it can unmap that region when needed.
    In pseudocode:
    Code:
    gpuhandle->Map(...);
    memcpy(gpuhandle->data, someData, size); // copy someData to GPU memory
    gpuhandle->Unmap();
    So to sum up, you can't physically allocate more than ~3GB of memory in a 32-bit application. Although one can use clever tricks to temporarily offload data to disk or avoid keeping a copy of data in both system RAM and graphics RAM.

  17. #17

    Default Re: CA is joking us people with capable hardware

    Quote Originally Posted by RedFox View Post
    So to sum up, you can't physically allocate more than ~3GB of memory in a 32-bit application.
    True, but that doesn't limit VRAM usage of a 32 bit application in any way.

    Yeah, sorry for all the technical stuff but I am just trying to bust that myth that 32 bit applications can never exceed 3 or 4GB VRAM usage. People mix it up all the time.

    As u said, only a small part of the complete VRAM is mapped to the 32 bit address space at any time. Therefore nothing prevents you from creating for example 8 x 1GB sized slices of VRAM and map one after another to your 32 bit address space (only 1 is accessible at any time, the others are not).

    In DirectX, those VRAM memory slices are called 'DirectDraw surface'. U simply define the Width, Height in pixels and some other info and you can allocate as as much VRAM as u like (or your system allows). If Windows runs out of VRAM, it will actually allocate it in system memory (RAM managed by Windows not your app) (if u run out of RAM, then the swapfile will be used).

    DirectX Surfaces:
    http://msdn.microsoft.com/en-us/library/aa911090.aspx
    Last edited by A Barbarian; January 27, 2014 at 07:26 AM.

  18. #18
    RedFox's Avatar When it's done.™
    Join Date
    Nov 2006
    Location
    Estonia
    Posts
    3,027

    Default Re: CA is joking us people with capable hardware

    Quote Originally Posted by A Barbarian View Post
    Yeah, sorry for all the technical stuff but I am just trying to bust that myth that 32 bit applications can never exceed 3 or 4GB VRAM usage. People mix it up all the time.
    Yes, you can exceed 4GB Graphics RAM usage if your operating system is able to map it for you.

    Quote Originally Posted by A Barbarian View Post
    Therefore nothing prevents you from creating for example 8 x 1GB sized slices of VRAM and map one after another to your 32 bit address space (only 1 is accessible at any time, the others are not).
    You would need 1GB of contiguous Virtual Address space in your application, which can be a problem with large applications and would not make much sense.

    Quote Originally Posted by A Barbarian View Post
    If Windows runs out of VRAM, it will actually allocate it in system memory (RAM managed by Windows not your app) (if u run out of RAM, then the swapfile will be used).
    You must also take to account that automatically managed resources in DirectX keep a copy of the resource also in System Memory, so in fact, you would still be limited by the total available Virtual Address space of ~3GB.
    http://msdn.microsoft.com/en-us/libr...=vs.85%29.aspx

    It's really hard to convince DirectX to mind its own business when you're dealing with your resources...

    Judging from how complex a game such as Rome 2 is, I'm not surprised they're hitting a barrier due to bad resource management and running out of 32-bit virtual addresses.

  19. #19
    Baleur's Avatar Biarchus
    Join Date
    Sep 2004
    Location
    South Sweden
    Posts
    624

    Default Re: CA is joking us people with capable hardware

    Intel i5 2500k
    8gb 1333mhz ram
    nVidia 560TI (semi-latest drivers)
    Win 7 64bit

    Rome 2 settings: All maxed, all extreme, SSAO and Unlimited Video Memory. No distortion effects, no Vsync, no Depth of Field. 1680x1050. GEM shader mod to add hdr and color contrast / balance.

    Beta patch 9 pretty much doubled my battle framerate (i don't use the ingame benchmark tool, as that isn't representative of actual gameplay, instead i load up the same custom battle before and after the patch).
    Really, it's a revolutionary patch for i5's and 560TI's. I still can't believe the increase.
    I'm now getting the same framerate running with Extreme unit detail as i got with High unit detail previously.

  20. #20
    RedFox's Avatar When it's done.™
    Join Date
    Nov 2006
    Location
    Estonia
    Posts
    3,027

    Default Re: CA is joking us people with capable hardware

    Quote Originally Posted by Baleur View Post
    Intel i5 2500k
    8gb 1333mhz ram
    nVidia 560TI (semi-latest drivers)
    Win 7 64bit

    Rome 2 settings: All maxed, all extreme, SSAO and Unlimited Video Memory. No distortion effects, no Vsync, no Depth of Field. 1680x1050. GEM shader mod to add hdr and color contrast / balance.
    Now that's interesting. i5-2500K is actually a prety good CPU, according to http://www.cpubenchmark.net/cpu_list.php
    What was your FPS before and after?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •