Is there any standard consensus about the amount of RAM considered good enough to play Total War games?
Is there any standard consensus about the amount of RAM considered good enough to play Total War games?
16 GB would be the best I'd say. It'll give you enough for what you need, and it'll be future-proof.
Proudly under the patronage of General Brewster of the Imperial House of Hader
Proud patron of 4zumi, Akar, CommodusIV ,Swaeft and Peaman
16GB is the new 8GB, most people liked to say 8GB was enough back in the day. Newer TW games are 64bit so it's more likely they'll use more RAM, since titles like Star Citizen which are 64bit also use tons of RAM.
32GB is pretty expensive now and not really necessary even if you're running a small server + playing a game at the same time.
The AI Workshop Creator
Europa Barbaroum II AI/Game Mechanics Developer
The Northern Crusades Lead Developer
Classical Age Total War Retired Lead Developer
Rome: Total Realism Animation Developer
RTW Workshop Assistance MTW2 AI Tutorial & Assistance
Broken Crescent Submod (M2TW)/IB VGR Submod (BI)/Animation (RTW/BI/ALX)/TATW PCP Submod (M2TW)/TATW DaC Submod (M2TW)/DeI Submod (TWR2)/SS6.4 Northern European UI Mod (M2TW)
What the gentlemen already said. No need for 32GB. Except if you have a big budget and you really don't care about the money.
FYI memory won't increase your FPS or something. It will help with general system stability and speed though, plus combined with an SDD you'll have quicker loading times, turns etc
Thank you very much for your replies. I get the idea that 16 GB DDR4 is enough RAM to play any Total War. Because I don't want to bother you opening more threads about the same, I want to ask you if you know if there is any program to measure the performance and hardware utilization of a game. I want to study carefully the use of CPU usage, GPU usage and RAM usage in Empire Total War, I am mainly interested in CPU usage. Which program can I use to see it?
I'd say MSI afterburner is a useful tool for that
Proudly under the patronage of General Brewster of the Imperial House of Hader
Proud patron of 4zumi, Akar, CommodusIV ,Swaeft and Peaman
For benchmarking gaming performance your best tool is:
-MSI Afterburner+RTSS
-Ms Excel
Just make sure to enable the logging option which then saves the gaming session in a log file. From there you have a detailed log file which you can research in MS Excel to calculate, compare different games and hardware. In excel you can then create charts as well like professional websites release when benchmarking a particular graphics card performance.
PS: Yes 16GB RAM is more than enough. GTAV like games consume 9.8GB of RAM along with 60+ tabs open in Chrome. So 16GB is plenty from today's\future-proofing standard. Besides with advancement in Graphics, VRAM consumption has increased, not much in RAM in 64bit games.
I run 32GB on my Desktop because I run Chrome in the background 24/7 and we all know what an absolute RAM hog Chrome is.
Patron: The Mighty Katsumoto
Sukiyama's Blog
Simple explanations of Austrian Economics POV on a number of issues.
Simplified Western Philosophy
Best of Thooorin, CS:GO Analyst and Historian.
Remember when using Chrome or even Firefox, with hardware acceleration ON, it will affect GPU usage as well which can somewhat hinder your gaming - mainly if you're an ALT-Tab gamer. But with the latest Win 10 update, there's a gaming mode which can detect full screen game and render chrome in the background until it's popped up again.
There are also some cool addons in Chrome like Tab Wrangler which can shut down inactive tabs under a single icon on the extreme right, in order to revive them again.
Bloody hell, just disable the preloading option in Chrome, and it runs fine. Unless it's a page with video or reall mess with bells and whistles all around, Chrome won't consume more than few tens of MB RAM per open tab. I hate when people complain about the Chrome RAM hogging while it's easy to disable. I've got a ~5 year old laptop which was mid grade at best on its release, with 8 GB RAM, and I can run XCOM2 or Witcher 3 on medium-ish detail just fine with 20-ish tabs of Chrome in the background just fine...just to illustrate.
To be fair with Suki, Chrome is a resource hog indeed, compared with other browsers. You can tweak it a bit ofc and increase it's performance, as you can do with every app.
As for Window's 10 Game mode, it'll probably help with minimum fps indeed (not much though) by allocating specific CPU cores to the game and leaving other cores for the remaining processes, but it's bad for multitasking and high-end CPU's, that's why most Tech-Gaming Sites suggest it's removal as soon as we freshly boot a new pc. Microsoft is still unable to fix it completely or give us a clear answer about it though.
If 16 GB of RAM is enough and the standard for Total War games and games in general, in respect of Total War games, in which extent clock speed matters? For Total War games, there is a noticeable difference in performance/fps between 2133 MHz, 2400 MHz, 2666 MHz and 3000 MHz?
Fluctuation in gaming FPS because of RAM's Frequency and CAS latency (DDR4 or DDR3) is minimal and not worth it. Price wise 2666 MHz is the sweet spot, not much difference in all of them unless you go ultra high with some marketing BS like "super heatsinks on RAM etc!".
100% agree. At most you'd see maybe 5 FPS more or 15 if there's a super super optimized game but that's unlikely.
Basically you can program things in c++ to become "super" optimized and use memory exactly how it was intended at multiple levels but it's very unlikely a game dev would go to such lengths.
The AI Workshop Creator
Europa Barbaroum II AI/Game Mechanics Developer
The Northern Crusades Lead Developer
Classical Age Total War Retired Lead Developer
Rome: Total Realism Animation Developer
RTW Workshop Assistance MTW2 AI Tutorial & Assistance
Broken Crescent Submod (M2TW)/IB VGR Submod (BI)/Animation (RTW/BI/ALX)/TATW PCP Submod (M2TW)/TATW DaC Submod (M2TW)/DeI Submod (TWR2)/SS6.4 Northern European UI Mod (M2TW)
If I recall correctly, the higher end Ryzen CPUs benefit from higher RAM speed to a greater extent than others
Proudly under the patronage of General Brewster of the Imperial House of Hader
Proud patron of 4zumi, Akar, CommodusIV ,Swaeft and Peaman
I can confirm that having a Ryzen 2600 in my own desktop. However, I really wonder if you could tell the difference in a real world situation playing games. There is a lot of info about this to be found though like this little video:
Spoiler Alert, click show to read:
Actually, all the Ryzen series had issues with RAM and motherboard compatibility since their release, especially with higher frequencies. Many users still reporting they have them and as of today, we have no official fix (100% stable). It's basically a fault in Ryzen's core architecture. Now Ryzen 3 (supposedly) won't have the same issues (stability,xmp crashes, auto lowering the MHz to 2666MHz no matter the setting) but we'll have to wait and see. People are usually referring to benches when they talk about RAM's enchantment on CPU. There is no game to my knowledge, that would benefit from 3200 or the most recent 3600MHz of RAM in your system under any circumstances. But when we're talking about OC as a sport and benches as a hobby then yeah, 4200MHz of DDR4 RAM will give you a few more points for your final score. I'm currently sitting on 16GB DDR4 and it's more than enough for me tbh since I'm not running benches
anymore...yeah I got old
PS
Long story short, with Ryzen you'll need higher frequency RAM so to get your dedicated memory's standard MHz performance, always according to the manufacturer.
Last edited by ♔Greek Strategos♔; March 17, 2019 at 10:02 AM.
Earlier Ryzen CPUs were just poorly optimised for games since not many devs took them into consideration due to Intel domination at that time and that resulted in RAM being utilised inconsistently since it's the actual CPU that pushes the command and has to wait for the response. Besides some games are poorly optimized in the end for AMD, DOTA 2 for example.
Also accurate RAM performance is hard to measure via games. And doesn't really make sense. Anyways true latency is actually the same across same range of high end frequencies which is calculated by actual clock cycles and CAS:
As you can see the True latency in the higher frequency ends are actually somewhat same.
As for the video you linked:
1) Only reason you're seeing the somewhat considerable difference here in the video is either because of poor optimisation or true RAM latency, which matters more than frequency.
2) The 2133 MHz is definitely underutilised since it auto overclocks to 2666 MHz in XMP profile which I suspect wasn't activated (saw its specs).
Maybe it was an advertising stunt to promote the G-Skill Sniper RAM since it's mainly marketed for gaming RAM. This is a very detailed article on the GSKILL RAM:
https://www.guru3d.com/articles_page...review,13.html
And clearly shows its performance when downclocked to 2133 MHz. You will find that at 1080p resolution and above the difference is usually under <3-4 fps.