If you want lower CPU utilization you actually have to raise and not lower your graphics settings
That's because most graphics settings (except 'unit size', 'blood', 'shadows' to some degree and maybe some others) cause much more GPU- rather than CPU utilization. Only when your GPU is fully utilized, the CPU will be used less. It's very rare that CPU and GPU are utilized to 100% at the same time, just running a game. Either the GPU is waiting for the CPU or vice versa.
And that your 8600K is used to 100% (or close) is not really surprising since it has 'only' 6 cores (albeit real cores). A 7700K for example only has 4 real cores but also has the hyperthreading feature which presents additional 4 'virtual' cores to Windows. Those virtual cores are not as powerful as the real ones. Virtual cores can improve performance by maybe 30% depending on application. But for an application a 7700K with hyperthreading has more cores than your 8600K (the 8700K is the same as the 8600K just with the hyperthreading feature enabled), so it will divide the work into more threads (if it's properly programmed).
I would use MSI-Afterburner, which you can configure to monitor CPU- and GPU utilization in-game, to find a good mix between CPU- and GPU utilization. If your GPU utilization is usually below 100% u have some headroom to enable some more graphics settings. If you are unhappy with your framerate you have to check whether the CPU- or GPU is the bottleneck. If it's the CPU, then you don't have too many options unfortunately. The one setting which will lower CPU utilization the most is 'unit-size'. Also,'shadow' settings can be taxing for the CPU (at least in former TW games), especially in unit closeups.