Alright so maybe they were the ones to drop the bombs on Japan but when it comes to Germany that was more or less Russia's achievement in the field with the Americans and commonwealth being a distraction more then anything.
Why do Amuricans think they won it all?
Is it just the medias fault
I have met Americans who if you talk to them about ww2 or even ww1 they act as if their president went over their first hand led the troops from the front and one handedly defeated Germany.