I heard that in Germany there was a show that was running, it was essentially a documentary akin to "America: A Story of Us" except it was about Germany. I believe the English translation of the name of the show was "The Germans" or something.
I've been meaning to try and find this documentary for a long time, since I heard about it. However every time I google "The Germans" all I get is books or nationalist things that have nothing to do what I'm looking for. So what was it called in German if you know what I'm talking about, and even better where could I watch it? With or without subtitles.
Thanks for your patience and understanding.







