Boards
Recommend me some WWII documentaries from as unbiased a perspective as possible.
Recently wanted to learn a little bit more about WWII, because in all honesty apart from the basic narrative of Hitler=Bad, Allies=Good, I knew very little about it. Obviously in order to understand Hitler you need to go back to the first World War and the sanctions imposed upon Germany and how they affected the country and so on.
Anyways, I was recently watching some stuff which talked about Hitler creating something like 4 million jobs in 12 months and generally being great for the country for a while, but the same documentary pretty much skipped over the Holocaust and had more of a focus on the poor treatment of German POW's at the hands of the Allies, and while that might have been the case, what I was watching felt a bit biased towards the Axis, like it was put together by Stormfront or something.
I'm sure there are plenty of historians on here that can recommend me some stuff (books or films) which outlines what happened/when it happened without focusing too much on either side being right or wrong. Just seems like too fucked up a time to not have a decent understanding of.