Boards
Sex And The City
Now that we're all enlightened and feminist, is it time we revisited this show? I know at the time the indie lad consensus was it was just all about shoes and one of them looked like a foot and omg how dare they etc etc but is that really fair? I noticed it came up in the Bechdel thread and two posters agreed it was just women talking about men and shoes. Struck me as a bit of an outdated perspective, but then I think people still maintain their anger about this show, while a newer show like Girls, which for me SATC definitely paved the way for (and they share a lot in common) is widely respected.
I'm not saying it was a masterpiece by any means, and I have to admit I haven't seen an episode in years, but it does feel like it has a bit of an unfair reputation borne at least partly out of a pretty ugly gender political landscape at the time it came out. Is it time we issued a collective retroactive apology for the way we treated a show that was genuinely trailblazing?
Thanks