Bit of a sensible topic for these boards, and I apologise for that, but does anybody get the impression that this country is getting increasingly right wing?
I'm no expert on politics, but when I speak to people (my parents, colleagues, people on my course etc) I get the impression that their viewpoints on certain things are going backwards
I find some of the things people believe these days (and I believe it has altered noticeably in the last couple of years) so difficult to agree with, and it's quite worrying
Only the other day a woman at work said to me how "she'd never felt so racist as she does now". I mean, what the fuck? In my opinion David Cameron will almost certainly prove to be the right man, in the right place at the right time. I'm bracing myself for a Tory government
Anybody have any opinions on this? Or am I just a rambling fool?