This is kind of a general question, spanning the last thirty years....I have a lot of family that left the UK during the 80's and 90's, saying it was terrible, awful....they emigrated and returned a few years ago and said they loved it. Apparently, it's changed a lot for the better. However, a lot of people say the country is ruined now and has gotten progressively worse. So my question is, what are your thoughts....is the UK a better or worse country than in the 70's, 80's or 90's (not 60's or earlier as I think society was too different to compare). Thanks for replies