Is it just me or have we become increasingly more cynical & pessimistic? Sometimes it's good to take a break from the news & politics. I did this for the summer so that I could refresh & come up for air. However, I am back & it seems that not much has changed since I've been gone. It's still mostly about death, corruption, & bashing one another. No wonder there is road rage, anger management classes, people don't smile or say thank you anymore.
What do we do? Can it change? I am hopeful, ever the optimist when it comes to these issues. I have to be. My daughter's are young & I want a better America, a better planet, a better human race for them!
I think I have said this before, that we are getting worse & not better. I seem to lose hope when I watch the local news or national headlines. The interesting part is that it seems more & more Americans are watching "Reality TV" to get out of their own heads & become voyeurs as they watch the lives of others. I didn't understand this as much until recently. But, do we have to be "dumb-ed" down in the process? It sure seems that instead of solid, informative news the juicy news of sex, lies, & drugs will continue to prevail.