Sometimes I wonder what happened to humanity. And when that it went wrong.
I don’t remember society being the way it is now when I was growing up, and that was only in the nineties, two thousand’s, which to be fair was 20 to 30 years ago but still, that isn’t that long. It’s not even an entire generation and yet…
When did family no longer be the most important thing in people’s lives?
When did people decide a career was more important than raising their own children?
When did society start to look at stay at home mom’s as something bad?
When did society decide the I was more important than the we?
Everyone talks big and claims to be so good… politically correct, posting about the right stuff, using all the hashtags, being pro whatever is woke at the moment and supporting the current thing… but who is actually doing anything productive to improve whatever it is they are standing up for. Who has actually gotten their hands dirty? Who knows from real life that the issue they so strongly support is really and issue and not something the media has created to sell their papers?
Who is doing something to make people’s lives better?
Who is actually listening to what others have to say with an open ear and mind, willing to learn and see the other side of a matter?
And how is it that how I feel or what I think only matters if I agree with the current thing and things “aren’t about how I feel or what I think” when they are against the current thing? People claim to be inclusive and for everyone but refuse to acknowledge that what the other side thinks and feels is just as valid.
We all matter and all our opinions count.
Take your time and listen to people, listen to learn not to argue. The world would be a much better place if people were just willing to listen to one another and try to understand the other side.