In society today, we're too afraid to say how we really feel in case it hurts someones feelings or ruins a friendship. We never step back and say, "this really hurt/bothered me" or "You really pissed me off by doing this." Why are we so afraid to tell someone how we really feel if we believe the person we're telling is truly a friend?
Have we become such a weak society that we depend on friends to stick by us as long as we don't say the truth? Are we so dependent on people staying close to us that we rather continuously be hurt by them instead of telling them they're hurting you and or pissing you off? Why do we insist on being hurt/pissed off on a regular basis just to have these people in our lives? What kind of people are we becoming?
My dad always says that "if you allow someone to treat you like a doormat, than you are a doormat." If you stick up for yourself and tell people that you are not a doormat, you risk losing that person. When these people are in your life, they're in your life for a reason but if they're walking all over you, than what is the reason they're in your life? If you told someone how you truly felt and they decided to leave your life, than were they really worth it at all? You should be able to tell people that you're not happy without fear of repercussion(within reason of course.) Through the times of growing up, I've always been told that your friends are your friends, regardless of what you do or say. If you're pissed off you should be able to say that you're pissed off without them getting up and leaving on you. So where did this phrase go? Where did this strong will disappear too among society? Why are people so easily offended and run away from good people when confronted?
These are questions I've been asking myself for quite some time and now with recent events happening this week, the questions are very much in my mind and I'm seriously trying to answer them.
I guess I have a lot more thinking to do....
No comments:
Post a Comment