I've been out of the nursing profession for some six years now, and although I've kept tabs on what's been going on here in nursing, I've noticed that only recently have there been more male nurses shown in your paper. While no doubt recent headlines have placed male nurses in a somewhat bad light, I want to say that in my opinion nursing has a lot more to go in terms of male nurses being considered as equals.
When I graduated in 1992, and passed my boards in July of that year, I had absolutely no idea of the crap I had to face just to practice nursing. For example, when I was an RN at a rural hospital, I was physically assaulted by an employee there that felt men had no business in nursing.
I was at another rural hospital, and by simply stating I was tired of the nonsense I had to deal with, I was told by the DON that if I didn't resign IMMEDIATELY, I would be terminated, and she (yes, a woman) would do everything in her power to see to it I could never find another job. I've been accused of sexual harassment, more out of convenience, rather than any sense of wrongdoing (there was none whatsoever). I've been accused of abuse, with absolutely no evidence of anything of the sort, yet I was terminated from jobs almost at a whim.
It was when I suffered a stroke in 1997, that I decided that nursing wasn't worth the crap. People who read this may criticize me, or my motives, but I ask you this: In all of the problems nurses face today, can any of you honestly say that you were actually a help in terms of your profession? Or, to put it bluntly, can any one of you who wish to criticize me honestly say you haven't seen things like I've just described?
I openly challenge anyone who reads this to prove me absolutely wrong on everything I've said.