Anyone who would like to vent out their feelings about Doctors, you can do it here!
I've had my own personal experience with doctors to know and feel like they really don't seem to understand. With an exception of one that I know who has been very helpful. He's a male, and I wanted to see a female doctor as you can probably understand
Doctors just don't take things seriously! They treat you like your stressed, depressed and a general typical case!
I for one, have reason to be so critical, because it cost my dear Nana her life! They always kept me in the dark about such things, but she was tired, bloated and her tummy was getting bigger and bigger. She went to the hospital a few times and I can remember her complaining about how she was being treated with stupidity. They even offered her a huge meat sandwich when she couldn't eat, food just passed right through her. They didn't take her seriously! AND DO YOU KNOW WHAT! She had cancer! Ovarian cancer!!!!!!!!!!!!!! If they'd taken her seriously they would've found it in time. It was horrible.
To make things worse, my friend had the exact same thing, her Nana passed because the doctors didn't take them seriously!!! And now I am dealing with the exact same thing myself. Why?