I had a skin infection once and tried to tell my derm that i read on a forum what it most likely is and how to treat
happens all the time
Tell me about it
Had a toe infection years ago.
Went to the doctor.
Doctor said literally: "it doesn't look like a ingrown toenail but let's just do the standard treatment for ingrown toe nails"
The standard treatment : surgically REMOVE!! the fucking toenail.
Doc told me to make new appointment for the surgery.
I didn't do it. Instead, I began to apply tea tree essential oil on the toe, every day.
After a few weeks the infection was gone.
And it makes me FURIOUS that health professionals pretend this stuff has no scientific explanation.
When it literally does, there's papers on that and actual research. But they rather want you to buy chemical pharma drugs that can be patented and branded.
Also why do they all have to put on the "I'm the doctor here so stfu" attitude.
I think it's psychological.
They study like crazy for years and years and then think they are somehow superior.
In my school, everyone who got straight A abitur went for studying medicine. Not because of personal interest but because then knew it will give them prestige, the power over human life and (in the long run) tons of money.
Some years ago I was in hospital. I had to take legal action against the fucking hospital because they refused to treat me appropriately.