Anyone else bitter about their dental school experience?

I graduated in 2016, and I hated those years other than my classmates and a few of the instructors. The instructors who made your life miserable are the ones you never forget. Especially when they openly criticize you in front of patients, or other classmates. Also, all the racism and sexism. I remember there were certain instructors that would go out of their way to help out the young, pretty female students, yet they would ignore the male students when they reached out for help. It was quite obvious. I looked up on DentalTown.com and remember hearing stories about certain professors who would sleep with their students.

Also, dealing with the politics BS from administration who didn't seem to care about the students. It seems from discussing with dental students that this is pretty much universal everywhere. Apparently, it was way worse in the 80s and earlier.

I hope those certain instructors are rotting in an alley somewhere. I wouldn't piss on them if they were on fire. I wish them nothing but misery.

Seriously, fuck Dental School.