I live in a small southern community in the United States, a place where conservative christians are the overwhelming majority. I work in a little family-owned restaurant - the oldest restaurant in town. Because both my parents have noteworthy reputations in town, I am CONSTANTLY having customers recognize me as being their daughter. On top of that, all the other waiters there are your stereotypical southern white female - lots of makeup, hair and nails constantly done, regular visits to tanning beds.
When I first started working there about a year ago, I showed up with no makeup, my long hair pulled back, wearing the required all black attire. I had chosen men's pants and men's shirts to wear there.......and very quickly, I noticed that my tips were nowhere close to what the others were making. So I altered my appearance because I was seriously in need of the money - I at least brush my hair before going in, and I even wear some makeup. Even some earrings if I remember. This isn't the first time I've dealt with makeup and the like; being from this part of the world, I'm well accustomed to dressing a certain way just to fit in. But I'm noticing more and more that it makes me feel disgusted with myself. Yes I am biologically female, no I do not feel female, no I don't feel entirely male...but being "forced" to come across as hyperfeminine for my job feels like it's compromising my true self.
I've looked for other jobs (to no avail), so I feel kind of stuck. On top of that, I live at home with my parents right now.
LUCKILY I'll be going back up to college in the fall

and I'm so thankful to attend a school that is extremely open and accepting about such things. But that doesn't help how I feel right now. I don't really see much of a solution to this since I'll be leaving in a month or two. I guess this thread was mostly just for me to vent about how frustrating it is to be in such a situation!!!!!
Thanks for listening,
Connor