Hello ladies! I was just wondering this earlier today as I was sitting around in my PJs and enjoying my morning coffee. My roommate came home from his job (he works overnights) and he said to me "Oh, no Samantha today?" and I just shot him a strange look. I said "Still Samantha, with or without the wig..." and he just kind of rolled his eyes and went up to his room.
Then it got me thinking. Why the hell do I wear the wig anyways? I mean, don't get me wrong, when I look in the mirror, I want to see a girl looking back at me and that is quite hard to achieve without a wig at this time in my life. But, I find wigs to be uncomfortable at times, and I don't want to dirty them up when I'm just lounging around my house in my jammies. Of course when I am getting ready to go out, I will never forget to put my wig on, but I'm thinking about just not wearing it when it's just me at the house. Before, I needed it to feel like a woman, but lately I feel like a woman anyways. I wear a bra, I have painted nails, I have a girly eyebrow piercing, I practically never wear any guys clothes anymore... I don't need the wig anymore to feel justified in expressing my femininity anymore.
I used to feel stupid when I would put on my girl clothes without also doing my make up and putting on a wig, but now I'm just like "screw it! this is me!" and wear whatever I want to. After laser treatments, I have like a 4 day period where my face is too rough to shave, but I've still gone out in public as a girl with a thick beard shadow, and I just don't really care anymore. Definitely had some strange looks but that's on them if they are bothered by my appearance. I love me and I'm not afraid to admit that anymore.
<3
(im not saying "anymore" anymore, I swear)