What do you think of all the recent uproars because men control women's beauty industry, with all big brands having male CEOs and stuff?
I think that women are still wearing shoes that cause their feet to fracture (which I’ve experienced personally) as it stresses bones and is entirely impractical (and makes it harder to run away). We’re still pouring hot wax on extremely sensitive parts of our bodies. Wearing bras, censoring nipples. We’re subconsciously attempting to live up to a standard that doesn’t even exist. To an extent, men created this “ideal version” of a woman in order to sell product. But where along the line did women adopt these processes as their own and perpetuate them? I think it’s a very weird time where women are aware (look at current street style photos with sneakers instead of heels, look at Marc Jacobs’ bare faced models this season) of what the beauty standards men have been perpetuating are, but they’re choosing to either adopt them, make it their choice, or go against them. Either way I feel like women are making a bigger statement now, than ever before, about wanting to dress for themselves and not because someone told them to dress a certain way. Doesn’t matter who’s selling them the product.
East Village rooftops | Manhattan New York City
If they don’t need you, it’s okay. You don’t live for other people.