I’m trying to get perspective on this particular beauty standard and how I want to approach it. Do people whiten their teeth where you live? Is it seen as expected to do so? Do you live in a city?

I have healthy teeth that have nevertheless seen a lot of tea and coffee. I have generally thought of this as similar to wrinkles, i.e. a natural thing bodies do that I don’t want to pay money to fix since it isn’t broken. I still think this. But I have been feeling lately like there might be more actual social stigma to my teeth being discolored. I am wondering if this is at all real? Has whitening teeth become an expected thing for all adults to do now? I thought I’d ask how other people feel and think about this and what the general norm is in your social circle.

Edit: thanks for the responses everybody.

  • rand_alpha19@moist.catsweat.com
    link
    fedilink
    arrow-up
    4
    ·
    5 months ago

    It’s relatively common where I live, but it’s not often talked about. My adult teeth emerged slightly yellow and my parents had me try to whiten them before being told by a dentist that, in that case, it’s not something you can fix, so I stopped caring.

    Maybe if you have staining from food or smoking it’s different, but to me it seems like a bit of a waste of time in terms of beautification unless you have nothing else on your face you could improve instead. Hair and skin improvements make more of an impact IMO.