I am by no means an expert but it seems to me “your body does not belong to you” is a major theme of right wing authoritarianism and, interestingly, modern USAmerican thinking. This underpins so much from abortion to forcing kids to hug their relatives. Your body belongs to the state, or God, or your husband, or your boss, or your doctor. Everything from trans and gay liberation to forcing autistic people to look in your eyes to making cashiers stand for no reason. Your body does not belong to you, but taking care of your body is your responsibility and your responsibility alone, and if you fail in some way, you deserve the consequences.