Disclaimer: I live in a Red state. It is election season, and many of the political ads follow a specific script. Lately, I’ve seen women slamming DEI (Diversity, Equity, and Inclusion), which sent me down a rabbit hole. On the surface, diversity, equity, and inclusion sound like common sense—why would any rational person be opposed? Don’t we want to promote a culture where individuals are given opportunities based on their skills, regardless of their gender, background, or physical abilities? When did recognizing the contributions of minorities, women, and the disabled become a "bad" thing?