Oct 12, 2022
It's not like racism ever went away in the United States. It just sort of went undercover, became something a little too indecent to talk about in public, like sex or something.
Trump made overt racism okay again, and his followers are delighted.