America's companies seem fully intent on selling wokeness rather than products. Whether it's changing their corporate image to a rainbow version on social media (except on the Middle East-focused accounts), hiring based on "diversity" rather than merit, or pushing any other form of leftism via what ...

Share.