I am experiencing some cognitive dissonance trying to absorb the idea of Disney as a valiant warrior for tolerance and social justice. It's not so much that I think Capitalism is Evil, it's that during my formative years Walt Disney presented as sort of a Nazi. During the 60's, Walt declared war on hippies, refusing to hire anybody with even a smidgen of facial hair (despite his own iconic 'stache) and reportedly turning away Disneyland customers who looked remotely countercultural. Of course, Walt is long dead and his personal vendettas have been discarded in favor of simply selling product to the largest possible audience, but it still seems strange. Of course that doesn't stop me from thoroughly enjoying the confrontation, and applauding every time Disney scores another point against DeSantis.
no subject