Disney, for all of our happy memories of it, has been at the forefront of changing America for the worse. We just didn't know much about it until recently. I learned about it when some friends' kids and/or nephew was trying to get a job there. One kid did and he told me about all of the cultlike, grotesque, and political stuff.
I tried to warn others about it, but was accused of attacking America and being un-American.
Another friend's nephew went to work there in the stage entertainment side. Was a staunch conservative for many years, but finally gave in to the world's pressure and (even though he still knows conservative values are clearly more in the right across the board) now claims he's a liberal and after fighting it for many years, gay. He was raised conservative, Christian, and his parents never divorced, but he never lived up to his own high expectations of himself, got into drugs and alcohol, got taken advantage of and programmed by kids and TV from an early age, and now he's lost.
No idea just how much Disney had an influence, but it had been seen by so many as such a clean and wholesome company that everything was soaked in without question or scrutiny for a very long time. I stopped watching anything Disney after the movie with the black hole. Somethings about it didn't sit right with me back then. Only saw things sporadically on video releases at friend's or gfriends houses much later.
Very disappointing history there, considering how loved they were by so many