When I was a kid I spent every Saturday afternoon in a movie theater. All the old westerns, comedies, — but those days ended long ago.

During WW II Hollywood got behind the war effort 100 percent. Celebrities sold war bonds, did USO tours, made movies about war heroes. Those days also ended long ago.

Hollywood Has Always Been Political

Hollywood used to be a mirror of America. Movies entertained us, made us laugh, made us cry, made us think.

Today it seems that movies have distorted the image in that mirror. If they disagree with the image they simply change it to reflect the image they prefer.

Rather than going with the Mona Lisa let’s make it look more like Barbie to sell a few more tickets.

The more political the celluloid message the more Hollywood pats itself on the back for their artistic contributions.

Hollywood has always embellished the truth. Any biography will probably have a few changes to make the life depicted more in line with the producer’s ideology.

Some Final Thoughts

Instead of the Oscar’s I think I’ll throw a copy of “Dirty Harry” on the CD player and watch Clint Eastwood vanquish the bad guys.

At least I won’t hear anything about gun control in that one. What are your thoughts on the Oscars? Comments below.