The Oscars are the most hallowed and important event on the Hollywood calendar… which goes to show you just how ridiculous Hollywood is.
The Oscars used to mean something, they used to give the Oscar to the best movie whether it was Jurassic Park or Citizen Cane. Those were great movies that people actually enjoyed.
These days the Oscars are just about Hollywood saying “look at me, look how great I am!” The Oscar is given not to the best or most entertaining movie, but to the “most artistically important.”
I know that actors are artists and I have a great respect for that, but not all art is good. Some art just sucks, whether it’s music or finger painting. (Stop putting that crap on Instagram, you’re kid isn’t a Picasso.) Hollywood seems to think that film is only art when not a lot of people like it, which is ridiculous.
I’m not advocating that Michael Bay win an Oscar, I am advocating that movies that people actually went to see win a few.
What do you think about the Oscars, are they awesome or pathetic? Let us know in the comments.