Is it fair to say that Western movies often fail to illustrate the true image of the traditional cowboy?

1 Answer | Add Yours

brettd's profile pic

brettd | High School Teacher | (Level 2) Educator Emeritus

Posted on

This is, of course, largely a matter of opinion, as with cinema, beauty and truth is always in the eye of the beholder.  But with this question, I think the answer is pretty clear, yes, for the most part, Hollywood has not been able or willing to accurately portray the life of the typical American cowboy.

In the original westerns that became popular in the 1950s and 60s, you notice that cowboys always wore white (first myth) and were always spotlessly clean (second myth).  They seemed to use their guns every day (third myth) and were always defending the beautiful woman (there weren't many women in the cowboy west).  I also find that many movies go to the other extreme and portray cowboys as utterly ruthless and despicable characters, which, while sociopathy may have been more common in the west than in urban settings of the time, I wouldn't say it was decisively so.

There are exceptions, of course, and one has to remember that Hollywood's main purpose is to tell a story, not necessarily to tell the truth.  It's purpose is to fill seats and make money, and historical truth is not always entertaining.


We’ve answered 317,820 questions. We can answer yours, too.

Ask a question