An article awhile back in Grantland caught my eye, “Dystopia Is the New Western.” Fans of Western fiction, and even more so, those who dabble in writing it, are forced at times to ask two questions: What has happened to the Western? And, does it have a future?
America can boast of inventing a number of essential “art” forms: the Blues and Jazz; bourbon; baseball and basketball; and the Western which gave birth to Hard-Boiled Fiction.
The Western, and its themes of “man (human) vs. wilderness,” rugged individualism, and the essential corruption of big business is in the DNA of the American psyche. It is our definitive and defining myth.
And yet, the western has all but gone away. And in its place… dystopia.
I have been working my way through West of Everything by Jane Tompkins. It is a “feminist” look at Westerns by someone who loves them. It is also the best book about Westerns and popular culture I have ever read.
To quote the book’s summary at Amazon, Tomkins believes that westerns were born out of:
a reaction against popular women’s novels and women’s invasion of the public sphere [in the late 19th Century]. With Westerns, men were reclaiming cultural territory, countering the inwardness, spirituality, and domesticity of the sentimental writers, with a rough and tumble, secular, man-centered world.
She also makes the convincing case that for men, and for women too, born roughly between the 1920s and 1960s, the Western more than anything else shaped our view of masculinity. I know for myself that that is most certainly true.
And yet I need to be honest with myself as well. While I know that the view of masculinity depicted in Westerns is not realistic and is often unhealthy, it is a view of masculinity I do not want to let go.
And so that is why I do not want to let go of Westerns. Why I read them even though the writing is often terrible. Why I want to write them. Why I want them to be better written than they are. Why I pop in a John Wayne DVD whenever life seems confusing and pointless. Why I still think the Western, along with Jazz, is the most American of all art forms.
And when we have totally lost both the Western and Jazz what will be left? Dystopia.