Thursday, December 22, 2011

Hitch Slapped

I am deeply saddened to learn of the passing of journalist and author Christopher Hitchens. This insightful man had the ability to both inspire and anger me within the same article or lecture. Whether he was defending the Iraq War, explaining why women are not funny, or condemning religion, Hitchens pushed the conversation forward.[1] A former socialist, though still Marxist in his thinking, Hitchens was articulate, well-read, and heavily involved in the modern world. Writing for left-leaning publications such as The Nation and Slate or the largely conservative Hoover Institution, Hitchens was not afraid to cross ideological boundaries. His tirades against religion (including my own) brought down the wrath of the so-called Religious Right, while his advocacy for intervention in Iraq baffled many of those on the Left.[2]

I first came across Christopher Hitchens soon after my mission in 2007. As my wife can tell you, I had little interest in politics at the time. I was still in a kind of missionary mode, mainly reading gospel-related academic and apologetic material such as FARMS and FAIR with brief interludes for the news. Though I cannot recall if it was a video or article of his that I came across first, I nonetheless remember being struck by how articulate and intelligent the man was. Out of curiosity, I began to browse the work of Mr. Hitchens and found myself being challenged and impressed by many of his columns, debates, and interviews. A man-crush was born.

Of course, Hitchens was not immune to sloppy thinking. For example, his attacks on religion are best when presented in terms of modern religious and political extremism, but stumble greatly when judged within the context of history or philosophy. His importance in my own thinking, however, does not stem from anything in particular he has written or lectured about. Instead, his importance comes from what is manifested in his writing and persona: a hunger for knowledge, intellectual honesty, and a deep moral concern for real human beings.[3] Drawing on lessons from history, literature, recent events, and personal experience, Hitchens was a formidable public intellectual. Not a specialist by any means, but a well-informed, reasonable individual.

In other words, he was a Jerry Cantrell intellectual.

Cantrell, the guitarist and co-vocalist of Alice in Chains, had a similar influence on me in the realm of guitar playing. While my interest in playing was spawned by my love of pop-punk/indie bands such as Blink 182, Green Day, Jimmy Eat World, and the Ataris, I was soon taken in by the speed of metal, the groove of blues, and the epic, versatile sound of classic rock.[4] Unfortunately, I was too slow to keep up with Kirk Hammet, Dave Mustaine, or Dimebag Darrell; too stiff to match the feel of Stevie Ray Vaughn, Robben Ford, or Jimi Hendrix; and too limited in style to create the sound of Jimmy Page, David Gilmour, or Brian May. However, Jerry Cantrell (mainly in the form of Alice in Chains) provided a blues-based, melodic metal I could rock out to. More importantly, he provided a type of playing that seemed achievable: not because his playing was sub-par, but because it evidenced a moderate partaking of the best rock music had to offer. Cantrell was not a shredder, a blues master, or a progressive rock composer (he still isn't). But he was and is a fine guitar player, lyricist, and all-around musician. He instilled me with confidence and inspiration in my first few years of playing and remains influential even today. Likewise, Hitchens was not an economist, scientist, or historian. But he was a fine writer, thinker, and debater. More importantly, he made me take a closer look at virtually everything.




Though you may not have wanted our prayers, I will nonetheless say this: God bless you, Christopher Hitchens. I hope you are finding more of the happiness, truth, beauty, and wisdom you always sought.[5]



1. See his books A Long-Short War: The Postponed Liberation of Iraq (New York: Plume, 2003), God Is Not Great: How Religion Poisons Everything (New York: Twelve, 2007), and his article "Why Women Aren't Funny."

2. See Daniel Peterson's review of Hitchens' attacks on Mormonism and religion in general in "Editor's Introduction: God and Mr. Hitchens," FARMS Review 19:2 (2007). For Hitchens' post-invasion views on Iraq, see his "So, Mr. Hitchens, Weren't You Wrong About Iraq?" Slate (March 19, 2007).

3. Catholic philosopher Edward Feser writes, "Of the four horsemen of the New Atheism, Hitchens was the only one I found likable, and the only one possessed of a modicum of wisdom about the human condition, or at least as much wisdom about the human condition as one can have while remaining essentially a man of the Left. While there was rather too obviously something of the champagne socialist about him, I do not doubt that he had real concern for real human beings -- rather than merely for grotesque abstractions like “the working class” or “humanity” -- and that he showed real moral and even physical courage in defense of what he sincerely took to be the best interests of real human beings." This concern can be found in his personal reasons for defending the Iraq invasion, which often stressed the genocides under Hussein rather than the potential nuclear danger (though Hitchens certainly did not ignore the latter factor). For more on the genocidal nature and WMD potential of Iraq, see the following: Daniel Henninger, "If Saddam Had Stayed," The Wall Street Journal (Sept. 2, 2010); Susan Sachs, "A Grim Graveyard Window on Hussein's Iraq," The New York Times (June 1, 2003); John F. Burns, "Uncovering Iraq's Horrors in Desert Graves," The New York Times (June 5, 2006); Burns, "How Many People Has Hussein Killed?" The New York Times (Jan. 26, 2003); David E. Sanger, Thom Shanker "A Region Inflamed: Weapons; For the Iraqis, a Missile Deal That Went Sour; Files Tell of Talks With North Korea," The New York Times (Dec. 1, 2003); Sanger, "After the War: Weapons Programs; Iraqi Says Hussein Planned to Revive the Nuclear Program Dismantled in 1991," The New York Times (June 27, 2003); Jim Lacey, "Saddam: What We Now Know," National Review Online (Sept. 14, 2011); U.S. Agency for International Development, "Iraq's Legacy of Terror: Mass Graves": http://www.globalsecurity.org/intell/library/reports/2004/040317_iraq_mass_graves.pdf.

4. A nod to my brother-in-law Juan for being a major player in this.

5. See Hitchens' moving final piece in Vanity Fair "Trial of the Will" along with Mark Judge's commentary. New York Times columnist Ross Douthat weighs in on the Christian attraction to Hitchens.

Friday, November 25, 2011

Happy Thanksgiving

A couple nights ago at work, I had a discussion with a co-worker of mine who lost his son this past year, only to discover a few months later that his wife had an aggressive form of breast cancer. As I asked him about his holiday plans, he explained that scheduling conflicts would cause his wife to go through her chemo treatment the day before Thanksgiving without sufficient time to recover from the session before. The holiday would largely consist of him taking care of his recovering, miserable spouse. He explained that the worst part was watching. Physical pain and misery was something he did not mind enduring himself. But having to watch a loved one suffer was something else entire. I shook this good man's hand, promised him my thoughts, prayers, and support, and slowly walked back across the trailer yard. As I did, I could not help the rush of tears that came as I thought about the struggles of my friend and his family. I had to gain my composure and wipe my eyes as I made it back to the loading dock.

One may say that such interactions should make you grateful for all the things you have. To some extent, this seems to me rather obscene. It is as if one is to take the attitude "better him than me." Suffering is part of fallen nature, yet appears to be the most incomprehensible aspect of a world supposedly created and watched over by a loving God.[1] To be grateful that others are suffering rather than yourself is immoral in every sense of the word.



However, I highly doubt this is what is meant by the phrase "count your blessings." Gratitude is an emotion and attitude toward God, mankind, and life as a whole. It is a deep sense of appreciation for the very experience of life and those in it; an outlook bred out of genuine humility and awe. Numerous studies have been conducted that demonstrate the power of positive emotions. Negative emotions tend to restrict and narrow focus and thinking, while positive emotions broaden one's creative horizon. Positive emotions can also undo the effects of negative emotions, including the increased ability to cope with trauma. Studies indicate that as one's coping ability increases, so do positive emotions. "These findings suggest that, over time, positive emotions and broad-minded coping mutually build on one another, leading to improved coping skills and triggering an upward spiral toward enhanced emotional well-being."[2] Another study found that practiced gratitude led to "more progress on [participants'] goals, fewer physical complaints, more frequent physical exercise, more optimism, and higher overall well-being. So, feeling the pleasant emotion of gratitude in the short run led to more optimal functioning and emotional well-being in the long run."[3] This upward spiral can extend to groups and organizations due to the fact that witnessing moral behavior (e.g. helpfulness, gratitude, etc.) elevates and inspires others to become more helpful and gracious. Giving and receiving gifts, along with the associated gratitude, is "the moral memory of mankind. By mutual giving, people become tied to each other by a web of feelings of gratitude. Gratitude is the motive that moves us to give in return, and thus creates the reciprocity of service and counterservice."[4]

An interesting midrashic telling of Moses

notes that in the description of the first 3 of the 10 plagues in Exodus - the blood, frogs, and lice - it was Aaron rather than Moses who struck the Nile River and the sand, the sources of these plagues. Why so? Because the Nile...had protected [Moses] from Pharaoh's decree that all male Israelite infants be drowned at birth. Similarly, the sand - which had concealed the body of the Egyptian taskmaster Moses had killed...had saved Moses from Pharaoh's wrath and from prosecution and death. In gratitude to the Nile and to the sand, Moses did not want to be the one to smite them with his staff, and Aaron was delegated by God to do so. The moral the rabbis were conveying is that if one has to show gratitude even to inanimate objects, how much more must we show gratitude to humans who have benefited us?[5]

Gratitude is an essential quality that is too often forgotten and underscored. This Thanksgiving holiday, reflect on the love that makes the anguish possible, the friends and family that make life worth living, and the unique feeling of what it is to even exist. Be grateful for the blessings all around you by becoming a blessing to all around you.



UPDATE: President Eyring has a new article in the December 2011 Ensign entitled "The Choice to Be Grateful."


1. For reading on the problem of evil and suffering, see Truman G. Madsen, "Human Anguish and Divine Love," Four Essays on Love (Provo, UT: Communications Workshop, 1971); Blake T. Ostler, David L. Paulsen, "Sin, Suffering, and Soul-Making: Joseph Smith on the Problem of Evil," Revelation, Reason, and Faith: Essays in Honor of Truman G. Madsen, eds. Donald W. Parry, Daniel C. Peterson, Stephen D. Ricks (Provo, UT: FARMS, 2002); Loyd Ericson, "'Which Thing I Had Never Supposed': The Problem of Evil and the Problem of Man," Sunstone 159 (June 2010); David B. Hart, "Tsunami and Theodicy," First Things (March 2005).

2. Barbara L. Frederickson, "Gratitude, Like Other Positive Emotions, Broadens and Builds," The Psychology of Gratitude, ed. Robert Emmons, Michael McCullough (New York: Oxford University Press, 2004), 156.

3. Frederickson, 2004, 154.

4. Aafke Elisabeth Komter, "Gratitude and Gift Exchange," The Psychology of Gratitude, ed. Emmons, McCullough (New York: Oxford University Press, 2004), 203-204.

5. Solomon Schimmel, "Gratitude in Judaism," The Psychology of Gratitude, ed. Emmons, McCullough (New York: Oxford University Press, 2004), 44-45.

Saturday, November 12, 2011

Hip to Be Square

Last September, I wrote about Kenda Creasy Dean's research published in her book Almost Christian: What the Faith of Our Teenagers Is Telling the American Church (New York: Oxford University Press, 2010).[1] Dean's work was in some ways a sociological introduction to the emerging, American-based Christian culture (particularly of the evangelical flavor). This subject has intrigued me as of late, fueled by my various contacts with campus Christians during my undergraduate studies. The evangelical culture among college students ranged from the likes of Mark Driscoll to that of John Piper (though some would embrace both).

One particular college course featured a fair amount of Christian students, many of which gave presentations explicitly about Christ or Christian life. I was surprised not only by the amount of Christian-themed presentations, but the frankness of them (admirable, if not slightly uncomfortable). The body language, articulation, and dress of these fellow Christian students provided an interesting model by which to compare. Many were average in their dress and appearance. However, some embraced a counter-cultural fashion, talked about how their "eyes had been opened" by [insert freshman college course here] and how they were seeking a church that "accepted them for who they are." I've become increasingly aware of the strains of this Christian subculture in my community and specifically at the university. I've witnessed first-hand the mating of university subculture ideas with LDS doctrine in an institute class setting.

This is the very Moralistic Therapeutic Deism described in Dean's book: "No pretense at changing lives; a low commitment, compartmentalized set of attitudes aimed at "meeting my needs" and "making me happy" rather than bending my life into a pattern of love and obedience to God."[2] As Dean recognizes, "Moralistic Therapeutic Deism cannot exist on its own. It requires a host, and American Christianity has proven to be an exceptionally gracious one."[3] The two great commandments have been reduced from "love God" and "love thy neighbor as thy self" to "believe there is a god" and "be nice to people while feeling good about yourself." I was excited to hear Elder Christofferson quote from Dean's work in his excellent Conference talk last April:

"By contrast,” as one author declares, “the God portrayed in both the Hebrew and Christian Scriptures asks, not just for commitment, but for our very lives. The God of the Bible traffics in life and death, not niceness, and calls for sacrificial love, not benign whatever-ism."

I struggled for some time with the exact term by which to describe what I was witnessing in the university and institute, until I discovered that one had already been coined: hipster. What I had been observing was nothing short of what writer and journalist Brett McCracken calls hipster Christianity. In his book Hipster Christianity: When Church and Cool Collide (Grand Rapids, MI: Baker Books, 2010), McCracken takes great care to explain the hipster mentality and the way it has infiltrated the various Christian denominations. The very notion of hip invokes raw individualism, rebellion against the status quo, the maintainance of a "unique" public image, and immersion in present-day consumerism. He successfully describes twelve common types of hipster (for more detail, see his book):
  1. The Natural - hipness just flows naturally.
  2. The Newbie - the clingy, fickle freshman who saw a movie or concert that "changed his/her life" and "opened his/her eyes." 
  3. The Artist - the starving, bohemian artist.
  4. The Academic - the bookish intellectual. 
  5. The Dilettante - a fringe-lover who knows little of the actual fringes he/she embraces (I think "poser" would be an adequate alternative name). 
  6. The Mountain Man - unshaven, nature-loving, agrarian-nostalgic macho man (or possibly woman).
  7. The Shaman Mystic - supposedly in touch with the energies of the universe. 
  8. The Detached Ironic - witty, sarcastic class clown.
  9. The Yuppie - Patrick Bateman without the murderous tendencies.
  10. The Flower Child - born to hippie parents and stands in "solidarity with the poor and, well, everyone except the white bourgeoisie." (pg. 60)
  11. The Expat - traveling, humanitarian types.
  12. The Activist - the annoying protester who "raises awareness" of the "evils" of things like globalization.


A five-year project by the Barna Group found the six major reasons young adults leave church life to be 1) the overprotective environment, 2) shallow church experience, 3) the perceived antagonism toward science, 4) the supposed judgmental attitudes toward sex, 5) the theological exclusiveness of Christianity, and 6) believers' unfriendliness toward doubt.[4] Today's young Christian hipsters dislike the ultra-conservative fundamentalism of Pat Robertson's The 700 Club and instead favor more "liberal" theologians such as N.T. Wright, C.S. Lewis, G.K. Chesterton, and Dietrich Bonhoeffer (I must have hipster tendencies). These hipsters prefer a postmodern approach to Christianity (something Mormonism certainly understands).[5] The Gospel becomes more important than the Church (echoes of Elder Poelman), dialogue replaces argumentation (this fits Joseph Smith's fundamental principles of Mormonism), and actions speak louder than words (this is key to all three Abrahamic religions: Judaism, Christianity, and Islam). These things require a more intellectually satisfying, pro-active Christianity (I'm all for moving past the gospel made easy! we often find in Sunday School classes). Unfortunately, many of these positive attributes are rooted in the recent reactions to modernity rather than any deep spiritual reflection; in political leanings and activism rather than Christian ethics. It is merely the manifestation of a culture obsessed with shock value and "edginess." In other words, it is a fad and, as McCracken wisely notes, "True relevance is not a fad."[6] Narcissism, alienation, rebellion, and a reduction to the visually stimulating are neither appropriate nor sustainable foundations for a modern approach to the gospel, despite being dressed up in flattering terms. "When I asked my undergraduate students to name the characteristics that best described their generation," writes psychologist Jean Twenge, "the two most popular answer were "independent" and "open-minded.""[7] Twenge's research finds that social norms and manners are increasingly cast aside, from stopping at stop signs to cheating in school.[8]

Big words, big protests, and big egos. These seem like the very products of Moralistic Therapeutic Deism (and the surrounding culture in general, but that is for another post). The sad part, according to Dean, is that the churches are to blame: "Why do teenagers practice Moralistic Therapeutic Deism? Not because they have misunderstood what we have taught them in church. They practice it because it is what we have taught them in church. In fact, American teenagers are barometers of a major theological shift taking place in the United States."[9] Churches have made accidental Christian hipsters of their youth.



Fortunately for Latter-day Saints, Dean's research along with the National Study of Youth and Religion found that LDS kids are less likely to drink, smoke, and engage in risky behavior, while more likely to postpone sex (to age 18 instead of the average 16.5; 13% of Mormon teens identify themselves as not being virgins compared to the average 20%). "Mormon teenagers rank ahead of other youth in terms of spiritual vitality, hope for the future, and overall health and well-being." Dean also finds that Mormon teenagers are more likely than other teenagers to hold religious beliefs similar to their parents, attend weekly religious services, and talk about religious matters in their families. Religious faith is seen as "extremely important" in shaping daily life, demonstrated by the fact that Mormon youths participate in more religious practices than most teenagers and are more articulate about church teachings. Early morning seminary, family home evening, and two-year missions prepare Mormon children for adulthood. All in all, "Mormon teenagers tend to be the "spiritual athletes" of their generation, conditioning for an eternal goal with an intensity that requires sacrifice, discipline, and energy." As NSYR researcher John Bartkowski put it, "The story we tell about Mormon youth is not that all is well, but compared with other teens they're more knowledgeable about their faith, more committed to their faith, and have more positive social outcomes associated with their faith."[10]

Despite these inspiring comments, author and scholar Jana Riess correctly notes in her online review, "One complaint I have with Dean’s book is that she seems to assume that Moralistic Therapeutic Deism doesn’t exist in Mormonism, which it does despite the aforementioned high levels of religiosity." My aforementioned observation regarding hipster subculture and Mormon youth apparently was not far off. A recent article in The New York Times describes "a young generation of Mormons [that] has adopted a fashion-forward urban aesthetic (geek-chic glasses, designer labels and plenty of vintage) that wouldn’t look out of place at a Bushwick party." A trendy subculture has slowly developed in response to the former "bias against being 'cool' in the Mormon world." While every generation goes through similar stages, the period of what is known as "emerging adulthood" is getting much larger. I worry as to how long this desire to be "hip" will last with young Mormon adults and what effects it may have on the following generation (if any).[11] Instead of "finding ourselves" in come-and-go trends, we need to be rooting ourselves in Christ. "We will never truly be at peace with ourselves, comfortable in our skin, and happy with who we are, outside of the one who created us and calls us into his presence and eternal fulfillment," writes McCracken. "Here--in the service of Christ and with God as the center and core of our being--our identities become more fully realized than we've ever known. If that's not cool, I don't know what is."[12]

As members of the Church, we should always remember that it is hip to be square.   




1. Commenting on the recent Pew Forum findings, I half-jokingly wrote, "I can hear it now: "See! This proves Mormons aren't real Christians! If they were, they'd be scoring down here with the rest of us! Mormons reject Christ just like their atheist and Jewish friends!" Either that or something worse like the Mormons and Masons have infiltrated the Pew Forum." Ironically, another Pew Forum study finds that most non-LDS Christians identify Mormons as Christian. The category "White Evangelical" had the highest percentage (45%) of 'No's when it came to the question, "Are Mormons Christian?" My friend Daniel McClellan has some excellent comments on self-identification and Christianity in his online debate with James White of Alpha and Omega Ministries.

2. Dean, 2010, 30.

3. Ibid.

4. This should not be too alarming. As Rodney Stark and Byron Johnson of Baylor University explained in The Wall Street Journal, "The national news media yawned over the Baylor Survey's findings that the number of American atheists has remained steady at 4% since 1944, and that church membership has reached an all-time high. But when a study by the Barna Research Group claimed that young people under 30 are deserting the church in droves, it made headlines and newscasts across the nation—even though it was a false alarm. Surveys always find that younger people are less likely to attend church, yet this has never resulted in the decline of the churches. It merely reflects the fact that, having left home, many single young adults choose to sleep in on Sunday mornings. Once they marry, though, and especially once they have children, their attendance rates recover. Unfortunately, because the press tends not to publicize this correction, many church leaders continue unnecessarily fretting about regaining the lost young people." In other words, once they grow up a little, these church deserters often return.

5. McGuire's point about Nephi's vision is encapsulated in Terryl Givens, "The Book of Mormon and Dialogic Revelation," Journal of Book of Mormon Studies 10:2 (2001) and further expounded on in chapter 8 of Givens, By the Hand of Mormon: The American Scripture That Launched a New World Religion (New York: Oxford University Press, 2002). For a Mormon philosophical approach to postmodernism, see James E. Faulconer, "The Myth of the Modern; the Anti-Myth of the Postmodern," FARMS Review 20:1 (2008).

6. McCracken, 2010: pg. 234.

7. Jean M. Twenge, Generation Me: Why Today's Young Americans Are More Confident, Assertive, Entitled - and More Miserable Than Ever Before (New York: Free Press, 2006), pg. 24.

8. "In 1979, 29% of people failed to stop at a particular stop sign in a New York suburb, but by 1996 a stunning 97% of drivers did not stop at all...In 2002, 74% of high school students admitted to cheating, up from 61% in 1992. In 1969, only 34% of high school students admitted to cheating, less than half of the 2002 number. This continues into college; a 2002 survey found that 80% of students at Texas A&M University admitted to cheating...Not only are teens more likely to cheat, but they are resigned to cheating among their peers. In a 1997 survey, 88% of high school students said that chearing was common at their school. Three times as many high school stuents in 1969 compared to 1989 said they would report someone they saw cheating. Also in 1989, an incredible 97% of high school students said they had let someone else copy their work. The disregard for rules continues outside the classroom: in 2000, 26% of high school boys admitted they had shoplifted from a store at least once." (Twenge, 2006, 26-27)

9. Dean, 2010, 29.

10. Ibid., 51.

11. See the review of Christian Smith with Patricia Smith, Souls in Transition: The Religious and Spiritual Lives of Emerging Adults (Oxford University Press, 2009) in The Wall Street Journal. This book is on my extensive "to-read" list.

12. McCracken, 2010, 247.

Thursday, November 10, 2011

Rand, Selflessness, and the Silly Undergrad

A relatively recent online debate grabbed my attention when an individual (who shall remain nameless) more-or-less claimed that the Austrian theory of economics was to be equated with Ayn Rand and her virtue of selfishness. While Rand's individualism and defense of capitalism certainly make her a fellow traveler among the Austrians, this individual had painted Austrian theory as nothing more than greed-fueled anarchism. Most likely unaware of the breaks Rand had with Rothbard's anarchism, Mises' praxeology, or Hayek's ethical foundations of traditional morality (it was consistently asserted in the debate that Hayek was some kind of anarchist), this critic of conservatism had no problem painting with a broad brush.[1] I explained that I have been critical of Rand's rhetoric regarding selfishness, yet pointed out that she basically redefined the term in an attempt to unload it of the negative baggage (whether wisely or not).



As Rand states in the introduction to her The Virtue of Selfishness,

The title of this book may evoke the kind of question that I hear once in a while: "Why do you use the word 'selfishness' to denote virtuous qualities of character, when that word antagonizes so many people to whom it does not mean the things you mean?" To those who ask it, my answer is, "For the reason that makes you afraid of it." [2]

The use of the word 'selfishness' was largely for shock value, not to mention her extreme disdain for anything collectivist due to her experience as a youth in Russia.[3] After a fairly lengthy exchange, a separate post was made by this individual to "educate" me (a silly undergraduate, which I am no longer nor was at the time) on the meaning of altruism in contrast to selfishness. The definition of 'altruism' was provided, along with the notion that to support her form of altruism (i.e. wealth redistribution) was to be caring, moral, and (as her post implied) Christian. She further implied that support of the market system was inhumane, selfish, and spat in the face of Jesus Himself.

Ignoring the multiple problems that presented themselves throughout her barrage of ill-mannered responses, I wanted to address the relationship between selflessness and selfishness. Ayn Rand has had little influence on my worldview, in large part due to her atheism and Objectivism. While I can understand her appeal to market proponents, I have never quite understood the borderline obsession.


However, her comments regarding the "selfish" nature of serving others is interesting. Joseph Smith reportedly told Oliver B. Huntington that "some people entirely denounce the principle of self-aggrandizement as wrong. ‘It is a correct principle,’ [Joseph] said, ‘and may be indulged upon only one rule or plan–and that is to elevate, benefit and bless others first. If you will elevate others, the very work itself will exalt you. Upon no other plan can a man justly and permanently aggrandize himself’.”[4] On this, the late philosopher Truman G. Madsen wrote,

God, taught the Prophet, loves Himself in an inclusive way and hence "everything God does is to aggrandize His kingdom." Such love expands the "self" to include all selves, all life; and God, therefore, cannot be happy except in the happiness of all creatures. Call that "selfish" if you like. But notice that the opposite is a selfishness which seeks something in indifference to or at the expense of others. We are commanded to be selfish as God is. Joseph Smith taught that there is a law (not, if I understand him, of God's making but in the very nature of things) that "upon no other principle can a man permanently and justly aggrandize himself." This is the meaning of the Master's cryptic phrase: "Lose yourself...and find yourself."[5]

Using a version of "The Prisoner's Dilemma" game and fMRI, a team of researchers from Emory University found that activation in the reward-processing regions of the brain (i.e. nucleus accumbens, caudate nucleus, ventromedial frontal/orbitofrontal cortex, rostral anterior cingulate cortex) took place during cooperative situations. This data demonstrates that what is known as altruism is in fact intrinsically rewarding.[6] Related results were found in another study, which provided participants the choice of either collecting a maximum of $128 or donating to a variety of charities. Scans during the process revealed that "the midbrain ventral tegmental area (VTA), the dorsal striatum, and the ventral striatum were activated by both pure monetary rewards and decisions to donate..., suggesting that donating to societal causes and earning money share anatomical systems of reward reinforcement expectancy...This finding is compatible with the putative role of the "warm glow" ("joy of giving") effect, the rewarding experience associated with anonymous donations."[7] The fronto-limbic activity is connected to "more basic social and motivational mechanisms" stimulated by such things as "food, sex, drugs, and money."[8] Even without any evidence of direct material or reputation gains or reciprocity, charity is neurologically rewarding.

Author and neuroscientist Sam Harris defines morality as that which produces the well-being of conscious creatures. Drawing on studies of moral cognition, he recognizes the existence of a "reward component of genuine altruism (often called the "warm glow" associated with cooperation)" and that "we know from neuroimaging studies that cooperation is associated with heightened activity in the brain's reward regions." From this evidence, Harris concludes, "Here...the traditional opposition between selfish and selfless motivation seems to break down. If helping others can be rewarding, rather than merely painful, it should be thought of as serving the self in another mode."[9]

It is perhaps worth noting that research conducted by Arthur C. Brooks of Syracuse University (now president of the American Enterprise Institute) has shown those in favor of free enterprise and less government donate four times as much money as redistributionists (even when controlled for income), give more blood, and volunteer more hours.[10] Not only is free enterprise statistically linked with charity, but charity is statistically linked with reported happiness. When controlled for income, education, age, race, gender, religion, and children, "conservatives are, on average, 7.5 percentage points more likely than liberals to say they are very happy."[11]

With all the morally superior sneering that takes place on my debate opponent's wall, I wonder how she feels about being neurologically selfish in her altruistic pursuits. On top of that, I wonder if she cares that the ideas she advocates not only harm those she intends to help, but her own happiness and well-being also.



1. The principles behind policies are often more important than the policies themselves. In other words, just because Rand and other market-oriented voices came to similar conclusions does not mean that they hold the same principles for doing so. In his testimony favoring Robert Bork's 1987 Supreme Court nomination, Thomas Sowell explained how principles behind policies take on a life of their own. For further reading on Rand's relationship with economists of the Austrian theory (and her life and politics in general), see The Journal of Ayn Rand Studies 6:2 (Spring 2005); Jennifer Burns, Goddess of the Market: Ayn Rand and the American Right (New York: Oxford University Press, 2009). For an overview, see Reason TV's interview with historian and author Jennifer Burns.

2. Ayn Rand, The Virtue of Selfishness: A New Concept of Egoism (New York: Signet, 1964 [1961]), 5.

3. "It was a wintry day in 1918 when the Red Guard pounded on the door of Zinovy Rosenbaum's chemistry shop. The guards bore a seal of the State of Russia, which they nailed upon the door, signaling that it had been seized in the name of the people. Zinovy could at least be thankful the mad whirl of revolution had taken only his property, not his life. Alisa [Ayn], twelve at the time, burned with indignation. The shop was her father's; he had worked for it, studied long hours at university, dispensed valued advice and medicines to his customers. Now in an instant it was gone, taken to benefit nameless, faceless peasants, strangers who could offer her father nothing in return. The soldiers had come in boots, carrying guns, making clear that resistance would mean death. Yet they had spoken the language of fairness and equality, their goal to build a better society for all. Watching, listening, absorbing, Alisa knew one thing for certain: those who invoked such lofty ideals were not to be trusted. Talk about helping others was only a thin cover for force and power. It was a lesson she would never forget." (Burns, 2009, 9)

4. Quote and reference provided in this excellent post at Life on Gold Plates.

5. Truman G. Madsen, "Joseph Smith and the Sources of Love" in his Four Essays on Love (Provo, UT: Communications Workshop, 1971), 13-14. To clarify, I by no means am attempting to equate the philosophy of Ayn Rand with that of Joseph Smith.

6. James K. Rilling, David Gutman, Thorsten Zeh, Giuseppe Pagnoni, Gregory Berns, Clint Kilts, "A Neural Basis for Social Cooperation," Neuron 35 (2002).

7. Jorge Moll, Frank Krueger, Roland Zahn, Matteo Pardini, Ricardo de Oliveira-Souza, Jordan Grafman, "Human Fronto-Mesolimbic Networks Guide Decisions About Charitable Donations," Proceedings of the National Academy of Sciences 103:42 (2006): 15624.

8. Moll et all, 2006: 15625.

9. Sam Harris, The Moral Landscape: How Science Can Determine Human Values (New York: Free Press, 2010), 91-92.

10. Arthur Brooks, "Tea Partiers and the Spirit of Giving," The Wall Street Journal (Dec. 24, 2010). For a book length treatment of this subject, see his Who Really Cares: The Surprising Truth About Compassionate Conservatism - America's Charity Divide: Who Gives, Who Doesn't, and Why It Matters (New York: Basic Books, 2006).

11. Brooks, 2006, 110.

Monday, October 31, 2011

This Is Halloween



Prior to the rise of what Science 2.0's Hank Campbell calls today's "torture porn," what we now know as "horror films" were largely disassociated with Halloween (1931's Dracula was released on Valentine's Day). While Orson Welles' October broadcast of War of the Worlds provided the first inklings of the marriage between Halloween and Hollywood horror, it was not until John Carpenter's Halloween (1978) that the two were officially wed. The Celtic festivities of Samhain (mentioned in the Halloween sequels) had more to do with agriculture and the changing of seasons than the art of scaring. Nonetheless, the sense of the supernatural was heightened due to the belief in spirits brought on by the oncoming winter (the season being related to death and decay). These spirits were possibly kept at bay with the practice of animal or even human sacrifice (Julius Caesar wrote of the Druids' use of a wicker man), though this is difficult to prove. Despite these pagan roots, the most recognizable practices derive from the medieval Christian holy days of All Souls' and All Saints' Day. For example, the rituals of "souling" involved the baking or cakes to be distributed to relatives and the poor in return for prayers for the souls in purgatory. Many would go from door to door requesting food in exchange for prayers for the dead. This house-to-house activity included the carrying of a hollowed-out turnip, which represented a soul trapped in purgatory. The Protestant Reformation helped rid Halloween of the its more Catholic elements, focusing instead on the marriage prospects of adolescents rather than those trapped in purgatory. Courting and divination practices linked to future marriages became the custom of the day. Between its changing contexts, Halloween was often a night filled with pranks and the undermining of social norms. As these disturbances became less tolerated in the early 20th century, Halloween evolved into a more familial holiday. After surviving the overblown "razor-in-the-apple" scares, the real threat of the Great Society, and the Hollywood gore-fest, the holiday continues to be a night of overturning social norms in a variety of ways (including dressing like a total slut).[1]



Still, Halloween continues its relationship with the spooky and the supernatural, invoking numerous Halloween specials on various TV stations. As far as I'm concerned, if your Halloween night does not consist of murderous preachers, showers with schizophrenics, old-fashioned haunted houses, real-life carnies, the devil's baby shower, possessed hotel caretakers, or all of the above, then you are not doing it right.[2]


1. For a detailed treatment of Halloween's evolution and prominence in North American culture, see Nicholas Rogers, Halloween: From Pagan Ritual to Party Night (New York: Oxford University Press, 2002). Another interesting study on Halloween consumerism can be found here.

2. I admit to not doing it right. Since I will be unable to celebrate Halloween in any recognizable way due to work, I decided to read the above academic material instead.

Tuesday, October 4, 2011

Paul and the Merkabah

Paul's vision on the road to Damascus has often been puzzling to me. The standard telling of the story consists of a devout Pharisee persecuting the Christians who is converted through a vision of the resurrected Jesus Christ. The question that always accompanied my reading of Acts 9 was in regards to the catalyst of Paul's vision. My previous assessment drew comparisons to the experience of Alma the Younger: an angelic appearance or theophany brought about by the prayers and suffering of others. While this may very well be the case when it comes to Paul, I am convinced that the vision rests comfortably within the context of merkabah mysticism; a reference to Ezekiel's vision of the anthropomorphic God upon His chariot-throne.[1] As Rice University's April DeConick explains,

The centerpiece of this [priestly] cosmology is the belief that God has a "body," called the "Glory" or Kavod of YHWH. This idea grew out of the study of certain Jewish scriptures, particularly sections of Ezekiel that describe his visions of an enthroned "likeness as the appearance of a Man ('adam)," a Man who looked like "fire" with "brightness around him." This is "the appearance of the likeness of the Glory (kavod) of YHWH" (Ezek 1:28). This figure is the very manifestation of the hidden YHWH, depicted in the scriptures as an anthropomorphic figure of fire or light (see Ezek 1:27-28; 8:2; Isa 6:1-4). He presides over the created order, oftentimes seated up his merkabah, a special throne consisting of two cherubim with wings spread over the kapporet, the lid of the ark of the covenant in the temple.[2]



In some forms of the interpretation of Ezek 1 the meaning of the text may have come about as the result of "seeing again" what Ezekiel saw. The visionary's own experience of what had appeared to Ezekiel becomes itself the context for a creative interpretation of the text...In some circles this led to renewed visionary experience as expounders saw again what had appeared to the prophet, but in their own way and appropriate for their own time.[3]

This not only demonstrates the power and importance of prayer in receiving revelation, but also the power and importance of the scriptures.[4]


1. See William Hamblin's lecture on the merkabah tradition in Ezekiel and its connection to the temple at David Larsen's blog.

2. April D. DeConick, "What Is Early Jewish and Christian Mysticism?" in Paradise Now: Essays on Early Jewish and Christian Mysticism, ed. April D. DeConick (Atlanta, GA: SBL, 2006), 11-12.

3. Christopher Rowland, with Patricia Gibbons and Vicente Dobroruka, "Visionary Experience in Ancient Judaism and Christianity," in Paradise Now: Essays on Early Jewish and Christian Mysticism, ed. April D. DeConick (Atlanta, GA: SBL, 2006), 56.

4. For more on Paul and his conversion, see Alan F. Segal, Paul the Convert: The Apostolate and Apostasy of Saul the Pharisee (New Haven, CT: Yale University Press, 1990).

Welfare Principles and the Welfare State

There are many good people and organizations in the world that are trying to meet the pressing needs of the poor and needy everywhere. We are grateful for this, but the Lord's way of caring for the needy is different from the world's way...He is not interested only in our immediate needs. He is concerned about our eternal progression. For this reason, the Lord's way has always included self-reliance and service to our neighbor in addition to care for the poor.

- President Dieter F. Uchtdorf, General Conference October 2011

These powerful words from President Uchtdorf's important and timely talk echo the thoughts of President Marion G. Romney from years ago:

Many programs have been set up by well-meaning individuals to aid those who are in need. However, many of these programs are designed with the shortsighted objective of “helping people,” as opposed to “helping people help themselves.” Our efforts must always be directed toward making able-bodied people self-reliant...The practice of coveting and receiving unearned benefits has now become so fixed in our society that even men of wealth, possessing the means to produce more wealth, are expecting the government to guarantee them a profit. Elections often turn on what the candidates promise to do for voters from government funds. This practice, if universally accepted and implemented in any society, will make slaves of its citizens. We cannot afford to become wards of the government, even if we have a legal right to do so. It requires too great a sacrifice of self-respect and political, temporal, and spiritual independence.



Commenting on the above riots, former prison doctor and psychiatrist Theodore Dalrymple writes,

The riots are the apotheosis of the welfare state and popular culture in their British form. A population thinks (because it has often been told so by intellectuals and the political class) that it is entitled to a high standard of consumption, irrespective of its personal efforts; and therefore it regards the fact that it does not receive that high standard, by comparison with the rest of society, as a sign of injustice. It believes itself deprived (because it has often been told so by intellectuals and the political class), even though each member of it has received an education costing $80,000, toward which neither he nor—quite likely—any member of his family has made much of a contribution; indeed, he may well have lived his entire life at others’ expense, such that every mouthful of food he has ever eaten, every shirt he has ever worn, every television he has ever watched, has been provided by others. Even if he were to recognize this, he would not be grateful, for dependency does not promote gratitude. On the contrary, he would simply feel that the subventions were not sufficient to allow him to live as he would have liked.

This is quite different from the continued coddling of many who see the riot a desperate act of the poor and downtrodden. Despite the maddening psychobabble of many, actual psychologists have recognized the narcissistic tendencies of the Entitlement Generation for some time. "Many young people also display entitlement," notes psychologist Jean Twenge, "a facet of narcissism that involves believing that you deserve and are entitled to more than others...Several studies have found that narcissists lash out aggressively when they are insulted or rejected."[1] This coincides with the findings of another researcher, which concludes that the perception of low status elicits misery, leading to animosity and aggression. Some are under no illusions as to where this perception of a deserved higher status came from. "The entitlement mindset didn’t come from nowhere," writes one columnist. "It came from us. It came from a generation of adults who believed that kids should never be allowed to fail, or told the truth about their abilities, or learn that getting what you want is sometimes hard." As The Australian reports, "It may seem compassionate to give people money, but passive welfare over the long term is a disaster for the recipient's self-respect, motivation, general morale and ultimately their sanity." Thus, by all accounts, "the European model right now is a wretched failure." Drawing comparisons to the dystopian novel and film A Clockwork Orange, one writer views the riots as "[w]hat happens when you teach people that profits are theft, that inequality of outcome is injustice, and that it is a basic human right for every citizen to have "access" to all the consumer goods their eyes behold[.]"

The disgraceful attempt of many to justify or overlook the inexcusable actions of these rioters is rich with as much cognitive dissonance as sheer hypocrisy. It is the same nonsense trumpeted by the media, politicians, and public intellectuals during the riots of the 1960s. Chanting "Burn, Baby, Burn," rioters looted and set stores ablaze. These riots were characterized as "'uprisings' against poverty and white racism."[2] Instead of stealing necessary goods (the deprivation of such being the very definition of "poverty"), things such as liquor, cigarettes, and drugs were targeted instead. Dry-cleaned clothes and pawn shop items were also looted, despite the fact that these were property of black community residents. Though the militant interpretation claimed this to be a response to racism, African American residents took a largely negative view of the situation. Most arrested looters admitted nothing more than a personal desire for material gain. Small businesses, which employed a large number of blacks, were the main recipients of looting and arson. Black-owned businesses faired no better, even with signs reading "Soul Brother" or "Very, Very, Very, Very Black." The 1992 Los Angeles riot following the Rodney King incident left multiple Korean-owned, African American-owned, and Hispanic-owned businesses in ruin. This supposed protest against "racial injustice" created a kind of riot ideology.[3] In our modern context, the injustice has moved from racial to economic and social. "[M]uch of the furor is because poverty is now seen as a relative, not an absolute, condition," writes historian Victor Davis Hanson. "Per capita GDP is $47,000 in the U.S. and $35,000 in Britain. In contrast, those rioting in impoverished Syria (where per capita GDP is about $5,000) or Egypt (about $6,000) worry about going to bed hungry or being shot for expressing their views — not about wanting a new BlackBerry or a pair of Nikes. Inequality, not Tiny Tim–like poverty, is the new Western looter’s complaint." The problems not only stem from government policy, but a cultural paradigm shift in regards to morality.[4]

This is what happens when individuals leave behind the welfare principles of the gospel with the support and encouragement of their government leaders.


1. Jean M. Twenge, Generation Me: Why Today's Young Americans Are More Confident, Assertive, Entitled - and More Miserable Than Ever Before (New York: Free Press, 2006), 70-71 (italics mine).

2. Jonathan J. Bean, "'Burn, Baby, Burn!': Small Business in the Urban Riots of the 1960s," The Independent Review 5:2 (Fall 2000): 165.

3. See Bean, 2000 in its entirety.

4. The following selection addresses this morality shift as well as the need for traditional values and economic freedom: Charles Murray, "Europe Syndrome," The Wall Street Journal (March 25, 2009); Jonathan Sacks, "Reversing the Decay of London Undone," The Wall Street Journal (Aug. 20, 2011); Sacks, "Markets and Morals," First Things (Aug/Sept 2000); Edward Feser, "Hayek on Tradition," Journal of Libertarian Studies 17:1 (Winter 2003).

Sunday, September 25, 2011

Bought and Paid For

The following from Harvard law professor and Massachusetts Senate candidate Elizabeth Warren has become a rather popular "Like" option on Facebook and elsewhere:



Of course, there is much that could be and has been said about Warren's comments.[1] However, I want to focus on the following:

You were safe in your factory because of police forces and fire forces that the rest of us paid for. You didn't have to worry that marauding bands would come and seize everything in your factory and hire someone to protect against this because of the work the rest of us did.

The implication appears to be that anarchy would unfold and marauding bandits would be the norm if government-run police forces were not intact. Of course, it may be desirable for governments to provide such services, but these can be handled on a small-scale and/or local level. However, historical scholarship over the last few decades demonstrates that the need for state-sponsored enforcement agencies may be less than we think. Despite Western folklore, some historians argue that the "wild" West was actually less violent than today's urban cities despite the lack of government institutions. For example, according to historian Thomas Woods, Dodge City had only five killings in 1878, the highest amount in its Frontier history. Five of the major cattle towns (Abilene, Caldwell, Dodge City, Ellsworth, and Wichita) only produced 45 reported homicides from 1870 to 1885. As for "marauding," fewer than a dozen bank robberies took place in the entire frontier West from 1859 to 1900. Overall, crime (including burglary or robbery and rape) might have been lower in the less regulated Old West than our modern age.[2] Even West Coast gold miners were capable of establishing lawful mechanisms without the help of a centralized government:

This outcome is all the more remarkable when we recall some of the details of miners' lives. These were men of vastly different backgrounds, who were complete strangers (and thus possessed no preexisting community camaraderie upon which to build) and who intended not to put down roots and stay for years but simply to get rich from their gold finds and return home...The miners settled disputes either through a district-wide meeting or by an elected jury or alcalde. The alcalde kept his position only as long as the miners accepted his rulings as just. They replaced those whose judgments did not conform to generally accepted standards of justice. Crime was also notably low in the districts, a fact attributed to widespread gun ownership among the miners as well as to the efficient nature of the miners' legal system.[3]

This feat was accomplished through "private, voluntary mechanisms." These mechanisms "successfully carried out the very functions of which the private sector is routinely assumed to be incapable: defining and enforcing property rights, adjudicating disputes, and protecting people against all manner of crimes."[4] In other words, the exaggerations of the writer in Clint Eastwood's Unforgiven are not far off.

From another perspective, law and order arises even among those who could be called "marauders." Economist Peter Leeson of George Mason University has shown the incredible organization and cooperation amongst 17th and 18th century pirates. As science writer Michael Shermer explains in his review of Leeson's work,

Pirate societies, in fact, provide evidence for [Adam] Smith’s theory that economies are the result of bottom-up spontaneous self-organized order that naturally arises from social interactions, as opposed to top-down bureaucratic design. Just as historians have demonstrated that the “Wild West” of 19th-century America was a relatively ordered society in which ranchers, farmers and miners concocted their own rules and institutions for conflict resolution way before the long arm of federal law reached them, Leeson shows how pirate communities democratically elected their captains and constructed constitutions...From where, then, did the myth of piratical lawlessness and anarchy arise? From the pirates themselves, who helped to perpetrate the myth to minimize losses and maximize profits.

The real question, however, regards the efficiency of pirate organizations. How did these "lawless" ships compare to other government-sanctioned ships such as merchant ships or privateers? Quite well, actually. Internal conflict was largely avoided due to the fact that adherence to particular rules determined the success of the crew. Voluntarily created order led to an efficient enterprise. Ironically, "pirates were more orderly, peaceful, and well organized among themselves than many of the colonies, merchant ships, or vessels of the Royal Navy."[5] "Pirate democracy," Leeson writes, "ensured that pirates got precisely the kind of captain they desired. Because pirates could popularly depose any captain who did not suit them and elect another in his place, pirate captains' ability to prey on crew members was greatly constrained compared to that of merchant ship captains. Similarly, because pirates were both principals and agents of their ships, they could divide authority on their vessel to further check captains' ability to abuse crew members without loss."[6] Leeson calls this "piratical checks and balances." This "democratic or self-governing vessel organization...facilitated crew cooperation at least as successfully as autocratic vessel organization, and probably more so."[7]



The concept of spontaneous order may be unintelligible to Ms. Warren and the like, but that does not mean it has not been or cannot be done.[8]



1. Aaron Ross Powell of the Cato Institute has an excellent response to Warren's "fair play" philosophy, while National Review's Rich Lowry emphasizes the importance of individual drive and innovation. Robert P. Murphy and George Will also weigh in. As I've pointed out before, there is ample evidence that low taxation actually increases economic growth to the benefit of everyone. If that does not fulfill Warren's form of social contract, I'm not sure what would.

2. See Thomas E. Woods, Jr., 33 Questions About American History You're Not Supposed to Ask (New York: Three Rivers Press, 2007), Ch. 6: "Was the "Wild West" Really So Wild?" Update: More sophisticated models on 19th-century homicide provide strong evidence that these numbers indicate these towns to be extremely homicidal. See Randolph Roth, Michael D. Maltz, Douglas L. Eckberg, "Homicide Rates in the Old West," Western Historical Quarterly 42 (Summer 2011): 173-196. Law professor Adam Winkler also notes that strict gun laws were enforced rarity in the Old West. This makes the issue far more complex than stated in the post.

3. Ibid., 50-51. Civil disputes in medieval England were also handled in a voluntary manner prior to the creation of centralized law enforcement. The court system was viewed by kings as a potential source of revenue. Rather than provide full restitution to the victim of the crime, government fines and punishments were introduced instead. See Nicholas A. Curott, Edward P. Stringham, "The Rise of Government Law Enforcement in England" in The Pursuit of Justice: Law and Economics of Legal Institutions, ed. Edward J. Lopez (New York: Palgrave Macmillan, 2010). Robert P. Murphy argues for a modern version of stateless law.

4. Ibid., 51.

5. Peter T. Leeson, "An-arrgh-chy: The Law and Economics of Pirate Organization,” Journal of Political Economy 115:6 (2007), 1076.

6. Leeson, 2007: 1065.

7. Leeson, 2007: 1087.

8. For more on intellectuals and their biases, see Thomas Sowell, Intellectuals and Society (New York: Basic Books, 2009); F.A. Hayek, "The Intellectuals and Socialism," The University of Chicago Law Review (Spring 1949).

Sunday, September 11, 2011

Ten Years Gone: The Secular and the Sacred

*Earlier this year, the American Atheists sued "over the inclusion of cross-shaped steel beams, dubbed the "World Trade Center Cross," in the exhibit at the National September 11th Memorial and Museum." The given reason was that the beams' inclusion "promotes Christianity over all other religions on public property and diminishes the civil rights of non-Christians." I find such reasoning difficult to swallow. This seems to be rooted in the tiresome debate over whether or not America is a "Christian nation" (a problematic phrase to begin with)[1] as well as the tendency for some atheists to overreact to anything that remotely comes close to mentioning Jesus.[2]



With today being the 10th anniversary of the 2001 terrorist attacks, I have reflected on the uproar over a religious symbol at Ground Zero. Of course, there are those who have tried to distinguish between the religious and secular expression of human nature. For example, in his amusing, yet shallow book God Is Not Great: How Religion Poisons Everything, author Christopher Hitchens admits, "We are not immune to the lure of wonder and mystery and awe: we have music and art and literature, and find that serious ethical dilemmas are better handled by Shakespeare and Tolstoy and Schiller and Dostoyevsky and George Eliot than in the mythical morality tales of the holy books. Literature, not scripture, sustains the mind and - since there is no other metaphor - the soul."[3] He suggests, "The loss of faith can be compensated by the newer and finer wonders that we have before us, as well as by the immersion in the near-miraculous work of Homer and Shakespeare and Milton and Tolstoy and Proust, all of which was also 'man-made' (though one sometimes wonders, as in the case of Mozart)."[4] Obviously, Mr. Hitchens misses the irony in his statement; an irony that is pointed out brilliantly by Daniel Peterson:

[Without religion], we would be without Bach's "St. Matthew Passion," Schubert's "Mass in G," Mozart's "Requiem," Vivaldi's "Gloria," Wagner's "Parzifal" and Handel's "Messiah." We'd have neither the musical compositions of John Tavener and Arvo Part nor the choral music of John Rutter. (For that matter, there wouldn't be many choirs.) Nor would we have gospel music. Dante's "Divine Comedy"? Erased. Likewise, Milton's "Paradise Lost," Chaucer's "Canterbury Tales" and Goethe's "Faust" would be gone, as would the Arthurian legends...We couldn't read Shusako Endo's "Silence," most poems of T.S. Eliot, the novels of G.K. Chesterton and Dostoevsky, or the writings of C.S. Lewis. There would be no Augustine, no Aquinas, no Kierkegaard. "Les Misérables" would make no sense. Lincoln's majestic "Second Inaugural Address" would be unthinkable.[5]

September 11, 2001 was a date in a list of many that reminded us all of the fallen nature of humankind and the fragility of life. It also reminded us of the need for redemption. The issue over the World Trade Center Cross confirms my suspicions that some Americans have forgotten the necessity and power of the sacred myth. Catholic theologian Tim Muldoon recently penned,

In studying the Western cultural tradition, it is clear to me that good societies, those comprised of people with a shared sense of purpose that spills over into care for their fellow citizens, are those built around a shared myth. In ancient Greece, there was a myth of the hero; in classical Greece, of virtue; in ancient Israel, of divine protection; in Christian Europe, of Christ the King. By "myth" here I do not mean a false story, nor do I suggest that all myths are equal. Instead I mean a story that gives shape to a culture, gives its efforts meaning, gives its people a sense of what they strive for as a community.

Despite Hitchen's praise, art more-or-less failed us in the wake of 9/11, particularly the film industry.[6] Can the symbol of the cross provide any comfort or reconciliation for the hardened secularist? I believe it can. As Oxford professor and The Lord of the Rings author J.R.R. Tolkien stated in his famous essay "On Fairy Stories,"

The Gospels contain a fairystory, or a story of a larger kind which embraces all the essence of fairy-stories. They contain many marvels—peculiarly artistic, beautiful, and moving: “mythical” in their perfect, selfcontained significance; and among the marvels is the greatest and most complete conceivable eucatastrophe [i.e. "The Consolation of the Happy Ending"]. But this story has entered History and the primary world; the desire and aspiration of sub-creation has been raised to the fulfillment of Creation. The Birth of Christ is the eucatastrophe of Man's history. The Resurrection is the eucatastrophe of the story of the Incarnation. This story begins and ends in joy. It has pre-eminently the “inner consistency of reality.” There is no tale ever told that men would rather find was true, and none which so many sceptical men have accepted as true on its own merits. For the Art of it has the supremely convincing tone of Primary Art, that is, of Creation. To reject it leads to sadness or to wrath. It is not difficult to imagine the peculiar excitement and joy that one would feel, if any specially beautiful fairy-story were found to be “primarily” true, its narrative to be history, without thereby necessarily losing the mythical or allegorical significance that it had possessed...The joy would have exactly the same quality, if not the same degree, as the joy which the “turn” in a fairy-story gives: such joy has the very taste of primary truth...It looks forward (or backward: the direction in this regard is unimportant) to the Great Eucatastrophe. The Christian joy, the Gloria, is of the same kind; but it is preeminently (infinitely, if our capacity were not finite) high and joyous. But this story is supreme; and it is true. Art has been verified. God is the Lord, of angels, and of men—and of elves. Legend and History have met and fused.

The story of atonement, resurrection, redemption, and eventual justice is one that transcends cultures, nations, and ideologies. It is a reminder of the good that also paradoxically exists with the evil in this world. It is a reminder that human nature has what we could refer to as divine qualities and potential, despite its inherent weaknesses. Finally, it is a reminder that hope can triumph over current circumstances, allowing society to recover a normal balance once more.

The difference between the hope found in the Gospels and the hope found in other literary pieces is that there is the possibility that the hope of the Gospels is real.


*Yes, there is a Led Zeppelin reference in the title. And I just realized that I kind of ripped off Mircea Eliade's title The Sacred and the Profane.

1. For informative readings on religion and the American founding, see Jon Meacham, American Gospel: God, the Founding Fathers, and the Making of a Nation (New York: Random House, 2007) and David L. Holmes, The Faiths of the Founding Fathers (New York: Oxford University Press, 2006).

2. As referenced previously on my blog, Hank Campbell has an excellent article on avoiding atheist stereotypes.

3. Christopher Hitchens, God Is Not Great: How Religion Poisons Everything (New York: Twelve, 2007), 5; emphasis mine.

4. Hitchens, 2007: 151.

5. See also Peterson, "Editor's Introduction: God and Mr. Hitchens," FARMS Review 19:2 (2007).

6. It could be argued that art has been failing us for a long time given its continual rejection of sacred beauty. See Roger Scruton's Beauty (New York: Oxford University Press, 2009), BBC special "Why Beauty Matters," or City Journal article "Beauty and Desecration."

Monday, September 5, 2011

A More Perfect Union?




*While Elia Kazan’s 1954 masterpiece On the Waterfront is remembered primarily for its artistic achievement (garnering eight Academy Awards, including Best Picture and Best Actor for a young Marlon Brando), the portrayal of political corruption within unionized labor reflected a reality that was and still is all too true.[1] Kazan at this time had lost considerable respect in some circles (as well as halved director’s fees) due to his testimony before the House Un-American Activities Committee, “an organization charged with weeding out those who were sympathetic to the Communist Party.”[2] A former Communist himself, Kazan under oath identified several associates affiliated with the Communist party. The courage of Brando’s Terry Malloy to speak out against the corrupt unionists has been noted as a probable apologia for Kazan’s own political confession. However, his marginalization by those who opposed his testimony continued even late in life, as demonstrated by the crowd’s divided reactions to his Honorary Oscar in 1999. As shown in Kazan's treasured film, unions have a tendency to implement an extreme form of coercive tribalism and marginalization in order to achieve their objectives. This occurs in the form of protecting union “brothers and sisters” at the expense of virtually everyone else. To oppose their methods is to be like Adolf Hitler according to some unionists.[3]

In the crusade to “save jobs,” technological and operational progress is hindered greatly. Instead of allowing innovative steps to be taken to further productivity and company health (which usually includes more hires and increased wages), inefficient jobs are maintained for the mere sake of “saving jobs.” Rather than being means to an end, jobs become the ends themselves in the hands of the union. As GMU economist Russell Roberts explains,
The story goes that Milton Friedman was once taken to see a massive government project somewhere in Asia. Thousands of workers using shovels were building a canal. Friedman was puzzled. Why weren't there any excavators or any mechanized earth-moving equipment? A government official explained that using shovels created more jobs. Friedman's response: "Then why not use spoons instead of shovels?"
New technology leads to higher productivity and lower costs. “Those lower costs lead to lower prices as businesses compete with each other to appeal to consumers," says Russel. "The result is a higher standard of living for consumers. The average worker has to work fewer and fewer hours to earn enough money to buy a dozen eggs or a pair of shoes or a flat-screen TV or a new car that's safer and gets better mileage than the cars of yesteryear…When it gets cheaper to make food and clothing, there are more resources and people available to create new products that didn't exist before…So many job descriptions exist today that didn't even exist 15 or 20 years ago. That's only possible when technology makes workers more productive."





Sound economic theory, however, has no bearing on the “mobster mentality” found within many unions.[4] Accountability is diffused among the collective whole, watered down until it is virtually non-existent. Individual clashes with the mob mentality often lead to threats, property destruction, and union-on-union violence. While most support the basic idea of unionized labor, one cannot help but recognize the eroding effect unions have on both businesses and the economy as a whole.
Despite the common trumpeting of the so-called Progressive Era in the early 20th century, the economic reforms of that era have been some of the most damaging in American history. These early progressives sought to "move beyond the political principles of the American founding."[5] Despite claims for social justice, what these progressive policies ended up creating was social control. This manifested itself in the exclusion of “unfit workers,” mainly blacks and immigrants.[6] The “Labor Unions” entry in The Concise Encyclopedia of Economics reads,
Economist Ray Marshall, although a prounion secretary of labor under President Jimmy Carter, made his academic reputation by documenting how unions excluded blacks from membership in the 1930s and 1940s. Marshall also wrote of incidents in which union members assaulted black workers hired to replace them during strikes. During the 1911 strike against the Illinois Central, noted Marshall, whites killed two black strikebreakers and wounded three others at McComb, Mississippi. He also noted that white strikers killed ten black firemen in 1911 because the New Orleans and Texas Pacific Railroad had granted them equal seniority. Not surprisingly, therefore, black leader Booker T. Washington opposed unions all his life, and W. E. B. DuBois called unions the greatest enemy of the black working class. Another interesting fact: the “union label” was started in the 1880s to proclaim that a product was made by white rather than yellow (Chinese) hands. More generally, union wage rates, union-backed requirements for a license to practice various occupations, and union-backed labor regulations such as the minimum wage law and the Davis-Bacon Act continue to reduce opportunities for black youths, females, and other minorities.
Many often point to unions as the source of higher wages, safer working conditions, and shorter work days. However, as explained above, it is increasing technology, wealth, and productivity that make these circumstances both plausible and sustainable. Of course, unions have been "successful" at these various activities as well. "In doing so, however, they have reduced the number of jobs available in unionized companies. That second effect occurs because of the basic law of demand: if unions successfully raise the price of labor, employers will purchase less of it. Thus, unions are a major anticompetitive force in labor markets. Their gains come at the expense of consumers, nonunion workers, the jobless, taxpayers, and owners of corporations."[7]
Many vocal activists claim that unions built the middle class. A recent study concludes that the decline in union membership has resulted in the decline of the middle class and stagnant income over the past several decades. This helps foster the myth that “the poor are getting poorer” and that unions are the only thing keeping evil corporations from turning the entire population into quasi-slaves. However, the statistics often used in these studies are highly misleading. As economist Steven Horwitz writes,
According to researchers at the University of Michigan, the poorest fifth of households in 1975 earned, on average, almost $28,000 more per year by 1991, adjusted for inflation. According to U.S. Treasury data, an astounding 86 percent of households that comprised the bottom fifth in 1979 had climbed out of poverty by 1988…The vast majority of American households do move up: The reality of the U.S. economy over the last 30 years is that everyone has gotten richer in absolute terms, and significantly so. Simply measuring income and wealth tells us very little about the lifestyle of typical Americans. For example, poor Americans today are more likely to own basic household goods like washing machines, dishwashers, color TVs, refrigerators, and toasters than the average household was in 1973.


Economist Thomas Sowell recognizes the talk of "poor getting poorer" as the fallacy of "confusing the fate of statistical categories with the fate of flesh-and-blood human beings."[8] The reason? "It is an undisputed fact that the average real income-that is, money income adjusted for inflation-of American households rose by only 6 percent over the entire period from 1969 to 1996...But it is an equally undisputed fact that the average real income per person in the United States rose by 51 percent over the very same period."[9]



As one commentator notes, “In other words, if the middle class in America has shrunk, it is only because so many formerly middle-class households have moved to the upper-income brackets, while a significant number of households previously in the lower brackets have moved up to the middle class and beyond.”
True growth comes from capital investment, creation of wealth, technological advances, increased skill and education, and competition. Based on my own observations, the observations of more seasoned employees, historians, and economists, unions tend to engage in the very thing they claim to be fighting against: exploitation and coercion. Though there is still "a role for labor unions, just as in a free market there is still a role for agents or managers who help their clients find work and negotiate their contracts,"[9] this must come without the government attachments. To politicize the union does a great disservice to both the labor force and the company as a whole.[11]

With that, Happy Labor Day!

*This is an edited excerpt from my internship research paper needed for graduation.

UPDATE: Here is an informative analysis of the "rising income inequality."


1. On the Waterfront was partly based on the Pulitzer Prize-winning New York Sun series "Crime on the Waterfront" by journalist Malcolm Johnson.

2. Daidria Curnutte, "The Politics of Art: Elia Kazan and the Scandal Over On the Waterfront," The Film Journal 2 (July 2002).

3. Ironically, the Nazi comparison backfires.

4. Economists like Charles Baird have gone so far as calling labor unions "labor cartels."

5. Ronald J. Pestritto, William J. Atto, "Introduction to American Progressivism," in American Progressivism: A Reader, eds. Pestritto, Atto (Lanham, MD: Lexington Books, 2008), 2.

6. See David E. Bernstein, Thomas C. Leonard, "Excluding Unfit Workers: Social Control Versus Social Justice in the Age of Economic Reform," Law and Contemporary Problems 72 (2009).

7. Morgan O. Reynolds, "Labor Unions," The Concise Encyclopedia of Economics, ed. David Henderson, 2nd ed. (Indianapolis, IN: Liberty Fund, 2007).

8. Thomas Sowell, Economic Facts and Fallacies, 2nd ed. (New York: Basic Books, 2011), 140.

9. Ibid., 140-141.

10. Robert P. Murphy, The Politically Incorrect Guide to Capitalism (Washington, DC: Regnery Publishing, 2007), 25.

11. For further reading, see Morgan O. Reynolds, Power and Privilege: Labor Unions in America (New York: Universe Books, 1984); Thomas E. Woods, Jr., "What Made American Wages Rise? (Hint: It Wasn't Unions or the Government.)" in his 33 Questions About American History You're Not Supposed to Ask (New York: Three Rivers Press, 2007).