Thursday, December 22, 2011

Hitch Slapped

I am deeply saddened to learn of the passing of journalist and author Christopher Hitchens. This insightful man had the ability to both inspire and anger me within the same article or lecture. Whether he was defending the Iraq War, explaining why women are not funny, or condemning religion, Hitchens pushed the conversation forward.[1] A former socialist, though still Marxist in his thinking, Hitchens was articulate, well-read, and heavily involved in the modern world. Writing for left-leaning publications such as The Nation and Slate or the largely conservative Hoover Institution, Hitchens was not afraid to cross ideological boundaries. His tirades against religion (including my own) brought down the wrath of the so-called Religious Right, while his advocacy for intervention in Iraq baffled many of those on the Left.[2]

I first came across Christopher Hitchens soon after my mission in 2007. As my wife can tell you, I had little interest in politics at the time. I was still in a kind of missionary mode, mainly reading gospel-related academic and apologetic material such as FARMS and FAIR with brief interludes for the news. Though I cannot recall if it was a video or article of his that I came across first, I nonetheless remember being struck by how articulate and intelligent the man was. Out of curiosity, I began to browse the work of Mr. Hitchens and found myself being challenged and impressed by many of his columns, debates, and interviews. A man-crush was born.

Of course, Hitchens was not immune to sloppy thinking. For example, his attacks on religion are best when presented in terms of modern religious and political extremism, but stumble greatly when judged within the context of history or philosophy. His importance in my own thinking, however, does not stem from anything in particular he has written or lectured about. Instead, his importance comes from what is manifested in his writing and persona: a hunger for knowledge, intellectual honesty, and a deep moral concern for real human beings.[3] Drawing on lessons from history, literature, recent events, and personal experience, Hitchens was a formidable public intellectual. Not a specialist by any means, but a well-informed, reasonable individual.

In other words, he was a Jerry Cantrell intellectual.

Cantrell, the guitarist and co-vocalist of Alice in Chains, had a similar influence on me in the realm of guitar playing. While my interest in playing was spawned by my love of pop-punk/indie bands such as Blink 182, Green Day, Jimmy Eat World, and the Ataris, I was soon taken in by the speed of metal, the groove of blues, and the epic, versatile sound of classic rock.[4] Unfortunately, I was too slow to keep up with Kirk Hammet, Dave Mustaine, or Dimebag Darrell; too stiff to match the feel of Stevie Ray Vaughn, Robben Ford, or Jimi Hendrix; and too limited in style to create the sound of Jimmy Page, David Gilmour, or Brian May. However, Jerry Cantrell (mainly in the form of Alice in Chains) provided a blues-based, melodic metal I could rock out to. More importantly, he provided a type of playing that seemed achievable: not because his playing was sub-par, but because it evidenced a moderate partaking of the best rock music had to offer. Cantrell was not a shredder, a blues master, or a progressive rock composer (he still isn't). But he was and is a fine guitar player, lyricist, and all-around musician. He instilled me with confidence and inspiration in my first few years of playing and remains influential even today. Likewise, Hitchens was not an economist, scientist, or historian. But he was a fine writer, thinker, and debater. More importantly, he made me take a closer look at virtually everything.




Though you may not have wanted our prayers, I will nonetheless say this: God bless you, Christopher Hitchens. I hope you are finding more of the happiness, truth, beauty, and wisdom you always sought.[5]



1. See his books A Long-Short War: The Postponed Liberation of Iraq (New York: Plume, 2003), God Is Not Great: How Religion Poisons Everything (New York: Twelve, 2007), and his article "Why Women Aren't Funny."

2. See Daniel Peterson's review of Hitchens' attacks on Mormonism and religion in general in "Editor's Introduction: God and Mr. Hitchens," FARMS Review 19:2 (2007). For Hitchens' post-invasion views on Iraq, see his "So, Mr. Hitchens, Weren't You Wrong About Iraq?" Slate (March 19, 2007).

3. Catholic philosopher Edward Feser writes, "Of the four horsemen of the New Atheism, Hitchens was the only one I found likable, and the only one possessed of a modicum of wisdom about the human condition, or at least as much wisdom about the human condition as one can have while remaining essentially a man of the Left. While there was rather too obviously something of the champagne socialist about him, I do not doubt that he had real concern for real human beings -- rather than merely for grotesque abstractions like “the working class” or “humanity” -- and that he showed real moral and even physical courage in defense of what he sincerely took to be the best interests of real human beings." This concern can be found in his personal reasons for defending the Iraq invasion, which often stressed the genocides under Hussein rather than the potential nuclear danger (though Hitchens certainly did not ignore the latter factor). For more on the genocidal nature and WMD potential of Iraq, see the following: Daniel Henninger, "If Saddam Had Stayed," The Wall Street Journal (Sept. 2, 2010); Susan Sachs, "A Grim Graveyard Window on Hussein's Iraq," The New York Times (June 1, 2003); John F. Burns, "Uncovering Iraq's Horrors in Desert Graves," The New York Times (June 5, 2006); Burns, "How Many People Has Hussein Killed?" The New York Times (Jan. 26, 2003); David E. Sanger, Thom Shanker "A Region Inflamed: Weapons; For the Iraqis, a Missile Deal That Went Sour; Files Tell of Talks With North Korea," The New York Times (Dec. 1, 2003); Sanger, "After the War: Weapons Programs; Iraqi Says Hussein Planned to Revive the Nuclear Program Dismantled in 1991," The New York Times (June 27, 2003); Jim Lacey, "Saddam: What We Now Know," National Review Online (Sept. 14, 2011); U.S. Agency for International Development, "Iraq's Legacy of Terror: Mass Graves": http://www.globalsecurity.org/intell/library/reports/2004/040317_iraq_mass_graves.pdf.

4. A nod to my brother-in-law Juan for being a major player in this.

5. See Hitchens' moving final piece in Vanity Fair "Trial of the Will" along with Mark Judge's commentary. New York Times columnist Ross Douthat weighs in on the Christian attraction to Hitchens.

Friday, November 25, 2011

Happy Thanksgiving

A couple nights ago at work, I had a discussion with a co-worker of mine who lost his son this past year, only to discover a few months later that his wife had an aggressive form of breast cancer. As I asked him about his holiday plans, he explained that scheduling conflicts would cause his wife to go through her chemo treatment the day before Thanksgiving without sufficient time to recover from the session before. The holiday would largely consist of him taking care of his recovering, miserable spouse. He explained that the worst part was watching. Physical pain and misery was something he did not mind enduring himself. But having to watch a loved one suffer was something else entire. I shook this good man's hand, promised him my thoughts, prayers, and support, and slowly walked back across the trailer yard. As I did, I could not help the rush of tears that came as I thought about the struggles of my friend and his family. I had to gain my composure and wipe my eyes as I made it back to the loading dock.

One may say that such interactions should make you grateful for all the things you have. To some extent, this seems to me rather obscene. It is as if one is to take the attitude "better him than me." Suffering is part of fallen nature, yet appears to be the most incomprehensible aspect of a world supposedly created and watched over by a loving God.[1] To be grateful that others are suffering rather than yourself is immoral in every sense of the word.



However, I highly doubt this is what is meant by the phrase "count your blessings." Gratitude is an emotion and attitude toward God, mankind, and life as a whole. It is a deep sense of appreciation for the very experience of life and those in it; an outlook bred out of genuine humility and awe. Numerous studies have been conducted that demonstrate the power of positive emotions. Negative emotions tend to restrict and narrow focus and thinking, while positive emotions broaden one's creative horizon. Positive emotions can also undo the effects of negative emotions, including the increased ability to cope with trauma. Studies indicate that as one's coping ability increases, so do positive emotions. "These findings suggest that, over time, positive emotions and broad-minded coping mutually build on one another, leading to improved coping skills and triggering an upward spiral toward enhanced emotional well-being."[2] Another study found that practiced gratitude led to "more progress on [participants'] goals, fewer physical complaints, more frequent physical exercise, more optimism, and higher overall well-being. So, feeling the pleasant emotion of gratitude in the short run led to more optimal functioning and emotional well-being in the long run."[3] This upward spiral can extend to groups and organizations due to the fact that witnessing moral behavior (e.g. helpfulness, gratitude, etc.) elevates and inspires others to become more helpful and gracious. Giving and receiving gifts, along with the associated gratitude, is "the moral memory of mankind. By mutual giving, people become tied to each other by a web of feelings of gratitude. Gratitude is the motive that moves us to give in return, and thus creates the reciprocity of service and counterservice."[4]

An interesting midrashic telling of Moses

notes that in the description of the first 3 of the 10 plagues in Exodus - the blood, frogs, and lice - it was Aaron rather than Moses who struck the Nile River and the sand, the sources of these plagues. Why so? Because the Nile...had protected [Moses] from Pharaoh's decree that all male Israelite infants be drowned at birth. Similarly, the sand - which had concealed the body of the Egyptian taskmaster Moses had killed...had saved Moses from Pharaoh's wrath and from prosecution and death. In gratitude to the Nile and to the sand, Moses did not want to be the one to smite them with his staff, and Aaron was delegated by God to do so. The moral the rabbis were conveying is that if one has to show gratitude even to inanimate objects, how much more must we show gratitude to humans who have benefited us?[5]

Gratitude is an essential quality that is too often forgotten and underscored. This Thanksgiving holiday, reflect on the love that makes the anguish possible, the friends and family that make life worth living, and the unique feeling of what it is to even exist. Be grateful for the blessings all around you by becoming a blessing to all around you.



UPDATE: President Eyring has a new article in the December 2011 Ensign entitled "The Choice to Be Grateful."


1. For reading on the problem of evil and suffering, see Truman G. Madsen, "Human Anguish and Divine Love," Four Essays on Love (Provo, UT: Communications Workshop, 1971); Blake T. Ostler, David L. Paulsen, "Sin, Suffering, and Soul-Making: Joseph Smith on the Problem of Evil," Revelation, Reason, and Faith: Essays in Honor of Truman G. Madsen, eds. Donald W. Parry, Daniel C. Peterson, Stephen D. Ricks (Provo, UT: FARMS, 2002); Loyd Ericson, "'Which Thing I Had Never Supposed': The Problem of Evil and the Problem of Man," Sunstone 159 (June 2010); David B. Hart, "Tsunami and Theodicy," First Things (March 2005).

2. Barbara L. Frederickson, "Gratitude, Like Other Positive Emotions, Broadens and Builds," The Psychology of Gratitude, ed. Robert Emmons, Michael McCullough (New York: Oxford University Press, 2004), 156.

3. Frederickson, 2004, 154.

4. Aafke Elisabeth Komter, "Gratitude and Gift Exchange," The Psychology of Gratitude, ed. Emmons, McCullough (New York: Oxford University Press, 2004), 203-204.

5. Solomon Schimmel, "Gratitude in Judaism," The Psychology of Gratitude, ed. Emmons, McCullough (New York: Oxford University Press, 2004), 44-45.

Saturday, November 12, 2011

Hip to Be Square

Last September, I wrote about Kenda Creasy Dean's research published in her book Almost Christian: What the Faith of Our Teenagers Is Telling the American Church (New York: Oxford University Press, 2010).[1] Dean's work was in some ways a sociological introduction to the emerging, American-based Christian culture (particularly of the evangelical flavor). This subject has intrigued me as of late, fueled by my various contacts with campus Christians during my undergraduate studies. The evangelical culture among college students ranged from the likes of Mark Driscoll to that of John Piper (though some would embrace both).

One particular college course featured a fair amount of Christian students, many of which gave presentations explicitly about Christ or Christian life. I was surprised not only by the amount of Christian-themed presentations, but the frankness of them (admirable, if not slightly uncomfortable). The body language, articulation, and dress of these fellow Christian students provided an interesting model by which to compare. Many were average in their dress and appearance. However, some embraced a counter-cultural fashion, talked about how their "eyes had been opened" by [insert freshman college course here] and how they were seeking a church that "accepted them for who they are." I've become increasingly aware of the strains of this Christian subculture in my community and specifically at the university. I've witnessed first-hand the mating of university subculture ideas with LDS doctrine in an institute class setting.

This is the very Moralistic Therapeutic Deism described in Dean's book: "No pretense at changing lives; a low commitment, compartmentalized set of attitudes aimed at "meeting my needs" and "making me happy" rather than bending my life into a pattern of love and obedience to God."[2] As Dean recognizes, "Moralistic Therapeutic Deism cannot exist on its own. It requires a host, and American Christianity has proven to be an exceptionally gracious one."[3] The two great commandments have been reduced from "love God" and "love thy neighbor as thy self" to "believe there is a god" and "be nice to people while feeling good about yourself." I was excited to hear Elder Christofferson quote from Dean's work in his excellent Conference talk last April:

"By contrast,” as one author declares, “the God portrayed in both the Hebrew and Christian Scriptures asks, not just for commitment, but for our very lives. The God of the Bible traffics in life and death, not niceness, and calls for sacrificial love, not benign whatever-ism."

I struggled for some time with the exact term by which to describe what I was witnessing in the university and institute, until I discovered that one had already been coined: hipster. What I had been observing was nothing short of what writer and journalist Brett McCracken calls hipster Christianity. In his book Hipster Christianity: When Church and Cool Collide (Grand Rapids, MI: Baker Books, 2010), McCracken takes great care to explain the hipster mentality and the way it has infiltrated the various Christian denominations. The very notion of hip invokes raw individualism, rebellion against the status quo, the maintainance of a "unique" public image, and immersion in present-day consumerism. He successfully describes twelve common types of hipster (for more detail, see his book):
  1. The Natural - hipness just flows naturally.
  2. The Newbie - the clingy, fickle freshman who saw a movie or concert that "changed his/her life" and "opened his/her eyes." 
  3. The Artist - the starving, bohemian artist.
  4. The Academic - the bookish intellectual. 
  5. The Dilettante - a fringe-lover who knows little of the actual fringes he/she embraces (I think "poser" would be an adequate alternative name). 
  6. The Mountain Man - unshaven, nature-loving, agrarian-nostalgic macho man (or possibly woman).
  7. The Shaman Mystic - supposedly in touch with the energies of the universe. 
  8. The Detached Ironic - witty, sarcastic class clown.
  9. The Yuppie - Patrick Bateman without the murderous tendencies.
  10. The Flower Child - born to hippie parents and stands in "solidarity with the poor and, well, everyone except the white bourgeoisie." (pg. 60)
  11. The Expat - traveling, humanitarian types.
  12. The Activist - the annoying protester who "raises awareness" of the "evils" of things like globalization.


A five-year project by the Barna Group found the six major reasons young adults leave church life to be 1) the overprotective environment, 2) shallow church experience, 3) the perceived antagonism toward science, 4) the supposed judgmental attitudes toward sex, 5) the theological exclusiveness of Christianity, and 6) believers' unfriendliness toward doubt.[4] Today's young Christian hipsters dislike the ultra-conservative fundamentalism of Pat Robertson's The 700 Club and instead favor more "liberal" theologians such as N.T. Wright, C.S. Lewis, G.K. Chesterton, and Dietrich Bonhoeffer (I must have hipster tendencies). These hipsters prefer a postmodern approach to Christianity (something Mormonism certainly understands).[5] The Gospel becomes more important than the Church (echoes of Elder Poelman), dialogue replaces argumentation (this fits Joseph Smith's fundamental principles of Mormonism), and actions speak louder than words (this is key to all three Abrahamic religions: Judaism, Christianity, and Islam). These things require a more intellectually satisfying, pro-active Christianity (I'm all for moving past the gospel made easy! we often find in Sunday School classes). Unfortunately, many of these positive attributes are rooted in the recent reactions to modernity rather than any deep spiritual reflection; in political leanings and activism rather than Christian ethics. It is merely the manifestation of a culture obsessed with shock value and "edginess." In other words, it is a fad and, as McCracken wisely notes, "True relevance is not a fad."[6] Narcissism, alienation, rebellion, and a reduction to the visually stimulating are neither appropriate nor sustainable foundations for a modern approach to the gospel, despite being dressed up in flattering terms. "When I asked my undergraduate students to name the characteristics that best described their generation," writes psychologist Jean Twenge, "the two most popular answer were "independent" and "open-minded.""[7] Twenge's research finds that social norms and manners are increasingly cast aside, from stopping at stop signs to cheating in school.[8]

Big words, big protests, and big egos. These seem like the very products of Moralistic Therapeutic Deism (and the surrounding culture in general, but that is for another post). The sad part, according to Dean, is that the churches are to blame: "Why do teenagers practice Moralistic Therapeutic Deism? Not because they have misunderstood what we have taught them in church. They practice it because it is what we have taught them in church. In fact, American teenagers are barometers of a major theological shift taking place in the United States."[9] Churches have made accidental Christian hipsters of their youth.



Fortunately for Latter-day Saints, Dean's research along with the National Study of Youth and Religion found that LDS kids are less likely to drink, smoke, and engage in risky behavior, while more likely to postpone sex (to age 18 instead of the average 16.5; 13% of Mormon teens identify themselves as not being virgins compared to the average 20%). "Mormon teenagers rank ahead of other youth in terms of spiritual vitality, hope for the future, and overall health and well-being." Dean also finds that Mormon teenagers are more likely than other teenagers to hold religious beliefs similar to their parents, attend weekly religious services, and talk about religious matters in their families. Religious faith is seen as "extremely important" in shaping daily life, demonstrated by the fact that Mormon youths participate in more religious practices than most teenagers and are more articulate about church teachings. Early morning seminary, family home evening, and two-year missions prepare Mormon children for adulthood. All in all, "Mormon teenagers tend to be the "spiritual athletes" of their generation, conditioning for an eternal goal with an intensity that requires sacrifice, discipline, and energy." As NSYR researcher John Bartkowski put it, "The story we tell about Mormon youth is not that all is well, but compared with other teens they're more knowledgeable about their faith, more committed to their faith, and have more positive social outcomes associated with their faith."[10]

Despite these inspiring comments, author and scholar Jana Riess correctly notes in her online review, "One complaint I have with Dean’s book is that she seems to assume that Moralistic Therapeutic Deism doesn’t exist in Mormonism, which it does despite the aforementioned high levels of religiosity." My aforementioned observation regarding hipster subculture and Mormon youth apparently was not far off. A recent article in The New York Times describes "a young generation of Mormons [that] has adopted a fashion-forward urban aesthetic (geek-chic glasses, designer labels and plenty of vintage) that wouldn’t look out of place at a Bushwick party." A trendy subculture has slowly developed in response to the former "bias against being 'cool' in the Mormon world." While every generation goes through similar stages, the period of what is known as "emerging adulthood" is getting much larger. I worry as to how long this desire to be "hip" will last with young Mormon adults and what effects it may have on the following generation (if any).[11] Instead of "finding ourselves" in come-and-go trends, we need to be rooting ourselves in Christ. "We will never truly be at peace with ourselves, comfortable in our skin, and happy with who we are, outside of the one who created us and calls us into his presence and eternal fulfillment," writes McCracken. "Here--in the service of Christ and with God as the center and core of our being--our identities become more fully realized than we've ever known. If that's not cool, I don't know what is."[12]

As members of the Church, we should always remember that it is hip to be square.   




1. Commenting on the recent Pew Forum findings, I half-jokingly wrote, "I can hear it now: "See! This proves Mormons aren't real Christians! If they were, they'd be scoring down here with the rest of us! Mormons reject Christ just like their atheist and Jewish friends!" Either that or something worse like the Mormons and Masons have infiltrated the Pew Forum." Ironically, another Pew Forum study finds that most non-LDS Christians identify Mormons as Christian. The category "White Evangelical" had the highest percentage (45%) of 'No's when it came to the question, "Are Mormons Christian?" My friend Daniel McClellan has some excellent comments on self-identification and Christianity in his online debate with James White of Alpha and Omega Ministries.

2. Dean, 2010, 30.

3. Ibid.

4. This should not be too alarming. As Rodney Stark and Byron Johnson of Baylor University explained in The Wall Street Journal, "The national news media yawned over the Baylor Survey's findings that the number of American atheists has remained steady at 4% since 1944, and that church membership has reached an all-time high. But when a study by the Barna Research Group claimed that young people under 30 are deserting the church in droves, it made headlines and newscasts across the nation—even though it was a false alarm. Surveys always find that younger people are less likely to attend church, yet this has never resulted in the decline of the churches. It merely reflects the fact that, having left home, many single young adults choose to sleep in on Sunday mornings. Once they marry, though, and especially once they have children, their attendance rates recover. Unfortunately, because the press tends not to publicize this correction, many church leaders continue unnecessarily fretting about regaining the lost young people." In other words, once they grow up a little, these church deserters often return.

5. McGuire's point about Nephi's vision is encapsulated in Terryl Givens, "The Book of Mormon and Dialogic Revelation," Journal of Book of Mormon Studies 10:2 (2001) and further expounded on in chapter 8 of Givens, By the Hand of Mormon: The American Scripture That Launched a New World Religion (New York: Oxford University Press, 2002). For a Mormon philosophical approach to postmodernism, see James E. Faulconer, "The Myth of the Modern; the Anti-Myth of the Postmodern," FARMS Review 20:1 (2008).

6. McCracken, 2010: pg. 234.

7. Jean M. Twenge, Generation Me: Why Today's Young Americans Are More Confident, Assertive, Entitled - and More Miserable Than Ever Before (New York: Free Press, 2006), pg. 24.

8. "In 1979, 29% of people failed to stop at a particular stop sign in a New York suburb, but by 1996 a stunning 97% of drivers did not stop at all...In 2002, 74% of high school students admitted to cheating, up from 61% in 1992. In 1969, only 34% of high school students admitted to cheating, less than half of the 2002 number. This continues into college; a 2002 survey found that 80% of students at Texas A&M University admitted to cheating...Not only are teens more likely to cheat, but they are resigned to cheating among their peers. In a 1997 survey, 88% of high school students said that chearing was common at their school. Three times as many high school stuents in 1969 compared to 1989 said they would report someone they saw cheating. Also in 1989, an incredible 97% of high school students said they had let someone else copy their work. The disregard for rules continues outside the classroom: in 2000, 26% of high school boys admitted they had shoplifted from a store at least once." (Twenge, 2006, 26-27)

9. Dean, 2010, 29.

10. Ibid., 51.

11. See the review of Christian Smith with Patricia Smith, Souls in Transition: The Religious and Spiritual Lives of Emerging Adults (Oxford University Press, 2009) in The Wall Street Journal. This book is on my extensive "to-read" list.

12. McCracken, 2010, 247.

Thursday, November 10, 2011

Rand, Selflessness, and the Silly Undergrad

A relatively recent online debate grabbed my attention when an individual (who shall remain nameless) more-or-less claimed that the Austrian theory of economics was to be equated with Ayn Rand and her virtue of selfishness. While Rand's individualism and defense of capitalism certainly make her a fellow traveler among the Austrians, this individual had painted Austrian theory as nothing more than greed-fueled anarchism. Most likely unaware of the breaks Rand had with Rothbard's anarchism, Mises' praxeology, or Hayek's ethical foundations of traditional morality (it was consistently asserted in the debate that Hayek was some kind of anarchist), this critic of conservatism had no problem painting with a broad brush.[1] I explained that I have been critical of Rand's rhetoric regarding selfishness, yet pointed out that she basically redefined the term in an attempt to unload it of the negative baggage (whether wisely or not).



As Rand states in the introduction to her The Virtue of Selfishness,

The title of this book may evoke the kind of question that I hear once in a while: "Why do you use the word 'selfishness' to denote virtuous qualities of character, when that word antagonizes so many people to whom it does not mean the things you mean?" To those who ask it, my answer is, "For the reason that makes you afraid of it." [2]

The use of the word 'selfishness' was largely for shock value, not to mention her extreme disdain for anything collectivist due to her experience as a youth in Russia.[3] After a fairly lengthy exchange, a separate post was made by this individual to "educate" me (a silly undergraduate, which I am no longer nor was at the time) on the meaning of altruism in contrast to selfishness. The definition of 'altruism' was provided, along with the notion that to support her form of altruism (i.e. wealth redistribution) was to be caring, moral, and (as her post implied) Christian. She further implied that support of the market system was inhumane, selfish, and spat in the face of Jesus Himself.

Ignoring the multiple problems that presented themselves throughout her barrage of ill-mannered responses, I wanted to address the relationship between selflessness and selfishness. Ayn Rand has had little influence on my worldview, in large part due to her atheism and Objectivism. While I can understand her appeal to market proponents, I have never quite understood the borderline obsession.


However, her comments regarding the "selfish" nature of serving others is interesting. Joseph Smith reportedly told Oliver B. Huntington that "some people entirely denounce the principle of self-aggrandizement as wrong. ‘It is a correct principle,’ [Joseph] said, ‘and may be indulged upon only one rule or plan–and that is to elevate, benefit and bless others first. If you will elevate others, the very work itself will exalt you. Upon no other plan can a man justly and permanently aggrandize himself’.”[4] On this, the late philosopher Truman G. Madsen wrote,

God, taught the Prophet, loves Himself in an inclusive way and hence "everything God does is to aggrandize His kingdom." Such love expands the "self" to include all selves, all life; and God, therefore, cannot be happy except in the happiness of all creatures. Call that "selfish" if you like. But notice that the opposite is a selfishness which seeks something in indifference to or at the expense of others. We are commanded to be selfish as God is. Joseph Smith taught that there is a law (not, if I understand him, of God's making but in the very nature of things) that "upon no other principle can a man permanently and justly aggrandize himself." This is the meaning of the Master's cryptic phrase: "Lose yourself...and find yourself."[5]

Using a version of "The Prisoner's Dilemma" game and fMRI, a team of researchers from Emory University found that activation in the reward-processing regions of the brain (i.e. nucleus accumbens, caudate nucleus, ventromedial frontal/orbitofrontal cortex, rostral anterior cingulate cortex) took place during cooperative situations. This data demonstrates that what is known as altruism is in fact intrinsically rewarding.[6] Related results were found in another study, which provided participants the choice of either collecting a maximum of $128 or donating to a variety of charities. Scans during the process revealed that "the midbrain ventral tegmental area (VTA), the dorsal striatum, and the ventral striatum were activated by both pure monetary rewards and decisions to donate..., suggesting that donating to societal causes and earning money share anatomical systems of reward reinforcement expectancy...This finding is compatible with the putative role of the "warm glow" ("joy of giving") effect, the rewarding experience associated with anonymous donations."[7] The fronto-limbic activity is connected to "more basic social and motivational mechanisms" stimulated by such things as "food, sex, drugs, and money."[8] Even without any evidence of direct material or reputation gains or reciprocity, charity is neurologically rewarding.

Author and neuroscientist Sam Harris defines morality as that which produces the well-being of conscious creatures. Drawing on studies of moral cognition, he recognizes the existence of a "reward component of genuine altruism (often called the "warm glow" associated with cooperation)" and that "we know from neuroimaging studies that cooperation is associated with heightened activity in the brain's reward regions." From this evidence, Harris concludes, "Here...the traditional opposition between selfish and selfless motivation seems to break down. If helping others can be rewarding, rather than merely painful, it should be thought of as serving the self in another mode."[9]

It is perhaps worth noting that research conducted by Arthur C. Brooks of Syracuse University (now president of the American Enterprise Institute) has shown those in favor of free enterprise and less government donate four times as much money as redistributionists (even when controlled for income), give more blood, and volunteer more hours.[10] Not only is free enterprise statistically linked with charity, but charity is statistically linked with reported happiness. When controlled for income, education, age, race, gender, religion, and children, "conservatives are, on average, 7.5 percentage points more likely than liberals to say they are very happy."[11]

With all the morally superior sneering that takes place on my debate opponent's wall, I wonder how she feels about being neurologically selfish in her altruistic pursuits. On top of that, I wonder if she cares that the ideas she advocates not only harm those she intends to help, but her own happiness and well-being also.



1. The principles behind policies are often more important than the policies themselves. In other words, just because Rand and other market-oriented voices came to similar conclusions does not mean that they hold the same principles for doing so. In his testimony favoring Robert Bork's 1987 Supreme Court nomination, Thomas Sowell explained how principles behind policies take on a life of their own. For further reading on Rand's relationship with economists of the Austrian theory (and her life and politics in general), see The Journal of Ayn Rand Studies 6:2 (Spring 2005); Jennifer Burns, Goddess of the Market: Ayn Rand and the American Right (New York: Oxford University Press, 2009). For an overview, see Reason TV's interview with historian and author Jennifer Burns.

2. Ayn Rand, The Virtue of Selfishness: A New Concept of Egoism (New York: Signet, 1964 [1961]), 5.

3. "It was a wintry day in 1918 when the Red Guard pounded on the door of Zinovy Rosenbaum's chemistry shop. The guards bore a seal of the State of Russia, which they nailed upon the door, signaling that it had been seized in the name of the people. Zinovy could at least be thankful the mad whirl of revolution had taken only his property, not his life. Alisa [Ayn], twelve at the time, burned with indignation. The shop was her father's; he had worked for it, studied long hours at university, dispensed valued advice and medicines to his customers. Now in an instant it was gone, taken to benefit nameless, faceless peasants, strangers who could offer her father nothing in return. The soldiers had come in boots, carrying guns, making clear that resistance would mean death. Yet they had spoken the language of fairness and equality, their goal to build a better society for all. Watching, listening, absorbing, Alisa knew one thing for certain: those who invoked such lofty ideals were not to be trusted. Talk about helping others was only a thin cover for force and power. It was a lesson she would never forget." (Burns, 2009, 9)

4. Quote and reference provided in this excellent post at Life on Gold Plates.

5. Truman G. Madsen, "Joseph Smith and the Sources of Love" in his Four Essays on Love (Provo, UT: Communications Workshop, 1971), 13-14. To clarify, I by no means am attempting to equate the philosophy of Ayn Rand with that of Joseph Smith.

6. James K. Rilling, David Gutman, Thorsten Zeh, Giuseppe Pagnoni, Gregory Berns, Clint Kilts, "A Neural Basis for Social Cooperation," Neuron 35 (2002).

7. Jorge Moll, Frank Krueger, Roland Zahn, Matteo Pardini, Ricardo de Oliveira-Souza, Jordan Grafman, "Human Fronto-Mesolimbic Networks Guide Decisions About Charitable Donations," Proceedings of the National Academy of Sciences 103:42 (2006): 15624.

8. Moll et all, 2006: 15625.

9. Sam Harris, The Moral Landscape: How Science Can Determine Human Values (New York: Free Press, 2010), 91-92.

10. Arthur Brooks, "Tea Partiers and the Spirit of Giving," The Wall Street Journal (Dec. 24, 2010). For a book length treatment of this subject, see his Who Really Cares: The Surprising Truth About Compassionate Conservatism - America's Charity Divide: Who Gives, Who Doesn't, and Why It Matters (New York: Basic Books, 2006).

11. Brooks, 2006, 110.

Monday, October 31, 2011

This Is Halloween



Prior to the rise of what Science 2.0's Hank Campbell calls today's "torture porn," what we now know as "horror films" were largely disassociated with Halloween (1931's Dracula was released on Valentine's Day). While Orson Welles' October broadcast of War of the Worlds provided the first inklings of the marriage between Halloween and Hollywood horror, it was not until John Carpenter's Halloween (1978) that the two were officially wed. The Celtic festivities of Samhain (mentioned in the Halloween sequels) had more to do with agriculture and the changing of seasons than the art of scaring. Nonetheless, the sense of the supernatural was heightened due to the belief in spirits brought on by the oncoming winter (the season being related to death and decay). These spirits were possibly kept at bay with the practice of animal or even human sacrifice (Julius Caesar wrote of the Druids' use of a wicker man), though this is difficult to prove. Despite these pagan roots, the most recognizable practices derive from the medieval Christian holy days of All Souls' and All Saints' Day. For example, the rituals of "souling" involved the baking or cakes to be distributed to relatives and the poor in return for prayers for the souls in purgatory. Many would go from door to door requesting food in exchange for prayers for the dead. This house-to-house activity included the carrying of a hollowed-out turnip, which represented a soul trapped in purgatory. The Protestant Reformation helped rid Halloween of the its more Catholic elements, focusing instead on the marriage prospects of adolescents rather than those trapped in purgatory. Courting and divination practices linked to future marriages became the custom of the day. Between its changing contexts, Halloween was often a night filled with pranks and the undermining of social norms. As these disturbances became less tolerated in the early 20th century, Halloween evolved into a more familial holiday. After surviving the overblown "razor-in-the-apple" scares, the real threat of the Great Society, and the Hollywood gore-fest, the holiday continues to be a night of overturning social norms in a variety of ways (including dressing like a total slut).[1]



Still, Halloween continues its relationship with the spooky and the supernatural, invoking numerous Halloween specials on various TV stations. As far as I'm concerned, if your Halloween night does not consist of murderous preachers, showers with schizophrenics, old-fashioned haunted houses, real-life carnies, the devil's baby shower, possessed hotel caretakers, or all of the above, then you are not doing it right.[2]


1. For a detailed treatment of Halloween's evolution and prominence in North American culture, see Nicholas Rogers, Halloween: From Pagan Ritual to Party Night (New York: Oxford University Press, 2002). Another interesting study on Halloween consumerism can be found here.

2. I admit to not doing it right. Since I will be unable to celebrate Halloween in any recognizable way due to work, I decided to read the above academic material instead.

Tuesday, October 4, 2011

Paul and the Merkabah

Paul's vision on the road to Damascus has often been puzzling to me. The standard telling of the story consists of a devout Pharisee persecuting the Christians who is converted through a vision of the resurrected Jesus Christ. The question that always accompanied my reading of Acts 9 was in regards to the catalyst of Paul's vision. My previous assessment drew comparisons to the experience of Alma the Younger: an angelic appearance or theophany brought about by the prayers and suffering of others. While this may very well be the case when it comes to Paul, I am convinced that the vision rests comfortably within the context of merkabah mysticism; a reference to Ezekiel's vision of the anthropomorphic God upon His chariot-throne.[1] As Rice University's April DeConick explains,

The centerpiece of this [priestly] cosmology is the belief that God has a "body," called the "Glory" or Kavod of YHWH. This idea grew out of the study of certain Jewish scriptures, particularly sections of Ezekiel that describe his visions of an enthroned "likeness as the appearance of a Man ('adam)," a Man who looked like "fire" with "brightness around him." This is "the appearance of the likeness of the Glory (kavod) of YHWH" (Ezek 1:28). This figure is the very manifestation of the hidden YHWH, depicted in the scriptures as an anthropomorphic figure of fire or light (see Ezek 1:27-28; 8:2; Isa 6:1-4). He presides over the created order, oftentimes seated up his merkabah, a special throne consisting of two cherubim with wings spread over the kapporet, the lid of the ark of the covenant in the temple.[2]



In some forms of the interpretation of Ezek 1 the meaning of the text may have come about as the result of "seeing again" what Ezekiel saw. The visionary's own experience of what had appeared to Ezekiel becomes itself the context for a creative interpretation of the text...In some circles this led to renewed visionary experience as expounders saw again what had appeared to the prophet, but in their own way and appropriate for their own time.[3]

This not only demonstrates the power and importance of prayer in receiving revelation, but also the power and importance of the scriptures.[4]


1. See William Hamblin's lecture on the merkabah tradition in Ezekiel and its connection to the temple at David Larsen's blog.

2. April D. DeConick, "What Is Early Jewish and Christian Mysticism?" in Paradise Now: Essays on Early Jewish and Christian Mysticism, ed. April D. DeConick (Atlanta, GA: SBL, 2006), 11-12.

3. Christopher Rowland, with Patricia Gibbons and Vicente Dobroruka, "Visionary Experience in Ancient Judaism and Christianity," in Paradise Now: Essays on Early Jewish and Christian Mysticism, ed. April D. DeConick (Atlanta, GA: SBL, 2006), 56.

4. For more on Paul and his conversion, see Alan F. Segal, Paul the Convert: The Apostolate and Apostasy of Saul the Pharisee (New Haven, CT: Yale University Press, 1990).

Sunday, September 11, 2011

Ten Years Gone: The Secular and the Sacred

*Earlier this year, the American Atheists sued "over the inclusion of cross-shaped steel beams, dubbed the "World Trade Center Cross," in the exhibit at the National September 11th Memorial and Museum." The given reason was that the beams' inclusion "promotes Christianity over all other religions on public property and diminishes the civil rights of non-Christians." I find such reasoning difficult to swallow. This seems to be rooted in the tiresome debate over whether or not America is a "Christian nation" (a problematic phrase to begin with)[1] as well as the tendency for some atheists to overreact to anything that remotely comes close to mentioning Jesus.[2]



With today being the 10th anniversary of the 2001 terrorist attacks, I have reflected on the uproar over a religious symbol at Ground Zero. Of course, there are those who have tried to distinguish between the religious and secular expression of human nature. For example, in his amusing, yet shallow book God Is Not Great: How Religion Poisons Everything, author Christopher Hitchens admits, "We are not immune to the lure of wonder and mystery and awe: we have music and art and literature, and find that serious ethical dilemmas are better handled by Shakespeare and Tolstoy and Schiller and Dostoyevsky and George Eliot than in the mythical morality tales of the holy books. Literature, not scripture, sustains the mind and - since there is no other metaphor - the soul."[3] He suggests, "The loss of faith can be compensated by the newer and finer wonders that we have before us, as well as by the immersion in the near-miraculous work of Homer and Shakespeare and Milton and Tolstoy and Proust, all of which was also 'man-made' (though one sometimes wonders, as in the case of Mozart)."[4] Obviously, Mr. Hitchens misses the irony in his statement; an irony that is pointed out brilliantly by Daniel Peterson:

[Without religion], we would be without Bach's "St. Matthew Passion," Schubert's "Mass in G," Mozart's "Requiem," Vivaldi's "Gloria," Wagner's "Parzifal" and Handel's "Messiah." We'd have neither the musical compositions of John Tavener and Arvo Part nor the choral music of John Rutter. (For that matter, there wouldn't be many choirs.) Nor would we have gospel music. Dante's "Divine Comedy"? Erased. Likewise, Milton's "Paradise Lost," Chaucer's "Canterbury Tales" and Goethe's "Faust" would be gone, as would the Arthurian legends...We couldn't read Shusako Endo's "Silence," most poems of T.S. Eliot, the novels of G.K. Chesterton and Dostoevsky, or the writings of C.S. Lewis. There would be no Augustine, no Aquinas, no Kierkegaard. "Les Misérables" would make no sense. Lincoln's majestic "Second Inaugural Address" would be unthinkable.[5]

September 11, 2001 was a date in a list of many that reminded us all of the fallen nature of humankind and the fragility of life. It also reminded us of the need for redemption. The issue over the World Trade Center Cross confirms my suspicions that some Americans have forgotten the necessity and power of the sacred myth. Catholic theologian Tim Muldoon recently penned,

In studying the Western cultural tradition, it is clear to me that good societies, those comprised of people with a shared sense of purpose that spills over into care for their fellow citizens, are those built around a shared myth. In ancient Greece, there was a myth of the hero; in classical Greece, of virtue; in ancient Israel, of divine protection; in Christian Europe, of Christ the King. By "myth" here I do not mean a false story, nor do I suggest that all myths are equal. Instead I mean a story that gives shape to a culture, gives its efforts meaning, gives its people a sense of what they strive for as a community.

Despite Hitchen's praise, art more-or-less failed us in the wake of 9/11, particularly the film industry.[6] Can the symbol of the cross provide any comfort or reconciliation for the hardened secularist? I believe it can. As Oxford professor and The Lord of the Rings author J.R.R. Tolkien stated in his famous essay "On Fairy Stories,"

The Gospels contain a fairystory, or a story of a larger kind which embraces all the essence of fairy-stories. They contain many marvels—peculiarly artistic, beautiful, and moving: “mythical” in their perfect, selfcontained significance; and among the marvels is the greatest and most complete conceivable eucatastrophe [i.e. "The Consolation of the Happy Ending"]. But this story has entered History and the primary world; the desire and aspiration of sub-creation has been raised to the fulfillment of Creation. The Birth of Christ is the eucatastrophe of Man's history. The Resurrection is the eucatastrophe of the story of the Incarnation. This story begins and ends in joy. It has pre-eminently the “inner consistency of reality.” There is no tale ever told that men would rather find was true, and none which so many sceptical men have accepted as true on its own merits. For the Art of it has the supremely convincing tone of Primary Art, that is, of Creation. To reject it leads to sadness or to wrath. It is not difficult to imagine the peculiar excitement and joy that one would feel, if any specially beautiful fairy-story were found to be “primarily” true, its narrative to be history, without thereby necessarily losing the mythical or allegorical significance that it had possessed...The joy would have exactly the same quality, if not the same degree, as the joy which the “turn” in a fairy-story gives: such joy has the very taste of primary truth...It looks forward (or backward: the direction in this regard is unimportant) to the Great Eucatastrophe. The Christian joy, the Gloria, is of the same kind; but it is preeminently (infinitely, if our capacity were not finite) high and joyous. But this story is supreme; and it is true. Art has been verified. God is the Lord, of angels, and of men—and of elves. Legend and History have met and fused.

The story of atonement, resurrection, redemption, and eventual justice is one that transcends cultures, nations, and ideologies. It is a reminder of the good that also paradoxically exists with the evil in this world. It is a reminder that human nature has what we could refer to as divine qualities and potential, despite its inherent weaknesses. Finally, it is a reminder that hope can triumph over current circumstances, allowing society to recover a normal balance once more.

The difference between the hope found in the Gospels and the hope found in other literary pieces is that there is the possibility that the hope of the Gospels is real.


*Yes, there is a Led Zeppelin reference in the title. And I just realized that I kind of ripped off Mircea Eliade's title The Sacred and the Profane.

1. For informative readings on religion and the American founding, see Jon Meacham, American Gospel: God, the Founding Fathers, and the Making of a Nation (New York: Random House, 2007) and David L. Holmes, The Faiths of the Founding Fathers (New York: Oxford University Press, 2006).

2. As referenced previously on my blog, Hank Campbell has an excellent article on avoiding atheist stereotypes.

3. Christopher Hitchens, God Is Not Great: How Religion Poisons Everything (New York: Twelve, 2007), 5; emphasis mine.

4. Hitchens, 2007: 151.

5. See also Peterson, "Editor's Introduction: God and Mr. Hitchens," FARMS Review 19:2 (2007).

6. It could be argued that art has been failing us for a long time given its continual rejection of sacred beauty. See Roger Scruton's Beauty (New York: Oxford University Press, 2009), BBC special "Why Beauty Matters," or City Journal article "Beauty and Desecration."

Monday, September 5, 2011

A More Perfect Union?




*While Elia Kazan’s 1954 masterpiece On the Waterfront is remembered primarily for its artistic achievement (garnering eight Academy Awards, including Best Picture and Best Actor for a young Marlon Brando), the portrayal of political corruption within unionized labor reflected a reality that was and still is all too true.[1] Kazan at this time had lost considerable respect in some circles (as well as halved director’s fees) due to his testimony before the House Un-American Activities Committee, “an organization charged with weeding out those who were sympathetic to the Communist Party.”[2] A former Communist himself, Kazan under oath identified several associates affiliated with the Communist party. The courage of Brando’s Terry Malloy to speak out against the corrupt unionists has been noted as a probable apologia for Kazan’s own political confession. However, his marginalization by those who opposed his testimony continued even late in life, as demonstrated by the crowd’s divided reactions to his Honorary Oscar in 1999. As shown in Kazan's treasured film, unions have a tendency to implement an extreme form of coercive tribalism and marginalization in order to achieve their objectives. This occurs in the form of protecting union “brothers and sisters” at the expense of virtually everyone else. To oppose their methods is to be like Adolf Hitler according to some unionists.[3]

In the crusade to “save jobs,” technological and operational progress is hindered greatly. Instead of allowing innovative steps to be taken to further productivity and company health (which usually includes more hires and increased wages), inefficient jobs are maintained for the mere sake of “saving jobs.” Rather than being means to an end, jobs become the ends themselves in the hands of the union. As GMU economist Russell Roberts explains,
The story goes that Milton Friedman was once taken to see a massive government project somewhere in Asia. Thousands of workers using shovels were building a canal. Friedman was puzzled. Why weren't there any excavators or any mechanized earth-moving equipment? A government official explained that using shovels created more jobs. Friedman's response: "Then why not use spoons instead of shovels?"
New technology leads to higher productivity and lower costs. “Those lower costs lead to lower prices as businesses compete with each other to appeal to consumers," says Russel. "The result is a higher standard of living for consumers. The average worker has to work fewer and fewer hours to earn enough money to buy a dozen eggs or a pair of shoes or a flat-screen TV or a new car that's safer and gets better mileage than the cars of yesteryear…When it gets cheaper to make food and clothing, there are more resources and people available to create new products that didn't exist before…So many job descriptions exist today that didn't even exist 15 or 20 years ago. That's only possible when technology makes workers more productive."





Sound economic theory, however, has no bearing on the “mobster mentality” found within many unions.[4] Accountability is diffused among the collective whole, watered down until it is virtually non-existent. Individual clashes with the mob mentality often lead to threats, property destruction, and union-on-union violence. While most support the basic idea of unionized labor, one cannot help but recognize the eroding effect unions have on both businesses and the economy as a whole.
Despite the common trumpeting of the so-called Progressive Era in the early 20th century, the economic reforms of that era have been some of the most damaging in American history. These early progressives sought to "move beyond the political principles of the American founding."[5] Despite claims for social justice, what these progressive policies ended up creating was social control. This manifested itself in the exclusion of “unfit workers,” mainly blacks and immigrants.[6] The “Labor Unions” entry in The Concise Encyclopedia of Economics reads,
Economist Ray Marshall, although a prounion secretary of labor under President Jimmy Carter, made his academic reputation by documenting how unions excluded blacks from membership in the 1930s and 1940s. Marshall also wrote of incidents in which union members assaulted black workers hired to replace them during strikes. During the 1911 strike against the Illinois Central, noted Marshall, whites killed two black strikebreakers and wounded three others at McComb, Mississippi. He also noted that white strikers killed ten black firemen in 1911 because the New Orleans and Texas Pacific Railroad had granted them equal seniority. Not surprisingly, therefore, black leader Booker T. Washington opposed unions all his life, and W. E. B. DuBois called unions the greatest enemy of the black working class. Another interesting fact: the “union label” was started in the 1880s to proclaim that a product was made by white rather than yellow (Chinese) hands. More generally, union wage rates, union-backed requirements for a license to practice various occupations, and union-backed labor regulations such as the minimum wage law and the Davis-Bacon Act continue to reduce opportunities for black youths, females, and other minorities.
Many often point to unions as the source of higher wages, safer working conditions, and shorter work days. However, as explained above, it is increasing technology, wealth, and productivity that make these circumstances both plausible and sustainable. Of course, unions have been "successful" at these various activities as well. "In doing so, however, they have reduced the number of jobs available in unionized companies. That second effect occurs because of the basic law of demand: if unions successfully raise the price of labor, employers will purchase less of it. Thus, unions are a major anticompetitive force in labor markets. Their gains come at the expense of consumers, nonunion workers, the jobless, taxpayers, and owners of corporations."[7]
Many vocal activists claim that unions built the middle class. A recent study concludes that the decline in union membership has resulted in the decline of the middle class and stagnant income over the past several decades. This helps foster the myth that “the poor are getting poorer” and that unions are the only thing keeping evil corporations from turning the entire population into quasi-slaves. However, the statistics often used in these studies are highly misleading. As economist Steven Horwitz writes,
According to researchers at the University of Michigan, the poorest fifth of households in 1975 earned, on average, almost $28,000 more per year by 1991, adjusted for inflation. According to U.S. Treasury data, an astounding 86 percent of households that comprised the bottom fifth in 1979 had climbed out of poverty by 1988…The vast majority of American households do move up: The reality of the U.S. economy over the last 30 years is that everyone has gotten richer in absolute terms, and significantly so. Simply measuring income and wealth tells us very little about the lifestyle of typical Americans. For example, poor Americans today are more likely to own basic household goods like washing machines, dishwashers, color TVs, refrigerators, and toasters than the average household was in 1973.


Economist Thomas Sowell recognizes the talk of "poor getting poorer" as the fallacy of "confusing the fate of statistical categories with the fate of flesh-and-blood human beings."[8] The reason? "It is an undisputed fact that the average real income-that is, money income adjusted for inflation-of American households rose by only 6 percent over the entire period from 1969 to 1996...But it is an equally undisputed fact that the average real income per person in the United States rose by 51 percent over the very same period."[9]



As one commentator notes, “In other words, if the middle class in America has shrunk, it is only because so many formerly middle-class households have moved to the upper-income brackets, while a significant number of households previously in the lower brackets have moved up to the middle class and beyond.”
True growth comes from capital investment, creation of wealth, technological advances, increased skill and education, and competition. Based on my own observations, the observations of more seasoned employees, historians, and economists, unions tend to engage in the very thing they claim to be fighting against: exploitation and coercion. Though there is still "a role for labor unions, just as in a free market there is still a role for agents or managers who help their clients find work and negotiate their contracts,"[9] this must come without the government attachments. To politicize the union does a great disservice to both the labor force and the company as a whole.[11]

With that, Happy Labor Day!

*This is an edited excerpt from my internship research paper needed for graduation.

UPDATE: Here is an informative analysis of the "rising income inequality."


1. On the Waterfront was partly based on the Pulitzer Prize-winning New York Sun series "Crime on the Waterfront" by journalist Malcolm Johnson.

2. Daidria Curnutte, "The Politics of Art: Elia Kazan and the Scandal Over On the Waterfront," The Film Journal 2 (July 2002).

3. Ironically, the Nazi comparison backfires.

4. Economists like Charles Baird have gone so far as calling labor unions "labor cartels."

5. Ronald J. Pestritto, William J. Atto, "Introduction to American Progressivism," in American Progressivism: A Reader, eds. Pestritto, Atto (Lanham, MD: Lexington Books, 2008), 2.

6. See David E. Bernstein, Thomas C. Leonard, "Excluding Unfit Workers: Social Control Versus Social Justice in the Age of Economic Reform," Law and Contemporary Problems 72 (2009).

7. Morgan O. Reynolds, "Labor Unions," The Concise Encyclopedia of Economics, ed. David Henderson, 2nd ed. (Indianapolis, IN: Liberty Fund, 2007).

8. Thomas Sowell, Economic Facts and Fallacies, 2nd ed. (New York: Basic Books, 2011), 140.

9. Ibid., 140-141.

10. Robert P. Murphy, The Politically Incorrect Guide to Capitalism (Washington, DC: Regnery Publishing, 2007), 25.

11. For further reading, see Morgan O. Reynolds, Power and Privilege: Labor Unions in America (New York: Universe Books, 1984); Thomas E. Woods, Jr., "What Made American Wages Rise? (Hint: It Wasn't Unions or the Government.)" in his 33 Questions About American History You're Not Supposed to Ask (New York: Three Rivers Press, 2007).

Monday, August 29, 2011

Zero Population

A recent thread at Mormon Dialogue and Discussion Board mentioned this lovely piece of nostalgia:



The poster apparently shares the same concerns as these hip 80s rebels. Given the centuries old scare of overpopulation that continues even today, it is easy for one to wonder if the Mormon emphasis on childbearing and family life is misguided. Are large Mormon families joining the ranks of the Beckhams as "bad role models and environmentally irresponsible?" Are we not worried about "improving family planning or reducing global inequality?" Do carbon emissions and diminishing resources mean nothing to us? Unfortunately for alarmists, our natural resources are doing just fine. The food supply is not "shrinking away" and the oil is not "depleting."



Economist Bryan Caplan and science writer Matt Ridley have recently taken this Malthusian view and turned it on its head by claiming that increasing population is actually a good thing. Caplan writes,

The case against population is simple: Assume a fixed pie of wealth, and do the math. If every person gets an equal slice, more people imply smaller slices. The flaw in this argument is that people are producers as well as consumers. More sophisticated critics of population appeal to the diminishing marginal product of labor. As long as doubling the number of producers less than doubles total production, more people imply smaller slices. These anti-population arguments have strong intuitive appeal. But they face an awkward fact: During the last two centuries, both population and prosperity exploded. Maybe the world just enjoyed incredibly good luck, but it makes you wonder: Could rising population be a cause of rising prosperity? Yes. Economists’ central discovery about economic growth is that new ideas are more important than labor or capital. The main reason we’re richer than we used to be is that we know more than we used to know. We know how one man can grow food for hundreds. We know how to build flying machines. We know how to build iPhones. Best of all: Once one person discovers a new idea, billions can cheaply adopt it.

Matt Ridley refers to this as "ideas having sex." The way ideas mate, according to Ridley, is exchange. Increased exchange and specialization (i.e. market systems) lead to increased productivity and, consequently, the rise of living standards. "The poor in the developing world grew their consumption twice as fast as the world as a whole between 1980 and 2000…Despite a doubling of the world population, even the raw number of people living in absolute poverty...has fallen since the 1950s. The percentage living in such absolute poverty has dropped by more than half to less than 18 percent."[1] In fact, it was labor's shift from the family to the market that made it plausible for couples to consider having less children.[2] As Ridley notes, "Human beings are a species that stops its own population expansions once the division of labour reaches the point at which individuals are all trading goods and services with each other, rather than trying to be self-sufficient. The more interdependent and well-off we become, the more population will stabilise well within the resources of the planet...Most economists are now more worried about the effects of imploding populations than they are about exploding ones."[3] Birth rates worldwide have been falling since the 1960s, with "the raw number of new people added each year...falling since the late 1980s" (this is across all cultures, including Mormons).[4]



As of now, if the entire population were to live within the density equal to that of New York City, we could all live here in Texas. The Economist has estimated that each subsequent billion will take slightly longer to reach and that the global population growth will plateau at about nine billion people in 2050. "A 2003 assessment by the United Nations concurs," writes RealClearScience editor Alex Berezow. "The UN projects, under its medium-growth scenario, that the human population will remain relatively stable at 9 billion until the year 2300." Wheaton business professor Seth Norton has documented the effects of liberalized economic institutions on fertility rates: high economic freedom and the rule of law (thus, higher economic growth) tend to lead to lower fertility rates.[5] Or, as Reason's Ronald Bailey summarizes, "economic freedom actually generates an invisible hand of population control." It was this kind of evidence that convinced business professor and economist Julian Simon decades ago to make a wager with alarmists: "select any raw material you wanted - copper, tin, whatever - and select any date in the future, "any date more than a year away," and Simon would bet that the commodity's price on that date would be lower than what it was at the time of the wager."

[Environmentalist Paul] Ehrlich and his colleagues picked five metals that they thought would undergo big price rises: chromium, copper, nickel, tin, and tungsten. Then, on paper, they bought $200 worth of each, for a total bet of $1,000, using the prices on September 29, 1980, as an index. They designated September 29, 1990, 10 years hence, as the payoff date. If the inflation-adjusted prices of the various metals rose in the interim, Simon would pay Ehrlich the combined difference; if the prices fell, Ehrlich et alia would pay Simon. Then they sat back and waited. Between 1980 and 1990, the world's population grew by more than 800 million, the largest increase in one decade in all of history. But by September 1990, without a single exception, the price of each of Ehrlich's selected metals had fallen, and in some cases had dropped through the floor. Chrome, which had sold for $3.90 a pound in 1980, was down to $3.70 in 1990. Tin, which was $8.72 a pound in 1980, was down to $3.88 a decade later. Which is how it came to pass that in October 1990, Paul Ehrlich mailed Julian Simon a check for $576.07. A more perfect resolution of the Ehrlich-Simon debate could not be imagined. All of the former's grim predictions had been decisively overturned by events. Ehrlich was wrong about higher natural resource prices, about "famines of unbelievable proportions" occurring by 1975, about "hundreds of millions of people starving to death" in the 1970s and '80s, about the world "entering a genuine age of scarcity."

This brings us to the final point, which is made perfectly by Caplan: "Libertarians could celebrate these changes as proof that the problem of overpopulation solves itself whether or not governments do anything about it. But if Julian Simon and the intellectual tradition he inspired were right, libertarians should be experiencing severe cognitive dissonance." We should continue to want children despite our economic success (success which allows us to actually afford more children). One does not have to take the Tiger Mom (Caplan has been called the "Anti-Tiger Mom") approach of "no time, no bucks, no fun." Having children should be encouraged for both social and private benefits, as Caplan has recently argued.[6] Not to mention it appears to be the first commandment given following creation.

Be fruitful, and multiply, and replenish the earth... (Genesis 1:28)




UPDATE: Ronald Bailey's latest (Aug. 30, 2011) Reason article reviews the recent data on trade, prosperity, and fertility. In conclusion, he states, "In the modern era, trade liberalization promotes a virtuous cycle that boosts incomes, raising the value of education, resulting in the protection of women’s rights, and eventually inducing a fall in fertility rates." Also, thanks to Tyler Anderson for pointing out Elder Nelson's comments on overpopulation. Mike Gallagher has an informative article over at BBC, while Matt Ridley and Nicholas Eberstadt chime in on reaching the 7 billion mark. A Catholic philosopher weighs in on Caplan's theory, while another stresses fact checking.

NOTES

1. Matt Ridley, The Rational Optimist: How Prosperity Evolves (New York: HarperCollins, 2010), 15.

2. For an overview of the effects of capitalism on the family structure, see Steven Horwitz, "Capitalism and the Family," The Freeman 57:6 (July/Aug 2007).

3. Ridley, 2010, 211.

4. Ridley, 2010, 205.

5. Seth Norton, "Population Growth, Economic Freedom, and the Rule of Law," PERC Policy Series PS-24 (Feb. 2002).

6. Caplan's newest book is Selfish Reasons to Have More Kids: Why Being a Great Parent Is Less Work and More Fun Than You Think (New York: Basic Books, 2011).

Wednesday, August 24, 2011

Casting Lots: Revelatory Divination

And they prayed, and said, Thou, Lord, which knowest the hearts of all men, shew whether of these two thou hast chosen...And they gave forth their lots; and the lot fell upon Matthias; and he was numbered with the eleven apostles. (Acts 1:24, 26)

This concept of casting lots at times bothers today's Latter-day Saints. Attempts are often made to reconcile the "giving forth of lots" with modern apostolic practices. However, this anachronistic approach to the scriptures distorts the original setting and clouds proper understanding. Elder Neal A. Maxwell said it best: "Only by searching the scriptures, not using them occasionally as quote books, can we begin to understand the implications as well as the declarations of the gospel." New Testament scholar Ben Witherington III explains this strange event in Acts 1 as follows:

This process [i.e. casting lots] for determining God's will was traditional in Judaism (cf. Lev. 16:8; Num. 26:55; Jon. 1:7-8; 1QS 5:3, 6:16), and there is probably no implied criticism of it by Luke, though scholars have often contrasted this story with those which follow Pentecost where the guidance of the Spirit is relied upon. Clearly, Luke thinks the choice here (and so presumably the method) valid for its day - the disciples could not be criticized for not relying on a source of power and discernment they had not yet received. The process was likely the same as we see in 1 Chron. 26:13-14 - stones in some way marked to distinguish them were placed in a container or jar and shaken until one came out, in this case the one that represented Matthias.[1]

What exactly is a lot? "A lot is an object used as a counter in determining a question by chance. To select by lot may include rolling dice or picking straws and the like. The prevailing wisdom was that "the lot is cast into the lap, but the decision is the Lord's alone" (Prov 16:33). The point is that God was behind the apparent randomness of the lot. For example, Saul was chosen the first king of mythical Israel by lot (1 Sam 14:41-42: the devices for casting lots here are called Urim and Thummim)."[2]

Cultural practices and influences should be explored, not ignored.[3]


NOTES


1. Ben Witherington III, The Acts of the Apostles: A Socio-Rhetorical Commentary (Grand Rapids, MI: Eerdmans, 1998), 125-126.

2. Bruce J. Malina, John J. Pirch, Social-Science Commentary on the Book of Acts (Minneapolis, MN: Fortress Press, 2008), 26-27. It should be noted that some scholars disagree with the simplistic comparison of the Urim and Thummim to casting lots. TB Yoma 21b states that the Urim and Thummim along with the spirit of prophecy that accompanied them were missing from the Second Temple. The description of the Urim and Thummim found within the biblical texts includes them being "attached to the high priest’s breastplate which hung from his ephod-apron by gemstone buttons on his shoulders (Exodus 28:28-30; Lev. 8:8). With the urim and thummim attached, the breastplate becomes the breastplate of judgment...and the entire ephod becomes a method for accessing the divine will, a method of prophecy (Num. 27:18-21). These biblical sections are customarily assigned to P, which is usually considered second temple. The urim and thummim appear more frequently in the LXX and the Samaritan Pentateuch than they do in the MT. RofĂ© suggests therefore that several references to them have been expunged from the MT." (Lisbeth S. Fried, "Did the Second Temple High Priests Possess the Urim and Thummim?" The Journal of Hebrew Scriptures 7:3, 2007: 4-5)

3. Cultural exploration and research within the biblical writings might help ease one's discoveries of the folk-magicMasonic and other 19th century influences on the early saints and their revelations.