Tag Archives: education

Souls in a Petri Dish (Review: Letter to a Christian Nation)

24 May

“Atheists are the most reviled minority in America.”

Sam Harris has it exactly right.  Polls—even some taken shortly after 9/11—show that the majority of Americans would rather have a Muslim president than one who doesn’t believe in any God at all.  Maybe that seems hard to believe when we think back to the horror over our current Presidents highly suspicious middle name, but the number bear it out.  Atheists aren’t likely to achieve high office.

Maybe that’s why one of our most famous nonbelievers in American history, Thomas Paine, is the most notable of our founding fathers not to have a monument.  They don’t even mention him in the recent History Channel documentary America: The Story of Us (which is otherwise both moving and surprisingly objective) in the Valley Forge segment.  George Washington thought the political pamphleteer important and inspiring enough to read to his starving, freezing men at Valley Forge (and thus keep the army together through a terrible winter)—but this isn’t the Age of Reason anymore.

In high school, I was nostalgic for the political pamphlets of Thomas Paine rallying patriots to the Revolutionary cause.  Nostalgic not because I’d been there (though I’m still holding out for time travel), but because today’s political debates involve so much more mudslinging and snide soundbites than any meaningful debate, and because—to someone who compliments acquaintances on brilliant extended metaphors in emails and cries after every re-reading of Plato’s Phaedo—good rhetoric is so, so hard to find.

Especially on the issue of religion and faith.  On a small scale, the University of Alabama club “triple-A,” Alabama Atheists and Agnostics, had its chalking vandalized by devout Southern Christians about half a dozen times this past year.  Pouring slushies on a chalk portrait of Darwin is the college equivalent of a shut-down of intellectual debate, I guess—which is something atheists face in the “Christian nation” of the United States.

I can’t help but have a wonderful time reading the gleefully irreverent Christopher Hitchens.  As might be expected, I can’t say the same for my ex-roommate at UA, who never looked at me the same after she found God Is Not Great: How Religion Poisons Everything on my Kindle.  But Sam Harris’s Letter to a Christian Nation satisfied my sentimental longing for Thomas Pain-esque writing, and then some.  His short book—more a manifesto—echoes Paine’s celebrated Age of Reason in that he’s not on the defensive.  Harris explains that the New Atheism isn’t just a negative (not believing in God): it’s about a positive too, belief in science and reason.

Last fall, I awarded Thomas Paine the Scattering’s premier literary award—the Heretic Badge of Honor—for his 1794 Age of Reason.  Today, I’m awarding the Heretic Badge to Sam Harris for Letter to a Christian Nation, for taking up the torch.  He writes in his conclusion, after all, that:

“This letter is the product of failure—the failure of the many brilliant attacks upon religion that preceded it, the failure of our schools to announce the death of God in a way that each generation can understand, the failure of the media to criticize the abject religious certainties of our public figures—failures great and small that have kept almost every society on this earth muddling over God an despising those who muddle differently.”

In his letter, the New Atheist does revive for a modern audience some ideas that reminded me of past doubters very strongly.  The foundation of atheism, he argues, is a scientific mindset, but that might mean something different than many people expect:

“The core of science is not controlled experiment or mathematical modeling; it is intellectual honest.  It is time we acknowledged a basic feature of human discourse: when considering the truth of a proposition, one is either engaged in an honest appraisal of the evidence and logical arguments, or one isn’t.”

Atheists don’t revile God (although, as Harris points out, there’s a whole lot of evidence to do just that)—we respect rationality.  That’s the scientific mindset.

This definition of “intellectual honesty” struck me as particularly reminiscent of Thomas Paine’s Age of Reason, in which he wrote my favorite 18th-century quote:

“It is necessary to the happiness of man that he be mentally faithful to himself. Infidelity does not consist in believing or disbelieving; it consists in professing to believe what he does not believe.”

Or for Harris, what is impossible to believe in in modern society, with so much scientific evidence stacked against the need for a God.  It’s as incongruous as, well, to use one of my favorite expressions from the Letter—“souls in a Petri dish.”

I won’t go into detail on Harris’s arguments, because I couldn’t begin to write more clearly or concisely than he does in Letter to a Christian Nation.  And personally, I wonder how many of the Christians the book’s addressed to will actually read it—but for those who do or are considering it, let me say that while it’s bold and certainly controversial, it’s written in some of the most clear, logical prose I’ve ever read.  It’s accessible, and written more to persuade than inflame (like some of Hitchens’s writings).

The book’s only 900 locations on the Kindle (as opposed to the 5-8,000 of the average novel), so I’d place it at about 100 pages.  In any case, it’s a one-afternoon read.  So head on outside on this beautiful summer (is it summer yet?  I never really know) day, relax in the sun, wear a hat or a beekeeper’s veil if you’re easily sunburned, listen to the rustle of leaves in the wind and insects buzzing in the grass, and remember that you can thank evolution for it all—not God.

Happy Pentecost, everyone!

Letter to a Christian Nation is available in paperback as well as an ebook on Amazon for $8.64

Gender and Power in Classical Japan (part 1 of 3)

11 May

Though the Nara period of the 8th century saw the adoption of Confucian values and ideals into Japanese society, and in particular their application to gender roles and the status of women, the cultural link between China and the “land of Wa” (Murphey 208) had already been forged a century before.

While trade has, in most parts of the world and periods of history, been an effective, if mainly passive and gradual, means of cross-cultural contact, Prince Shotoku of the Yamato court set the precedent for active seeking after cultural exchange; in the early 7th century, Shotoku sent the first large-scale, official embassies to China from Japan, “determined to tap the riches of Chinese civilization at their source and to bring back to Japan everything they could learn or transplant” (Murphey 213).

In these centrally-planned delegations, traders were generally replaced by emissaries who could best bring back cultural, rather than material, commodities: students, scholars, artists, and monks, among others.  But while these delegations early on established a pattern for vibrant cultural exchange, the Sinification of Japanese institutions, seen most clearly during the Nara period, resulted in an abandonment of indigenous matriarchal traditions for new legal codes and societal values which eroded female authority.

One of the earliest accounts of Japanese culture and governance comes from the Chinese—this Account of the Three Kingdoms, believed to have been written about 290 CE, describes Japanese society as a collection of “clans… some ruled by kings and some by queens” (Murphey 208).

At this early period of decentralized clans, therefore, a patriarchal system had yet to achieve cultural hegemony.

Women, the Chinese record indicates, also played a significant role in important divinatory and ritualistic matters.  According to the Account of the Three Kingdoms, one prominent local leader was “an unmarried queen who as a kind of high priestess ruled over several ‘kingdoms,’ or clans, and was considered important enough to have one of the largest tombs and mounds erected for her on her death” (Murphey 208).

These tombs, along with the relatively crude clay haniwa figurines found in and around them, are believed to date to as far back as the 3rd century CE—the presence of haniwa pottery depicting female shamans indicates that the Chinese account of a priestess-queen was not an anomaly, but instead reveals matriarchy as a relatively common pattern in early Japanese society.

Early legends and mythology, too, highlight the significance of women in Japanese religious tradition, at least before the introduction of Buddhism and the later Confucianism.

Though rulers used the Chinese honorific of “emperor” even at the first formation of a Japanese state from the previously independent clans, these leaders nonetheless rejected the predominantly secular basis of Chinese government and established instead a principle of divine kingship.  While Chinese emperors legitimized their reigns by claiming to rule by the “Mandate of Heaven,” tianming, this was an essentially political, not religious, term—in Confucius and the Analects, tianming is also translated as “what is ordained by Heaven” (Analects 2:4).

Contrarily, Japanese emperors claimed direct descent from heaven, not simply the wisdom to act in accordance with its wishes.  Legitimacy for Japanese rulers was based on descent from the Sun Line, those who “allegedly descended directly from the sun goddess Amaterasu… the titular deity of Japan” (Murphey 206).  Significantly, this mythical founder of the Japanese royal line is a woman.

This is an except from a paper written for a University of Alabama Asian Civ course.  My sources will be included at the end of part 3.  This is a subtle reminder to please cite your own sources if using any of the info here– “subtle” being defined as obnoxiously obvious.

G is for Grande

16 Apr

I started chortling in class yesterday during a lecture on Ignatius Loyola’s mystical theology.  My professor, who already thinks I’m strange since I interned at the Ayn Rand Institute last summer and have a habit of tearing up during lectures in which heretics and other historical figures are executed, looked at me sharply.

“Morales!  What’s so amusing?”

I explained that, many years ago, I was told that the Bene Gesserit of Frank Herbert’s Dune series had been based off the Jesuit order, and that I’d been thinking of that the whole time I was reading The Spiritual Exercises of St. Ignatius Loyola. Dr. M—- asked me to elaborate.

After explaining that the Jesuits evolved into a far-future all-female religious order that controls education throughout the universe, masterminds a creepy eugenics program (until the Worm Who Is God comes around), literally has a collective consciousness, and plants the seeds of religious messianism everywhere they go, everyone in the class (including my professor) was laughing, and Morales got some extra weird points.

In any case, the unusual jocularity the above incident unleashed carried over through the rest of the period, during which everything about Ignatius Loyola seemed absolutely hilarious to absolutely everyone.  I believe I must be a miracle-worker, as Loyola’s Spiritual Exercises are almost as terrifying as Calvin’s Institutes of the Christian Church.  And we all know how much Calvin frightens me.

I really have no desire to get into epistemology today, but briefly: Iggy (that’s what all the cool History majors call him) embodies an interesting paradox in Counter-Reformation theology.  He was influenced by the contemporary currents of Spanish mysticism, emphasizing a personal piety and deep emotional connection with God, but was also very, very, very institutional.  Personal prayer did not imply an unnecessary Catholic Church.  Example:

If we wish to be sure that we are right in all things, we should always be ready to accept this principle: I will believe that the white I see is black, if the hierarchical Church so defines it.

In this way, he imbibed mystical ideas without taking the Protestant route toward private interpretation of Scripture.  But then, unlike Luther, Calvin, and co., he was an ex-military man (the Exercises abound with references to being a spiritual “knight”) and highly valued order and discipline.

That’s the theme of the Exercises‘s first week in the “Particular Examination of Conscience” (and the spark for more outrageous hilarity in Dr. M—-‘s class).  He begins with a chart that looks something like this:

G

G

G

G

G
G
G

[I feel like this is becoming one of this dreadful “Read to the Bottom!” emails.  Alas.]

The idea is that each “G” stands for a day of the week.  On Sunday, the largest G, the person making the exercises chooses a particular sin.  Dr. M—-‘s example: being late to class (said as he skewered the young man in front of me with a glance.  For months this semester, I wasn’t even sure why the young man in front of me even bothered coming to class, considering he was always late, blocked my view of the whiteboard, and spent the entire period “taking notes” on his laptop… otherwise known as visiting Facebook and Star Trek forums.  But then I saw him set up GarageBand to record the lecture, and everything fell into place.)

Each day, then, the young trekkie in front of me would tally up how many times he was late to class on Sunday, then Monday, then Tuesday, and so on.  The Gs get smaller each day because a truly disciplined spiritual knight would be improving throughout the week.

All very logical, and possibly the inspiration for Benjamin Franklin’s outrageous moral perfection chart some centuries later.

The only thing that bothered our class was the question of what–for the love of God–the “G” stood for.

Dr. M—- suggested giorno, the Italian word for day; but of course Loyola wrote in Spanish.  Italian and Spanish have many cognates but this, unfortunately, is not one of them.  We racked our minds but couldn’t think of anything else.  Dr. M—- and I chortled like co-conspirators.

Indeed, I determined I would find out, if I had to tear apart the entire Internet to do it.

Getting back to my dorm, I wondered if perhaps G stood for God.  Maybe in the original Spanish, the letter had been D for Dios, and the translators changed it.  With this happy thought in mind, I began to comb Spanish Jesuit websites for the original text.  No good.  The original Spanish had “G” too, and seemingly no reason for it.  I despaired that this was to be forever a divine mystery in my mind, tormenting and mocking me.  I would never sing the alphabet song with fondness ever again.

But then I found another copy of the Spanish text online, and–miracle of miracles!– it included the annotation that G is for Grande (Spanish for big, or large).  This actually makes perfect sense, since the sins are to be larger on the first day than the second, seventh.

Thus I have written this post for the other despairing students out there who wonder what the heck was going through dear Iggy’s mind (too much self-flagellation that day, I suppose).  And to make things even easier, I’ll include some phrases said students might search:

Ignatius Loyola Spiritual Exercises “G”

What does “G” stand for in the Spiritual Exercises?

Particular Examination of Conscience “G”

Friggin confusing directions Loyola

Hopefully, this will be of help.

Medieval Innovators

6 Apr

We’ve a somewhat iconoclastic culture.

There’s not much in America too sacred or time-honored to escape the gleefully malicious

Here’s an example for all you Catholics out there, since we know the Pope got some serious bad PR this last Lenten season.  From mathematician and satirist Tom Lehrer, circa  1965:

The American humorist par excellence, Mark Twain, made a name for himself poking fun at the upper crust of New England society in The Innocents Abroad—a travelogue of his time as a subversive among the reputables on the first U.S. cruise vacation.  (Neither priest, nor Parisian, nor any passenger escaped his pen.)

H.L. Mencken—probably best known for covering the Scopes “Monkey Trial” (see E.K. Hornbeck in Inherit the Wind)—got everyone else.

And hitting on the most incendiary issues of today, a rebel vlogger across the pond even dared to criticize that most vaunted book—Twilight.

I’m of the opinion that this is just one facet of a larger cultural trend—loving the innovator.

If there’s anything Americans don’t value—save the Bill of Rights, in theory—it’s tradition.  Nothing’s off limits, and nothing’s too far (see the 1960s).  Nobody’s looking backwards; it’s all about what’s new.

Why was there so much hype about the iPad, anyway?  I was hearing awed whispers about the fabled “Apple Tablet” long before the newest tool in Steve Jobs’s plot to take over the world was unveiled.    First sales weren’t as high as expected this past weekend, after all, and I’m still crossing my fingers for Amazon to win the ebooks/ereader war (mainly because I already have a Kindle…)

In any case—Steve Jobs has a reputation as an innovator, the technological visionary of the new millennium, and possibly for the rest of time if he sticks around for the Singularity.  We want ingenuity, creativity, anything anything new.

I’m all for forward ho!—and thoroughly looking forward to the “Nerd Rapture,” as I’m told those heathen Luddite doubters call the Singularity (just wait—they’ll be sorry when they’re not among the elect)—but it’s still fascinating to look back and see just how different things were way back when.

[I realize that my last post was about consistency throughout history—well guess what? There’s also chaos!  And vast cultural changes.  The past is an alien, alien place, and pretty scary sometimes.]

In late-medieval Europe, mid-16th century—also known as the Reformation, or, as I like to call it, “That-Time-When-Martin-Luther’s-Mental-Breakdown-Led-to-a-Theological-Revolution-and-Everything-Went-Effing-Crazy”—reformers were tripping over each other trying to argue that they weren’t innovators.  “Innovation,” in fact, was a pejorative word.

Strange, because John Calvin’s name is almost synonymous with (literal) iconoclasm of the period.  Down with superstition!  With empty ritual and the cult of saints!  Down with papal tyranny, false prophets and false sacraments!  That sounds pretty radical—and of course, it was.  But like innovation means something different to us now, so does tradition.

Survivors of the Catholic school system or of the Confraternity of Christian Doctrine classes (aka, CCD, or those kids who messed up everything in our desks) will know that the Catholic Church bases its authority on two things: Scripture, and Tradition.  Scripture’s pretty obvious—it’s that book with the cross on it.  Tradition, however, isn’t contained in any one book.  It’s the theory and praxis of centuries (millennia?) of institutional development: papal bulls and decrees and encyclicals, ceremonies and ritual, writings of Church fathers and Church doctors (some women in there, surprisingly).

This is the tradition the Protestant reformers were Protestanting against.  For Luther and the gang, Scripture was the only basis for orthodoxy—anything else was the work of the papists trying to ensnare you and your property.

[I’ll let Karlstadt and Eck debate it out at Lepzig—as I’ve said, my salvation lies in the prophecies of Ray Kurzweil.]

And yet, despite their tirades against tradition of the papacy, Protestant reformers made very clear that they had tradition too—that of the original, first-centuries Apostolic church.

John Calvin’s Response to Sadoleto is a great example.  The Reader’s Digest version:

Where last we left off, Calvin was fleeing Paris under suspicion of heresy and being a really good speechwriter.  He headed from Paris to Basel (in Switzerland), then from Basel to Italy (where he hung out with a pro-Protestant duchess and got a nice suntan), then back to Basel again, from Basel to Paris, and Paris to Strasbourg—or that was the plan.  The French Valois dynasty was in the middle of a war with (of course) the Spanish Hapsburgs, and so Calvin was detoured into a city on the edge of a lake, near the French border: Geneva.

So it’s summer 1536 and Calvin arrives, gets bullied by fellow predestinarian William Farel into staying, and the two of them take over the city.  Or, kind of.  The city council and local Genevans start getting cold feet about the whole crypto-theocracy thing, so they throw Calvin and Farel out of town (a couple years later they invite them back and everything goes to hell, but that’s another story—and Calvin would probably argue that everyone went to heaven.  Ah well).

After Calvin got booted, the clever Catholic Cardinal Sadoleto decides that it’s his time to jump back into the fray and try to bring back the city from the edge, back into the fold of the Mother Church.  Basically, he writes an open letter encouraging a return to Catholicism, arguing that Calvin and co. were nothing more than a bunch of (wait for it…) innovators.

Code for: no respect for authority, for sanctity, for tradition.  The Catholic Church had been around, after all, for 1,500 years.  That’s a long time, and a big rival for lone men like Luther and Calvin to take on with only the strength of their consciences and heretical vernacular bibles.

But though the city had turned out Calvin, they hadn’t turned aside Protestantism, and searched in vain for someone to respond to the cardinal’s claims.  I imagine you can guess that those fickle, fickle Genevans turned back to Calvin.  And Calvin, being a genius and really fast writer, got his reply out in a matter of days—including possibly the best medieval burn I’ve read this semester:

You call us crafty men, enemies of Christian unity and peace, innovators on things ancient and well-established, seditious, alike pestiferous to souls, and destructive both publicly and privately to society at large.  I am unwilling, however, to dwell on each of these points.

But… You know, Sadoleto, and if you venture to deny, I will make it palpable to all that you knew yet cunningly and craftily disguised the fact, not only that our agreement with antiquity is far closer than yours, but that we have attempted to renew that ancient form of the church which, at first sullied and distorted by illiterate men of indifferent character, was afterward flagitiously mangled and almost destroyed by the Roman pontiff and his faction.

First of all—how awesome is it to accuse someone’s colleagues of being illiterate and then throw down a word like flagitiously.  Ouch.  That’s almost as harsh as double predestination itself.

But most importantly— innovation was cearly not desirable.

And yet, like it or not, the times they were a’changin’.  A very literate populace devoured the letters as fast as they could be printed.  Of course, there’s a certain delicious irony in denouncing innovation by means of the most influential invention in human history (excepting fire…maybe): the printing press.

I’ll leave you with the words of Thomas Cromwell, via James Frain and The Tudors: “It’s called a printing press, my Lord, and it will change the world.”

Pregnant on Primetime

24 Jan

After Lifetime’s debut of the original movie The Pregnanacy Pact, the verdict is in: America has a cultural fascination with sensational pregnancies—and the more controversial, the better (remember Octomom?).

The timeline could be said to start on April 10, 2007, with the premiere of Discovery Health’s (later, TLC’s) Jon & Kate Plus 8, which followed the beleaguered parents of one pair of twins and another set of sextuplets.  The late-2009 divorce made headlines everywhere; meanwhile, viewer-ship skyrocketed to 10.6 million for the episode announcing the couple’s separation—talk about sensation.

Next to come was 17 Kids and Counting (as of today, the count’s up to 19) in September 2009, documenting the surprisingly tranquil Duggar household (the only marital problems I can posit seem to be the ongoing quest for more names starting with “J” for the ever-increasing Duggar brood).  A handful of one-hour specials aired starting in 2004, but publicity shot up once Michelle Duggar’s number of kids could be rounded up to 20.  (Just one more, Mom…)

On the fictional front, The Secret Life of the American Teenager brought teen pregnancy to ABC in a big way.  With a middle-aged Molly Ringwald, a theme song titled “Let’s Do It,” and possibly the most obnoxiously didactic dialogue I’ve ever heard, “Secret Life” succeeds 7th Heaven (no surprise there: they had the same producer) as the only show on primetime to exceed The 700 Club in painful moralizing—the best way I can describe the cast is as sermons with legs.  From the New York Times: “ ‘Secret Life’ doesn’t take the fun out of teenage pregnancy, it takes the fun out of television.”

To be honest, I want the three minutes of my life back that I spent writing that paragraph above—the program is probably the worst show you’re not watching (at least, I sincerely hope you’re not watching).  The reason I mention it is that “Secret Life,” despite its abysmal reviews, still has millions of viewers.  The only explanation I can offer is the guilty pleasure hypothesis: pregnancy is something of a taboo subject in a country where the spirit of Puritanism yet lingers around social discussions (or lack thereof) about sexuality.

In any case, summer 2009 saw MTV’s 16 and Pregnant bring some reality to the issue of teenage pregnancy—the spin-off series Teen Mom continues to document the lives of the young women from the first series.  And for every insult I paid “Secret Life,” Teen Mom gets a thumbs-up (about the number of thumbs the Duggar kids have all told).  Of the four, three of the young mothers are exceptionally admirable (one still insists on going out clubbing all night, but 3 out of 4 ain’t bad):

Maci devotes every ounce of attention to her son Bentley, even when her (loser) fiancé doesn’t lift a finger to help; she’s about as responsible as I can imagine any mother of any age to be.  Amber doggedly pursues her GED while caring for baby Leah; and Catelynn made the tough choice of giving her daughter up for adoption with considerable grace.  The hardships of being a teen mom come across loud and clear—but the heroines are a lot more admirable than Molly Ringwald’s fictional daughter.

Which brings up to the most recent depiction of sensational pregnancy on television: Lifetime’s The Pregnancy Pact, based off of the June 2008 news story about a supposed “pact” between 18 girls at a Gloucester, MA high school to get pregnant and raise their kids together.  While the existence of such an agreement was never proven, the media had a field day, capitalizing on the cultural fixation that makes us shake our heads in disapproval while at the same time glued to the television screen.

Lifetime’s fictionalized account raises the issue of this sort of hypocrisy and more.  I was personally pleased to see a blogger-turned-investigative-reported take center stage, but that’s ultimately less important than the interesting question The Pregnancy Pact raises about America’s sometimes-contradictory cultural values.

Sara, our protaganist, is the daughter of no less than the Family Values Committee president, and it shows—“All I need to make me happy is to get married and have kids.  That’s all I want.” she insists, when blogger Sidney (who was the little sister in Hocus Pocus the last time I saw her, by the way) questions why such a bright girl would want to get pregnant so young.

It seems like a contradiction on Sara’s part, to be so traditional in her values and yet ignore the injunction to abstain, but there’s just as glaring a paradox in the rhetoric of such as the Gloucester “Family Values Committee.”  Unmarried pregnancy is a mistake, but babies are a gift from God?  Something of a ‘hate the sin, love the sinner’ sort of outlook, I’d imagine, but confusing nonetheless.  It confuses Sara and her Gloucester friends, at least.

The Pregnancy Pact does get a bit pedagogical toward the end, but I’m willing to forgive in this case—the message isn’t the “Secret Life” sort of Teenage Pregnancy is Bad; Don’t Have Sex.  Nor is it simply a blanket call for contraceptives in schools.  In this case, the idea is a little more subtle:

“I’m beginning to see that talking about this is a good thing,” Sidney’s ex-boyfriend admits at one point, to which she responds.

And I’m of the opinion that this sort of attitude applies just as strongly outside the Gloucester city limits—as long as we’re watching these shows behind closed doors, we might as well admit what tv producers already know: we need a cultural outlet for dialogue about sex, pregnancy, and the things society tells us are aberrant.  At the risk of being didactic myself, I’ll quote Lifetime’s crusading blogger Sidney Bloom:

“What we need to do now is have a real conversation.”

It may not be a coincidence that the mysterious exploding pregnancy rate happened in Gloucester, after all—old Nat Hawthorne set The Scarlet Letter in Massachusetts for a reason: that Puritan mindset doesn’t die easily.

Natural Skeptics: Kids and the Santa Myth

5 Dec

It’s the most wonderful time of the year.  And for a lot of people—when it comes to young children—“wonder” is the key word.  Nothing captures the magic of childhood Christmases like memories of waking up in the morning to find that, somehow, while you were sleeping, Santa Claus arrived, ate the cookies and milk you left for him (giving Rudolph the carrots, of course), and filled the room with presents.

(If I wanted to be flippant, I’d say that there’s nothing more wonderful than an overweight man breaking and entering into one’s house through as innocuous a feature as a chimney, but in the spirit of the holidays, I won’t mention such a thing.)

Those memories are tinged with nostalgia for the world-weary adults (and in this case, ‘adult’ can mean eight-year-olds) whose ideals are eventually shattered by the knowledge that Santa Claus is Mom or Dad tiptoeing around downstairs after sprinkling on the sleep with warm milk and soporific poems about sugarplums (what the heck are those?) and mice not stirring in the house (I’d hope so).

But the subsequent disillusionment doesn’t seem to prevent ultimately looking back on those memories with fondness—the carefree days watching cartoons on Saturday mornings and believing in something magical.

And there’s a lot of good in the Santa myth—it’s about joy, justice (being rewarded for meritorious behavior), and good will.  He’s jolly; he enjoys American commerce and gastronomy; and his mode of transportation hardly leaves a carbon footprint.  How could there possibly be a downside?

For one thing, it’s a lie.  Even in the service of magic and childhood wonder, it’s dishonest, and sets a precedent—the lies have to grow.

Kids are natural skeptics: they drive parents crazy with the constant “why?”  I distinctly remember the dreadful time before I learned to read—jealous of my older sister’s lexicographic skills, I would scribble in a notebook and pretend it was my diary, but when I looked back on the pages, I couldn’t remember what my “sentences” were supposed to mean.  And sitting in the backseat of the car, I’d point out every billboard and ask what it meant until both she and my mother stopped answering.  Horrible frustration—I wanted to know.

Most kids are curious—about why the ocean is blue, whether colors looks the same to everyone, what billboards say, or anything they don’t understand.  The world’s a mysterious place when you’re little (shoot, it is when you’re big), and let’s face it, Santa Claus is a mysterious guy.

Eventually, kids become skeptical about his mysterious abilities and begin to ask completely commonsense questions: how does he visit every house in 24 hours? why don’t all reindeer fly? how can a morbidly obese man fit down our chimney? (That one was particularly relevant in my house, which didn’t have a chimney.)

But parents think back to the “magic” of their early days and respond with vague claims about the supernatural.  The process of coming to a reasonable, logical conclusion is forestalled, supposedly for the good of the child in his or her fragile formative years.

That’s just it.  They are formative years.

Deflecting answers or making up far-fetched explanations to fend off questions discourages this completely healthy, completely natural, and almost universal skepticism in children.

And truth be told, attachment to the man in the red suit seems to tilt pretty heavily to the adult side.  Adults don’t want to deny their children the sense of wonder they remember; kids don’t want to be denied answers.  Cross-purposes, friends.  There’s nothing more frustrating than deflection, even now (thank goodness for Wikipedia).

Skepticism is an important thing to learn early—it’s critical thinking, reasoning through problems, learning about how the world works, the scientific method.  And in a media-saturated culture, with politicians and newscasters and advertisers and writers throwing information at us from every direction, it’s more important than even to have a mechanism to sift the wheat from the chaff.

Skeptical thinking isn’t cynicism or disillusionment; and it certainly doesn’t have to mean a Burgermeister-Meisterburger ban on Christmas (or maybe that was Oliver Cromwell…).

Once again, I have to point to Carl Sagan: The Demon-Haunted World is a manifesto for critical thinking, and I agree with every reviewer who commented that it needs to be read by every high school student who can get a copy (buy it, share it, steal it… well, maybe not the last).  He writes:

Every now and then, I’m lucky enough to teach a kindergarten or first-grade class.  Many of these children are natural-born scientists—although heavy on the wonder side and light on skepticism.  They’re curious, intellectually vigorous.  Provocative and insightful questions bubble out of them.  They exhibit enormous enthusiasm.  I’m asked follow-up questions.  They’ve never heard of the notion of a “dumb question.”

And then we get to high school.  We memorize dates and facts, read Albert Camus, flirt with nihilism.  Don’t tell me it’s childhood trauma from that Christmas day the magic died.  It’s because we never learned critical thinking at all.  Just the opposite, in fact—as children, every time we tried, we were discouraged.  Questions are brushed off or patronized (“because the ocean reflects the sky, dear”; “because the reindeer are magic!”).  Memories of Santa Claus bring a lump to our collective throat?  Maybe because we think he holds a monopoly on magic.

There’s just as much excitement in learning how to read, or in finding a solution to a problem all by yourself (don’t kids incessantly insist on doing things on their own?)—maybe more.  In any case, that’s the kind of wonder that lasts.

Anarchy on the Internet (and why it’s good)

24 Nov

Everyone knows that middle-aged sexual predators lurk in chatrooms, posing as insecure tweens looking for a friend; or friend other insecure tweens on MySpace; or that if you don’t lock up your wireless network tight, terrorists are going to tap into it and turn your naivete into massive-scale crime; or that that email with the suspicious subject line is a virus that’s going to delete all your files (even if you do have a Mac); and that if you don’t forward this message of holiday cheer to 42 people by midnight, an axe murderer will sneak into your room at 3 am and— ZZSWAR9ARG7Z

You get the point.  There are dangers hiding behind every hyperlink.

I don’t mean to be flippant (no, that’s a lie; I do, but it’s strictly rhetorical)—the Internet can be a scary place, and scary people use it.  I’m all for parental controls and spam queues.  What I’m not for is the underlying premise beneath Internet fear-mongering—because it’s not always just “Stranger Danger.”

Some of the outcry against danger (or obscenity, or perversion, etc, et al) comes with a call to action that frightens me more than any technological boogeyman—if the Internet is dangerous because it’s so open, because anyone can do, really, anything, why not regulate?

In 1993, SF author Bruce Sterling (“Junk DNA,” remember?) wrote an article called “A Short History of the Internet,” which you can find in its entirely online, and which I highly recommend.  For my part, I’ll focus on just a few key facts, some of the points from the reading assignment for today’s American Studies lecture on “The Internet Revolution.”  So:

1. The very openness and decentralization of the Internet that makes it “dangerous” was built into its most basic structure—from the perspective of a Cold War scientist, you see, a communication network would have to be as decentralized as possible in order to still function after a nuclear holocaust wiped out God-knew-where in the United States.  With this in mind, the less authority—the better (sounds strange for a military-government program, doesn’t it?).

2. And after decades of evolution, that’s what we still have: no authority.  Sterling asks:

Why do people want to be “on the Internet?”  One of the main reasons is  simple freedom.   The Internet is a rare example of a true, modern, functional  anarchy.   There is no “Internet Inc.”   There are no official censors, no bosses, no board of directors, no stockholders.  In principle, any node can speak as a peer to any other node, as long as it obeys the rules of the TCP/IP protocols, which are strictly technical, not social or political.

Sixteen years after those words hit shelves in The Magazine of Fantasy and Science Fiction, and that’s still true: it’s simple science fact, and no less amazing for it.

Online, you are what you type, upload, or post—identities are fluid.  It’s true that might mean a fifty-year-old man staring at a glowing screen in his basement could pretend to be a junior high girl on a some Edward Cullen fan site, but it also means that young Peter Wiggin can blog and be seen by the world as an elder statesman.

It’s freedom to be creative without the stigma of age, sex, race, or anything else that might lead someone to prejudge you before looking at your work or ideas: online, you are your ideas.

Blogger and SF writer Cory Doctorow’s name (which I feel I mention every other post) is almost synonymous with Internet freedom.  Publishing his novels under a Creative Commons license for free distribution online (DRM-free, I might add), Doctorow could almost be a character from one of his own books—Alan/Adam/Albert/Avi, for example, from Someone Comes to Town, Someone Leaves Town, spends the time he’s not brooding about his troubled childhood as the eldest son of a mountain and a washing machine, setting up a free, open, wireless network for the people of his local town.

(I did say almost a character.)  In any case, he practices what he preaches, and in all his books shows just how cool our world is.  I’m going to have to quote Makers again– we’re living in the “weirdest and best time” in the history of the world.  Witness the astonishing success of modern anarchy:

“No one needed to draw a map of the Web,” Kurt said, “It just grew and people found its weird corners on their own.  Networks don’t need centralized authority, that’s just the chains on your mind talking.”

I have to give my professor credit—revolution was a good title for the lecture.  Even after our first Revolution, observers (read: Alexis de Tocqueville) noticed a tension in American society between liberty and equality, freedom and democracy.  Oftentimes, they clash (see any debate on social welfare programs—the object is equality of outcome, but at the expense of freedom to use and dispose of one’s property, money).

But no political arguments in this post about liberty and equality: the anarchy of the Internet is one of the only places where you don’t really have to choose.

Home Sweet Homepage: Growing Up in Cyberspace

2 Nov

“Have you ever played the Wikipedia game?”

After an exasperating few seconds struggling to articulate her mental picture of the Internet — “I can’t fit it into my head!” she protested — college sophomore Nicole Hugo landed upon one of her favorite exercises in procrastination as a suitable analogy for the greater World Wide Web.  To play, she explained, you need only think of a topic, any topic, and attempt to wind your way through the hyperlinked labyrinth of Wikipedia’s three-million-odd articles until reaching, at long last, a page dealing with the subject you’d originally chosen.  The “Wikipedia game,” she claimed, is the Internet writ small: “You can start off one place and end up where you want to go because everything is so interconnected.”

An effortless activity for Nicole, who rated herself a modest 10.5 in Internet proficiency (on a scale of 1 to 10), the analogy suggests an interesting topological conception of the Internet — for cyber-navigators like Hugo, information retrieval is so simple as to be a game; the challenge lies in exploring the geography, covering the terrain from point A to point B in the most creative way possible (her example: from roadrunners to Japanese anime).  In the minds of these college-age students, raised to treat it less as a tool than an environment, the Internet has become a “place” — one they find as comfortable and natural as the world outside the computer monitor.

Leah Jacobs, perhaps the only member of the Millennial Generation to express the wish that the Internet had never been invented, is not one of those students.

“I was always behind on the Internet,” she confessed — “This is embarrassing, but if I heard a song I liked on the radio, I would go on the iTunes search engine and try to guess the name of the song until something came up; I didn’t know I could go on Google and just type in a couple lyrics and press enter.  I only realized this the other day.”

Leah, who sees the Internet predominantly as an instrument of time wasting (at one point she compared Facebook to a drug addiction), approved of the utility of only a select few Internet features: email and Mapquest, in particular, though she still prefers Rand McNally.  While she originally rated herself a 7 in Internet competence, about an hour after the interview Leah must have had second thoughts; she emailed me back with the suggestion that I change her self-evaluation to a 4.  Perhaps the drop to a below-average ranking has something to do with her shaky command of Digital Age jargon, in which she’s not quite fluent: over the course of our conversation Leah groped for the term “flash drive” — “those little things that you put in the USB and save stuff to and take them somewhere else?” — and, when referring to websites she frequents, commonly cited the full URL (“dot com” and all).

Leah attributes her limited proficiency online to a long series of technological deficiencies in the Jacobs household: “It’s definitely how I was brought up — I was a sheltered child,” she said, adding that her family’s falling “behind” extended beyond computers.  As Leah remembers: “It was this big exciting thing when we got one cell phone… until we realized we couldn’t call each other.”

Though she was, like most millennials, exposed to computers very young, these early experiences served mainly to establish the utilitarian view of the Internet she holds today — while her “mom was obsessed with buying computer games,” they were strictly educational programs, and online activity involved nothing but email until high school.  “I think of the Internet as a tool to help me do work,” she said; “Other than that, it’s better to actually do things with people, to see them and talk to them.”

That sentiment was echoed by another Internet detractor, Dana Cooper, who spends her time online in similarly practical pursuits: homework, research, email.  This too reflects a pattern forged in childhood: while Dana learned to type before she could spell her own last name, her earliest memories of the Internet emphasize the academic uses she puts it to today.  “I remember being eight years old, looking up stuff online to do a project for science, on manatees,” she laughed, “And I did it all on my own, like I knew how to look stuff up online” — ten years later, the pride in her voice is still audible.

But Dana, if more adept than Leah at what she termed “the tricks of the trade,” still deplores the amount of time her classmates spend on the Internet.  “I think it’s embarrassing,” she said, citing a recent visit to a friend’s house as an example of the excessive lengths her peers will go to in pursuit of the perfect social networking profile — “She was putting up her pictures online, and fixing them and changing them and cropping them here or there, and I felt like, you can find out way too much stuff about people.  It’s invasive.”

From here, Dana begins to diverge from Leah’s functionalism: while Jacobs considers face-to-face communication the only true form of human contact, Dana indicates that the Internet does provide a setting for real social interaction, even if she isn’t a participant.  “Sometimes I do feel like an outsider,” she lamented; “If I go on some site and see what other people are writing, it’s kind of strange, like, why am I reading this if I’m not a part of it?”

For Dana, reading the posts of an online forum, the comments on someone’s YouTube account, or the wall posts between individuals over Facebook is akin to eavesdropping — the content a person generates online, in this view, is an accurate reflection of the flesh-and-blood human being behind the HTML.  To Dana, you are who you are online, and she, without a virtual identity (even one as cropped and edited as her friend’s), is less than an outsider: “If the Internet was real life,” she mused, “I would be non-existent!”

Nicole Hugo and Britta Kilkenny, another college sophomore and Internet true believer, elaborated on this idea — “Everything I post is true,” Nicole said of her Facebook profile, “So I guess it is does show who I am, or at least who I think I am”; Britta agreed, speculating that “someone could get a pretty good idea” of her personality and daily life if she updated her status more often.

What Dana termed “invasive,” Nicole and Britta see as a means of accurately representing themselves in a new medium, online.  And for Nicole, even something as short and presumably impersonal as a YouTube video comment can extend meaningful human contact — because she sees content that may not include so much as the creator’s name as representing an actual person, Hugo tries always to be encouraging and supportive: “These are usually people that I really enjoy, that I subscribe to, and I think they appreciate it when you write something positive.”  Compare this to Leah, for whom “people” exist only in physical space.

But in cyberspace (at least for Nicole, Britta, and Dana), manifestation as a tangible, carbon-based life form isn’t necessary; one’s virtual identity, perhaps a thing of words alone, is just as real and just as capable of social interaction.  The germs of this mindset as well may have been planted in early childhood — when asked what website they most remember visiting during their first days of Internet travels, all but Leah answered without hesitation: “Neopets.”

Launched in 1999, Neopets was something of an early online community for kids; set on the virtual planet of Neopia, members chose their own pets from a wide selection of chimerical animals and then proceeded to feed, clothe, and entertain them.  Discussing Neopets, Dana’s description highlighted both the independent nature of her early Internet use as well as the sense that the website was more a virtual environment than a game: “You could have your own pet, and it was yours, and you cared for it.  Then I could make money and have all my own material possessions, but on the Internet, you know?”  Britta, too, agreed that “it was pretty intense,” recalling that the website allowed for messaging among “neofriends” and even “battles” between the pets of complete strangers, an online cognate of the then-popular Pokemon card game.

Though she played as well, Nicole Hugo found Neopets rather too tame for her tastes — her online communication with others, even with strangers, proved more direct: chat rooms.

“Now that I think about it, that was probably extra sketch,” she said of these first, daring forays into the online social environment — nevertheless, a fourth-grade girl in 1999 could hardly expect to maintain her reputation without participating in at least one Backstreet Boys chat room.  A self-described “bad kid,” Nicole very early developed a cool confidence on the Internet.  “Because we didn’t have a babysitter,” she explained, free time was spent on the computer in her father’s office; this informal, private practice — “click-click-click, that’s all I knew how to do” — gave Nicole the basic skills she needed to manipulate a mouse, keyboard, and search engine in order to pursue her personal interests as a ravenous fangirl.

It’s no coincidence that Nicole and Britta, who at a young age began to see the Internet as a medium for social interaction, today treat websites like Facebook and YouTube as settings for communication as comfortable and personal as any face-to-face conversation.  Dana too, a Neopets veteran, sees the Internet as a “place,” an environment with its own geography — “Everyone talks about how big the Internet is, and I know, because I can go on for hours and hours and still feel like I’ve never gotten into the core of it.”

Her use of the word “big” here, interestingly, refers neither to popularity nor extension across a physical expanse of space: Dana references the simply overwhelming number of websites, a dense virtual world she can’t penetrate because, without a Facebook profile or other virtual identity, she remains on the periphery, “non-existent.”  And while Dana can’t reach the “core” of the Internet, Leah Jacobs — who described her family as “technologically deprived” — seems constantly out of her depth, floundering with search engines and information retrieval processes digital natives like Nicole Hugo consider so familiar as to constitute a game.

Ultimately, these four millennials represent a continuum in their treatment of the Internet — from a practical tool and nothing more, to a virtual environment in which real human interaction can occur.  The determinant?  Who cut their teeth on mousepads and keyboards, and who had to settle for pacifiers.

 

This is a slightly re-tooled version of a paper I recently wrote for an American Studies course.  The research is original, though names of interviewees have been changed to protect their identities (especially that of “Leah” who really doesn’t trust the Internet).  Feel free to use my research—just cite your sources.  Because remember, kids: plagiarism will send you straight to the Gates of Hell with the inefficient, the indifferent, and Pope Celestine V.  Or so says Dante.

“Step Aside, Shakespeare” ?

31 Aug

Don’t mess with the Bard.

So, maybe we don’t know if “Shakespeare” was an alias, or whether the writer of the plays and poetry was Sir Francis Bacon or Edward de Vere or really and truly the son of a farmer from Stratford-upon-Avon.  What we can agree on, however, is that he was one of the greatest (heck, he was the greatest) writers in English history.

Or at least, I thought we could agree on that.

Apparently not, according to my lovely university student newspaper, which ran an article today stating that the consensus of the editorial board wants a “reform” of English education.  “Learning,” they say:

Should not be a chore.  Rather, it should be a pleasure.  It’s time to reform the way we teach people to embrace reading.  Make way for education, not anguish.  Step aside, Shakespeare.

Now generally, I have little to complain about when it comes to the Crimson White.  Mediocre writing on moderately interesting topics usually makes for a thoroughly mild reading experience.  There’s the occasional angry letter to the editor, of course, and the self-important guest column from some student luminary, but overall—there’s just not much to say about the Crimson White.

Which is why I’m not too surprised to find contempt for “a thick classic filled with incomprehensible prose and old-fashioned themes” within its pages.

I agree with the editorial board that English education in most high schools and elementary schools doesn’t always instill a ravenous hunger for knowledge in its students.  But don’t fault the material for that—fault the completely undeserved stigma that you, campus journalists, are perpetuating in your own columns.

Shakespeare was a friggin’ genius.

You might not understand some of his “incomprehensible prose,” but did anyone ever tell you just how much of the prose we use today, in our most prosaic conversations, was invented by good ol’ Bill?  Here’s a small sample:

“All that glitters is gold” (Merchant of Venice)

“The be-all and end-all” (Macbeth)

“Best foot forward”  (King John)

“Dog will have his day” (Hamlet)

“Eaten me out of house and home” (Henry IV)

“Jealousy is the green-eyed monster” (Othello)

“Kill with kindness” (Taming of the Shrew)

“In a pickle” (The Tempest)

“There’s a method to my madness” (Hamlet)

“Neither rhyme nor reason” (As You Like It)

“Too much of a good thing” (As You Like It)

“Wild goose chase” (Romeo and Juliet)

And that’s only a small selection.  Along with common turns of phrase we use, he coined words as well:  accused (n), accommodation, amazement (n), bachelorship, bandit, birthplace, cold-blooded, coldhearted, to compromise, dauntless, deafening, dexterously, to educate, enthroned, eyeball, eyesore, fortune-teller, gloomy, hoodwinked, housekeeping, invitation, lackluster, leapfrog, majestic, manager (n), multitudinous, obscene, puppy-dog, bedazzled, and many many more.

(Yes—you can thank William Shakespeare for the name of that staple of infomercials with which you can stick rhinestones on your jeans, or whatever.)

So while the sentence structure might be difficult for us today, just think: people in Shakespeare’s time didn’t know what the heck he was talking about either.

But not only this (and this is no small matter, to essentially reinvent the English language into its modern form), Shakespeare also wrote some of the most beautiful poetry in this language:

All the world’s a stage,

And all the men and women merely players:

They have their exits and their entrances;

And one man in his time plays many parts  (As You Like It)

This is by no means the best example, but it does refute the Crimson White’s assertion that “classics” deal only with “old-fashioned themes.”  Read the lines above and tell me that they don’t apply just as well today as they did in Elizabethan England.

Not to mention that Shakespeare was way ahead of his time dealing with issues such as race (Othello) and gender (Merchant of Venice, As You Like It), which a lot of Americans didn’t start thinking about until the 1960s and 70s.

Even the structure of his plays show how far ahead in the game Shakespeare was: Antony and Cleopatra, for example, switches between wildly different settings– Rome, Alexandria, Messina, Syria, on board a ship at sea.  And he didn’t have CGI, either.  Shakespeare was anticipating the screenplay.

Yes, I agree that English education is not always inspiring, but that doesn’t mean we should take writers such as Shakespeare off of the pedestal they very rightfully deserve.  Don’t lower the bar and erect a monument to Averageness: show students that there already are monumental works of literature, and that they are understandable, and that they are incredibly relevant even today.

 

Coined quotes and words from:

http://www.pathguy.com/shakeswo.htm