Tag Archives: knowledge

Be A Good Cyber-Citizen: Edit Wikipedia

18 Mar

After reading China Mieville’s Perdido Street Station, my Facebook religious views status read “Cult of Palgolak” for a couple months.  Because the truth is, if I were to believe in a deity, it would totally be this one:

(Wikipedia) Palgolak is the god of knowledge, who features in the novel Perdido Street Station. Palgolak is typically depicted as either a human or a Vodyanoi, sitting in a bathtub that floats mystically across the cosmos’ infinite dimensions, observing and learning. It is believed that anything learned by a follower of Palgolak is also known by Palgolak himself, a quality that gives his worshipers desire for knowledge.

And from the book itself:

He was an amiable, pleasant deity, a sage whose existence was entirely devoted to the collection, categorization, and dissemination of information … Everything known by a worshipper was immediately known by Palgolak, which was why they were religiously charged to read voraciously. But their mission was only secondarily for the glory of Palgolak, and primarily for the glory of knowledge, which was why they were sworn to admit all who wished to enter into their library.

I love the idea of a religion completely devoted to the creation, consumption, and dissemination of knowledge–I tell my classmates, only half-joking, that Amelia Gayle Gorgas Library on campus is my church.  But in this digital age of ours, the real manifestation of the Cult of Palgolak and library cathedral would have to be Wikipedia.

Wikipedia editors are volunteers, contributing to the largest encyclopedia in human history, available to anyone with an Internet connection.  It’s the bane of teachers, the holy grail of homework, the first resource of anyone looking for information on any topic in almost any language, and the most successful digital humanities projects… ever.  Democratization of information ftw!

This semester I’m taking a class through the UA history department called Intro to Digital Humanities.  Basically, DH (and no, I don’t mean Deathly Hallows) is about integrating technology into traditional scholarship (particularly in the more qualitative fields of the humanities).  What our excellent professor suggested the first day of class, however, is that DH is a set of values too: an ethos of collaboration and sharing that the Internet makes possible on a wider scale than ever.  Wikipedia definitely brings together technology and research, but it also demonstrates that collaborative spirit in action.  It’s kind of a crazy utopian idea when you think about it–but it’s working.

Last class, we had a guest speaker join us, a professor of Middle-Eastern History from Florida State University.  We had a long debate about the value of Wikipedia (the class seemed to divide fairly quickly into idealists and skeptics)–and then our guest asked how many of us have ever edited a Wikipedia article.  No hands.  And then we broke for Spring Break.

But it made me think–we’re in a class all about the sharing of information, and not even contributing to the greatest such project in human history.  Volunteering to edit Wikipedia is an act of democratic participation–maybe even a sign of good global citizenship.  You don’t have to be a worshipper of Palgolak to start to feel that participation is almost a moral duty.  Which is why I’m making a belated New Year’s resolution to be an active Wikipedia editor.  Current task? the Unreferenced Articles WikiProject.  As a history student, I’ve got some mad citation skills.

So now the only question is– when can I put this on my resume under community service?

Spirit Eyes (review: The White Hairs)

9 Aug

The ill-starred Titanic’s bandleader Wallace Hartley drowned in the frigid Atlantic the night the ship went down, music box strapped to his chest as he and the ship’s seven other musicians played on the sloping deck for passengers strapping on vests and swarming into life boats.  Survivors recall hearing the popular upbeat, syncopated ragtime tunes of the day, the hymn “Nearer my God to Thee,” and the haunting “Dream of Autumn”:

Reports conflict as to which was the very last song the band played before they all went under, but we don’t need to know to be moved by the fact that eight men would—knowing that they were in their last hour of life—choose to create something beautiful for others.  It’s a tragedy, but awe-inspiring in the most honest sense of the word.

That’s the story—and sentiment—that came to mind as I read Noah K. Mullette-Gillman’s novel of “spiritual mythology,” The White Hairs.  The book follows the physical and mental journeys of a mysterious, non-human snow creature called Farshoul, whose people have learned to loose their souls from their bodies and travel the astral plane.  Their self-proclaimed status as the most “advanced” race on the planet comes not from science, technology, or the other trappings of human civilization, but from this mystical ability to see straight to the soul.  The humans, on the other hand, are “The Unconscious Ones” or “The Mindless.”

Very simply, of Farshoul—“He had been told they lived unexamined lives.”

But Farshoul is a snow-Socrates who thinks perhaps there’s more to the human than meets the eye—or rather, that meets the anatomical, biological, physical and physiological eye.

My favorite scene in the book comes when Farshoul uses his spirit eyes to travel far from his home on the snowy slopes of a mountain range out to sea, where a human ship is caught in a desperate position: just minutes away from capsizing in a massive storm.  Initially, Farshoul pities them their frantic prayers and pleas for help—

“Farshoul’s people knew that souls die when their bodies die… It was as if their next world was supposed to make everything that was wrong in the current one all right.”

It doesn’t help the human reputation, either, that the captain shoots half of his men as they lay strapped in their beds—a quicker death than drowning, he explains over their helpless protests.  And yet, as Farshoul continues to eavesdrop on the sailors’ last moments, he witnesses something more elevated: they begin to sing.

“They were crying and they all knew they were dying and they chose to spend their last moments of consciousness making something beautiful.  He hadn’t thought they were capable of anything so wise and advanced.”

It could have been ragtime, and it could have been “Nearer My God to Thee,” but either way it was something sublime, a reminder of a fact we so often forget: there are beautiful facets of human nature too.

The White Hairs is fiction (though, if you want to believe that the abominable snowman is a soul-traveling, mystic superhero, go right ahead).  There is, however, a lot of truth to it.  The rest of the novel might not have so perfect an historical cognate as the story of Wallace Hartley and the Titanic’s orchestra, but there is absolutely no way to read this book and not see it as a parable for our own very material-centric world.

Farshoul’s “society was set up around their relationship with the intangible,” Mullette-Gillman writes—putting very clearly what the first sentence of the book already hinted at:

Farshoul watched as the long white hairs on his arms became translucent.  He watched as they faded away.

What the author does in The White Hairs (the name of the mysterious snow race, in case you were wondering) is give us a story about stripping away all the superfluous aspects of our own lives to find other examples of human spiritual beauty in such a cluttered, materialistic world.

Without that ability, Farshoul learns at one point—after losing his own spirit eyes in a fight with a vicious dog made of hateful red energy—life becomes empty and hollow: “He was not capable of seeing the most beautiful qualities of anyone after his injuries.  He was incapable of love.”

But even for us most Unconscious Ones, all is not lost.  As Farshoul’s mother tells him (proving, perhaps, that wisdom does come with white hairs):

“You didn’t lose anything you can’t find again.”

Author Response: Faith, Science, and The Proximian

4 Aug

After I raised some objections in a review to what I see as an incongruous blending of biblical literalism in his science fiction novel The Proximian, I wanted to make sure author Dennis Phillips had a hearing too.  He felt strongly enough to leave a very generous comment on my review, and I felt strongly enough to re-publish my review of atheist Sam Harris’s Letter to a Christian Nation. But not everyone scrolls down to read the comments, so here’s Mr. Phillips’s response to an admittedly critical reception of The Proximian from the Scattering:

Thank you, Isabela for taking time to review my work, and thank you for your kind comments. You mentioned in an email to me that you are an athiest. As such, I understand that you would be biased with regard to any blending of religion in science.

You seem to believe there is some disconnect between faith and science. I do not. You seem to believe that if someone is true to science, that they cannot be religious, which is why you wrote “I find it difficult to believe that an astrophysicist like Carl Sage could accept Creationism.” Yet many like him have and do. I can no more prove the existence of God, than you can disprove it. In the end, a belief in God, like a belief in the theory of evolution, must be a matter of faith.

Many thanks again and best wishes for a successful future,

Dennis Phillips, author, The Proximian

And here’s my comment in reply:

I’d hate to get into a theological argument in a comment thread, but one thing you said in particular stuck out to me as rather off– “A belief in God, like a belief in the theory of evolution, must be a matter of faith.”

Absolutely not!

Tim Minchin said it best: “Science adjusts its views based on what’s been observed. Faith is denial of observation so that belief can be preserved.”

In regard to the “absence of evidence” vs. “evidence of absence” argument, here’s briefly what I have to say:

Observation and experimentation is the basis of science, and these pillars allow not only for dialogue but the opportunity for other scientists and researchers to disprove a hypothesis–and so get closer to the truth (knowing something is wrong is just as valuable as knowing which answer is right). The fact that one cannot, as you mentioned, disprove the existence of God only serves to highlight the very real disconnect between faith and science: that’s completely the opposite of the scientific ethos.

Sam Harris’s short “Letter to a Christian Nation” would be a great resource for anyone wishing to better understand atheism.

– Isabela Morales

Well, the debate isn’t going to be solved in the comment thread of a second-tier science fiction review blog, but I hope that gives readers a more rounded-out view of author Dennis Phillips’s philosophy and reasons for including some Genesis apocrypha in his novel.  The stakes, as he let me know, are high:

One of us is wrong. We can’t both be right. And if I’m wrong, when I die, I’ve lost nothing; but if you’re wrong, then some day, when you die, you’ve lost everything.


Souls in a Petri Dish (Review: Letter to a Christian Nation)

4 Aug

“Atheists are the most reviled minority in America.”

Sam Harris has it exactly right.  Polls—even some taken shortly after 9/11—show that the majority of Americans would rather have a Muslim president than one who doesn’t believe in any God at all.  Maybe that seems hard to believe when we think back to the horror over our current Presidents highly suspicious middle name, but the number bear it out.  Atheists aren’t likely to achieve high office.

Maybe that’s why one of our most famous nonbelievers in American history, Thomas Paine, is the most notable of our founding fathers not to have a monument.  They don’t even mention him in the recent History Channel documentary America: The Story of Us(which is otherwise both moving and surprisingly objective) in the Valley Forge segment.  George Washington thought the political pamphleteer important and inspiring enough to read to his starving, freezing men at Valley Forge (and thus keep the army together through a terrible winter)—but this isn’t the Age of Reason anymore.

In high school, I was nostalgic for the political pamphlets of Thomas Paine rallying patriots to the Revolutionary cause.  Nostalgic not because I’d been there (though I’m still holding out for time travel), but because today’s political debates involve so much more mudslinging and snide soundbites than any meaningful debate, and because—to someone who compliments acquaintances on brilliant extended metaphors in emails and cries after every re-reading of Plato’s Phaedo—good rhetoric is so, so hard to find.

Especially on the issue of religion and faith.  On a small scale, the University of Alabama club “triple-A,” Alabama Atheists and Agnostics, had its chalking vandalized by devout Southern Christians about half a dozen times this past year.  Pouring slushies on a chalk portrait of Darwin is the college equivalent of a shut-down of intellectual debate, I guess—which is something atheists face in the “Christian nation” of the United States.

I can’t help but have a wonderful time reading the gleefully irreverent Christopher Hitchens.  As might be expected, I can’t say the same for my ex-roommate at UA, who never looked at me the same after she found God Is Not Great: How Religion Poisons Everything on my Kindle.  But Sam Harris’s Letter to a Christian Nation satisfied my sentimental longing for Thomas Pain-esque writing, and then some.  His short book—more a manifesto—echoes Paine’s celebrated Age of Reason in that he’s not on the defensive.  Harris explains that the New Atheism isn’t just a negative (not believing in God): it’s about a positive too, belief in science and reason.

Last fall, I awarded Thomas Paine the Scattering’s premier literary award—the Heretic Badge of Honor—for his 1794 Age of Reason.  Today, I’m awarding the Heretic Badge to Sam Harris for Letter to a Christian Nation, for taking up the torch.  He writes in his conclusion, after all, that:

“This letter is the product of failure—the failure of the many brilliant attacks upon religion that preceded it, the failure of our schools to announce the death of God in a way that each generation can understand, the failure of the media to criticize the abject religious certainties of our public figures—failures great and small that have kept almost every society on this earth muddling over God an despising those who muddle differently.”

In his letter, the New Atheist does revive for a modern audience some ideas that reminded me of past doubters very strongly.  The foundation of atheism, he argues, is a scientific mindset, but that might mean something different than many people expect:

“The core of science is not controlled experiment or mathematical modeling; it is intellectual honest.  It is time we acknowledged a basic feature of human discourse: when considering the truth of a proposition, one is either engaged in an honest appraisal of the evidence and logical arguments, or one isn’t.”

Atheists don’t revile God (although, as Harris points out, there’s a whole lot of evidence to do just that)—we respect rationality.  That’s the scientific mindset.

This definition of “intellectual honesty” struck me as particularly reminiscent of Thomas Paine’s Age of Reason, in which he wrote my favorite 18th-century quote:

“It is necessary to the happiness of man that he be mentally faithful to himself. Infidelity does not consist in believing or disbelieving; it consists in professing to believe what he does not believe.”

Or for Harris, what is impossible to believe in in modern society, with so much scientific evidence stacked against the need for a God.  It’s as incongruous as, well, to use one of my favorite expressions from the Letter—“souls in a Petri dish.”

I won’t go into detail on Harris’s arguments, because I couldn’t begin to write more clearly or concisely than he does in Letter to a Christian Nation.  And personally, I wonder how many of the Christians the book’s addressed to will actually read it—but for those who do or are considering it, let me say that while it’s bold and certainly controversial, it’s written in some of the most clear, logical prose I’ve ever read.  It’s accessible, and written more to persuade than inflame (like some of Hitchens’s writings).

The book’s only 900 locations on the Kindle (as opposed to the 5-8,000 of the average novel), so I’d place it at about 100 pages.  In any case, it’s a one-afternoon read.  So head on outside on this beautiful summer (is it summer yet?  I never really know) day, relax in the sun, wear a hat or a beekeeper’s veil if you’re easily sunburned, listen to the rustle of leaves in the wind and insects buzzing in the grass, and remember that you can thank evolution for it all—not God.

Happy Pentecost, everyone!

Letter to a Christian Nation is available in paperback as well as an ebook on Amazon for $8.64

God Emperor of Quantum Physics

7 May

The universe wants what the universe wants.

I think I started reading Frank Herbert’s Dune series sophomore year of high school—I’m pretty sure it was sophomore year because freshman year I was obsessed with Watership Down for some reason I can’t quite remember, but ended up using the Bene Gesserit Litany Against Fear as my senior yearbook quote (a decision I shall never regret).  So it was between year one and four, and I’m pretty sure it wasn’t three as year three I was keeping a Word document of Algernon Charles Swinburne poems.

But that’s not really relevant.

What’s relevant is that next Tuesday the Scattering will one year old, and in celebration a return to the beginning is in order.

Shockingly, this blog does not derive it’s name from the scattered nature of my thoughts and the tangents that posts often go off on [see first paragraph].  “The Scattering” is actually a far-future event that takes place at the end of the fourth book in Frank Herbert’s famous series—God Emperor of Dune.  It’s a species-wide diaspora of sorts, with human beings spreading out across the universe after the (spoiler alert) murder of Leto Atreides II, half-human/half-sandworm dictator.

— begin tangent –

Whenever I hear Bozz Scaggs’s “Lido Shuffle” (and I hear it quite a bit when my iPod’s on shuffle), I subtly change the lyrics to pay tribute to said God Emperor’s death:

Leto (woah-oah), he’s for the money, he’s for the show—drowning in the Idaho-o-o-o-o.

— end tangent –

The Scattering was the ultimate end of Leto’s Golden Path, a prescient vision that turned into a guide for all the horrible choices he had to make in his 3,000+ year creepy hybrid life—the only path that would prevent the total destruction of homo sapiens sapiens.  Essentially, if humans scattered to the millions of planets and billions of stars, no force (not even themselves) could ever destroy them all.

And thus, Leto II saved humanity.

But the most interesting part of the series, for me, was Herbert’s idea that simply seeing the future made the future.  When Leto’s father Paul (a messiah himself, if not a god) had his visions, he was tormented by the terrible things he saw.  And yet, the very fact that he saw them predisposed him to follow the paths he’d glimpsed—for all he knew, the alternatives he was blind to could be worse.  But Paul couldn’t take the pressure, and left his son to choose the devil he knew.

But after watching the most recent episode of FlashForward (“Course Correction”), I’ve begun to wonder whether Herbert’s ideas weren’t entirely science fiction.

For those of you who aren’t watching ABC’s new series in the hopes that it can fill the void that LOST will leave in just a few weeks, FlashForward is a science fiction drama focused on an event known as The Blackout—a couple minutes of time when the whole world went unconscious, or rather: the whole world shifted consciousness and mentally traveled six months into the future.  Everyone glimpsed what would happen (or not happen—if they’d be dead) to them on a particular day in April, and everyone freaked out.  Free will versus destiny debates broke out everywhere, and philosophy professors all over the country saw a sudden spike in their research grants.  (Well that last part’s speculation, but I’m pretty sure ABC has it in backstory somewhere.)

Central to the mystery of the blackout are Simon Campos as Lloyd Simcoe, two quantum physicists whose experiments may or may not have had something to do with the world-changing event.  In any case, they’re experts now, making talk show appearances and working with the FBI.  And in “Course Correction,” Simcoe makes a particularly interesting statement about what happens when people see the future.  To avoid butchering science, I’ll leave explanation to the experts—

From a 1998 Science Daily article:

One of the most bizarre premises of quantum theory, which has long fascinated philosophers and physicists alike, states that by the very act of watching, the observer affects the observed reality.

In a study reported in the February 26 issue of Nature (Vol. 391, pp. 871-874), researchers at the Weizmann Institute of Science have now conducted a highly controlled experiment demonstrating how a beam of electrons is affected by the act of being observed. The experiment revealed that the greater the amount of “watching,” the greater the observer’s influence on what actually takes place.

When a quantum “observer” is watching Quantum mechanics states that particles can also behave as waves. This can be true for electrons at the submicron level, i.e., at distances measuring less than one micron, or one thousandth of a millimeter. When behaving as waves, they can simultaneously pass through several openings in a barrier and then meet again at the other side of the barrier. This “meeting” is known as interference.

Strange as it may sound, interference can only occur when no one is watching. Once an observer begins to watch the particles going through the openings, the picture changes dramatically: if a particle can be seen going through one opening, then it’s clear it didn’t go through another. In other words, when under observation, electrons are being “forced” to behave like particles and not like waves. Thus the mere act of observation affects the experimental findings.

(today in 2010, this premise is generally accepted among the physics in-crowd)

And so, to oversimplify in every way: once you see something, you make it real.  Or as Lloyd Simcoe explained, the universe wants it to happen—and if you try to thwart your [fate], the universe will “course correct.”

This doesn’t only apply to the visions seen by all the poor denizens of FlashFoward world, but Paul and Leto of the Duniverse as well.

Paul Atreides saw the Golden Path, but found it too horrifying to comprehend—he didn’t find the idea of millennia of sandtrout cilia invading his privy organs terribly appealing.  By not following his vision, Paul should have changed the future, but the Duniverse course corrected through Leto.  From Children of Dune:

Already he could feel how far he’d drifted from something recognizably human. Seduced by the spice which he gulped from every trace he found, the membrane which covered him no longer was sandtrout, just as he was no longer human. Cilia had crept into his flesh, forming a new creature which would seek its own metamorphosis in the eons ahead. You saw this, father, and rejected it, he thought. It was a thing too terrible to face. Leto knew what was believed of his father, and why. Muad’Dib died of prescience. But Paul Atreides had passed from the universe of reality into the alam al-mythal while still alive, fleeing from this thing which his son had dared.

My advice: think twice before you let the spice flow.

G is for Grande

16 Apr

I started chortling in class yesterday during a lecture on Ignatius Loyola’s mystical theology.  My professor, who already thinks I’m strange since I interned at the Ayn Rand Institute last summer and have a habit of tearing up during lectures in which heretics and other historical figures are executed, looked at me sharply.

“Morales!  What’s so amusing?”

I explained that, many years ago, I was told that the Bene Gesserit of Frank Herbert’s Dune series had been based off the Jesuit order, and that I’d been thinking of that the whole time I was reading The Spiritual Exercises of St. Ignatius Loyola. Dr. M—- asked me to elaborate.

After explaining that the Jesuits evolved into a far-future all-female religious order that controls education throughout the universe, masterminds a creepy eugenics program (until the Worm Who Is God comes around), literally has a collective consciousness, and plants the seeds of religious messianism everywhere they go, everyone in the class (including my professor) was laughing, and Morales got some extra weird points.

In any case, the unusual jocularity the above incident unleashed carried over through the rest of the period, during which everything about Ignatius Loyola seemed absolutely hilarious to absolutely everyone.  I believe I must be a miracle-worker, as Loyola’s Spiritual Exercises are almost as terrifying as Calvin’s Institutes of the Christian Church.  And we all know how much Calvin frightens me.

I really have no desire to get into epistemology today, but briefly: Iggy (that’s what all the cool History majors call him) embodies an interesting paradox in Counter-Reformation theology.  He was influenced by the contemporary currents of Spanish mysticism, emphasizing a personal piety and deep emotional connection with God, but was also very, very, very institutional.  Personal prayer did not imply an unnecessary Catholic Church.  Example:

If we wish to be sure that we are right in all things, we should always be ready to accept this principle: I will believe that the white I see is black, if the hierarchical Church so defines it.

In this way, he imbibed mystical ideas without taking the Protestant route toward private interpretation of Scripture.  But then, unlike Luther, Calvin, and co., he was an ex-military man (the Exercises abound with references to being a spiritual “knight”) and highly valued order and discipline.

That’s the theme of the Exercises‘s first week in the “Particular Examination of Conscience” (and the spark for more outrageous hilarity in Dr. M—-‘s class).  He begins with a chart that looks something like this:






[I feel like this is becoming one of this dreadful “Read to the Bottom!” emails.  Alas.]

The idea is that each “G” stands for a day of the week.  On Sunday, the largest G, the person making the exercises chooses a particular sin.  Dr. M—-‘s example: being late to class (said as he skewered the young man in front of me with a glance.  For months this semester, I wasn’t even sure why the young man in front of me even bothered coming to class, considering he was always late, blocked my view of the whiteboard, and spent the entire period “taking notes” on his laptop… otherwise known as visiting Facebook and Star Trek forums.  But then I saw him set up GarageBand to record the lecture, and everything fell into place.)

Each day, then, the young trekkie in front of me would tally up how many times he was late to class on Sunday, then Monday, then Tuesday, and so on.  The Gs get smaller each day because a truly disciplined spiritual knight would be improving throughout the week.

All very logical, and possibly the inspiration for Benjamin Franklin’s outrageous moral perfection chart some centuries later.

The only thing that bothered our class was the question of what–for the love of God–the “G” stood for.

Dr. M—- suggested giorno, the Italian word for day; but of course Loyola wrote in Spanish.  Italian and Spanish have many cognates but this, unfortunately, is not one of them.  We racked our minds but couldn’t think of anything else.  Dr. M—- and I chortled like co-conspirators.

Indeed, I determined I would find out, if I had to tear apart the entire Internet to do it.

Getting back to my dorm, I wondered if perhaps G stood for God.  Maybe in the original Spanish, the letter had been D for Dios, and the translators changed it.  With this happy thought in mind, I began to comb Spanish Jesuit websites for the original text.  No good.  The original Spanish had “G” too, and seemingly no reason for it.  I despaired that this was to be forever a divine mystery in my mind, tormenting and mocking me.  I would never sing the alphabet song with fondness ever again.

But then I found another copy of the Spanish text online, and–miracle of miracles!– it included the annotation that G is for Grande (Spanish for big, or large).  This actually makes perfect sense, since the sins are to be larger on the first day than the second, seventh.

Thus I have written this post for the other despairing students out there who wonder what the heck was going through dear Iggy’s mind (too much self-flagellation that day, I suppose).  And to make things even easier, I’ll include some phrases said students might search:

Ignatius Loyola Spiritual Exercises “G”

What does “G” stand for in the Spiritual Exercises?

Particular Examination of Conscience “G”

Friggin confusing directions Loyola

Hopefully, this will be of help.

The Singularity is Near — Again

28 Jan

(or, what in the world nanorobots have to do with 10,000 BC)

Sentient computers and chimerical cyborgs sound far-fetched, but if inventor Ray Kurzweil is right, the future’s going to be even weirder—and isn’t as far away as we imagine.

Kurzweil is an accomplished inventor, entrepreneur, and author, praised by Forbes magazine as “the ultimate thinking machine.”  A high compliment, really, considering what Kurzweil sees thinking machines becoming after the “Technological Singularity” he predicts.

The Singularity, according to Kurzweil in his book The Singularity is Near, is a transition stage in human history—it’s the point where we “transcend the limitations of our bodies and brains.”  More than that:

We will gain power over our fates.  Our mortality will be in our own hands.  We will be able to live as long as we want (a subtly different statement from saying we will live forever).  We will fully understand human thinking and will vastly extend it’s reach.  By the end of this century, the nonbiological portion of our intelligence will be trillions of trillions of times more powerful than unaided human intelligence.

(And this circa 2005.)

It’s practically a truism today that your new laptop’s practically outdated the day you buy it (hey, maybe we’ll all have iPads tomorrow); but however lightly we make those comments, the real truth is that the rate of change, the acceleration of technological advancements, is kind of frightening. This is the basis of the concept of the Singularity: observation of accelerating technology.

If we tried to plot technological advances on a chart, even just intuitively, we’d probably get a pretty steep incline.  The problem, Kurzweil sees, is that our line would likely be straight.  However steep it is, a straight line assumes a constant slope—in other words, technology keeps marching forward, but the rate of that change remains constant over time.  If the rate of change itself is increasing, we won’t get a straight line but an exponential curve, in which, at a certain point, this conceptual curve shoots up to be almost completely vertical.

Recall that this theoretical curve doesn’t show how advanced our technology is, but how fast it’s advancing, which is a rather more shocking thought.  And that, friends, is how we get to the Singularity.

Kurzweil discussed this much, much better in his 2001 essay, “The Law of Accelerating Returns“:

When people think of a future period, they intuitively assume that the current rate of progress will continue for future periods. However, careful consideration of the pace of technology shows that the rate of progress is not constant, but it is human nature to adapt to the changing pace, so the intuitive view is that the pace will continue at the current rate. Even for those of us who have been around long enough to experience how the pace increases over time, our unexamined intuition nonetheless provides the impression that progress changes at the rate that we have experienced recently. From the mathematician’s perspective, a primary reason for this is that an exponential curve approximates a straight line when viewed for a brief duration.

So maybe explanation of the mechanics of the Singularity needs a mathematician, but Kurzweil in The Singularity is Near also waxes a little poetic himself when describing what happens after we get past that “knee-bend” in the curve (which is the next few decades, quoth our futurist):

The Singularity will represent the culmination of the merger of our biological thinking and existence with our technology, resulting in a world that is still human but transcends our biological roots.  There will be no distinction, post-Singularity, between human and machine or between physical and virtual reality.  If you wonder what will remain unequivocally human in such a world, it’s simply this: ours is the species that inherently seeks to extend it’s physical and mental reach beyond current limitations.

At least for me, this is thrilling, chilling, mind-boggling, and vaguely horrifying, all at once.  It’s a world I’m going to leave to Charles Stross and David Louis Edelman to describe for now, until it comes around outside the realm of science fiction.

Because I’m pretty convinced it is.

Now I’m not any sort of scientist, and math majors terrify me, but I do know a little bit about history, and I can say this much—this won’t be humanity’s first Technological Singularity.

We are the species that inherently seeks to extend it’s physical and mental reach, Kurzweil asserts, and he’s absolutely right—human beings need technology to survive, a fact that’s been true well into prehistory.  We don’t have deadly claws, warm pelts, or species-wide instincts to help us survive; we can’t photosynthesize when we’re hungry.  What we do have, and have always had, however, is a reasoning mind; and what we do do, and always have done, is use that to make the things we need but didn’t have the good fortune to be born with—stone choppers and spears, atlatls, the Amazon Kindle.

And in about 10,000 BCE, the sudden explosion of this early human innovation resulted in the birth of what we call “civilization”—agriculture, written language, cities.  And it happened, mysteriously, after roughly hundreds of thousands of years of seeming stasis.  Our brains—human hardware, so to speak—were anatomically modern, but for whatever reason, it appears that we spent over 100 millennia just hanging out on the savannah waiting for that “great leap forward.”  (And if anyone comments that it was the aliens or Atlanteans, so help me God I will start a flame war.)

Key words here: hundreds of thousands years of seeming stasis.

The metaphor I like best is that of a snowball.  You start small, with just a handful, which grows as it rolls along, perhaps down a hill.  As it rolls, it picks up more snow, but also more speed, so that before you know it there’s an avalanche running down the other kids at the bottom of the hill.

Remember that the early stages of even an exponential curve start slowly, an incremental increase that gradually builds until that knee-bend moment when the rate of change shoots up into the sky.  I think 10,000 BCE was that bend in the curve.

Human beings weren’t doing nothing out on the savannah all those millennia—they were building up the rudimentary foundations necessary for “civilization” to emerge.  Cities doesn’t just sprout up overnight.

– 2.5 million years ago, we began to use stone scrapers to butcher dead animals we scavenged.

– 1.6 million years ago saw the development of the first hand axes.

– 1.5 million years ago, homo erectus began to manipulate fire (we’re not even at anatomically modern humans yet)

And then, in 10,000 BCE, we came to the Neolithic Revolution, with the development of agriculture.  Only a few thousand years later (and that’s a very short period of time considering the millions between changes in stone tool technology) brought the earliest written languages.

In other words, 10,000 BCE began humanity’s First Singularity.

So it’s not cyborgs.  Still, Kurzweil defines the Singularity as the merge of human and artificial, of “biological thinking and existence with our technology.”

With the development of agriculture, people could generate surplus food and have some protection from the vicissitudes of hunting and gathering—but it also tied early humans to certain land, to certain methods of production, to a lifestyle in which they “existed with their technology” to an unprecedented extend.

The decline of hunting and gathering meant the settlement of large groups of people together—thousands—rather than small nomadic bands or family groups.  Permanent houses and community structures were erected, meaning that in these first cities, we literally lived in our technology.

With writing, how we interacted with each other (trade) and even how we thought (religious or political use) was shaped by it.

10,000 BCE is exactly what Kurzweil describes—“civilization,” as opposed to nature, after all, is the human constructs that humans live in.  It’s existence within our technology—a merge unprecedented in human (pre)history.

For that, I’m inclined to believe that Kurzweil’s right about a modern technological snowball leading to a radical paradigm shift.  It’s not like it hasn’t happened before.

Asimov and Machiavelli: Go Team Pragmatism!

23 Jan

As his Wikipedia article will tell you, sci-fi legend Isaac Asimov’s books have been published in nine out of ten categories of the Dewey Decimal System: everything but the 100s, philosophy.

That’s hardly surprising, considering that he wrote over 470 books.  That’s more than 6.5 books a year, assuming he began pounding on that typewriter in 1920, as an infant.  But you know, considering his resume, I wouldn’t be too surprised about that either.

And so, having finished Asimov’s and Robert Silverberg’s Nightfall this afternoon, I’m compelled to make the argument that Asimov deserves that tenth spot from Melvil Dewey: placed next to Asimov’s Foundation series Nightfall displays a striking similarity, and that in the philosophical realm.

It’s called Pragmatism.

As a formal philosophy, Pragmatism was developed primarily by William James and another Dewey (John)—but it could be argued just as well that Niccolo Machiavelli was one of the very original proponents, all the way back the 16th century.  With his political treatise The Prince, Florence’s most infamous son laid the foundations for political science as we know it.

And while “Machiavellian” has become synonymous for cunning, deceit, and unscrupulous manipulation (also a byword for such as LOST’s Ben Linus and Gormenghast’s Steerpike—look it up; Mervyn Peake needs to get some readers this side of the pond), the ultimate intention of The Prince isn’t to be a guidebook for aspiring megalomaniacs.  It’s simply pragmatic: meaning, basically, that what’s true is what works.

That’s a strange definition at first read.  But the Pragmatist relies on a re-working of what we mean by the word “truth.”  Truth, conventionally conceived, is something we discover in a dusty library perusing ancient documents, or on a mountaintop communing with the divine, or paging through Wikipedia.  It’s something immutable, unchanging, and something that can be determined objectively.  It’s what correctly describes reality (formally, by the way, this is called the “Correspondence Theory of Truth,” but no one really needs to know that unless they have an upcoming dinner party to sound pretentious at or something).

The Pragmatist rejects this concept of truth.  Science shows us, after all, that theories are always being contested, revised, and contested again—it’s why we eschew attaching the word “law” even to the works of Newton or Einstein.  Science isn’t about dogma.  And that—quoth the Niccolo Machiavelli inside the Pragmatist—is why, when it comes to searching for “truth,” we should be more like scientists.  Truth doesn’t come by research or revelation, but rather by experiment.  We test, tinker, and investigate a question until we find something that works.  We’re actors in the world, after all—not passive observers.  The “truth” should facilitate successful action in the world: it has to be practical.

So let’s be semi-scientific for a moment:

Quantum theory is absolutely mind-boggling (at least for a layperson like myself): it confuses cause and effect, posits zombie cats both alive and dead at the same time, and raises the metaphysically bothersome proposition of an observer-created reality—but it works.  The predictions of quantum mechanics have been validated as extraordinarily accurate.  And so, for now, it’s true.

Which brings us back to Asimov (and if you haven’t read either Foundation or the novelization of “Nightfall,” then please be warned: thar be spoilers yonder)—

Asimov’s novels are filled with tough-minded pragmatists making horrifying decisions in horrifying circumstances about the horrifying future of humanity—usually against their deepest convictions and consciences.  The reasons tend to be pragmatic.

Take Captain Golan Trevize of Foundation and Earth (who I gleefully lambasted in my very first blog post, so long, long ago):

Trevize is the consummate individualist, something of a space cowboy who might have provided inspiration for the rebel pseudo-criminal Captain Malcolm Reynolds of Joss Whedon’s cult hit Firefly.  Trevize does, after all, kind of steal a Foundation cruiser, and he rejects with every ounce of his being the idea that the course of humankind’s future has been predetermined by the psychohistorical predictions and guidance of the ancient mathematician Hari Seldon and his secret planet of followers.  For Trevize, free will is everything.

The stakes only get higher when he learns that part of that planned course involves a friggin’ creepy galactic hive mind, Gaia.  “A superorganism,” Bantam’s back book cover explains:

“Gaia is a holistic planet with a common consciousness so intensely united that every dewdrop, every pebble, every being, can speak for all—and feel for all.  It is a realm in which privacy is not only undesirable, it is incomprehensible.”

The prospect, for Trevize, is repulsive (as it would be for those of us who abhor the Borg—which, too, was terrorizing one Captain Picard and co. in the mid-1980s, as well as science fiction’s other Cold War kids).  And yet—he picks it.  Trevize alone (by some plot twist I still don’t fully comprehend) can choose or derail this future, and for the sake of species-wide unity in the face of possible extraterrestrial invasion, decides that the horror of total absorption of the individual is better than total annihilation of the species—though honestly, I will seriously debate this point, and I’ve got Mal Reynolds at my back.  In any case, former ideals are suppressed, and Golan Trevize does what he sees as most practical, damn him.

It’s a similar choice Theremon 762 of Kalgash has to face in Nightfall.  The novel, based on Asimov’s legendary short story of the same name—possibly the most famous short story of the entire genre—imagines a world in which six suns in the sky make Darkness unnatural and completely unimaginable, mentioned only in enigmatic texts of a creepy religious cult, the Apostles of Flame.  According to the Apostles, their incomprehensible Book of Revelation, and their steely-eyed leader Folimun, Darkness will descend every 2,049 years—one nightfall per two millennia—when the mysterious Stars will appear to suck out men’s souls.

Naturally, Theremon brushes this off as mystical mumbo-jumbo—even when scientific evidence from multiple academic fields begins to, disconcertingly, back up the Apostles’ claims.  He laughs it off, encourages public disbelief with his vicious rhetoric, and goes completely off the rocker for a couple days when night does fall, civilization does descend into utter madness, and the innumerable Stars unseat him from his cozy little place at the center of the universe.

But our hero’s better off than most—his sanity returns, and he embarks on a quest with a very few other mentally stable companions to reach Amgando, the site of what’s supposed to be a new provisional government.  The goal?  Combat the religious totalitarianism of the Apostles of Flame, who had been preparing for centuries for this apocalypse and are gearing up for world conquest.

Theremon, recall, hates the Apostles.  He hates them with a fiery passion.  He hates them for their mysticism, their anti-scientific attitude, their repressive dogma and creepy hooded robes.  Even if they were right all along.

But still, in a penultimate-page plot twist, Theremon joins them.  “Folimun,” he says of the Apostles’ leader:

“Is a totally ruthless, almost monstrously rational man who believes that the only thing that’s of real importance is the survival of civilization.  Folimun knows that in a time of total madness the best hope of pulling things together is religious totalitarianism.  You and I may think the gods are just old fables, but there are millions and millions of people who have a different view—and now they have an absolute dread of the gods.  The Apostles are in a better position to set up a world government.”

Theremon, for all he abhors the Apostles, can almost admire the “monstrously rational” Folimun.  “I hate the idea,” Theremon says; nevertheless, the most practical route—for the preservation of humanity (or Kalgashity… I’m not entirely sure what they are)—he, not terribly enthusiastically, jumps on the bandwagon.  In a world of logic and reason, science was salvation; but with half the world gibbering lunatics, it’s the Book of Revelation to the rescue.  That’s pragmatism.

Now I don’t know what that says about Asimov’s religious views, but his philosophy’s pretty clear.  So just give him a spot in those 100s, already—okay?

Disclaimer: I am not a Pragmatist.  Really.  I make fun of it all the time here on the Scattering.  But I must admit—I kind of love Machiavelli… him and Thomas Cromwell… and Ben Linus.  Especially Ben Linus.  This a psychological defect on my part, and should not be taken as an indication that I am a Pragmatist.  Alas, alack—how ashamed Ayn Rand would be.

Democratizing Technology, again

14 Dec

The more we control our technology, the better we like it.

This weekend I took the advice of 17-year-old Marcus, hacker protagonist of Cory Doctorow’s young adult novel Little Brother (2009’s Prometheus Award winner for Libertarian SF, by the way):

A computer is the most complicated machine you’ll ever use. It’s made of billions of micro-miniaturized transistors that can be configured to run any program you can imagine. But when you sit down at the keyboard and write a line of code, those transistors do what you tell them to.

Most of us will never build a car. Pretty much none of us will ever create an aviation system. Design a building. Lay out a city. Those are complicated machines, those things, and they’re off- limits to the likes of you and me. But a computer is like, ten times more complicated, and it will dance to any tune you play. You can learn to write simple code in an afternoon. Start with a language like Python, which was written to give non-programmers an easier way to make the machine dance to their tune. Even if you only write code for one day, one afternoon, you have to do it. Computers can control you or they can lighten your work — if you want to be in charge of your machines, you have to learn to write code.

It’s less a subtle suggestion than a call to arms, but I was inspired anyway and messed around with Python for an afternoon—because I’m a Humanities student and very proud of the fact, I used my very limited skills to write a program that quotes Hamlet at you, substituting your name for Ophelia’s (user_reply = raw_input (“Get thee to a nunnery!”)).

It was amazing.  And Cory Doctorow/Marcus was absolutely right: every time that little program carried out a command I typed, I shouted very loudly and very excitedly with the astonishing power of even my limited abilities (very, very limited abilities).  Doctorow’s call to arms is something almost as cool as my hamlet.py program: computers are complicated, and the Internet can be frightening, but we don’t have to be passive users.

It’s democratization of technology.

We’ve probably all heard the faux-Chinese curse once or twice before: May you live in interesting times.  Interesting being dangerous, of course.  Well, this is an interesting time (Cory Doctorow: it’s one of the “best and weirdest” times in human history), and what makes it interesting—the powers, good and bad, of technology—are shared by some of the other weird and best periods in history.  Take 1850 to 1900:

Historians have a lot of names for these mid to late-1800s—Mark Twain’s designation of the period as the “Gilded Age” is probably the best known.  The second half of the 19th-century, Twain believed, demonstrated unprecedented superficiality, decadence, and extravagant displays of wealth.

But Twain seems to ignore where that wealth came from—the growth of industry in the northeast, so productive that we call it the “Second Industrial Revolution.”  This was the time of Thomas Edison, Nikola Tesla, and Michael Faraday, among many, many others.  Advertisement, commercialization of the automobile (Henry Ford), and the mass production of consumer goods all began at this time.  Not to mention its affect on communication: for the telephone alone, this half-century has even been labeled the “Technical Revolution.”

Technology was changing lives, making the world smaller, and spreading ideas like never before.  Great inventors may stand on the shoulders of giants—no one’s disputing Newton here—but there are clear periods of time in which our gradual technological snowball triggers an avalanche, and everything shakes up.

The later 1400s were one of those times, with the invention of the printing press and movable type (in Europe, at least).  It was the first major democratization of knowledge in human history, and led to one of the greatest social upheavals in Western Civilization: the Protestant Reformation.

Another one of those inventive periods was the latter half of the 19th-century, the scientific and technological “revolution” that Mr. Twain called the Gilded Age.  And—in this case, I think unjustly—the quips of a great satirist like Clemens tend to stick.

There’s a competing name for that epithet, however: the Age of Optimism.  Hearing a friend or family member in your ear across thousands of miles must have seemed like magic.  Confidence in the power of technology skyrocketed.

But power can go the other way too—and when WWII’s atomic bomb proved, as Carl Sagan writes, that “scientists knew sin,” faith in technology waned a little low.  In the 1970s, Vietnam War protestors burned their draft cards in napalm, another contribution of science to the destruction of humankind.

No wonder our history books never mention that the “Gilded Age” optimism might have been warranted.

But I increasingly think that we’re in one of those technological avalanche periods—like the printing press and the telephone, the Internet has only further democratized knowledge and communication.  (Remember Bruce Sterling’s comment?  The Internet is the world’s, history’s, only “functional anarchy.”)

Newsweek shocked me this week with their feature: The Decade in Review.  At #8 on the happy endings list was the story of Martin Takleff, convicted in 1990 for murdering his parents—as it turns out, he didn’t.  It’s a brief story, literally six lines, and half of them are these:

Because of new technologies, we can prove that mistakes were made.  Technology is neutral.  It is dispositive proof of objective fact, and it brings us closer to truth.

So it’s not unbridled optimism, but that’s a pretty confident assessment of the benefits of technological advances—and the talk about “objective fact” makes me grin with the thought that postmodernism might be falling out of favor.

I think we’re starting to trust technology again.

That can be dangerous—more than ever, there’s the risk of this ubiquitous, ever-more-powerful technology being twisted to watch us, track our movements, and surveil ordinary people’s activities in the grand tradition of an Orwellian-dystopian nightmare.  But like Cory Doctorow writes—and Newsweek, surprisingly, echoes—technology is neutral, objective, and it does our bidding.  Like the printing press, the Internet has democratized information to a greater extent than ever (ever) before; it’s our technology, and if we can learn to control it, maybe that 19th-century confidence can overwhelm the cynicism.

Just mess around with Python for a couple hours.  It’s painless; I promise.

Wheels within Wheels: Strange Lights in Norway and Ezekiel

10 Dec

Thousands of Norwegians witnessed a bizarre spiral of light in the sky on the morning of November 9.  Still, take a look:

(Read the original article and see more photographs at The Sun)

Update: 12/11 — Looks like the word is in, and the mysterious light show was a Russian missile launch after all.

Still, for what it’s worth, my first thought back on the 9th, after I snapped my gaping jaw shut and got over the initial shock, was that the images from Norway bore  a striking resemblance to what some researchers have pointed to as a description of UFOs in the Bible.

Ezekiel 1:16 (New American Standard Bible)

The appearance of the wheels and their workmanship was like sparkling beryl, and all four of them had the same form, their appearance and workmanship being as if one wheel were within another.

“Beryl,” notably, is transparent mineral often of a blue or green color (much like in the pictures of the spiral over Norway).

The similarity of the spiral seen to the description of “wheels within wheels,” I think, needs little explanation.

What I love most (even in my awe– in the traditional definition of wonder and fear) is that this suggests, if nothing else, that modern analysis of early texts might want to give a little more credit to the historiography of ancient people.  Now I’m not saying that the spiraling light was the fiery chariot of cherubim; that’s probably less likely than an alien invasion.  But it is interesting to consider that, if this fiery wheel does turn out to have been a natural “astral phenomenon” as some were claiming earlier, it may have been seen sometime in the distant past as well.

We’re all quick and willing to apply “to err is human” to people of earlier centuries (how much more so for millennia?) without considering that we ourselves could be lacking vital information.  Human eyewitness accounts haven’t changed so much since biblical times, even if the equipment to capture something as bizarre as a fiery “wheel within a wheel” is now available.  After all, a number of Norwegians, grasping for words, compared the sight to a “Catherine wheel.”

Just a thought.