Tag Archives: Cory Doctorow

Parallel Universes Where the South Won the War

25 Feb

Whooosh, it’s the Flash-Sideways… again!

I’ll have to wait 8 years or so before I can say with some validity that this is the kind of question professional historians daydream about without daring to publish, but I’ll make an educated guess that they do.  I imagine some Civil War re-enactors daydream and talk about it shamelessly, but as an undergrad history major who only pretends to be an Alabamian, I really can’t speak with authority on either account.  I do know that in 9th grade my world history class reenacted the battle of Waterloo and I got all of us disgruntled French soldiers pumped up on ABBA before we waged a water-balloon shock and awe campaign on that smug 14-year-old Duke of Wellington and put Napoleon back on the throne for good.

Our teacher wasn’t terribly pleased.

But that sort of counterfactual history is the bread and butter of science fiction writers–remember Murray Leinster’s Sidewise in Time? And it’s not just temporal shifts in general that SF writers posit, either, but specifically Confederate/Nazi Americas.  Another classic example: Philip K. Dick’s (love him!) The Man in the High Castle.

PKD’s novel was first published in 1962–I call that the “coherent period.”  Come 1978 and you get something like VALIS, which almost makes House of Leaves look intelligible.  Almost.

(And that, my friends, is called postmodern name-dropping, included in honor of my *friend* and fellow blogger (well, she’s 3 posts in), Marina Roberts, who doesn’t get one of my weird pseudonyms because she’s an anthropology major and I don’t care about her Internet safety, and who called me a “scary hipster” the other day, which really touched a soft spot because I’ve never even been in a thrift shop.  The salespeople kind of freak me out.  But then, I won’t step into J. Crew either.)

In any case, and to preach to the choir, The Man in the High Castle is fantastic and well-deserving of its Hugo.  Remember the Sidewise Award for Alternate History?  I’ll bet that in an alternate universe where he lived to 1995, PKD won that too.

For anyone who hasn’t had the pleasure of a glimpse into Philip K. Dick’s twisted mind, here are a couple plot summaries for TMITHC.  Since it’s Phil, I figured we might need two:

Dick’s Hugo Award-winning 1962 alternative history considers the question of what would have happened if the Allied Powers had lost WWII. Some 20 years after that loss, the United States and much of the world has now been split between Japan and Germany, the major hegemonic states. But the tension between these two powers is mounting, and this stress is playing out in the western U.S.

What if the Allies had lost the Second World War …? The Nazis have taken over New York – the Japanese control California. In a neutral buffer zone existing between the two states an underground author offers his own vision of reality, an alternative world that offers hope to the disenchanted …Hugo Award winner Philip K Dick is one of the most original contributors to American sci-fi, and his books were the basis for the critically acclaimed films “Blade Runner” and “Total Recall”.

So it’s not exactly a Confederate States of America, but slavery is legal in this alternate America and freedom definitely isn’t ringing.  It’s a book that came to mind earlier this week as I watched a screening of Kevin Wilmott’s “Confederate States of America,” an event hosted by the University of Alabama’s brilliant history department.  From Wikipedia, because I’m lazy:

C.S.A.: The Confederate States of America is a 2004 mockumentary directed by Kevin Willmott. It is a fictional “tongue-in-cheek” account of an alternate history, in which the Confederates won the American Civil War, establishing the new Confederate States of America(that incorporates the former United States as well).

The film primarily details significant political and cultural events of C.S.A. history from its founding until the 2000s. This viewpoint is used to satirize real-life issues and events, and to shed light on the continuing existence of discrimination in American culture.

A particularly interesting segment of the film involves the CSA’s participation (or lack thereof) in World War II.  In this reality, Hitler visited the States, whose Aryan leaders impressed him with the slave economies in North and South alike.

More insightful bloggers than me (read Cory Doctorow, and yes, that’s a double entendre) have written about the role of science fiction in society; a couple years ago, he wrote an article titled “Radical Presentism,” about “the way that science fiction reflects the present more than the future.”

For some years now, science fiction has been in the grips of a conceit called the “Singularity”—the moment at which human and machine intelligence merge, creating a break with history beyond which the future cannot be predicted, because the post-humans who live there will be utterly unrecognizable to us in their emotions and motivations.

Read one way, it’s a sober prediction of the curve of history spiking infinity-ward in the near future (and many futurists will solemnly assure you that this is the case); read another way, it’s just the anxiety of a generation of winners in the technology wars, now confronted by a new generation whose fluidity with technology is so awe-inspiring that it appears we have been out-evolved by our own progeny.

Confederate States of America isn’t science fiction in the way that The Man in the High Castle is, but both are counterfactual (alternate) histories–telling us less about what could have happened than what is happening.

In the question-and-answer session after the film screening, director Kevin Wilmot (hell yes, he was there) suggested that–to paraphrase–the reason that South fought so hard in the Civil War was because they felt betrayed.  The country started out as the Confederate States of America–slavery and the 3/5 law were enshrined in our very Constitution, and let’s not forget how very many Virginians we had for presidents back in the day.

Every move we make toward true democracy, Wilmott argued (and that’s not just emancipation, but Civil Rights, repealing Don’t Ask Don’t Tell, etc), makes the country more of the United States, and less the Confederate States.

If filmmaking ever falls through for Wilmott, I’d suggest a career in science fiction.

Btdubs, I’m currently fascinated by the I Write Like tool, which has for months told me that in both the blogosphere and academe I write like H.P. Lovecraft.  Probably the prohibitively long sentences.  This post, however, has been textually analyzed and came out, quote the machine, as comparable to Cory Doctorow.  Nothing’s comparable to Cory Doctorow’s writing, but I’ll take what I can get.

Advertisements

Democratizing Technology, again

14 Dec

The more we control our technology, the better we like it.

This weekend I took the advice of 17-year-old Marcus, hacker protagonist of Cory Doctorow’s young adult novel Little Brother (2009’s Prometheus Award winner for Libertarian SF, by the way):

A computer is the most complicated machine you’ll ever use. It’s made of billions of micro-miniaturized transistors that can be configured to run any program you can imagine. But when you sit down at the keyboard and write a line of code, those transistors do what you tell them to.

Most of us will never build a car. Pretty much none of us will ever create an aviation system. Design a building. Lay out a city. Those are complicated machines, those things, and they’re off- limits to the likes of you and me. But a computer is like, ten times more complicated, and it will dance to any tune you play. You can learn to write simple code in an afternoon. Start with a language like Python, which was written to give non-programmers an easier way to make the machine dance to their tune. Even if you only write code for one day, one afternoon, you have to do it. Computers can control you or they can lighten your work — if you want to be in charge of your machines, you have to learn to write code.

It’s less a subtle suggestion than a call to arms, but I was inspired anyway and messed around with Python for an afternoon—because I’m a Humanities student and very proud of the fact, I used my very limited skills to write a program that quotes Hamlet at you, substituting your name for Ophelia’s (user_reply = raw_input (“Get thee to a nunnery!”)).

It was amazing.  And Cory Doctorow/Marcus was absolutely right: every time that little program carried out a command I typed, I shouted very loudly and very excitedly with the astonishing power of even my limited abilities (very, very limited abilities).  Doctorow’s call to arms is something almost as cool as my hamlet.py program: computers are complicated, and the Internet can be frightening, but we don’t have to be passive users.

It’s democratization of technology.

We’ve probably all heard the faux-Chinese curse once or twice before: May you live in interesting times.  Interesting being dangerous, of course.  Well, this is an interesting time (Cory Doctorow: it’s one of the “best and weirdest” times in human history), and what makes it interesting—the powers, good and bad, of technology—are shared by some of the other weird and best periods in history.  Take 1850 to 1900:

Historians have a lot of names for these mid to late-1800s—Mark Twain’s designation of the period as the “Gilded Age” is probably the best known.  The second half of the 19th-century, Twain believed, demonstrated unprecedented superficiality, decadence, and extravagant displays of wealth.

But Twain seems to ignore where that wealth came from—the growth of industry in the northeast, so productive that we call it the “Second Industrial Revolution.”  This was the time of Thomas Edison, Nikola Tesla, and Michael Faraday, among many, many others.  Advertisement, commercialization of the automobile (Henry Ford), and the mass production of consumer goods all began at this time.  Not to mention its affect on communication: for the telephone alone, this half-century has even been labeled the “Technical Revolution.”

Technology was changing lives, making the world smaller, and spreading ideas like never before.  Great inventors may stand on the shoulders of giants—no one’s disputing Newton here—but there are clear periods of time in which our gradual technological snowball triggers an avalanche, and everything shakes up.

The later 1400s were one of those times, with the invention of the printing press and movable type (in Europe, at least).  It was the first major democratization of knowledge in human history, and led to one of the greatest social upheavals in Western Civilization: the Protestant Reformation.

Another one of those inventive periods was the latter half of the 19th-century, the scientific and technological “revolution” that Mr. Twain called the Gilded Age.  And—in this case, I think unjustly—the quips of a great satirist like Clemens tend to stick.

There’s a competing name for that epithet, however: the Age of Optimism.  Hearing a friend or family member in your ear across thousands of miles must have seemed like magic.  Confidence in the power of technology skyrocketed.

But power can go the other way too—and when WWII’s atomic bomb proved, as Carl Sagan writes, that “scientists knew sin,” faith in technology waned a little low.  In the 1970s, Vietnam War protestors burned their draft cards in napalm, another contribution of science to the destruction of humankind.

No wonder our history books never mention that the “Gilded Age” optimism might have been warranted.

But I increasingly think that we’re in one of those technological avalanche periods—like the printing press and the telephone, the Internet has only further democratized knowledge and communication.  (Remember Bruce Sterling’s comment?  The Internet is the world’s, history’s, only “functional anarchy.”)

Newsweek shocked me this week with their feature: The Decade in Review.  At #8 on the happy endings list was the story of Martin Takleff, convicted in 1990 for murdering his parents—as it turns out, he didn’t.  It’s a brief story, literally six lines, and half of them are these:

Because of new technologies, we can prove that mistakes were made.  Technology is neutral.  It is dispositive proof of objective fact, and it brings us closer to truth.

So it’s not unbridled optimism, but that’s a pretty confident assessment of the benefits of technological advances—and the talk about “objective fact” makes me grin with the thought that postmodernism might be falling out of favor.

I think we’re starting to trust technology again.

That can be dangerous—more than ever, there’s the risk of this ubiquitous, ever-more-powerful technology being twisted to watch us, track our movements, and surveil ordinary people’s activities in the grand tradition of an Orwellian-dystopian nightmare.  But like Cory Doctorow writes—and Newsweek, surprisingly, echoes—technology is neutral, objective, and it does our bidding.  Like the printing press, the Internet has democratized information to a greater extent than ever (ever) before; it’s our technology, and if we can learn to control it, maybe that 19th-century confidence can overwhelm the cynicism.

Just mess around with Python for a couple hours.  It’s painless; I promise.

Anarchy on the Internet (and why it’s good)

24 Nov

Everyone knows that middle-aged sexual predators lurk in chatrooms, posing as insecure tweens looking for a friend; or friend other insecure tweens on MySpace; or that if you don’t lock up your wireless network tight, terrorists are going to tap into it and turn your naivete into massive-scale crime; or that that email with the suspicious subject line is a virus that’s going to delete all your files (even if you do have a Mac); and that if you don’t forward this message of holiday cheer to 42 people by midnight, an axe murderer will sneak into your room at 3 am and— ZZSWAR9ARG7Z

You get the point.  There are dangers hiding behind every hyperlink.

I don’t mean to be flippant (no, that’s a lie; I do, but it’s strictly rhetorical)—the Internet can be a scary place, and scary people use it.  I’m all for parental controls and spam queues.  What I’m not for is the underlying premise beneath Internet fear-mongering—because it’s not always just “Stranger Danger.”

Some of the outcry against danger (or obscenity, or perversion, etc, et al) comes with a call to action that frightens me more than any technological boogeyman—if the Internet is dangerous because it’s so open, because anyone can do, really, anything, why not regulate?

In 1993, SF author Bruce Sterling (“Junk DNA,” remember?) wrote an article called “A Short History of the Internet,” which you can find in its entirely online, and which I highly recommend.  For my part, I’ll focus on just a few key facts, some of the points from the reading assignment for today’s American Studies lecture on “The Internet Revolution.”  So:

1. The very openness and decentralization of the Internet that makes it “dangerous” was built into its most basic structure—from the perspective of a Cold War scientist, you see, a communication network would have to be as decentralized as possible in order to still function after a nuclear holocaust wiped out God-knew-where in the United States.  With this in mind, the less authority—the better (sounds strange for a military-government program, doesn’t it?).

2. And after decades of evolution, that’s what we still have: no authority.  Sterling asks:

Why do people want to be “on the Internet?”  One of the main reasons is  simple freedom.   The Internet is a rare example of a true, modern, functional  anarchy.   There is no “Internet Inc.”   There are no official censors, no bosses, no board of directors, no stockholders.  In principle, any node can speak as a peer to any other node, as long as it obeys the rules of the TCP/IP protocols, which are strictly technical, not social or political.

Sixteen years after those words hit shelves in The Magazine of Fantasy and Science Fiction, and that’s still true: it’s simple science fact, and no less amazing for it.

Online, you are what you type, upload, or post—identities are fluid.  It’s true that might mean a fifty-year-old man staring at a glowing screen in his basement could pretend to be a junior high girl on a some Edward Cullen fan site, but it also means that young Peter Wiggin can blog and be seen by the world as an elder statesman.

It’s freedom to be creative without the stigma of age, sex, race, or anything else that might lead someone to prejudge you before looking at your work or ideas: online, you are your ideas.

Blogger and SF writer Cory Doctorow’s name (which I feel I mention every other post) is almost synonymous with Internet freedom.  Publishing his novels under a Creative Commons license for free distribution online (DRM-free, I might add), Doctorow could almost be a character from one of his own books—Alan/Adam/Albert/Avi, for example, from Someone Comes to Town, Someone Leaves Town, spends the time he’s not brooding about his troubled childhood as the eldest son of a mountain and a washing machine, setting up a free, open, wireless network for the people of his local town.

(I did say almost a character.)  In any case, he practices what he preaches, and in all his books shows just how cool our world is.  I’m going to have to quote Makers again– we’re living in the “weirdest and best time” in the history of the world.  Witness the astonishing success of modern anarchy:

“No one needed to draw a map of the Web,” Kurt said, “It just grew and people found its weird corners on their own.  Networks don’t need centralized authority, that’s just the chains on your mind talking.”

I have to give my professor credit—revolution was a good title for the lecture.  Even after our first Revolution, observers (read: Alexis de Tocqueville) noticed a tension in American society between liberty and equality, freedom and democracy.  Oftentimes, they clash (see any debate on social welfare programs—the object is equality of outcome, but at the expense of freedom to use and dispose of one’s property, money).

But no political arguments in this post about liberty and equality: the anarchy of the Internet is one of the only places where you don’t really have to choose.

Nostalgic, Prescient (and very, very memorable) Science Fiction

23 Nov

Somehow, without me noticing, the science fiction writers I remember from magazines of the early-2000s appeared on my bookshelf again.

For the last few weeks, I’ve been on a mission to find copies of the first SF stories I can remember reading—two of them I knew for sure came from an issue of Asimov’s Science Fiction magazine; two of them might be in one of a number of old anthologies of my grandfather’s; and one of them might just be from a dream I had years ago and inflated into a dystopian epic (it happens).

In any case, after diligent Google searching and telephone inquiries with a used bookstore in Oregon, I was able to get a listing of the titles and authors of short stories in Asimov’s from 2002 to 2005.  The problem was that it’s a monthly magazine, and I couldn’t remember if my subscription had begun when I was a freshman in high school, or when my older sister first brought back those QSP-issued order forms for the annual magazine drive.

So: after nearly 7 years, I couldn’t remember the authors, or the titles (shoot, I couldn’t even remember the year).  This may have something to do with the fact that back in those halcyon days of yore, I was a very sweet, very impressionable middle-school girl who found herself horrified by the lurid cover illustrations and pulp fiction content of the publication—a semi-nude, iridescent faerie was not, after all, what Dune and Contact had prepared me for.

I read no more than two or three issues, tossed the rest out, and did not renew my subscription.  I would stick to the classics, I decided.

But for 7 years I’ve managed to vividly remember two stories—or at least, bizarre details from two stories—from one of the few issues I’d read.

The first was about a woman with some sort of genetically-engineered pets franchise: they had a strange name (ploompies?  ploofties?) and were globular, translucent, pulsing masses of the buyer’s own DNA.  And somehow, these creatures were so appealing that the owner could hardly help but bite into them—and get a taste of something sharp and metallic (in my orthodontics-oriented middle-school mind, that jagged pain you get from biting down on a piece of tinfoil with a filled tooth).

The second story had something to do with a girl and her dog; they lived in the “real world,” or rather, the physical world, because when she grew up, she would have to abandon her body and lived in a completely virtual world, like the Internet.  Some accident happens to the girl, and her body is lost—she herself is just barely uploaded in time, but the dog can’t be saved.

This isn’t much to go on.  But paging through lists of titles online, I spotted one called “Junk DNA.”  Alarms went off in the brainpan.  I bought a used copy of the January 2003 issue of the magazine, and checked my PO Box daily until it arrived.

The first story, about the bizarre pets (Pumptis, as it turns out), was indeed “Junk DNA,” by Bruce Sterling and Rudy Rucker.  And here’s the passage that had so stuck with me:

In a dizzying moment of raw devotion, Janna suddenly found herself sinking her teeth into the unresisting flesh of the Pumpti.  Crisp, tasty, spun-cotton candy, deep-fried puffball dough, a sugared beignet.  And under that a salty, slightly painful flavor—bringing back the memory of being a kid and sucking on the root of a lost tooth.

Why that particular imagery was so memorable, I don’t know.  More interesting is the fact that the genre of the story is one I’ve been raving about for the past few months:

“Junk DNA” is science fiction story about a business venture and all the backroom politicking that goes along with economics, invention, and the market.  Sound a bit like…?

(My post on) Cory Doctorow and Makers, his very recent epic of robotics, business, and the “New Work” (like the New Deal, but way more free market);

(My post on) David Louis Edelman and his Jump 225 series, for which “cyberpunk” hardly does justice as a classification—the corporate intrigue behind Bio/Logic and MultiReal (and how could there not be corporate intrigue with sociopathic entrepreneur Natch at the helm?) is just as intense as the science;

Charles Stross and Glasshouse, which won the 2007 Prometheus Award for “libertarian SF” (This, friends, is my life goal), or The Atrocity Archives, which is something of a spy thriller with a science fiction element closer to Lovecraftian horror than anything else (take a look at the January 2003 cover illustration and you’ll see where I’ve found a connection with Lovecraft).

Even one of the authors, Bruce Sterling, will be appearing on my bookshelf when The Caryatids arrives in the mail in a couple weeks.  And the last page of the January 2003 issue is a sort of preview of coming attractions feature, listing authors and stories for the next issue—one of them, by the way, is Charlie Stross).

To think, I thought these were new discoveries.

Mystery Story #2 also happened to be in the Jan. 2003 issue—“Pick My Bones With Whispers,” by Sally McBride.  This was a major lucky break, as I would never have remembered that the second story imprinted on my malleable brain had been the winner of the Pretentious Title Award for 2003.  (Is McBride trying to be ironic?  I sincerely hope so…)

And once again, the topics that fascinate me today, I discover, are absolutely nothing new.  The research I recently did on the millennial generation’s changing conception of the Internet (or, for them/us, Cyberspace)—from a tool to a place that has been increasingly explored since childhood—is all there in the saga of Lizbeth and her faithful virtual pup, Fritz:

Though I’m twelve, there’s still a lot I can’t do in the children’s Net areas, even if Fritz was letting me in deeper and deeper all the time.  There were dark places I couldn’t go, forbidden subjects I couldn’t get data on, tantalizing things I couldn’t see or join or do.  Sometimes it was humiliating to be a flesh-and-blood person.

This sounds so much like one of the responses I got from an interviewee for my paper that it’s almost shocking.  She doesn’t use the Internet to the same extent of her peers—and so (like Lizbeth, albeit less dramtically) resists absorption into Cyberspace.  She told me:

“Everyone talks about how big the Internet is, and I know, because I can go on for hours and hours and still feel like I’ve never gotten into the core of it.  If the Internet was real life, I would be non-existent.”

This interviewee in particular doesn’t care for science fiction—she enjoys borrowing my DVDs of Firefly, but that’s about it.  No 2003 Asimov’s Science Fiction for her.  And still, she easily could have spoken those lines from McBride’s story.

This—like the theme and subject matter of recent novels by authors like Stross, Edelman, and Doctorow—tells me that there’s something in the culture today stories like “Junk DNA” and “Pick My Bones With Whispers” (I’m sorry, I still really can’t type that without cracking up) picked up on in 2003: the increasing interconnectedness of technology and economics, and the transformation of the Internet into an environment rather than just a tool.

Getting that old magazine in the mail today was like a wave of nostalgia, but after reading through those stories again, the sentimentality was gone—the things I missed and remembered for 7 years are mainstream now.

Cory Doctorow’s Makers predicts the present

10 Nov

I finished Cory Doctorow’s latest novel last Sunday, at a commercial break during the season finale of Mad Men.  Notable for portraying a Don Draper losing his cool, one scene has the frustrated ad man shouting:

“I want to work!  I want to build something of my own.”

In the context of having just read Makers, that’s a telling line.  But I’ll explain—

Twice the size of Eastern Standard Tribe and absolutely dwarfing his first novel Down and Out in the Magic Kingdom, Makers is an epic of economics, technology, corporate psychopaths, and people who “just want to make things.”  It isn’t Homer, but there’s definitely something Greek about the fates of a number of central characters—fatal flaws, tragic irony, all that—and by the time I logged out of Preview and filed Makers away on my desktop with the other PDF copies of Doctorow’s books, I wasn’t sure whether to laugh or cry, but I was sure that Doctorow had succeeded in capturing and articulating something significant in the culture.

Here’s why:

Makers tells the story of the birth and untimely death of an economic movement reporter/blogger (and initial protagonist) Suzanne Church calls the “New Work,” aimed at turning ancient, lumbering “dinocorps” into flexible teams of innovators.  In the words of Tjan, one of the “suits” who works with the original New Work team of Perry Gibbons and Lester Banks (an eccentric pair of hacker-inventors who act more like an old married couple than a business partnership), the purpose of the New Work is this:

“We’re going to create a new class of artisans who can change careers every 10 months, inventing new jobs that hadn’t been imagined a year before.”

And when reporter Suzanne questions the stability of that sort of system, he argues passionately for his idea—

“That’s a functional market,” he insists, going on to deliver a free market sermon that made me want to stand and applaud:

“If you want to make a big profit, you’ve got to start over again, invent something new, and milk it for all you can before the first imitator shows up. The more this happens, the cheaper and better everything gets. It’s how we got here, you see. It’s what the system is for. We’re approaching a kind of pure and perfect state now, with competition and invention getting easier and easier—it’s producing a kind of superabundance that’s amazing to watch. My kids just surf it, make themselves over every six months, learn a new interface, a new entertainment, you name it. Change-surfers…”

It doesn’t sound like stereotypical science fiction—alien invasion, cyborgs, and omnipresent governments of a nightmarish dystopia.  It’s our world, the world today, with bloggers outstripping traditional print papers and corporate bureaucracy doing its best to smother fluid, mobile models of small groups of innovators under the weight of its own inertia.  It’s at once the “weirdest and best time the world has yet seen.”

But Dickens and faux-Chinese proverbs will have out—if it was the best of times, it was the worst of times, and everyone knows that to live in interesting times is a curse.

That’s something Doctorow’s characters learn the hard way.

For the sake of saving the plot for the readers, I’ll just say this: the brilliant whirlwind of creativity we find in Part I isn’t the end—Parts II and III take us from this dizzying height to a doldrums of frustration and stagnation.

It’s what our high school English teachers would call a “chiasmus,” a crossing of paths or slopes or fate lines, with characters selling-out or buying in or suing or countersuing each other so fast that I can’t tell who to root for anymore.

But if one thing is clear, it’s this:

The title of the novel could apply to literally every character in the story, all of whom express at some point or another a deep desire to make/do/create.  But their hands are tied by red tape, or they’re strangled by lawsuits, or their fate lines are snipped off with the shears of the bureaucratic Atropos: inertia.

“All he wanted was to have good ideas and make them happen,” Doctorow writes of Sammy Page, a Disney dinocorps executive trapped by the rigid structure of the company.  “Basically, he wanted to be Lester.”

But the politics engulfing Lester’s once-happy life as an inventor selling his creations on eBay make it so that even Lester can’t “be Lester”: “Why couldn’t he just make stuff and do stuff? Why did it always have to turn into a plan for world domination?” he thinks.

Sound something like Don Draper?  (or maybe Howard Roark?)

Cory Doctorow writes on his blog that his science fiction represents “radical presentism,” a prediction of the present rather than the future—meaning that in Makers it’s our world today he’s looking at, probably the reason the novel’s so unnerving.  Doctorow creates a cognate of modern America, a place in a frenzy of invention and creativity—until the idealism dies.  The tragedy is that the frustrated desire to just make stuff doesn’t die along with it.

AMC, at least, is catching on to the cultural drift—if a little after Doctorow—and I have to say, the Mad Men finale is a bit more optimistic than the fifteen-years-later epilogue of Doctorow’s epic.  The smaller, more flexible new agency of Sterling-Cooper-Draper-Price, after all, is staffed completely by makers.

Death of Books?

17 May

First question– Can we please stop burning libraries?

While 212 BC saw the initiation of a construction project awe-inspiring even today­– the Great Wall of China– it also saw the death of thousands upon thousands of ancient Chinese texts, ordered burned by Emperor Chi Huang Ti.

Whether by Julius Caesar “accidentally” during the Alexandrian War, by Christian fanatics at the order of Emperor Theodosius I, or Muslim Caliph Umar after the Battle of Heliopolis– someone burned the massive Library at Alexandria (someone who found the ancient manuscripts enough fuel to heat baths for the soldiers for six months, according to legend).

Of the thousands of Mayan codices encountered by the Spanish conquistadors, fewer than five are known to exist today.

History is written by the winners of wars?  How about– history is erased by the winners of wars.  Which brings us to our:

Second question– How much irretrievable knowledge has been lost due to the suppression of knowledge over the course of civilization?

The hearts of Humanities majors everywhere are breaking.

But reading Cory Doctorow via free downloads (thanks once again, Creative Commons) reminds me that books are growing increasingly difficult to kill, at least completely.

If the printing press catalyzed a storm of information dissemination, what can we call the Internet?  Hurricane? Maelstrom?  And as much as I love holding a tangible, physical object while I read, I have to admit that there’s one very important advantage to distributing books online: they’re harder to burn.

I’m with Doctorow on this one:

“Here’s a thing I’ve noticed about the present: more people are reading more words off of more screens than ever before. Here’s another thing I’ve noticed about the present: fewer people are reading fewer words off of fewer pages than ever before. That doesn’t mean that the book is dying—no more than the advent of the printing press and the de-emphasis of Bible-copying monks meant that the book was dying—but it does mean that the book is changing. I think that literature is alive and well: we’re reading our brains out! I just think that the complex social practice of “book”—of which a bunch of paper pages between two covers is the mere expression—is transforming and will transform further.”

Maybe I can’t make notes in the margins of a PDF file, and maybe reading the glowing monitor of my laptop is more of a strain on the eyes, and maybe there’s no more “new book smell” or breaking in of a crease-free spine, but that’s not too much to complain about for the sake of increasing circulation of histories (or any other stories).  Technology doesn’t just transform, it evolves.

And after all those tragedies-by-fire in the ancient world, the printing press in the Middle Ages and the Creative Commons licenses today are in the same line:

Survival adaptations.

Read science fiction online?  Check. http://craphound.com

Update: Turns out I can make notes in the margins of my Kindle, and the screen is much more like a print page than a computer monitor (no strain!).  This is not a paid advertisement or anything–I’m just very happy, and wanted to share another example of lovely technological “evolution.”