Env

· · learn more

◌  Think about Facebook: An angry reverie on software

26 May 2010


[Note on 5 March 2012: Today this has been making the rounds on Twitter among some of the readers and writers I most admire. I’m happy. I’m also terribly self-conscious, because this was sneezed out over a few evenings two years ago in response to one of the periodic waves of paranoia about Facebook, and without false modesty there’s a lot about it that makes me cringe to re-read. It’s a first draft without typos, basically. But I don’t want to sandbag too much: I’m really proud that smart folks are finding something good here.

If e-mail is too old-fashioned, you’re welcome to tweet at me: @vruba.]

  1. Purpose
  2. Ten thousand years of user-generated content
  3. Apology Preface
  4. Engagement
  5. Philosophy of technology
  6. Iodine (abstraction)
  7. To the people
  8. Binary, guilt, hothouse, elves, hummingbirds, fear
  9. More sullen frustration (skippable)
  10. La différance & der Verfremdungseffekt
  11. The iPad is bad
  12. The iPad is good
  13. Weapons of mass destruction and farm tools
  14. Facebook is bad
  15. Facebook is good
  16. Software is the dangerous Other
  17. Software is us
  18. How to think about Facebook

Purpose

Every few weeks lately I’m reminded of Stewart Brand’s Purpose for the first Whole Earth Catalog in late 1969:

We are as gods and might as well get good at it. So far, remotely done power and glory – as via government, big business, formal education, church – has succeeded to the point where gross defects obscure actual gains. In response to this dilemma and to these gains a realm of intimate, personal power is developing – power of the individual to conduct his own education, find his own inspiration, shape his own environment, and share his adventure with whoever is interested. Tools that aid this process are sought and promoted by the Whole Earth Catalog.

I am annoyed by about half the words there, but I do like the gist. It reminds me of liberation theology, which was showing up at about the same time. These were both part of a tradition of access to tools – a Whole Earth Catalog slogan – that goes back through people like Lilburne, Komenský, and Hypatia, for three whose names, but not their stories, were in the history books of my childhood. Their tradition is humanism, the idea that people are the standard for their things and inherently deserve power.

Ten thousand years of user-generated content

As I was thinking about how people talk about Facebook on Facebook, my mother sent me a video of Rory Sutherland defending advertising in mid 2009. Something here shakes me:

This [slide: farmer carrying melons in a bucket yoke] is what’s called in the digital world user-generated content, though it’s called agriculture in the world of food [audience laughs]. This [slide: people cooking] is called a mashup, when you take content that someone else has produced and you do something new with it; in the world of food we call it cooking.

It’s that the audience laughs. Even with credit to Sutherland’s delivery, I think these comparisons should be obvious. If they’re surprising enough to be funny, it means that the people who should be naturalizing infotech are failing. The humor is a spark across a conceptual gap that should not exist.

Apology Preface

Hi. I’m one of the ones failing. I think a little about some parts of these things, but I don’t explain them much. So I guess I’ll try.

If you want an essay with practical ideas for dealing with Facebook, Joshua’s working on that. Watch for it. The draft I saw was clear, perceptive, and useful – good advice.

What you’re reading is none of that. It’s not even bad advice anymore, and it’s a stretch to call it an essay. If you gave me retrograde amnesia, told me you had written it to be a proper essay, and asked for edits, I would think less of you. I would try to find a gentle way of telling you to re-write it with some kind of hierarchy or at least linearity of ideas, a lot more rigor, much less whine, and consistent tone. I’d be especially curious why the witch calls you by my name.

Not an essay, not an essay. More the kind of fantasia/tirade that I would write in a personal letter and hope its stridency, unpleasantly kaleidoscopic flow, and living darlings would be accepted in a spirit of headlong sincerity.

Okay, look. What I’m trying to say is that this maybe isn’t good enough to publish, but it’s meant for a moment that’s already closing, and even if it’s pretty embarrassing I think it’s worth it. If it’s tedious, sorry but I told you so, and go read Joshua’s thing.

This is a little pile of postcards from a hacker/ICT view on software as a human thing, as human as weaving or cooking. That’s what seemed important to show after I was scared by the laughter at Sutherland’s comparisons.

Engagement

I often remember the Leatherman Wave (’98 version) that I lost in mid 2006, and what Bruce Sterling said about multitools in late 2008:

A multitool changes your perceptions of the world. Since you lack your previous untooled learned-helplessness, you will slowly find yourself becoming more capable and more observant. If you have pocket-scissors, you will notice loose threads; if you have a small knife you will notice bad packaging; if you have a file you will notice flashing, metallic burrs, and bad joinery. If you have tweezers you can help injured children, while if you have a pen, you will take notes. Tools in your space, saving your time. A multitool is a design education.

What a good feeling: to be capable and observant, to be critical and creative and involved in your environment. I wish this on everyone, but it’s complicated. It’s involved with power and wealth. It comes from different things in different contexts for different people. It’s not always good: it can be a distraction or an ego trip. But I think it’s worth it, because people who notice things changing them and themselves changing things are happier and make better decisions.

Philosophy of technology

What’s the name of thinking that people need multitools? There isn’t a good one because we have no popular philosophy of technology. We have everyday words for theories about the is and the ought of politics and music, but not of nanotech and the social Web. We have luddite and gadget freak and that’s about it.

If I say I’m a democratic-transhumanist technoprogressive with interests in the Long Now, postgenderism, ad-hoc collaboration, technopaganism, regenerative design, upcycling, extended personhood, smart dust, wiki-GIS, unschooling, free culture, artistic patronage, attribution-only licensing, gacaca courts, TRCs, permaculture, village fabs, radical IP reform, trust gel, ICT4D 2.0, nuclear disarmament, ambitious space colonization, stewardship-oriented deep ecology, programming for kids, fast military MARO, the capabilities approach, solar and wind power, distributed banking, subsidiarity, gift economies, moderately liberal infoethics, homesteading, SETI/METI, Nussbaum’s critique of the wisdom of repugnance, strong crypto, geoengineering, freedom to tinker, optional localism, HEAP, seed and culture banks, anarcho-capitalism in internalizing ordoliberalism, R2P/post-Westphalianism, blue Mars, neurodiversity, and the culture of abundance, I’m not saying things that make sense to my neighbors.

I worry we’re doing things we don’t know how to talk about. We don’t have to use my favorite silly neologisms, but the things they’re trying to describe – big plans about what to do and how – should be made, owned, and remade by everyone.

Iodine (abstraction)

The best low-tech source of the micronutrient iodine is things that grow in seawater. Until iodized table salt, most human societies (especially those on mountain soils) either ate some seafood or had a high rate of thyroid and developmental problems. There was even a fringe theory that the Neanderthals were debilitated by iodine deficiency disorders, presumably because they couldn’t brain hard enough to run trade routes to the ocean.

This should shake an idea I often see about recent technology, that it newly separates people from the foundations of their lives. Of the 100,000,000,000 or so people who have lived, my guess is that few of us have personally harvested all the fish, shellfish, and seaweed we’ve each eaten, or prepared our own iodized salt. Nor made all our own clothing, or all our own myths, or all our own fires, or all our own words, or loved only ourselves. So when people say that infotech makes us dependent, I haven’t heard an indictment. We have always depended on things outside our control. We’ve always lived mediated, fragmented lives. That’s just the fact of being talking, tool-using social animals, and it’s not interesting to complain about. If you really think dependence and a finite horizon of engagement is necessarily bad, I sure hope you get to the ocean before your thyroid gland swells up like an overfed puppy.

In software engineering we’re gaga over something called information hiding, encapsulation, modeling, or abstraction. We mean making it easy to use things as tools. Many pieces of code are extremely complex, but we admire them when we can interact with them in simple ways. The first wiki, the Portland Pattern Repository’s, is a 15-year attempt to apply Christopher Alexander’s ideas about design patterns in architecture to software, and it’s full of argumentation about how to abstract things.

An abstraction is an interface: a line between systems, or a dotted line within a system, that only a little information can cross. A car’s controls are an abstraction of its workings. Congress is an abstraction of the will of the citizens. A user interface is an abstraction of software. Maps are abstractions of land. These are ways we use to concentrate on what without getting distracted by too much how.

But abstractions are never perfect. Sometimes the information they’re supposed to hide leaks through and distracts us. Other times, we get so cozy with them that the opposite happens: we start to act as if the abstractions themselves are the real thing.

These are problems that come up whenever we try to represent or summarize or symbolize anything. Languages and scientific theories have analogous trouble. Even economies: the specialization of industrial society is based on abstracting people’s labor into narrow roles, so that instead of going to a farmer and having to listen to them talk about irrigation systems and tractors, I can go to the big-brand supermarket and reliably pay a set amount of money for a consistent amount of food of a certain minimum quality. To some people this is glorious. To other people it’s a disaster. I think it’s a little more complicated than that.

To the people

A networked computer is liberation. It’s free speech. It’s access to public opinion, medical advice, police, and banking. It’s a directory of people selling what you consume and people buying what you produce. It’s an atlas and an encyclopedia. It’s a library of techniques for every kind of human enterprise – building stoves, delivering babies, gardening, storytelling, running meetings, starting a business, mourning, catching fish, vector calculus, basket-weaving. It’s disaster avoidance and relief. It’s entertainment and conversation. It keeps you in touch with who you love. It’s a step toward almost anything otherwise just out of reach.

But.

There are about 6.7 billion of us.

About 0.02 billion of us know how to make a networked computer do something new.

About 2 billion of us have access to e-mail and the web.

About 4 billion of us have access to soap, toilets, and safe drinking water.

About 4 billion of us have access to some kind of banking.

About 5 billion of us have access to a networked computer.

Something is wrong.

The hardware side of the computer revolution is slathered in win. More people have a computer than can bathe properly. If you time-warped Stewart Brand from 1969 and told him that three quarters of the people alive have more analytical and connective power in their pockets (if they have pockets) than Armstrong and Aldrin had on the moon, he would be within his rights to cry with joy. We’re doing really, really well getting hardware power to the people.

In software we’re wrapped in fail. These portable networked computers’ interfaces consist mainly of typing long series of numbers, which was clunky and old-fashioned in 1969. They are used almost entirely to have voice conversations, which is a lot better than nothing, but it’s a lot less than it could be. A billion of their users who have no other option are not using them for banking. Their obvious applications in communal mapping are pretty much ignored. And – it would really sting to give this news to 1969’s Stewart Brand – they are not programmable. Power to the people seems to be those powers that the manufacturer decided on.

Now hang on, Charlie, you might say; you’re oversimplifying. These are super wicked problems to do with everything from governance to religion, and throwing technology at them is really naïve. For starters, you’re treating the lack of access to infotech as a cause of things like poverty and oppression, not as a symptom. Cell phones – which are not computers, so stop calling them that – are not liberation, they just let you take advantage of what you have. They depend on things like cell towers and battery chargers, which depend on traditional stability and infrastructure. A lot of the people you’re throwing around as big numbers are already at the edge of their means to split the cost of a half-busted screenless phone. Implying that three times the population of the US is being kept from improving its financial management by poor user interface design alone is simply irresponsible. This is all what Heeks calls pro-poor thinking, and it’s patronizing. There are just too many assumptions about everyone needing what you want.

Interesting points well made, my dear sock puppet. It’s true I’m just doodling. But one of my larger points is that if you want to have a say in how people use software in twenty years, you’d better be doodling today. The people who ruminate much about how to use software right now are mostly the ones whose parents were rich and quirky enough to let us use computers twenty years ago, and as fond as I am of some of us, it gives me the wilburs to think of us as the only people in charge of software.

And look, when I sigh in this infuriating way at poor people and their ugly Nokias, part of what I’m trying to do is bring you in on a perspective that I have on you (maybe) and your ugly Dell (as it may be). The machine you’re reading this on is way beyond a cell phone. Is it making your life as much better as it could? And if you suspect that what I really mean is “why don’t you let your computer control more of your life”, why is that? Why shouldn’t your computer serve you better by getting out of your face more? What if you used computers completely differently? How could you use computers to be more feeling, more in the moment, more human?

You have permission to ask this kind of question.

Actually, no, you don’t have permission.

The whole point is that you don’t need permission.

In fact, if you’re reading this, not asking more of software is a luxury. I’m not here to yell at you – I’m chin-deep on tippy-toes in my own indulgences – but please keep in mind that your experiences and experiments with computers are useful to society.

If software is to be one of the main media of our power, one of our main abstractions, then we need the kind of radically distributed control over it that we have over our languages, or, at least, our transportation. If it wants to be in on our culture, it can’t keep secrets from us. I want us to know what it means to be on Facebook in the way that we know what it means to rent a house or belong to a church or be vegetarian.

Binary, guilt, hothouse, elves, hummingbirds, fear

I’ve been to the ends of the luddite/gadget freak spectrum. I’ve chopped down trees for firewood, built my own Gentoo boxes, made chisels, operated a nuclear reactor, dug outhouse holes, written parsers, set type, rented render farms, killed my dinner, collected 1.6 million GPS fixes, and stayed up all night around a bonfire. I can’t count the times I’ve been dismissed as a dewy-eyed hippie and as a para-autistic geek. And, having been from end to end, I can say there’s no there there. The spectrum is a fake idea. The ways people think about tools are not tractable by a pro- to anti-technology dimension.

(The extremes aren’t there. Famous nature poets write odes to their Macs; famous computer scientists have no e-mail addresses. You should see how freaked out programmers get about electronic voting. I know someone who lives in a mossy cabin in the woods and loves watching Mythbusters on their iPhone, and I know a programmer who wants to colonize Mars but thinks phones are dehumanizing, and these are not exceptions. Almost everyone is complicated. As for the serious ascetics and gargoyles, there are a few thousand of them among billions of us, and if they have a disproportionate share of our imaginations, as saints or as demons, it’s our own fault. They symbolize our fantasies about giving up on compromise. Join them if you want, but don’t worry about them.)

Here’s what I’m sick of. When I talk to people about applied philosophy of technology, they get apologetic. Hardware techs feel guilty for liking to go on hikes without electronics. Crunchy folk feel guilty for using e-mail instead of postcards. It throws me, as if they’re confessing to victimless sins of omission in cults they’ve only heard of. Where is it written that we should take cameras on hikes or that postcards are necessarily better? For goodness’ sake, it’s our culture. If it chafes, let it out. If it drags, take it in. If it has loose threads, cut them off or tie them up or learn to like them – but quit apologizing and take some responsibility for your needs and tastes. Make, own, and remake your approach to technology.

Or is it too late to matter? I’m thinking of Teresa Nielsen Hayden on fanfic and sf:

New ways of telling stories develop most readily when you have a population that’s hungry for the product, the creators have little or no dignity at stake, and there are open channels for feedback and discussion. The American comic book developed like that. So did Kabuki, Bunraku, and Elizabethan theatre.

The hothouse age seems to be over for software, except here and there. We’re more serious now. The internet started making money, people got used to the web, spam appeared – I don’t know what all happened, but playful and flowery hypotheses have gradually become deontological edifices that you have to apologize for believing too much or too little. Stuff is less strange. We are losing perspective. You see fewer elves around. In early 1995, St Jude was riffing like a mad genius:

Girls need modems. […] I’m a future hacker; I’m trying to get root access to the future. I want to raid its system of thought. Grrr. […] Machines disappoint me. I just can’t love any of these wares, hard or soft. I’m nostalgic for the future. We need ultrahigh res! Give us bandwidth or kill us! Let’s see the ultraviolet polka-dot flowers that hummingbirds see, and smell ’em like the bees do.

St Jude is dead now, and who’s talking about girls, the future, root access, love, bandwidth and hummingbirds? Who even remembers that St Jude did? Who remembers what she said about feminism, hacking and martial arts? Who remembers what she called her “lefto-revolutionist programming commune”? Many questions she raised have answers now, but many don’t, and no one seems to be raising new ones. Maybe I’m just not listening right. But it seems like we’re doing things we don’t talk about.

A Wikipedia bookmarklet

Open your browser’s bookmark manager. Make a bookmark in your bookmark bar with the address (or URL or location) javascript:location = "http://en.wikipedia.org/wiki/" + window.getSelection() and whatever name you want – I use “W”. Now you can highlight any phrase in the window, click the bookmarklet, and get a Wikipedia page for it.

I happen to use Safari, and I’ve made this the first item in my bookmark bar, which means I can set it off by typing ⌘–1. This is one of my favorite tools. It makes a lot of stuff I read make a lot more sense. I hope people make references I don’t recognize. Polytely? Bosnian mujahideen? Utility fog? Gender feminism? Trie? Wisdom of repugnance? ICT4D? Select, ⌘–1, read, back. It makes me more literate. It’s sugar on my tongue. It’s a hint of what software should be like. When I say fuzzy things about lowering barriers between people and information and whatever, this is the kind of thing I’m thinking of. Only instead of it being something you have to be told by a middle-class white man who spends too much time on the computer, I’m trying to think what it would be like if 5,000,000,000+ people could discover this sort of stuff with tools they mostly already have. Really excellent, I think. That’s humanism.

For what it’s worth, if you make this bookmarklet you’re a programmer, at least in the sense that you’re a gardener when you raise a geranium. And here’s the thing – the code you’re using turns out to be pretty intelligible if you look at it charitably, with a good-faith try at engagement. The javascript: says that the following is a kind of script, i.e. instructions, not an address. Then you set the web browser’s location – the page you’re looking at – to the base address of Wikipedia entries + this window’s selection. (A . in this context is like a possessive ’s, and the () suffix marks imperative verbs, roughly.)

This ain’t English, but neither is it long division, which I gather is what a lot of people think “code” is. It’s a language made by people for people to use information systems made by people for people. I don’t think it’s harder than growing a geranium or baking a pie.

In fact, I suspect that the tricky part of this was dealing with your browser’s bookmark manager. All the ones I’ve seen are awkward. That’s the problem.

As long as your windowsill with a blooming geranium or a cooling pie on it is immeasurably more wonderful than your bookmark bar, you’re justified in ignoring your bookmark bar. But what makes a windowsill so wonderful is that you do those things with it. If you ignore and resent a place, it’s going to be worth ignoring and resenting.

More sullen frustration (skippable)

Sometimes being a software person here and now feels like being a farmer watching people sit around an overgrown farm in midsummer to eat Pop Tarts® and gripe about how terrible food is. Oh, food is so unhealthy, it’s manufactured so industrially, did you know the gas used to produce food weighs more than it does, the one kind of food I like is non-dairy creamer because at least it’s honest, have you heard how bad the high-fructose corn syrup in food is, the consolidation of small farms, hunger at home and abroad, GMOs, monocultures, enzymes, blah blah, and meanwhile plums and raspberries are rolling off their heads and splashing their Big Gulps® all over their Volcano Nachos®. Gross.

If you’ve helped someone creaky-old with their new tools, you might have a sense of this – how they assume the first way they got it to work is the only way, the incurious mixture of wonder and discomfort, the refusal to invest half an hour of learning now to save two minutes of fiddling every day for the rest of their lives – this is something of a certain sense that I think a lot of software people get just from looking out the window when we’re in bad moods. And hoo boy am I ever in a bad mood. I’m going broke looking for a job whose description doesn’t contain the phrase “we give the customer the shaft using exciting high technologies such as Window 7, Social Mediums Marketing, and Web!”. Feh.

You understand I’m not complaining about old people. They have an excuse: they’re old. And, actually, I’m not complaining about non-programmers. They have an excuse: the software environment is not like a farm, or if it is, it’s like an ugly and mandatory one with mostly greedy or distracted farmers. It’s our fault. We got lost. We aren’t thinking hard enough about farming. We have big ideas about how we’re connecting people, and we have little tricks for making code run faster, but we lack the mid-range theories to reliably sinter them into something wholesome.

The tools most people use to write things together on computers are mortifyingly crude if you’ve seen the clever ways that programmers collaborate on shared code or how geneticists make maps of genomes from a zillion little fragments. Nick and I wrote a tool to make co-writing easier. Neither it nor its many technically inferior competitors have caught on. I don’t really know why this is. And even more than I’m bitter that it’s not saving thousands of people a whole lot of time and patience (which is pretty bitter), I’m bitter that I don’t know why not. I don’t know how to ask the questions that let me know how people want to collaborate on writing, and they don’t know how to tell me. It’s hard to make things better when you can’t figure out why they suck.

I know an ATM which, when you’re done, says “ENTER TO EXIT”. Let’s count some of the most obvious things wrong with that:

  1. “Enter to exit” is a stupid, stupid phrase. Don’t say things like that.

  2. All caps is not English orthography for running text.

  3. The “enter” button should be called “OK”. If I tell you my PIN, am I entering it into you? Ugh. No, when we do the kinds of things we do with ATMs, we’re not “entering data”, we’re choosing and typing. When we’ve typed something right, it’s OK. We already have words for stuff.

  4. “Exit” is only the natural word from the programmer’s point of view: exit() is a direction for a program to leave the stage. But when you walk away from the ATM, you’re not exiting – i.e., leaving from inside – anything. This is, at best, jargon misapplied from Windows®.

  5. “Exiting” is the only option. The ATM knows you’re done with it. It should say something like “Transaction closed – thank you!” to let you know you’re not leaving your account open to the next person. Don’t make people press buttons for no reason.

Suspect anything that has its own vocabulary without clear justifications. It either doesn’t care about you or it’s trying to fool you. This ATM’s designers don’t care about you. They didn’t trouble themselves to make a good citizen of the society of humans’ things. Double ugh.

If you see computers with an idea of what they could be, most of what they are is confusing. Here is this gorgeous medium, this trail into the sunny uplands, this multitool for brains, this sampo for science and empowerment, liberation, and we end up with hundreds of millions of people spending their days at keyboards, mice, windows, and Windows®, doing drudge work, clicking rude and semiliterate dialog boxes – and billions of people carrying amazing computers that charade as a slight step up from rotary telephones. Triple ugh.

I would like more people to see how much of a disaster this is. I think some competent software people are alert, although obviously not enough to make usable bookmark managers or ATMs that don’t put a twitch in my eyelid. What worries me most today is that users (a word for victims of things like heroin and tobacco but here meaning people who consume but do not produce software) are just going to get more alienated from it. “Computers” will become inexplicable things, Other things, inhuman things, irresistible and noncontingent and unreasonable, absurd, meaningless, ahistoric interlopers in culture, boring at best.

Quadruple ugh.

La différance & der Verfremdungseffekt

So our abstractions of software suck. Bookmark managers make you think in their own terms. Conversely, Facebook starts to seem like socialization itself. In both cases, our tools are taking over from our original intentions. The phrase “when all you have is a hammer, everything looks like a nail” is very popular in software engineering.

In government they have this idea, freshly fashionable, of regulatory capture, which is what happens when a regulator makes intimate with the regulated so warmly that it begins to pick up the worldview that it’s supposed to be correcting, and it’s seduced into uselessness. A vital boundary is melted. An interaction that should be explicit is silenced. Something a little like this happens to us when we use many tools: what should be means turn into ends in themselves.

Only where does that “should” come from? If tools themselves make us happy, why not? If I see someone using a motorcycle for entertainment, or joy, as well as for transportation, or getting pleasure as well as nutrition from food, how could I call them wrong? Complaining that people get distracted by epiphenomena is saying that we shouldn’t find satisfaction in the journey itself. People on Facebook playing Farmville or becoming fans of Becoming A Fan: this is the famous stopping and smelling the roses.

When I was small, one of the crazy old neighbor lady’s cats peed on my shoe, so I snuck over and peed in her garden. The next day she came to me and asked, as it was her habit of asking odd personal questions, “Little Charlie, what are your three least favorite things?”

“Well,” I said (laughing to myself because I had peed in her garden), “let me think. I suppose they would have to be stupid puns, continental philosophy, and … uh … the French language.”

“I see, I see,” she said, with a strange cawing laugh. “I saw you take a waz on my tomatoes, young man, and I curse you: one day, the word you need will be a stupid pun from French continental philosophy!

She died later that year. I remember feeling guiltily relieved at the funeral because I thought the curse died with her. As I got older and remembered the incident, of course I assumed it was just her way of making me regret peeing in her garden, which I certainly did at the time; I was terrified. But now I want Derrida’s idea of différance. This is so infuriating that I hereby un-recant peeing in her garden.

Différance says that there are no final repositories of meaning. Meaning is about otherness; it’s a pointing-away. Things defer it to different things – which, apparently, is considered a real knee-slapper in a certain horrible little language.

If différance seems like a fancy way of dismissing rationality, think it over. As Robert Anton Wilson wrote in 1987, in the course of debunking some especially dumb right-wing essentialism:

Science, incidentally, not only ignores the question of indwelling “essences” by looking instead at measurable relationships, but science also does not agree that knowledge is obtained through Rothbard’s medieval “investigation by reason,” i.e., by inventing definitions and then deducing what your definitions implicitly assumed. Science investigates by experiment.

Différance is only a new word for this staid idea that our concepts have no truth in themselves, and you can’t learn anything from a word with no external definition; if you want to know something as well as you can, you have to put it in terms of something else, like a calibrated instrument or a statistical test or a personal experience. And those things in turn, etc., duh.

But we use essentialism (or at least a naïvely strong sort of rule utilitarianism – enjoy that bookmarklet) all the time in everyday life. I used it when I said that humans inherently deserve power. We set up tiny idols, like convenience or privacy, or power to the people, or pleasure, and follow them as if we understood them and knew them to be good in themselves, and we consider the things we do to get toward them less important even when their effects are bigger. We often act like there’s a set of legitimate goals and everything else is only raw material.

This has many faces. I imagine there are people who would be upset that I jumped on Sutherland’s joke up there, because picking apart a joke kills it. It does, but we get to do it anyway. We can’t safely divide the world into messages, which are fit objects for study, and mediums, which are not. Someone delivering a joke (or an ad, or music, or political talking points) may want to be transparent, but we don’t have to let them. We can say “hey, there’s a backstory here” – unless we tell each other that we have to lie back and laugh and not worry, because it’s just a joke and that’s somehow a pass.

There is also, I suspect, a disjoint set of people who were upset that I pretended not to notice that the audience didn’t laugh at Sutherland’s second gag, which takes a leg from the idea that there’s something to worry about there. (Everyone in this set is welcome to apply to be my unpaid editor.) If you read that section in slow motion, you can see me making a mistake opposite to the one I’m complaining about here: instead of concentrating on messages and ignoring media, I seized some interstitial space and claimed it was full of a meaning which is very hard to reliably show. This kind of paranoia is, in bulk, easily as bad an idea as the complacency of overapplied essentialism.

I want to spend most of my time in an intermediate state, where I get the productivity advantages of mostly dedicated attention to goals, but the ethics advantages of noticing problems in means – software, hand tools, urban planning, language, agricultural practices, and so on. I want to carry multitools without using them all the time.

So I need two processes: one kind of learning that takes something complex and lets me see through it, and another kind of learning that takes something I can’t see and shows me the complexity in it. The first kind we can call familiarization, and the second defamiliarization.

The playwright Brecht had this thing he called the Verfremdungseffekt, the distancing or alienation or defamiliarization effect, or problematization. He wanted his plays to wake people up, so they’re full of little jolts that break the dramatic illusion and ask you to think personally about the problems on the stage. He was taking a very strong abstraction, the audience-performer relationship, the willing suspension of disbelief and all that, and deliberately making it leak.

You can see versions of the Verfremdungseffekt wherever someone’s trying to get you to pay attention. I’ve been using it – in an amateurish way – all over the place. (It’s one of the main things that makes this not a proper essay.) Using a bunch of silly words for philosophy of technology was a way of trying to get you to look them up and think about them. Calling them silly was a way of trying to keep you from assuming that I’m comfortable with all these ideas myself. Telling a bizarre lie about peeing in someone’s garden was a way of trying to get you on your toes so you’d be both open to and suspicious of différance. Using highbrow foreign words for familiar ideas is a way of trying to make them unfamiliar enough to think about.

Brecht called his plays dialectical theatre. He wanted to provoke an adversarial process, a competition of ideas, a complex seeing. If you squint, things like the 2008 credit crisis or the 2010 Deepwater Horizon spill produce Verfremdungseffekts in economic narratives: they work to break regulatory capture in the way that Brecht tried to break the suspension of disbelief. They un-take-for-granted. They are like a pfeilstorch (I said, I hope you’re enjoying that bookmarklet) flapping in with weird news about how terrible our abstractions are.

I want to use these familiarization and defamiliarization effects like two hands, so that everything gets to be a subject and an object. I want complex seeing.

The iPad is bad

I’m writing this in mid 2010, seven weeks after the first iPad came out. The criticism so far seems to center on its being a toy for rich people. It would be boring to explain just how sterile and short-sighted this observation is. What’s actually interesting and scary to me is that it’s not for making things. It has no camera or general-purpose physical connectors, the onscreen keyboard is iffy at best, and running your software directly on it takes Apple’s permission.

There was a time when computers came with programming languages – amazing things like HyperCard, which was the link between hypertext in the literary sense and the HTTP of the World Wide Web. We’ve drifted until the hot new product is a tremendously sophisticated machine designed for passive media consumption.

The iPad is good

It’s been decades since a really new computer interface caught on. We’ve just been polishing the groove of windows, mice, and keyboards. Concepts like “saving” stuff are free-floating abstractions with no justification in current technology – skiamorphs. The idea of only being able to point out a single pixel on a screen of modern dimensions, say – which is all a mouse lets you do – is just absurd.

The iPad is for touching.

We have a strange and dated word, cyberspace, that’s supposed to describe where we are when we don’t care where we are. When bodies are orphaned by frontal lobes going off to the cyberspace of phone conversations, they do things like doodling, pacing, and tidying. Most of us have been spending a lot of time there lately. The iPad is a naïve but bright exchange student from cyberspace to the real world: something you can read on the sofa with cocoa. It does less with complex, explicit human body behaviors (think about how weird keyboards and touch-typing are for a minute) and more with things like pointing and tapping. The version 1.0 I played with was a zoo of little problems, but it felt like something made for people.

I’ve been hanging out with a two-year–old lately, and it’s fun to see how the brain shifts into childcare gear. You realize you were ready all along. And when I climb a steep hillside, with no particular experience in rock climbing, I can feel simple instincts – which must be tens of millions of years old – keep my balance and momentum. It’s like blood is flowing to forgotten parts of the brain. There is a little of this sense in dragging an icon across the iPad’s screen.

Weapons of mass destruction and farm tools

Who can judge tools for the world?

I think nuclear weapons should be eradicated. But they seem to have kept us in a Nash equilibrium short of WWIII for 65 years. They organized the peace movement by giving it a great bête noir – they’re the best anti-war propaganda we’ve ever had. They kept Stalin and Mao on their own territory. Since GPS made the huge ones on spray-and-pray ICBMs obsolete, they are not so much worse than the new high-yield conventional explosives. It may be that nuclear weapons have done more than their part to keep the world together over the last few generations.

Machetes are a fine human-scale farm tool. You have to look at what you’re cutting. You think and work to use one, and you can keep your land clear and make paths, but you can’t level a forest. Machetes make land-use practices at least slightly conscious. In early 1994, machetes killed four times as many people as nuclear weapons did in mid 1945. The technology of the Rwandan genocide was mainly radios and farm tools. These are things I wish more people had.

So I don’t know much about how to give n stars out of five to a given tool. If can’t even come to clear conclusions about obviously bad and good things like WMDs and farm tools, I’m so far from knowing what “appropriate technology” means in any broad practical sense that I suspect no one else knows either. The best I can say is that if you have a tool, you should try to have some perspective on it. Whether it’s a WMD or a farm tool doesn’t seem to matter as much as what you do with it.

Facebook is bad

By forming a cooperative group, people can present a united front with its own identity to organize behind. The members hold shares, which are co-ownerships and voices in the way the collective acts. Facebook – the thing that makes the thing we usually call Facebook – is one of these mutual action leagues, or “corporations”.

Most US executives learn the principle that their duty in the absence of more specific consensus among the co-owners is to maximize shareholder value. That means keeping the price up by doing whatever makes people want to buy more shares. (This comes from Dodge v. Ford, which, curiously, experts do not take the way that business school professors do.) And because most large corporations have a fairly diverse range of shareholders, they are not reliably in consensus. Shareholder value takes over.

So Facebook is not a person. It’s made of people, but it’s very abstracted from them. This is important because it means you can’t trust it or hate it the way you would a person. You cannot negotiate with it the way you can negotiate with a person. You don’t matter to it, not just in the way that you don’t matter to a person who doesn’t care about you, but in the way that you don’t matter to a cloud. Clouds tend to ignore even the most popular, fiery, and righteous petitions about clouds.

You can’t even negotiate with Facebook the way you can with a restaurant, because you are not Facebook’s customer. You are Facebook’s product: its customers are its advertisers. Or the product is the advertisers’ money and the customers are its shareholders. Ether way, to Facebook, you are not a client, you are a worker.

Facebook’s interest in you is that you make it money, by clicking the ads or keeping other people on Facebook to click the ads, and that you not take its money, for example by making PR problems that keep people from clicking the ads, or reporting it doing something illegal. If there is some way for you to suffer that would cause Facebook more gain than harm, it will eventually get around to doing it.

Being upset at this is like being upset that if you go on a hike and swim in a heavy waterfall you might drown. It doesn’t matter that it’s pleasing to your eye. It’s not on your side.

Most of all, you can’t trust Facebook to watch out for your privacy, because Facebook is a tool for taking away privacy. It makes social attributes explicit. That’s how it works. That’s what it’s for.

Facebook isn’t how it has to be, you know. In 2000, the assumption in my circles was that in 2010 we’d all have our own sites – nothing too technical, just places you could put your photos and so on, but owned by you – and we’d plumb them together however it pleased us. All together, it might look roughly like Facebook, but it would be built with open-source (i.e., communal) tools using things like FOAF and RSS, and everyone would have what control they wanted over their own things. But nooo! Facebook was more glamorous and you didn’t have to type as many passwords or make as many complicated decisions. Gosh, why not let something manage your friendships for its own profit? Feh. These days FOAF is almost forgotten, and RSS is mostly just a way of smuggling the juice out of ugly blogs.

Maybe it was doomed. Jefferson, you know, had this idea of America as a land of yeomen farmers who would keep government local and honest through the self-sufficiency and good morals unquestionably produced by caring for one’s own land. In Query XIX of Notes on the State of Virginia, from 1787, he says:

Corruption of morals in the mass of cultivators is a phænomenon of which no age nor nation has furnished an example. […] Dependance begets subservience and venality, suffocates the germ of virtue, and prepares fit tools for the designs of ambition.

Or think of Thoreau complaining about the division of labor in Walden, 1854:

Shall we forever resign the pleasure of construction to the carpenter? […] No doubt another may also think for me; but it is not therefore desirable that he should do so to the exclusion of my thinking for myself.

These kind of vague ideas that it would be nice for everyone to take more responsibility are easy to snicker at in retrospect (everyone building their own houses – really, Henry?) except the ones that worked and we take for granted. Once it was foolish to think that the masses could be “given” free general educations, or that every adult could be in charge of their own high-speed conveyance. Or would have a sophisticated computer, even if poorly designed. So some of the spirit of the agrarianists and transcendentalists gives me hope that we could all have some understanding and control of our software. And as long as that’s possible, Facebook is a kind of insult.

Facebook is good

What are you trying to hide? Why?

When people say things amounting to “a whole generation of people is revealing things like political views, which will make them poor employment material”, I’m kind of confused. How is this different from saying “a whole generation of women is wearing trousers, which will make them poor marriage material”? If Facebook allows unprofessional employers to discover that everyone does personal things in personal time, what is lost? And what could be gained?

If you’re a woman in America c. 1975, it’s none of my business to tell you where to wear pants. If you’re one of the first in your neighborhood or office, it’s probably going to be unpleasant in many ways. And if you’re on Facebook now, I’m not telling you to be totally carefree. It might get you fired. But I can say that people telling you that simply must not be carefree are fools. It’s your choice.

Even moderate privacy is not good in itself. It’s good that it protects people from distraction and bigotry, but when it’s violated, the blame goes to the distractors and bigots, not to the person who accidentally let the truth about themselves be public. We should hope for a world where privacy is less necessary. But this is not the same as trying to get rid of it.

(You cannot have absolute privacy. Anonymous can, but you can’t. Whatever persona you choose is a leaky abstraction, and extra aspects of you will always come through. People who know Latin can often recognize each other from the way they turn phrases in English. A very little information about something like your commute can identify you. How many anonymous actions are there with a suspect pool that’s a big portion of the general population?)

And but so privacy is not dead. Protect yours. Help other people protect theirs. I have found many straight people who aren’t out on Facebook because they have gay friends who would get in trouble for it, so they’re trying to normalize that kind of privacy. This warms my heart: for all the objections that come to mind (maybe it’s ineffective, patronizing, discouraging outness, sandbagging, passively stigmatizing, insufficiently confrontational, or whatever), it’s a conscious decision toward a specific goal. It’s much better, to my way of thinking, than leaving the orientation field blank out of a bleary outrage that OMG, “corporations” will show me ads with models I find attractive, OMG (or whatever). Rule utilitarianism again.

Figure out what you’re trying to hide from whom and why. Act accordingly. Now you can use Facebook, or not use it, with a minimum of regrets. Your life is now better. You’re welcome. You might like to join me in some smug exasperation at people seeming surprised that they sometimes have to think about the information they produce.

How grateful I am for Facebook! If you see it as a dangerous waterfall, it’s real pretty. It’s kind of inspiring. And we have this marvelous system of corporate governance, litigiousness, and muckraking whereby it’s actually in Facebook’s own interest to do a reasonably good job of looking out for you. When it screws up, it’s mobbed by attorneys general and class action lawyers. It’s not on your side, no, but neither are the bees who made your honey, and that’s okay.

And look what it gives us. I’ve found old friends. I’ve had conversations with people I really like but who evidently gave up on e-mail somewhen. It’s kept me up on all the social housekeeping – marriages, dinner parties, and so on. Sure, I can imagine something better. But Facebook works. That’s big.

Software is the dangerous Other

How often do you do exactly what a stranger wants? Most of us live in extremely designed environments, where we can go an entire day without touching anything, except the air and people, that wasn’t designed by people. And the temperature and purity of indoor air, and the ways we touch people, are designed by people. All these artificial things embody intentions; they are all descriptions of how the world is and how it ought to be. A chair says what shape people are and how they sit. And software is an extreme of this: it’s pure, incorporeal intention. It makes claims at every level about the is and ought of the world and especially human behavior. It says how people think.

But it’s not how people think. Computers are very, very unlike people. Computers are aliens. They can be made to do things that persuade us they’re humane, but they just aren’t. They are dangerous.

Software is us

Computers are aliens.

What do we know about aliens? Nothing. Only science fiction – what Darko Suvin famously called “the literature of cognitive estrangement” in late 1972. It’s a genre about familiarizing the unfamiliar and vice-versa. It’s like horror without fear. Its subjects have always been enormous things that we don’t know how to think about: new social orders, new religions, new planets, many weird futures. And boy do we love aliens in science fiction, because our response to their incomprehensibility tells us who we are.

Star Trek is maybe not the ideal example of this, but at least it’s widely familiar. Here’s how the Platonic Trek episode works: in the A-story (as the writers called it), the Enterprise or its allies are in danger. Kirk or Picard demonstrates what it means to be a brave and civilized human. In the B-story, a personal relationship is in danger. Someone demonstrates what it means to be a good friend. At the end, the stories intersect.

A big point of science fiction, maybe the big point, is that you get that A-story to demonstrate virtue externally. To get that consistently outside science fiction, you need something like an enemy army, or nature, or the innumerable gunslingers of the Old West, and it tends to suck if you’re prone to intrusive thoughts about things like ethics and historical accuracy. Aliens and other such Big Weirds are an artificial frontier in an entertainment environment where we tend to get extremely patronizing about real unfamiliars.

We do have a real Big Weird that’s relatively free from, say, our shameful history with nature: computers.

Computers are a blank slate. They’re mirrors. They’re aliens.

We need that.

One of the really beautiful things about programming is having to go through your assumptions. Computers have no innate concepts, unless you could say that scissors have an innate concept of cutting. The proverbial 1s and 0s are already human terms for something that computers do not conceptualize in that way. They don’t even do arithmetic in our sense. They do things that we sometimes interpret as approximations of arithmetic. Computers do math as much as paint does art.

Charles Babbage, who designed the first modern computing hardware, said c. 1863 that:

On two occasions I have been asked, – “Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?” In one case a member of the Upper, and in the other a member of the Lower House [of Parliament] put this question. I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.

The software we’ve built up so far is a huge project to spell out our assumptions about arithmetic, communication, grammar, causality, reference, equivalence, concurrency – things with awkward names because we don’t usually have to think about them. And this project, 167 years and 7 months long so far, has been doing just what hammering out a pidgin with aliens should do: telling us about ourselves.

Augusta Ada King, Countess of Lovelace, wrote in October 1842 that:

It is however pretty evident, on general principles, that in devising for mathematical truths a new form in which to record and throw themselves out for actual use, views are likely to be induced, which should again react on the more theoretical phase of the subject. There are in all extensions of human power, or additions to human knowledge, various collateral influences, besides the main and primary object attained.

This is especially interesting for its context in British history, at the end of the Industrial Revolution and the beginning of the Victorian era. Only twelve years before, Captain Swing was afoot, smashing machines. (Select “Captain Swing”, click bookmarklet. I shouldn’t have to keep reminding you.) As befits a classic – this is the paper where she wrote the first software – it points in many directions. One way is to a meme that cyberpunk fans might call “the street finds its own uses for things”. In Brian Eno’s Revenge of the Intuitive, in January 1999, he said:

Even the “weaknesses” or the limits of these tools become part of the vocabulary of culture. I’m thinking of such stuff as Marshall guitar amps and black-and-white film – what was once thought most undesirable about these tools became their cherished trademark.

The Marshall guitar amplifier doesn’t just get louder when you turn it up. It distorts the sound to produce a whole range of new harmonics, effectively turning a plucked string instrument into a bowed one. A responsible designer might try to overcome this limitation – probably the engineers at Marshall tried, too. But that sound became the sound of, among others, Jimi Hendrix. That sound is called “electric guitar.”

It seems wonderful to me that computers are not a transparent medium. Thanks, différance. Computers reflect back on us. Their frustrations, when not (like that ATM) merely the fault of foolish humans, are like the rewarding difficulties of dealing with a toddler; they are little confrontations with my own assumptions; they are like the difficulties of camping; they are counsellors that feelingly persuade me what I am. They defamiliarize me from all my ideas. Aliens.

Writing most software, incidentally, now that we have this huge library of basic human ideas explained to computers, does not feel like doing mathematics in Lady Ada’s sense. I have no idea what integration is in a rigorous way – I’ve written things that integrate, but they do it with numerical methods so brutally ad-hoc that they’d give a real mathematician a nosebleed. When I’m doing graphics, I have to pull up the Wikipedia page on cosines. I avoid numbers in general.

Writing most software, in fact, feels like most writing. It’s a complicated, open-ended negotiation with your own intentions, with tools from other people, and with plans about how things should work in the world. It’s a kind of explaining or translating: you have an idea in your head and certain ways of representing ideas, and you daub and whittle and fold and hack until you have a device that might, if you’re lucky, put something like your idea in someone else’s head.

And it is about someone else’s head, not about the computer. The things computers do are all about people – they come from people and they are supposed to go to people. We do spend a lot of time thinking about the computer as a medium, the way a pianist might spend a lot of time thinking about fingerings. But the point is the music, and the point of the music is people. Programming is a way – a new way, that we don’t have good words for yet – for people to talk with each other.

The preface to first edition of The Structure and Interpretation of Computer Programs, by Harold Abelson, Gerald Jay Sussman and Julie Sussman in 1984, starts with a quote:

A computer is like a violin. You can imagine a novice trying first a phonograph and then a violin. The latter, he says, sounds terrible. That is the argument we have heard from our humanists and most of our computer scientists. Computer programs are good, they say, for particular purposes, but they aren’t flexible. Neither is a violin, or a typewriter, until you learn how to use it.

– Marvin Minsky, Why Programming Is a Good Medium for Expressing Poorly-Understood and Sloppily-Formulated Ideas

Abelson, Sussman and Sussman go on to say something even more quoted in software circles:

we want to establish the idea that a computer language is not just a way of getting a computer to perform operations but rather that it is a novel formal medium for expressing ideas about methodology. Thus, programs must be written for people to read, and only incidentally for machines to execute.

And later:

Underlying our approach to this subject is our conviction that “computer science” is not a science and that its significance has little to do with computers. The computer revolution is a revolution in the way we think and in the way we express what we think. The essence of this change is the emergence of what might best be called procedural epistemology – the study of the structure of knowledge from an imperative point of view, as opposed to the more declarative point of view taken by classical mathematical subjects. Mathematics provides a framework for dealing precisely with notions of “what is.” Computation provides a framework for dealing precisely with notions of “how to.”

This is the textbook for practical computer science or theoretical software engineering, and to work through it (it’s a kit for writing the programming language it’s written in) is to receive a vision of humanism.

Software is written by people, for people. Sometimes it really sucks. But it’s our suck. We make it, we own it, and we can remake it. This means me, and this means you.

How to think about Facebook

The same ways you should think about every other thing people make and use.