March 10, 2023
Curiosity
How interesting the world is depends on how well it’s written.
Date: March 10th, 2023 dw
March 10, 2023
How interesting the world is depends on how well it’s written.
January 14, 2023
I typed my doctoral dissertation in 1978 on my last electric typewriter, a sturdy IBM Model B.
My soon-to-be wife was writing hers out long hand, which I was then typing up.
Then one day we took a chapter to a local typist who was using a Xerox word processor which was priced too high for grad students or for most offices. When I saw her correcting text, and cutting and pasting, my eyes bulged out like a Tex Avery wolf.
As soon as Kay-Pro II’s were available, I bought one from my cousin who had recently opened a computer store.
The moment I received it and turned it on, I got curious about how the characters made it to the screen, and became a writer about tech. In fact, I became a frequent contributor to the Pro-Files KayPro magazine, writing ‘splainers about the details of how these contraptions. worked.
I typed my wife’s dissertation on it — which was my justification for buying it — and the day when its power really hit her was when I used WordStar’s block move command to instantly swap sections 1 and 4 as her thesis advisor had suggested; she had unthinkingly assumed it meant I’d be retyping the entire chapter.
People noticed the deeper implications early on. E.g., Michael Heim, a fellow philosophy prof (which I had been, too), wrote a prescient book, Electric Language, in the early 1990s (I think) about the metaphysical implications of typing into an utterly malleable medium. David Levy wrote Scrolling Forward about the nature of documents in the Age of the PC. People like Frode Hegland are still writing about this and innovating in the text manipulation space.
A small observation I used to like to make around 1990 about the transformation that had already snuck into our culture: Before word processors, a document was a one of a kind piece of writing like a passport, a deed, or an historic map used by Napoleon; a document was tied to its material embodiment. Then the word processing folks needed a way to talk about anything you could write using bits, thus severing “documents” from their embodiment. Everything became a document as everything became a copy.
In any case, word processing profoundly changed not only how I write, but how I think, since I think by writing. Having a fluid medium lowers the cost of trying out ideas, but also makes it easy for me to change the structure of my thoughts, and since thinking is generally about connecting ideas, and those connections almost always assume a structure that changes their meaning — not just a linear scroll of one-liners — word processing is a crucial piece of “scaffolding” (in Clark and Chalmer‘s sense) for me and I suspect for most people.
In fact, I’ve come to recognize I am not a writer so much as a re-writer of my own words.
January 9, 2021
Twitter’s reasons for permanent banning Donald Tr*mp acknowledge a way in which post-modernists (an attribution that virtually no post-modernist claims, so pardon my short hand) anticipated the Web’s effect on the relationship of author and reader. While the author’s intentions have not been erased, the reader’s understanding is becoming far more actionable.
Twitter’s lucid explanation of why it (finally) threw Tr*mp off its platform not only looks at the context of his tweets, it also considers how his tweets were being understood on Twitter and other platforms. For example:
“President Trump’s statement that he will not be attending the Inauguration is being received by a number of his supporters as further confirmation that the election was not legitimate…”
and
The use of the words “American Patriots” to describe some of his supporters is also being interpreted as support for those committing violent acts at the US Capitol.
and
The mention of his supporters having a “GIANT VOICE long into the future” and that “They will not be disrespected or treated unfairly in any way, shape or form!!!” is being interpreted as further indication that President Trump does not plan to facilitate an “orderly transition” …
Now, Twitter cares about how his tweets are being received because that reception is, in Twitter’s judgment, likely to incite further violence. That violates Twitter’s Glorification of Violence policy, so I am not attributing any purist post-modern intentions (!) to Twitter.
But this is a pretty clear instance of the way in which the Web is changing the authority of the author to argue against misreadings as not their intention. The public may indeed be misinterpreting the author’s intended meaning, but it’s now clearer than ever that those intentions are not all we need to know. Published works are not subservient to authors.
I continue to think there’s value in trying to understand a work within the context of what we can gather about the author’s intentions. I’m a writer, so of course I would think that. But the point of publishing one’s writings is to put them out on their own where they have value only to the extent to which they are appropriated — absorbed and made one’s own — by readers.
The days of the Author as Monarch are long over because now how readers appropriate an author’s work is even more public than that work itself.
(Note: I put an asterisk into Tr*mp’s name because I cannot stand looking at his name, much less repeating it.)
September 25, 2020
When I was 10 and my next-door-neighbor, David Stolzenberg, was probably 13, we wrote a short story and submitted it to Boys’ Life, the magazine for Boy Scouts.
In an ancient box in a forgotten corner of our basement, I found the rejection letter. It is, I believe, my very first, kicking off a series of maybe a thousand. The tape marks in the corners suggest that I this pasted into a scrapbook at one point in my youth.
I don’t remember what the story was about. Maybe science fiction.
I do remember David well, though. He became a doctor, married, had children, and died in his thirties of cancer. I am still processing that.
April 15, 2019
In response to a tweet asking writers what they write out longhand, I replied that if I’m particularly at sea, I’ll write out an outline, usually with lots of looping arrows, on a pad. But only with a fountain pen. Ballpoints don’t work.
My old bloggy friend AKMA wondered how he’d known me so long without knowing that I’m a fountain pen guy. The truth is that I’ve only recently become one. I’ve liked them at various times over the course my life, but only about four years ago did I integrate fountain pens into my personality.
It happened because I bought a $20 Lamy Safari on impulse in a stationery store. From there I got some single-digit Chinese fountain pens. Then, when I made some money on a writing contract, I treated myself to a $120 Lamy 2000, a lifetime pen. It’s pretty much perfect, from the classic 1960s design to the way the ink flows onto paper just wet enough and with enough scratchiness to feel like you’re on a small creek splashing over stones as it carves out words.
I have recently purchased a TWSBI ECO for $30. It has replaced my Safari as my daily pen. It’s lovely to write with, holds a lot of ink, and feels slightly sturdier than the Safari. Recommended.
Even though my handwriting is horrendous, I look forward to opportunities to write with these pens. But I avoid writing anything I’ll then have to transcribe because transcribing is so tedious. I do harbor a romantic notion of writing fiction longhand with a fountain pen on pads of Ampad “golden fibre.” Given that my fiction is worse than my handwriting, we can only hope that this notion itself remains a fiction.
So much of my writing is undoing, Penelope-like, the words I wove the day before that I am not tempted even a little to switch from word processors when the words and their order are the object. But when the words are mere vehicles, my thinking is helped — I believe — by a pen that drags its feet in the dirt.
July 26, 2014
My blogging has gone way down in frequency and probably in quality. I think there are two reasons.
First, I’ve been wrapped up in trying to plot a new book. I’ve known for about three years the set of things I want to write about, but I’ve had my usual difficult time figuring out what the book is actually about. For example, when I was planning Everything is Miscellaneous, I knew that I wanted to write about the importance of metadata, but it took a couple of years to figure out that it wasn’t a book about metadata, or a book about the virtue of messiness, or two dozen other attempts at a top line.
I’m going through the same process now. The process itself consists of me writing a summary of each chapter. Except they’re not summaries. They’re like the article version of each chapter and usually work about to about 2,000 words. That’s because a chapter is more like a path than a list, and I can’t tell what’s on the path until I walk it. Given that I work for a living, each complete iteration can take me 2-3 months. And then I realize that I have it all wrong.
I don’t feel comfortable going through this process in public. My investment of time into these book summaries is evidence of how seriously I take them, but my experience shows that nineteen times out of twenty, what I thought was a good idea is a very bad idea. It’s embarrassing. So, I don’t show these drafts even to the brilliant, warm and forgiving Berkman Book Club — a group of Berkfolk writing books — not only because it’s embarrassing but because I don’t want to inflict 10,000 words on them when I know the odds are that I’m going to do a thorough re-write starting tomorrow. The only people who see these drafts are my literary agents and friends David Miller and Lisa Adams, who are crucial critics in helping me to see what’s wrong and right in what I’ve done, and working out the next approach.
Anyway, I’ve been very focused for the past couple of months on figuring out this next book. I think I’m getting closer. But I always think that.
The second reason I haven’t been blogging much: I’ve been mildly depressed. No cause for alarm. It’s situational and it’s getting better. I’ve been looking for a new job because the Harvard Library Innovation Lab that I’ve co-directed, with the fabulous Kim Dulin, for almost five years has been given a new mission. I’m very proud of what we — mainly the amazing developers who are actually more like innovation fellows — have done, and I’m very sorry to leave. Facing unemployment hasn’t helped my mood. There have been some other stresses as well. So: somewhat depressed. And that makes it harder for me to post to my blog for some reason.
I thought you might want to know, not that anyone cares [Sniffles, idly kicks at a stone in the ground, waits for a hug].
November 24, 2013
I spent some time this morning happily browsing advice from famous writers on how to write, thanks to Maria Popova’s [twitter:BrainPickings] own writings on those writers writing about writing. Here’s Maria’s latest, which is about Anne Lamont’s Bird By Bird, an excellent (and excellently written!) piece that also contains links to famous writers on said topic.
Some of these pieces were familiar, some not, but all convinced me of one thing: writers should re-label their advice on how to write as “How I Write.” I find myself irked by every one of them into looking for counter-examples, even though I personally agree with much of what they say, and in many instances find their comments remarkably insightful.
Still, I want to push back when, for example, Susan Sontag says:
Your job is to see people as they really are, and to do this, you have to know who you are in the most compassionate possible sense. Then you can recognize others.
Yet you can’t throw a cat into a room full of writers without hitting someone wildly self-deceptive and unknowing. For example, Sontag’s own writing about writing ranges from breathtakingly perceptive to provocative to transparently self-aggrandizing.
Likewise, Elmore Leonard’s brilliant 10 rules of writing are clearly not rules for how to write, but rules for how to write like Elmore Leonard. (His ten rules are themselves a great example of his own style.) For instance, there’s #4:
Never use an adverb to modify the verb “said”
Ok.
I even find myself pushing back against one of his rules that I greatly admire:
“If it sounds like writing … rewrite it.”
I love that…except that what do we do with Bernini? His Apollo and Daphne statue — the one where Daphne’s fingers sprout translucent leaves — is so realistic and yet so marble that one cannot look at it without thinking, “Holy crap! That’s marble!!!” (By the way, I just violated Leonard’s rule #5: “Keep your exclamation points under control.” He’s right about that.) Likewise, are we sure that no poetry is allowed to sound like writing?
Meanwhile, David Ogilvy — the model for Dan Draper in pitch-mode, and a writer I admire greatly — is stylistically in sync with Elmore Leonard, but disagrees with both Leonard’s and Sontag’s rules. (Note: That was a highly imperfect sentence. Welcome to my blog.) Agreeing with Leonard, Ogilvy demands simplicity and avoiding pretentious, abstract terms. But his second rule says:
Write the way you talk. Naturally.
What do you say to that, Elmore? If you write the way you talk, will it sound like writing? And, David, suppose you don’t talk so good?
And Ogilvy’s eighth rule says:
If it is something important, get a colleague to improve it.
I’m not sure that Sontag’s insistence that writing requires something like personal authenticity allows for editing by colleagues. Why can’t “Hire yourself the best goddamn editor you can find” be an important Rule for Writers? And before you assume that such a needy writer must be a pathetic schlub who on her/his own is writing schlock, keep in mind that The New Yorker has a tradition of featuring truly superb writers in part because of the strength of its editors.
Maria Popova’s essays on writers advising writers (which, let me reiterate, I admire and enjoy) includes some pieces of advice that are incontestable, but in the bad sense that they are verge on being tautologies. For example, Lamont says:
Perfectionism is the voice of the oppressor, the enemy of the people. It will keep you cramped and insane your whole life, and it is the main obstacle between you and a shitty first draft.
That’s certainly true if it perfectionism means a paralyzing perfectionism, i.e., the sort of perfectionism that keeps you cramped and insane, and that prevents you from doing a shitty first draft. (You have to love Lamont’s rule-violating use of “shitty.”) But there is also a type of perfectionism that makes an author worry over every broken rhythm and soft imprecision, and that ultimately results in lapidary works. Also, I’d venture that for most authors, the real obstacle to getting to that shitty first draft is not perfectionism but the fact that they’re just too damn tired when they get home from work.
The thing is, I agree with Lamont about perfectionism. It’s one reason I like blogging. I’m in favor of filling in the spaces between writing and speaking, between publishing and drafting. Even so, I find myself so insistently pushing back against advice from writers that it makes me wonder why. Maybe…
…Maybe it’s because I don’t think there’s such a thing as “writing” except in its most literal sense: putting marks on a rectangular surface. Beyond that, there is nothing that holds the concept of writing together.
This still makes it better than “communication,” an abstraction that gets wrong what it is an abstraction from. Still, communication provides a useful analogy. To give advice on how to communicate well, one will have to decide ahead of time what type of communication one is referring to. Wooing? Convincing a jury? Praying? Writing a murder mystery? Asking for change from strangers? Muttering imprecations at the fact of dusk? Yelling “Fie! Her!!” in a crowded theater? Even basic rules like “Speak clearly” assume that one is communicating orally and that one is not Marlon Brando auditioning for a part. And even within anyone one domain or task of communication, the best practices are really about maintaining a form of rhetoric, not about communicating well.
There are plenty of tips about how to write the thing one wants to write. These tips can be very helpful. For example, I have a friend who swears by Write Or Die to help her get her shitty first draft down on paper. (No, my friend, your first drafts really aren’t shitty. I was using a technique I recommend that everyone use because I use it: the callback.) That tip works for her, but not for me. Still, I’m in favor of tips! But tips are “How I write” or “How I’ve heard some other people write,” not “How to write.”
How to write? I dunno. Lots of ways, I guess.
September 14, 2013
John Sundman is a heck of an interesting person. He’s been around the technology circuit from the Old Days (we’re peers in the chronological sense) but he also writes damn good fiction, some of which (Cheap Complex Devices [my review][sf site][goodreads]) is pretty sublime.
So how does a talented writer make a living in the Webby world? He and I have a long conversation about that and many other things.
August 16, 2012
I suspect there’s a lot of truth in Richard MacManus’ post at ReadWriteWeb about where Web publishing is going. In particular, I think the growth of topic streams is pretty much close to inevitable, whether this occurs via Branch + Medium (and coming from Ev Williams, I suspect that at the very least they’ll give Web culture a very heavy nudge) and/or through other implementations.
Richard cites two sites for this insight: Anil Dash and Joshua Benton at the Nieman Journalism Lab. Excellent posts. But I want to throw in a structural reason why topics are on the rise rise: authors don’t scale.
It is certainly the case that the Web has removed the hold the old regime had over who got to publish. To a lesser but still hugely significant extent, the Web has loosened the hold the old regime had on who among the published gets attention; traditional publishers can still drive views via traditional marketing channels, but tons more authors/creators are coming to light outside of those channels. Further, the busting up of mass culture into self-forming networks of interest means that a far wider range of authors can be known to groups that care about them and their topics. Nevertheless, there is a limit within any one social network — and within any one human brain — to how many authors can be emotionally committed to.
There will always be authors who are read because readers have bonded with them through the authors’ work. And the Web has enlarged that pool of authors by enabling social groups to find their own set, even if many authors’ fame is localized within particular groups. But there are only so many authors you can love, and only so many blogs you can visit in a day.
Topics, on the other hand, are a natural way to handle the newly scaled web of creators. Topics are defined as the ideas we’re interested in, so, yes, we’re interested in them! They also provide a very useful way of faceting through the aggregated web of creators — slicing through the universe of authors to pull in what’s interesting and relevant to the topic. There may be only so many topics you can be interested in (at least when topics get formalized, because there’s no limit to the things our curiosity pulls us toward), but within a topic, you can pull in many more authors, many of whom will be previously unknown and most of whom’s names will go by unnoticed.
I would guess that we will forever see a, dialectic between topics and authors in which a topic brings an author to our attention to whom we then commit, and an author introduces a topic to which we then subscribe. But we’ve spent the past 15 years scaling authorship. We’re not done yet, but it’s certainly past time for progress in scaling topics.
September 3, 2011
When I run into someone who wants to talk with me about something I’ve written in a book, they quite naturally assume that I am more expert about what I’ve written than they are. But it’s almost certainly the case that they’re more familiar with it than I am because they’ve read it far more recently than I have. I, like most writers, don’t sit around re-reading myself. I therefore find myself having to ask the person to remind me of what I’ve said. Really, I said that?
But, over the past twenty-four hours, I’ve re-read myself in three different modes.
I’ve been wrapped up in a Library Innovation Lab project that we submitted to the Digital Public Library of America on Thursday night, with 1.5 hours to spare before the midnight deadline. Our little team worked incredibly hard all summer long, and what we submitted we think is pretty spectacular as a vision, a prototype of innovative features, and in its core, work-horse functionality. (That’s why I’ve done so little blogging this sumer.)
So, the first example of re-reading is editing a bunch of explanatory Web pages — a FAQ, a non-tech explanation of some hardcore tech, a guided tour, etc. — that I wrote for our DPLA project. In this mode, I feel little connection to what I’ve written; I’m trying to edit it purely from the reader’s point of view, as if someone else had written it. Of course, I am oblivious to many of the drafts’ most important shortcomings because I’m reading them through the same glasses I had on when I wrote them. Things make sense to me that would not to readers who have the good fortune not to be me. Nevertheless, it’s just a carpentry job, trying to sand down edges and make the pieces fit. It’s the wood that matters, not whoever the carpenter happened to be.
In the second mode, I re-read something I wrote a long time ago. Someone on the Heidegger mailing list I audit asked for articles on Heidegger’s concept of the “world” in Being and Time and in The Origin of the Artwork. I remembered that I had written something about that a couple of careers ago. So, I did a search and found “Earth, World and the Fourfold” in the 1984 edition of Tulane Studies in Philosophy. (It’s locked up nice and tight so, no, you can’t read it even if you want to. Yeah, this is a completely optimal system of scholarship we’ve built for ourselves. [sarcasm]) I used my privileged access via my university and re-read it. It’s a fully weird experience. I remember so little of the content of the article and am so disassociated from the academic (or more exactly, the pathetic pretender to the same) I was that it was like reading a message from a former self. Actually, it wasn’t like that. It was that exactly.
I actually enjoyed reading the article. For one thing, unsurprisingly, I agreed with its general outlook and approach. It argues that Heidegger’s shifting use of “world,” especially with regard to that which he contrasts it with, expresses his struggle to deal with the danger that phenomenology will turn reality into a mere appearance. How can phenomenology account for that which shows itself to us as being beyond the mere showing? That is, how do we understand and acknowledge the fact that the earth shows itself to us as that which was here before us and will outlast us?
Since this was the topic of my doctoral dissertation and has remained a topic of great interest to me — it runs throughout all my books, including Too Big to Know — it’s not startling that I found Previous Me’s article interesting. And yet, Present Me persistently asked two sorts of distancing questions.
First, granting that the question itself is interesting, why was this guy (Previous Me) so wrapped up in Heidegger’s way of grappling with it? To get to Heidegger’s answers (such as they are) you have to wade through a thicket wrapped in profound scholarship wrapped in arrogantly awful writing. Now, Present Me remembers the personal history that led Previous Me to Heidegger: an identity crisis (as we used to call it) that manifested itself intellectually, that could not be addressed by pre-Heideggerian traditional philosophy (because that tradition of philosophy caused the intellectual conundrum in the first place). But outside of that personal history, why Heidegger? So, the article reads to Present Me as a wrestling match within a bubble invisible to Previous Me.
Second, my internal editor was present throughout: Damn, that was an inelegant phrase! Wait, this paragraph needs a transition! What the hell did that sentence mean? Jeez, this guy sounds pretentious here!
So, reading something of mine from the distant past was a tolerable and even interesting experience because PreviousMe was distant enough.
Third, I am this weekend reading the page proofs of Too Big to Know. At this point in the manufacturing process known as “writing a book,” I am allowed only to make the most minor of edits. If a change causes lines on the page to shift to a new page, there can be consequences expensive to my publisher. So, I’m reading looking for bad commas and “its” instead of “it’s”, all of which should have been (and so far have been) picked up by Christine Arden, the superb copy-editor who went through my book during the previous pass. But, I also am reading it looking for infelicities I can fix — maybe change an “a” to “the” or some such. This requires reading not just for punctuation but also for rhythm and meaning. In other words, I simultaneously have to read the book as if I were a reader, not just an editor. And that is a disconcerting, embarrassing, frustrating process. There are things about the book that pleasantly surprise me — Where did I come up with that excellent example! — but fundamentally I am focused on it critically. Worse, this is PresentMe seeing how PresentMe presents himself to the world. I am in a narcissistic bubble of self-loathing.
Which is too bad since this is the taste being left by what is very likely to be the last time I read Too Big to Know.
(My publisher would probably like me to note that the book is possibly quite good, and the people who have read it so far seem enthusiastic. But, how the hell would I know before you tell me?)