At a recent Fellows Hour at the Berkman Center the topic was something like “Whatever happened to blogging?,” with the aim of thinking about how Berkman can take better advantage of blogging as a platform for public discussion. (Fellow Hours are private. No, this is not ironic.) They asked me to begin with some reflections on what blogging once was, because I am old. Rather than repeating what I said, here are some thoughts heavily influenced by the discussion.
And an important preface: What follows is much more of a memoir than a history. I understand that I’m reporting on how blogging looked to someone in a highly privileged position. For example, the blogosphere (remember when that was word?) as I knew it didn’t count LiveJournal as a blogging service, I think because it wasn’t “writerly” enough, and because of demographic differences that themselves reflect several other biases.
I apparently began blogging in 1999, which makes me early to the form. But, I didn’t take to it, and it was only on Nov. 15, 2001 that I began in earnest (blogging every day for twelve years counts as earnest, right?), which puts me on the late edge of the first wave, I believe. Blogging at that point was generating some interest among the technorati, but was still far from mainstream notice. Or, to give another measure, for the first year or so, I was a top 100 blogger. (The key to success: If you can’t compete on quality, redefine your market down.)
Blogging mattered to us more deeply than you might today imagine. I’d point to three overall reasons, although I find it not just hard but even painful to try to analyze that period.
1. Presence. I remember strolling through the vendor exhibits at an Internet conference in the mid 1990s. It seemed to be a solid wall of companies large and small each with the same pitch: “Step into our booth and we’ll show you how to make a home page in just 3 minutes.” Everyone was going to have a home page. I wish that had worked out. But even those of us who did have one generally found them a pain in the neck to update; FTPing was even less fun then than it is now.
When blogs came along, they became the way we could have a Web presence that enabled us to react, respond, and provoke. A home page was a painting, a statue. My blog was me. My blog was the Web equivalent of my body. Being-on-the-Web was turning out to be even more important and more fun than we’d thought it would be.
2. Community. Some of us had been arguing from the beginning of the Web that the Web was more a social space than a publishing, informational or commercial space — “more” in the sense of what was driving adoption and what was making the Web the dominant shaping force of our culture. At the turn of the millennium there was no MySpace (2003) and no Facebook (2004). But there was a blogging. If blogging enabled us to create a Web presence for ourselves, blogging was also self-consciously about connecting those presences into a community. (Note that such generalizations betray that I am speaking blindly from personal experience.)
That’s why blogrolls were important. Your blogroll was a list of links to the bloggers you read and engaged with. It was a way of sending people away from your site into the care of someone else who would offer up her own blogroll. Blogrolls were an early social network.
At least among my set of bloggers, we tried to engage with one another and to do so in ways that would build community. We’d “retweet” and comment on other people’s posts, trying to add value to the discussion. Of course not everyone played by those rules, but some of us had hope.
And it worked. I made friendships through blogging that maintain to this day, sometimes without ever having been in the same physical space.
(It says something about the strength of our community that it was only in 2005 that I wrote a post titled No, I’m not keeping up with your blog. Until that point, keeping up was sort of possible.)
3. Disruption. We were aware that the practice of blogging upset many assumptions about who gets to speak, how we speak, and who is an authority. Although blogging is now taken for granted at best and can seem quaint at worst, we thought we were participating in a revolution. And we were somewhat right. The invisibility of the effects of blogging — what we take for granted — is a sign of the revolution’s success. The changes are real but not as widespread or deep as we’d hoped.
Of course, blogging was just one of mechanisms for delivering the promise of the Net that had us so excited in the first place. The revolution is incomplete. It is yet deeper than we usually acknowledge.
To recapture some of the fervor, it might be helpful to consider what blogging was understood in contrast to. Here are some of the distinctions discussed at the time.
Experts vs. Bloggers. Experts earned the right to be heard. Bloggers signed up for a free account somewhere. Bloggers therefore add more noise than signal to the discussion. (Except: Much expertise has migrated to blogs, blogs have uncovered many experts, and the networking of bloggy knowledge makes a real difference.)
Professionals vs. Amateurs. Amateurs could not produce material as good as professionals because professionals have gone through some controlled process to gain that status. See “Experts vs. Bloggers.”
Newsletters vs. Posts. Newsletters and ‘zines (remember when that was a word?) lowered the barrier to individuals posting their ideas in a way that built a form of Web presence. Blogs intersected uncomfortably with many online newsletters (including mine). Because it was assumed that a successful blog needed new posts every day or so, content for blogs tended to be shorter and more tentative than content in newsletters.
Paid vs. Free. Many professionals simply couldn’t understand how or why bloggers would work for free. It was a brand new ecosystem. (I remember during an interview on the local Boston PBS channel having to insist repeatedly that, no, I really really wasn’t making any money blogging.)
Good vs. Fast. If you’re writing a couple of posts a day, you don’t have time to do a lot of revising. On the other hand, this made blogging more conversational and more human (where “human” = fallible, imperfect, in need of a spelpchecker).
One-way vs. Engaged. Writers rarely got to see the reaction of their readers, and even more rarely were able to engage with readers. But blogs were designed to mix it up with readers and other bloggers: permalinks were invented for this very purpose, as were comment sections, RSS feeds, etc.
Owned vs. Shared. I don’t mean this to refer to copyright, although that often was an important distinction between old media and blogs. Rather, in seeing how your words got taken up by other bloggers, you got to see just how little ownership writers have ever had over their ideas. If seeing your work get appropriated by your readers made you uncomfortable, you either didn’t blog or you stopped up your ears and covered your eyes so you could simulate the experience of a mainstream columnist.
Reputation vs. Presence. Old-style writing could make your reputation. Blogging gave you an actual presence. It was you on the Web.
Writing vs. Conversation. Some bloggers posted without engaging, but the prototypical blogger treated a post as one statement in a continuing conversation. That often made the tone more conversational and lowered the demand that one present the final word on some topic.
Journalists vs. Bloggers. This was a big topic of discussion. Journalists worried that they were going to be replaced by incompetent amateurs. I was at an early full-day discussion at the Berkman Center between Big Time Journalists and Big Time Bloggers at which one of the bloggers was convinced that foreign correspondents would be replaced by bloggers crowd-sourcing the news (except this was before Jeff Howe [twitter: crowdsourcing] had coined the term “crowd-sourcing”). It was very unclear what the relationship between journalism and blogging would be. At this meeting, the journalists felt threatened and the bloggers suffered a bad case of Premature Triumphalism.
Objectivity vs.Transparency Journalists were also quite concerned about the fact that bloggers wrote in their own voice and made their personal points of view known. Many journalists — probably most of them — still believe that letting readers know about their own political stances, etc., would damage their credibility. I still disagree.
I was among the 30 bloggers given press credentials at the 2004
2005 Democratic National Convention — which was seen as a milestone in the course of blogging’s short history — and attended the press conference for bloggers put on by the DNC. Among the people they brought forward (including not-yet-Senator Obama) was Walter Mears, a veteran and Pulitzer-winning journalist, who had just started a political blog for the Associated Press. I asked who he was going to vote for, but he demurred because then how could we trust his writing? I replied something like, “Then how will we trust your blog?” Transparency is the new objectivity, or so I’ve been told.
It is still the case that for the prototypical blog, it’d be weird not to know where the blogger stands on the issues she’s writing about. On the other hand, in this era of paid content, I personally think it’s especially incumbent on bloggers to be highly explicit not only about where they are starting from, but who (if anyone) is paying the bills. (Here’s my disclosure statement.)
For me, it was Clay Shirky’s Power Law post that rang the tocsin. His analysis showed that the blogosphere wasn’t a smooth ball where everyone had an equal voice. Rather, it was dominated by a handful of sites that pulled enormous numbers, followed by a loooooooooong tail of sites with a few followers. The old pernicious topology had reasserted itself. We should have known that it would, and it took a while for the miserable fact to sink in.
Yet there was hope in that long tail. As Chris Anderson pointed out in a book and article, the area under the long tail is bigger than the area under the short head. For vendors, that means there’s lots of money in the long tail. For bloggers that means there are lots of readers and conversationalists under the long tail. More important, the long tail of blogs was never homogenous; the small clusters that formed around particular interests can have tremendous value that the short head can never deliver.
So, were we fools living in a dream world during the early days of blogging? I’d be happy to say yes and be done with it. But it’s not that simple. The expectations around engagement, transparency, and immediacy for mainstream writing have changed in part because of blogs. We have changed where we turn for analysis, if not for news. We expect the Web to be easy to post to. We expect conversation. We are more comfortable with informal, personal writing. We get more pissed off when people write in corporate or safely political voices. We want everyone to be human and to be willing to talk with us in public.
So, from my point of view, it’s not simply that the blogosphere got so big that it burst. First, the overall media landscape does look more like the old landscape than the early blogosphere did, but at the more local level – where local refers to interests – the shape and values of the old blogosphere are often maintained. Second, the characteristics and values of the blogosphere have spread beyond bloggers, shaping our expectations of the online world and even some of the offline world.
[The next day:] Suw Charman-Anderson’s comment (below) expresses beautifully much of what this post struggles to say. And it’s wonderful to hear from my bloggy friends.
Tagged with: blogging
• web 2.0
Date: January 8th, 2014 dw
I saw it on New Year’s Eve and liked it a lot. But, I think it’s best taken as a series of brilliant set pieces. String them together and you have a fairly predictable narrative arc, and a thematic point ( [SPOILER] Greed is bad) that isn’t going to change anyone’s mind. But the set pieces are incredibly well done because Scorcese. And Leo Dicaprio is just great in it.
Some people are upset because the movie doesn’t condemn the behavior it depicts. Yikes. Scorcese is obviously showing us behavior he finds so extraordinarily bad that he was motivated to make a movie about it. To tack on some moralizing elements would only lessen the impact, because that would imply that we need to be told that the behavior depicted is bad.
Mike Ryan at HuffPo writes about this question and, citing Chuck Klosterman, compares Leo’s character to Archie Bunker. But there’s very little to understand about Archie. He’s a bigot and ignorant. Haha. Wolf of Wall Strett instead shows us a sub-culture that is twisted and extreme, but is coherent within its own little world. There’s something to understand there, which is why Leo is able to give an Oscar-worthy performance. In that it’s much like The Godfather or The Sopranos, not to mention Good Fellas. It is also more like American Psycho than like Wall Street. (And speaking of Oliver Stone, one of my very least favorite directors, if you want to be hit repeatedly with a gigantic Morality Hammer, watch Platoon, if you can get through it.)
Good Fellas is a better movie than Wolf (in my opinion, natch) because it is less predictable, the main character is more morally nuanced, there are more unforgettable characters, etc. But I thought Wolf was very good, very entertaining, and treated us like moral grownups.
Not that all of us are.
Tagged with: movies
Date: January 3rd, 2014 dw
Gary King [twitter:kinggarry] , Director of Harvard’s Institute for Quantitative Social Science, has published an article (Open Access!) on the current status of this branch of science. Here’s the abstract:
The social sciences are undergoing a dramatic transformation from studying problems to solving them; from making do with a small number of sparse data sets to analyzing increasing quantities of diverse, highly informative data; from isolated scholars toiling away on their own to larger scale, collaborative, interdisciplinary, lab-style research teams; and from a purely academic pursuit focused inward to having a major impact on public policy, commerce and industry, other academic fields, and some of the major problems that affect individuals and societies. In the midst of all this productive chaos, we have been building the Institute for Quantitative Social Science at Harvard, a new type of center intended to help foster and respond to these broader developments. We offer here some suggestions from our experiences for the increasing number of other universities that have begun to build similar institutions and for how we might work together to advance social science more generally.
In the article, Gary argues that Big Data requires Big Collaboration to be understood:
Social scientists are now transitioning from working primarily on their own, alone in their officesâ??a style that dates back to when the offices were in monasteriesâ??to working in highly collaborative, interdisciplinary, larger scale, lab-style research teams. The knowledge and skills necessary to access and use these new data sources and methods often do not exist within any one of the traditionally defined social science disciplines and are too complicated for any one scholar to accomplish alone
He begins by giving three excellent examples of how quantitative social science is opening up new possibilities for research.
1. Latanya Sweeney [twitter:LatanyaSweeney] found “clear evidence of racial discrimination” in the ads served up by newspaper websites.
2. A study of all 187M registered voters in the US showed that a third of those listed as “inactive” in fact cast ballots, “and the problem is not politically neutral.”
3. A study of 11M social media posts from China showed that the Chinese government is not censoring speech but is censoring “attempts at collective action, whether for or against the government…”
Studies such as these “depended on IQSS infrastructure, including access to experts in statistics, the social sciences, engineering, computer science, and American and Chinese area studies. ”
Gary also points to “the coming end of the quantitative-qualitative divide” in the social sciences, as new techniques enable massive amounts of qualitative data to be quantified, enriching purely quantitative data and extracting additional information from the qualitative reports.
Instead of quantitative researchers trying to build fully automated methods and qualitative researchers trying to make do with traditional human-only methods, now both are heading toward using or developing computer-assisted methods that empower both groups.
We are seeing a redefinition of social science, he argues:
We instead use the term “social science” more generally to refer to areas of scholarship dedicated to understanding, or improving the well-being of, human populations, using data at the level of (or informative about) individual people or groups of people.
This definition covers the traditional social science departments in faculties of schools of arts and science, but it also includes most research conducted at schools of public policy, business, and education. Social science is referred to by other names in other areas but the definition is wider than use of the term. It includes what law school faculty call “empirical research,” and many aspects of research in other areas, such as health policy at schools of medicine. It also includes research conducted by faculty in schools of public health, although they have different names for these activities, such as epidemiology, demography, and outcomes research.
The rest of the article reflects on pragmatic issues, including what this means for the sorts of social science centers to build, since community is “by far the most important component leading to success…” ” If academic research became part of the X-games, our competitive event would be “‘extreme cooperation’”.
Tagged with: 2b2k
• big data
• social science
Date: January 2nd, 2014 dw
It’s been a disappointing year for those of us who enjoy Top Ten Top Ten lists. Last year I found an arguable ten or so for my Top Ten top Ten Top Ten list.
This year, my Top Ten Top Ten Top Ten lists contains only four:
Then there were some sites that seemed promising but can’t count in a list as rigorous as this. I have my standards, people!
Not only is this a disaster for this year’s list, it calls into question our progress toward the dream: a Top Ten Top Ten Top Ten Top Ten list. But I have hope. We can do this!
Tagged with: humor
Date: January 1st, 2014 dw
I’ve been spending TV time taking digital photographs of every page of our family photo albums. Sure, it’d be better to digitize each one individually, but it turns out that what I’m doing is way better than never getting around to doing it right.
Tagged with: photos
Date: December 30th, 2013 dw
The history of Western philosophy usually has a presumed shape: there’s a known series of Great Men (yup, men) who in conversation with their predecessors came up with a coherent set of ideas. You can list them in chronological order, and cluster them into schools of thought with their own internal coherence: the neo-Platonists, the Idealists, etc. Sometimes, the schools and not the philosophers are the primary objects in the sequence, but the topology is basically the same. There are the Big Ideas and the lesser excursions, the major figures and the supporting players.
Of course the details of the canon are always in dispute in every way: who is included, who is major, who belongs in which schools, who influenced whom. A great deal of scholarly work is given over to just such arguments. But there is some truth to this structure itself: philosophers traditionally have been shaped by their tradition, and some have had more influence than others. There are also elements of a feedback loop here: you need to choose which philosophers you’ll teach in philosophy courses, so you you act responsibly by first focusing on the majors, and by so doing you confirm for the next generation that the ones you’ve chosen are the majors.
But I wonder if in one or two hundred years philosophers (by which I mean the PT-3000 line of Cogbots™) will mark our era as the end of the line — the end of the linear sequence of philosophers. Rather than a sequence of recognized philosophers in conversation with their past and with one another, we now have a network of ideas being passed around, degraded by noise and enhanced by pluralistic appropriation, but without owners — at least without owners who can hold onto their ideas long enough to be identified with them in some stable form. This happens not simply because networks are chatty. It happens not simply because the transmission of ideas on the Internet occurs through a p2p handoff in which each of the p’s re-expresses the idea. It happens also because the discussion is no longer confined to a handful of extensively trained experts with strict ideas about what is proper in such discussions, and who share a nano-culture that supersedes the values and norms of their broader local cultures.
If philosophy survives as anything more than the history of thought, perhaps we will not be able to outline its grand movements by pointing to a handful of thinkers but will point to the webs through which ideas passed, or, more exactly, the ideas around which webs are formed. Because no idea passes through the Web unchanged, it will be impossible to pretend that there are “ideas-in-themselves” — nothing like, say, Idealism which has a core definition albeit with a history of significant variations. There is no idea that is not incarnate, and no incarnation that is not itself a web of variations in conversation with itself.
I would spell this out for you far more precisely, but I don’t know what I’m talking about, beyond an intuition that the tracks end at the trampled field in which we now live.
, too big to know
Tagged with: 2b2k
Date: December 28th, 2013 dw
I know it’s the day after the day after Christmas, but I’m still going to give you a gift. A gift of Schiff.
I heard Andras Schiff on the radio a couple of days ago and it reminded me how much I’ve enjoyed his discussions of Beethoven’s piano sonatas before he’s performed them. He plays with passion but has an analytic understanding of the compositions. And, no, I’m not sure why I used “but” as the conjunction in that sentence.
Anyway, you can download the lectures here, thanks to The Guardian. (Thank you, The Guardian!)
Schiff said on the radio the other day that as he gets older, his understanding increases but his technical ability decreases. It makes me hope that we get some software that lets a master like him manipulate musical notation to produce a digital version of the performance that he would have liked to be able to give. Or will it turn out that there are so many variables for how you strike a note and string them together that such software is like wishing that Meryl Streep could instruct a digitizal avatar to act as well as she does?
Tagged with: acting
Date: December 27th, 2013 dw
I had a chance to talk with Dan Brickley today, a semanticizer of the Web whom I greatly admire. He’s often referred to as a co-creator of FOAF, but these days he’s at Google working on Schema.org. He pointed me to the work Schema has been doing with online datasets, which I hadn’t been aware of. Very interesting.
Schema.org, as you probably know, provides a set of terms you can hide inside the HTML of your page that annotate what the visible contents are about. The major search engines — Google, Bing, Yahoo, Yandex — notice this markup and use it to provide more precise search results, and also to display results in ways that present the information more usefully. For example, if a recipe on a page is marked up with Schema.org terms, the search engine can identify the list of ingredients and let you search on them (“Please find all recipes that use butter but not garlic”) and display them in a more readable away. And of course it’s not just the search engines that can do this; any app that is looking at the HTML of a page can also read the Schema markup. There are Schema.org schemas for an ever-expanding list of types of information…and now datasets.
If you go to Schema.org/Dataset and scroll to the bottom where it says “Properties from Dataset,” you’ll see the terms you can insert into a page that talk specifically about the dataset referenced. It’s quite simple at this point, which is an advantage of Schema.org overall. But you can see some of the power of even this minimal set of terms over at Google’s experimental Schema Labs page where there are two examples.
The first example (click on the “view” button) does a specialized Google search looking for pages that have been marked up with Schema’s Dataset terms. In the search box, try “parking,” or perhaps “military.” Clicking on a return takes you to the original page that provides access to the dataset.
The second demo lets you search for databases related to education via the work done by LRMI (Learning Resource Metadata Initiative); the LRMI work has been accepted (except for the term useRightsUrl) as part of Schema.org. Click on the “view” button and you’ll be taken to a page with a search box, and a menu that lets you search the entire Web or a curated list. Choose “entire Web” and type in a search term such as “calculus.”
This is such a nice extension of Schema.org. Schema was designed initially to let computers parse information on human-readable pages (“Aha! ‘Butter’ on this page is being used as a recipe ingredient and on that page as a movie title“), but now it can be used to enable computers to pull together human-readable lists of available datasets.
I continue to be a fan of Schema because of its simplicity and pragmatism, and, because the major search engines look for Schema markup, people have a compelling reason to add markup to their pages. Obviously Schema is far from the only metadata scheme we need, nor does it pretend to be. But for fans of loose, messy, imperfect projects that actually get stuff done, Schema is a real step forward that keeps taking more steps forward.
Here’s a recipe for a Manhattan cocktail that I like. The idea of adding Kahlua came from a bartender in Philadelphia. I call it a Bogotá Manhattan because of the coffee.
You can’t tell by looking at this post that it’s marked up with Schema.org codes, unless you View Source. These codes let the search engines (and any other computer program that cares to look) recognize the meaning of the various elements. For example, the line “a splash of Kahlua” actually reads:
<span itemprop=”ingredients”>a splash of Kahlua</span>
“itemprop=ingredients” says that the visible content is an ingredient. This does not help you as a reader at all, but it means that a search engine can confidentally include this recipe when someone searches for recipes that contain Kahlua. Markup makes the Web smarter, and Schema.org is a lightweight, practical way of adding markup, with the huge incentive that the major search engines recognize Schema.
So, here goes:
A variation on the classic Manhattan — a bit less bitter, and a bit more complex.
Prep Time: 3 minutes
Yield: 1 drink
1 shot bourbon
1 shot sweet Vermouth
A few shakes of Angostura bitters
A splash of Kahlua
A smaller splash of grenadine or maraschino cherry juice
1 maraschino cherry and/or small slice of orange as garnish. Delicious garnish.
Shake together with ice. Strain and serve in a martini glass, or (my preference) violate all norms by serving in a small glass with ice.
Here’s the Schema.org markup for recipes. author url
So, some guy on a TV show I never saw said some stuff I don’t agree with about homosexuality. He thinks it’s a sin akin to a whole bunch of other sex-related sins. After the affair blew up, he responded, “I would never treat anyone with disrespect just because they are different from me. We are all created by the Almighty and like Him, I love all of humanity.” In the original interview he also described his experience as “white trash” working alongside African-Americans, saying that he never saw them mistreated. I believe him. He never saw that. Ok.
I don’t much care about the details of the incident, so if you want to tell me that I’m not understanding the horribleness of what he said, I’m not going to argue with you. I really haven’t researched it. But the debate is irking me.
I am reading too many of my compatriots — and, by the way, welcome to marriage equality, New Mexico! — saying that it was ok for A&E to fire Phil Robertson (the Duck Dynasty guy in question) because the First Amendment constrains the actions only of the government. So, I assume A&E had every legal and Constitutional right to fire Robertson for what he said.
So what? The question isn’t what A&E is allowed to do and what the First Amendment forbids. The question is: What makes this country a better place in which to live? Do we want to live in a place where you can’t state your opinion without worrying that you may be fired? How much variance from the orthodoxy are we willing to permit? And, yes, I feel the same way about not buying from a local store that has a political sign in its window that you disagree with. Your Republican hardware store owner has a right to make a living!
Do we really think America is better if the many people who think homosexuality is a sin are forbidden from saying so? The ironic revenge of Don’t ask, don’t tell?
Jeez. We need some room for disagreement here!
Just to anticipate the comments: Yes, I would feel the same way if he had said, “Everyone knows the Jews own the banks.” And, yes, there are things he could say that would make him so toxic that I’d agree that the network should fire him. For example, if he had threatened violence, or had used language so inflammatory that it could lead to violence. There are lines. We’re just drawing them wrong. IMO.
Tagged with: duck dynasty
• free speech
Date: December 19th, 2013 dw
« Previous Page | Next Page »