Joho the Blog » education

May 11, 2014

[2b2k] “In Over Our Heads” – My Simmons commencement address

On Friday, I had the tremendous honor of being awarded a Doctor of Letters degree from Simmons College, and giving the Commencement address at the Simmons graduate students’ ceremony.

Simmons is an inspiring place, and not only for its deep commitment to educating women. Being honored this way — especially along with Ruth Ellen Fitch, Madeleine M. Joullié, and Billie Jean King — made me very, very happy.

 


Thank you so much to Simmons College, President Drinan, and the Board of Trustees for this honor, which means the world to me. I’m just awed by it. Also, Professor Candy Schwartz and Dean Eileen Abels, a special thank you. And this honor is extra-special meaningful because my father-in-law, Marvin Geller, is here today, and his sister, Jeannie Geller Mason, was, as she called herself, a Simmons girl, class of 1940. Afterwards, Marvin will be happy to sing you the old “We are the girls of Simmons C” college song if you ask.

So, first, to the parents: I have been in your seat, and I know how proud – and maybe relieved – you are. So, congratulations to you. And to the students, it’s such an honor to be here with you to celebrate your being graduated from Simmons College, a school that takes seriously the privilege of helping its students not only to become educated experts, but to lead the next cohort in their disciplines and professions.

Now, as I say this, I know that some of you may be shaking your inner heads, because a commencement speaker is telling you about how bright your futures are, but maybe you have a little uncertainty about what will happen in your professions and with your career. That’s not only natural, it’s reasonable. But, some of you – I don’t know how many — may be feeling beyond that an uncertainty about your own abilities. You’re being officially certified with an advanced degree in your field, but you may not quite feel the sense of mastery you expected.

In other words, you feel the way I do now. And the way I did in 1979 when I got my doctorate in philosophy. I knew well enough the work of the guy I wrote my dissertation on, but I looked out at the field and knew just how little I knew about so much of it. And I looked at other graduates, and especially at the scholars and experts who had been teaching us, and I thought to myself, “They know so much more than I do.” I could fake it pretty well, but actually not all that well.

So, I want to reassure those of you who feel the way that I did and do, I want to reassure you that that feeling of not really knowing what you should, that feeling may stay with you forever. In fact, I hope it does — for your sake, for your profession, and for all of us.

But before explaining, I need to let you in on the secret: You do know enough. It’s like Gloria Steinem’s response, when she was forty, to people saying that she didn’t look forty. Steinem replied, “This is what forty looks like.” And this is what being a certified expert in your field feels like. Simmons knows what deserving your degree means, and its standards are quite high. So, congratulations. You truly earned this and deserve it.

But here’s why it’s good to get comfortable with always having a little lack of confidence. First, if you admit no self-doubt, you lose your impulse to learn. Second, you become a smug, know-it-all and no one likes you. Third, what’s even worse, is that you become a soldier in the army of ignorance. Your body language tells everyone else that their questions are a sign of weakness, which shuts down what should have been a space for learning.

The one skill I’ve truly mastered is asking stupid questions. And I don’t mean questions that I pretend are stupid but then, like Socrates, they show the folly of all those around me. No, they’re just dumb questions. Things I really should know by now. And quite often it turns out that I’m not the only one in the room with those questions. I’ve learned far more by being in over my head than by knowing what I’m talking about. And, as I’ll get to, we happen to be in the greatest time for being in over our heads in all of human history.

Let me give you just one quick example. In 1986 I became a marketing writer at a local tech startup called Interleaf that made text-and-graphic word processors. In 1986 that was a big deal, and what Interleaf was doing was waaaay over my head. So, I hung out with the engineers, and I asked the dumbest questions. What’s a font family? How can the spellchecker look up words as fast as you type them? When you fill a shape with say, purple, how does the purple know where to stop? Really basic. But because it was clear that I was a marketing guy who was genuinely interested in what the engineers were doing, they gave me a lot of time and an amazing education. Those were eight happy years being in over my head.

I’m still way over my head in the world of libraries, which are incredibly deep institutions. Compared to “normal” information technology, the data libraries deal with is amazingly profound and human. And librarians have been very generous in helping me learn just a small portion of what they know. Again, this is in part because they know my dumb questions are spurred by a genuine desire to understand what they’re doing, down to the details.

In fact, going down to the details is one very good way to make sure that you are continually over your head. We will never run out of details. The world’s just like that: there’s no natural end to how closely you can look at thing. And one thing I’ve learned is that everything is interesting if looked at at the appropriate level of detail.

Now, it used to be that you’d have to seek out places to plunge in over your head. But now, in the age of the Internets, all we have to do is stand still and the flood waters rise over our heads. We usually call this “information overload,” and we’re told to fear it. But I think that’s based on an old idea we need to get rid of.

Here’s what I mean. So, you know Flickr, the photo sharing site? If you go there and search for photos tagged “vista,” you’ll get two million photos, more vistas than you could look at if you made it your full time job.

If you go to Google and search for apple pie recipes, you’ll get over 1.3 million of them. Want to try them all out to find the best one. Not gonna happen.

If you go to Google Images and search for “cute cats,” you’ll get over seven million photos of the most adorable kittens ever, as well as some ads and porn, of course, because Internet.

So that’s two million vista photos. 1.3 million apple pie recipes. 7.6 million cute cat photos. We’re constantly warned about information overload, yet we never hear one word single word about the dangers of Vista Overload, Apple Pie Overload, or Cute Kitten overload. How have the media missed these overloads! It’s a scandal!

I think there’s actually a pretty clear reason why we pay no attention to these overloads. We only feel overloaded by that which we feel responsible for mastering. There’s no expectation that we’ll master vista photos, apple pie recipes, or photos of cute cats, so we feel no overload. But with information it’s different because we used to have so much less of it that back then mastery seemed possible. For example, in the old days if you watched the daily half hour broadcast news or spent twenty minutes with a newspaper, you had done your civic duty: you had kept up with The News. Now we can see before our eyes what an illusion that sense of mastery was. There’s too much happening on our diverse and too-interesting planet to master it, and we can see it all happening within our browsers.

The concept of Information Overload comes from that prior age, before we accepted what the Internet makes so clear: There is too, too much to know. As we accept that, the idea of mastery will lose its grip, We’ll stop feeling overloaded even though we’re confronted with exactly the same amount of information.

Now, I want to be careful because we’re here to congratulate you on having mastered your discipline. And grad school is a place where mastery still applies: in order to have a discipline — one that can talk with itself — institutions have to agree on a master-able set of ideas, knowledge, and skills that are required for your field. And that makes complete sense.

But, especially as the Internet becomes our dominant medium of ideas, knowledge, culture, and entertainment, we are all learning just how much there is that we don’t know and will never know.

And it’s not just the quantity of information that makes true mastery impossible in the Age of the Internet. It’s also what it’s doing to the domains we want to master — the topics and disciplines. In the Encyclopedia Britannica — remember that? — an article on a topic extends from the first word to the last, maybe with a few suggested “See also’s” at the end. The article’s job is to cover the topic in that one stretch of text. Wikipedia has different idea. At Wikipedia, the articles are often relatively short, but they typically have dozens or even hundreds of links. So rather than trying to get everything about, say, Shakespeare into a couple of thousand words, Wikipedia lets you click on links to other articles about what it mention — to Stratford-on-Avon, or iambic pentameter, or about the history of women in the theater. Shakespeare at Wikipedia, in other words, is a web of linked articles. Shakespeare on the Web is a web. And it seems to me that that webby structure actually is a more accurate reflection of the shape of knowledge: it’s an endless series of connected ideas and facts, limited by interest, not an article that starts here and ends there. In fact, I’d say that Shakespeare himself was a web, and so am I, and so are you.

But if topics and disciplines are webs, then they don’t have natural and clear edges. Where does the Shakespeare web end? Who decides if the article about, say, women in the theater is part of the Shakespeare web or not? These webs don’t have clearcut edges. But that means that we also can’t be nearly as clear about what it means to master Shakespeare. There’s always more. The very shape of the Web means we’re always in over our heads.

And just one more thing about these messy webs. They’re full of disagreement, contradiction, argument, differences in perspective. Just a few minutes on the Web reveals a fundamental truth: We don’t agree about anything. And we never will. My proof of that broad statement is all of human history. How do you master a field, even if you could define its edges, when the field doesn’t agree with itself?

So, the concept of mastery is tough in this Internet Age. But that’s just a more accurate reflection of the way it always was even if we couldn’t see it because we just didn’t have enough room to include every voice and every idea and every contradiction, and we didn’t have a way to link them so that you can go from one to another with the smallest possible motion of your hand: the shallow click of a mouse button.

The Internet has therefore revealed the truth of what the less confident among us already suspected: We’re all in over our heads. Forever. This isn’t a temporary swim in the deep end of the pool. Being in over our heads is the human condition.

The other side of this is that the world is far bigger, more complex, and more unfathomably interesting than our little brains can manage. If we can accept that, then we can happily be in over our heads forever…always a little worried that we really are supposed to know more than we do, but also, I hope, always willing to say that out loud. It’s the condition for learning from one another…

…And if the Internet has shown us how overwhelmed we are, it’s also teaching us how much we can learn from one another. In public. Acknowledging that we’re just humans, in a sea of endless possibility, within which we can flourish only in our shared and collaborative ignorance.

So, I know you’re prepared because I know the quality of the Simmons faculty, the vision of its leadership, and the dedication of its staff. I know the excellence of the education you’ve participated in. You’re ready to lead in your field. May that field always be about this high over your head — the depth at which learning occurs, curiosity is never satisfied, and we rely on one another’s knowledge, insight, and love.

Thank you.

7 Comments »

May 2, 2014

[2b2k] Digital Humanities: Ready for your 11AM debunking?

The New Republic continues to favor articles debunking claims that the Internet is bringing about profound changes. This time it’s an article on the digital humanities, titled “The Pseudo-Revolution,” by Adam Kirsch, a senior editor there. [This seems to be the article. Tip of the hat to Jose Afonso Furtado.]

I am not an expert in the digital humanities, but it’s clear to the people in the field who I know that the meaning of the term is not yet settled. Indeed, the nature and extent of the discipline is itself a main object of study of those in the discipline. This means the field tends to attract those who think that the rise of the digital is significant enough to warrant differentiating the digital humanities from the pre-digital humanities. The revolutionary tone that bothers Adam so much is a natural if not inevitable consequence of the sociology of how disciplines are established. That of course doesn’t mean he’s wrong to critique it.

But Adam is exercised not just by revolutionary tone but by what he perceives as an attempt to establish claims through the vehemence of one’s assertions. That is indeed something to watch out for. But I think it also betrays a tin-eared reading by Adam. Those assertions are being made in a context the authors I think properly assume readers understand: the digital humanities is not a done deal. The case has to be made for it as a discipline. At this stage, that means making provocative claims, proposing radical reinterpretations, and challenging traditional values. While I agree that this can lead to thoughtless triumphalist assumptions by the digital humanists, it also needs to be understood within its context. Adam calls it “ideological,” and I can see why. But making bold and even over-bold claims is how discourses at this stage proceed. You challenge the incumbents, and then you challenge your cohort to see how far you can go. That’s how the territory is explored. This discourse absolutely needs the incumbents to push back. In fact, the discourse is shaped by the assumption that the environment is adversarial and the beatings will arrive in short order. In this case, though, I think Adam has cherry-picked the most extreme and least plausible provocations in order to argue against the entire field, rather than against its overreaching. We can agree about some of the examples and some of the linguistic extensions, but that doesn’t dismiss the entire effort the way Adam seems to think it does.

It’s good to have Adam’s challenge. Because his is a long and thoughtful article, I’ll discuss the thematic problems with it that I think are the most important.

First, I believe he’s too eager to make his case, which is the same criticism he makes of the digital humanists. For example, when talking about the use of algorithmic tools, he talks at length about Franco Moretti‘s work, focusing on the essay “Style, Inc.: Reflections on 7,000 Titles.” Moretti used a computer to look for patterns in the titles of 7,000 novels published between 1740 and 1850, and discovered that they tended to get much shorter over time. “…Moretti shows that what changed was the function of the title itself.” As the market for novels got more crowded, the typical title went from being a summary of the contents to a “catchy, attention-grabbing advertisement for the book.” In addition, says Adam, Moretti discovered that sensationalistic novels tend to begin with “The” while “pioneering feminist novels” tended to begin with “A.” Moretti tenders an explanation, writing “What the article ‘says’ is that we are encountering all these figures for the first time.”

Adam concludes that while Moretti’s research is “as good a case for the usefulness of digital tools in the humanities as one can find” in any of the books under review, “its findings are not very exciting.” And, he says, you have to know which questions to ask the data, which requires being well-grounded in the humanities.

That you need to be well-grounded in the humanities to make meaningful use of digital tools is an important point. But here he seems to me to be arguing against a straw man. I have not encountered any digital humanists who suggest that we engage with our history and culture only algorithmically. I don’t profess expertise in the state of the digital humanities, so perhaps I’m wrong. But the digital humanists I know personally (including my friend Jeffrey Schnapp, a co-author of a book, Digital_Humanities, that Adam reviews) are in fact quite learned lovers of culture and history. If there is indeed an important branch of digital humanities that says we should entirely replace the study of the humanities with algorithms, then Adam’s criticism is trenchant…but I’d still want to hear from less extreme proponents of the field. In fact, in my limited experience, digital humanists are not trying to make the humanities safe for robots. They’re trying to increase our human engagement with and understanding of the humanities.

As to the point that algorithmic research can only “illustrate a truism rather than discovering a truth,” — a criticism he levels even more fiercely at the Ngram research described in the book Uncharted — it seems to me that Adam is missing an important point. If computers can now establish quantitatively the truth of what we have assumed to be true, that is no small thing. For example, the Ngram work has established not only that Jewish sources were dropped from German books during the Nazi era, but also the timing and extent of the erasure. This not only helps make the humanities more evidence-based —remember that Adam criticizes the digital humanists for their argument-by-assertion —but also opens the possibility of algorithmically discovering correlations that overturn assumptions or surprise us. One might argue that we therefore need to explore these new techniques more thoroughly, rather than dismissing them as adding nothing. (Indeed, the NY Times review of Uncharted discusses surprising discoveries made via Ngram research.)

Perhaps the biggest problem I have with Adam’s critique I’ve also had with some digital humanists. Adam thinks of the digital humanities as being about the digitizing of sources. He then dismisses that digitizing as useful but hardly revolutionary: “The translation of books into digital files, accessible on the Internet around the world, can be seen as just another practical tool…which facilitates but does not change the actual humanistic work of thinking and writing.”

First, that underplays the potential significance of making the works of culture and scholarship globally available.

Second, if you’re going to minimize the digitizing of books as merely the translation of ink into pixels, you miss what I think is the most important and transformative aspect of the digital humanities: the networking of knowledge and scholarship. Adam in fact acknowledges the networking of scholarship in a twisty couple of paragraphs. He quotes the following from the book Digital_Humanities:

The myth of the humanities as the terrain of the solitary genius…— a philosophical text, a definitive historical study, a paradigm-shifting work of literary criticism — is, of course, a myth. Genius does exist, but knowledge has always been produced and accessed in ways that are fundamentally distributed…

Adam responds by name-checking some paradigm-shifting works, and snidely adds “you can go to the library and check them out…” He then says that there’s no contradiction between paradigm-shifting works existing and the fact that “Scholarship is always a conversation…” I believe he is here completely agreeing with the passage he thinks he’s criticizing: genius is real; paradigm-shifting works exist; these works are not created by geniuses in isolation.

Then he adds what for me is a telling conclusion: “It’s not immediately clear why things should change just because the book is read on a screen rather than on a page.” Yes, that transposition doesn’t suggest changes any more worthy of research than the introduction of mass market paperbacks in the 1940s [source]. But if scholarship is a conversation, might moving those scholarly conversations themselves onto a global network raise some revolutionary possibilities, since that global network allows every connected person to read the scholarship and its objects, lets everyone comment, provides no natural mechanism for promoting any works or comments over any others, inherently assumes a hyperlinked rather than sequential structure of what’s written, makes it easier to share than to sequester works, is equally useful for non-literary media, makes it easier to transclude than to include so that works no longer have to rely on briefly summarizing the other works they talk about, makes differences and disagreements much more visible and easily navigable, enables multiple and simultaneous ordering of assembled works, makes it easier to include everything than to curate collections, preserves and perpetuates errors, is becoming ubiquitously available to those who can afford connection, turns the Digital Divide into a gradient while simultaneously increasing the damage done by being on the wrong side of that gradient, is reducing the ability of a discipline to patrol its edges, and a whole lot more.

It seems to me reasonable to think that it is worth exploring whether these new affordances, limitations, relationships and metaphors might transform the humanities in some fundamental ways. Digital humanities too often is taken simply as, and sometimes takes itself as, the application of computing tools to the humanities. But it should be (and for many, is) broad enough to encompass the implications of the networking of works, ideas and people.

I understand that Adam and others are trying to preserve the humanities from being abandoned and belittled by those who ought to be defending the traditional in the face of the latest. That is a vitally important role, for as a field struggling to establish itself digital humanities is prone to over-stating its case. (I have been known to do so myself.) But in my understanding, that assumes that digital humanists want to replace all traditional methods of study with computer algorithms. Does anyone?

Adam’s article is a brisk challenge, but in my opinion he argues too hard against his foe. The article becomes ideological, just as he claims the explanations, justifications and explorations offered by the digital humanists are.

More significantly, focusing only on the digitizing of works and ignoring the networking of their ideas and the people discussing those ideas, glosses over the locus of the most important changes occurring within the humanities. Insofar as the digital humanities focus on digitization instead of networking, I intend this as a criticism of that nascent discipline even more than as a criticism of Adam’s article.

4 Comments »

November 14, 2013

[2b2k] No more magic knowledge

I gave a talk at the EdTechTeacher iPad Summit this morning, and felt compelled to throw in an Angry Old Man slide about why iPads annoy me, especially as education devices. Here’s my List of Grievances:

  • Apple censors apps

  • iPads are designed for consumers. [This is false for these educators, however. They are using iPad apps to enable creativity.]

  • They are closed systems and thus lock users in

  • Apps generally don’t link out

That last point was the one that meant the most in the context of the talk, since I was stressing the social obligation we all have to add to the Commons of ideas, data, knowledge, arguments, discussion, etc.

I was sorry I brought the whole thing up, though. None of the points I raised is new, and this particular audience is using iPads in creative ways, to engage students, to let them explore in depth, to create, and to make learning mobile.

Nevertheless, as I was talking, I threw in one more: you can’t View Source the way you can in a browser. That is, browsers let you see the code beneath the surface. This capability means you can learn how to re-create what you like on pages you visit…although that’s true only to some extent these days. Nevertheless, the HTML code is right there for you. But not with apps.

Even though very few of us ever do peek beneath the hood — why would we? — the fact that we know there’s an openable hood changes things. It tells us that what we see on screen, no matter how slick, is the product of human hands. And that is the first lesson I’d like students to learn about knowledge: it often looks like something that’s handed to us finished and perfect, but it’s always something that we built together. And it’s all the cooler because of that.

There is no magic, just us humans as we move through history trying to make every mistake possible.

2 Comments »

November 7, 2013

What I want those Google barges to be

We now know that the Google barges are “interactive learning spaces.” That narrows the field. They’re not off-shore data centers or Google Glass stores. They’re also not where Google keeps the porn (as Seth Meyers reported) and they’re not filled with bubblewrap for people to step on, although that would be awesome.

So here’s my hope for what “interactive learning spaces” means: In your face, Apple Store!

Apple Stores manifest Apple’s leave-no-fingerprints consumerist ideal. Pure white, squeaky clean, and please do come try out the tools we’ve decided are appropriate for you inferior Earth creatures.

Google from the beginning has manifested itself as comfortable with the messy bustle of the Net, especially when the bustlers are hyper-geeky middle class Americans.

So, I’m hoping that the “interactive learning spaces” are places where you can not only get your email on a Chromebook keyboard, play a game on an Android tablet, and take a class in how to use Google Glass, but is a place where you can actually build stuff, learn from other “customers,” and hang out because the environment itself — not just the scheduled courses — is so stimulating and educational. Have hackathons there, let the community schedule classes and talks, make sure that Google engineers hang out there and maybe even some work there. Open bench everything!

That’s what I hope. I look forward to being disappointed.

1 Comment »

November 6, 2013

[2b2k] Is the Net shortcutting our kids out of learning?

I was invited to give a talk yesterday afternoon to the faculty at Brookline High School where all three of our children were educated, and that graduated my wife and both of her parents. Furthermore, the event was held in the Black Box, a performance space I watched our youngest child perform in many times. (Go T-Tones!) So, it was thrilling and quite intimidating, even though the new headmaster, Deb Holman [twitter: bhsheadmaster] could not be more welcoming and open.

There were some great (= hard) questions, and a lot of skepticism about my comments, but not all that much time to carry on a conversation. After most people left, a couple of teachers stayed to talk.

One said that she thoroughly disagrees with my generally positive characterization of the Internet. In her experience, it is where children go to get quick answers. Rather than provoking them and challenging them, the Net lets them get instant gratification, and shuts down their curiosity.

We talked for a while. Her experience certainly rings true. After all, I go to the Net for quick answers also, and if I had to write an assignment on, say, The Great Gatsby, and I wanted to finish it before The Walking Dead comes on, I’d be out on the Net. And I’d get it done much faster than in the old days when I’d have to go to the library.

I’m still not sure what to make of this phenomenon. Did the old library experience of looking things up in the card catalog or in the Periodical Index made me any more thoughtful than googling does now? In fact, I’m more likely to see more ideas and opinions on the Net than in a trip to the library. On the other hand, the convenience of the Net means that I can just look up some ideas rather than having to work through them myself; the Net is letting student short-circuit the process of forming ideas. Perhaps the old difficulty of accessing materials added friction that usefully slowed down thought. I don’t know. I don’t feel that way about my own experience, but I am not a high school student, and I’m pretty self-deluding to begin with.

Anyway, that’s pretty much the issue the second teacher brought up after the talk. Keep in mind that BHS has an extraordinary set of teachers, always caring and frequently quite inspiring. She is in the School Within a School, which is more loosely structured than the rest of BHS. When she gives writing assignments, she tells her students to come up with an idea that will surprise her, and to express it in their own voice. Very cool.

Her concern is that jangle of the Net keeps students from mulling over ideas. Thought comes from a private and individual place, she believes, and students need that stillness and aloneness.

I can’t disagree with her. I want students to understand — to experience — the value of solitude and quiet, and to have internalized enough information that they can have it at hand to play with and synthesize. And yet…

..I’m not convinced that private thought is realest thought. I know that who I am when I’m alone doesn’t feel more real than when I am with others, and in many ways feels less authentic; I’ve written before about the inner narrator who accompanies me when I visit someplace new alone, making me feel more crazy than authentic. In a similar way, I’m not ready to accept that private thinking is the best thinking or the most authentic thinking. It has its place, of course, but personally (data point of one!) I think best when engaged with others, or when I’m writing while imagining my words engaging with others.

We have, it seems to me, overvalued private thinking, which is certainly not to say that it has no value. We have likewise undervalued social thinking. But now We think in public, out loud, with others. Most of our public engagements of course are not particularly deep or thoughtful in any normal use of the term. That’s why we need to be educating our children to appreciate thinking out loud with others, and teaching them how to do it. It’s in these public multi-way discussions that ideas and knowledge develop.

While there are many ways in which public thinking can go wrong, it has the advantage of revealing the mechanisms of knowledge in all their fallibility. We are still carrying over the cultural wish for black box authorities whom we can trust simply because they were the ones who said it. We need to steer our children away from that wish for inhuman knowledge, and thus toward recognizing how ideas and knowledge actually develop. Public thinking does that. At least it should. And it will do it more if our children learn to always wonder how knowledge has been brought forward. Especially when the ideas seem so obvious.

This is one reason I find the “flipped classroom” idea so interesting. (Good discussion of this yesterday on On Point.) I was asked yesterday what I’d like BHS to do if I could have it do anything. I answered rather badly, but part of it would have to be that students learn how to engage with one another socially so that they build knowledge together, and this knowledge tolerates disagreement, is assumed to be public, and is aware of itself as a product of social engagement. Of course that happens already in classrooms — and more so (presumably) in flipped classrooms — but we should be preparing our students for doing this virtually as well as in real space because the “real” discussions will increasingly be online where there is a wealth of sources to draw upon and to argue about.

But it’s hard to see how we get there so long as we continue to assign papers and reports as the primary type of knowledge artifact, isn’t it? (I’m not even going to mention standardized testing.) Doing so implicitly tells students that knowing is what you do alone: foraging sources, coming back with useful bits, and then engaging in an internal thought process that renders them into one of the conventional written forms. In that frame, the Net looks like an uncurated library, overflowing with lies, studded with occasional truths.

Instead, students could be required to explore a topic together, in public (or at least in the protected public of their class), discussing, arguing, joking, and evaluating one another’s sources. In that frame, the Net looks like a set of discussions, not an information resource at the end of the Information Highway. After all, kids don’t come into a class interested in The Great Gatsby. The teacher will help them to see what’s interesting about the novel, which is crucial and not easy to do. But primarily we get interested in things through one another. My interest steers yours, and yours amplifies mine. Our interest in The Great Gatsby is mediated and amplified by our interest in one another. We make the world interesting together. The Net does this all the time. Papers and reports rarely do.In their pursuit of demonstrating mastery, they too often drive the interest right out of the topic — less so at a wonderful school like BHS where teachers ask students to write in their own voice and come up with ideas that surprise them both.

Anyway, I came out of the session very stimulated, very thankful that so many of my relatives had the great good luck to attend that institution, and ever thankful to our teachers.

4 Comments »

May 15, 2013

[meshcon] Ryan Carson of Treehouse

Ryan Carson [twitter:RyanCarson] of Treehouse at the Mesh Conference is keynoting the Mesh Conference. He begins his introduction of himself by saying he is a father, which I appreciate. Treehouse is an “online education company that teaches technology. We hope we can remove the need to go to university to do technology.”

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

Treehouse “treasures personal time.” They work a 4-day week, 8 hours a day, although they pay for a full 40-hour week. He asks how many people in the audience work for themselves or run their own company; half the people raise their hands. “We have a fundamental belief that people can work smarter, and thus faster…We use a lot of tools that decrease drag.” E.g., they have an internal version of Reddit called “Convoy.” It keeps conversation out of email. “We ask people to never put anything in email that isn’t actionable.” A 4 day week also makes recruiting easy.

“As a father, I realize I’m going to die, sooner rather than later. If I work four days a week, I can send 50% more of my life with my wife and kids.”

Q: Why not a 3 day week?

A: It’s a flag to say “We believe personal time is important.” We’ll do whatever we have to. I’ve told people not to send email over the weekend because it makes work for others.

Q: How about flex time instead?

A: We have tried that, and we let people work from home. “People are smart and motivated and want to succeed. We presume that about people.” We’re demanding, and we’ll fire people if they don’t perform. But you have to institute practices, and not just say that you believe in personal time.

Q: Do you have investors? How do they respond?

A: We have $12M in investment. But we didn’t raise money until after we were profitable. I used my experience running 3 prior companies to give investors confidence. And no one asked about the 4 day week. It doesn’t seem to matter to them. My prior company was an events company and it got bought by a company that worked 5 days a week, and it was messy. I think our team there is now working 5 days.

Q: How do you provide 7 day a week support?

A: Our support team time shifts.

Q: How do you control email so that it’s only actionable?

A: It’s a policy. Also, we use Boomerang which lets us schedule when email is sent.

Now Ryan talks about the tools they use to facilitate a distributed team: about 30 people in Orlando, 8 in Portland, and the rest are distributed in the US and UK. “We don’t have a headquarters.” We are an Internet company. We use Convoy: part water cooler, part news distribution. Notes from meetings go there. It took a dev about a day to create Convoy.

We also use Campfire, a chat program. And Trello for task management. And Google Hangouts. (He notes that you have to be wired, not wifi, and have good gear, for Hangouts to work well.)

Q: Do you have to work over the weekend when there’s a hard deadline? And do you put more of an emphasis on planning?

A: Yes, we sometimes have worked over the weekend. And we’ve sometimes had a problem with people working too much. I think some people work without telling us, especially developers and designers. But if they have to work, their managers have failed. And it does mean we have to plan carefully.

Q: What are your annual meetups like?

A: It’s a full week. No agenda, no working. Pure get drunk, have fun. People work much harder if they like each other and believe in each other.

Now on education. By 2020, there will be 1,000,000 jobs in tech than students. Nine out of ten high schools don’t even offer computer programming classes. [Really? Apparently so. Wow.] Treehouse tries to address this, along with Udacity, CodeAcademy, Code School. In a video, Ryan says that Treehouse will cost you about $300 for an entire course of tech education, making you ready to enter the workforce. “The education system is a racket. Universities have milked us dry for ten years.” 40% of jobs in STEM are in computer science, but only 2% of STEM students are studying it. “In 41 out of 50 states coding classes don’t count toward high school graduation math or science requirements.” “In the future, most students won’t get a four year degree, and I think that’s a good thing. We are moving toward a trade school model.”

Q: Many companies use college degrees as a filter. How do you filter?

A: In 5 yrs there won’t be enough graduates for you to hire anyone because Google and FB will pay them $500,000/year. At Treehouse we apply points. You can see someone’s skills.

Q: What will people miss out on if they don’t go to college?

A: People will miss out on the social aspect, but people can’t afford to go into debt for that. College as the next step is a new idea in the past 15 years. [Really?] You’ll have free liberal arts education available through free online courses. You’ll pay for trade school training. “We’ll just have to have faith that people can be responsible adults without going to university.”

Q: How do you help people who complete your courses find job?

A: We’re rolling out an entire department for this. As you learn on Treehouse, you get points and start to establish your rank. Employers will be able to search our database saying, e.g., “I want someone with over 1,000 points in CSS, 800 points in Javascript, and 500 points in business.”

Q: How are you going to mesh these ideas into traditional education?

A: Sub-par universities will die. Education will be completely different in 10 years. We don’t know what it will be.

Ryan says that he’s not doing this for the money. “People who need education can’t afford it.”

[Judy Lee tweeted that Ryan should have asked us how many in the audience have a university degree, and how many of us regret it. Nice.]

3 Comments »

March 28, 2013

[annotation][2b2k] Phil Desenne on Harvard annotation tools

Phil Desenne begins with a brief history of annotation tools at Harvard. There are a lot, for annotating from everything to texts to scrolls to music scores to video. Most of them are collaborative tools. The collaborative tool has gone from Adobe AIR to Harvard iSites, to open source HTML 5. “It’s been a wonderful experience.” It’s been picked up by groups in Mexico, South America and Europe.

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

Phil works on edX. “We’re beginning to introduce annotation into edX.” It’s being used to encourage close reading. “It’s the beginning of a new way of thinking about teaching and assessing students.” Students tag the text, which “is the beginning of a semantic tagging system…Eventually we want to create a semantic ontology.”

What are the implications for the “MOOC Generation”? MOOC students are out finding information anywhere they can. They stick within a single learning management system (LMS). LMS’s usually have commentary tools “but none of them talk with one another . Even within the same LMS you don’t have cross-referencing of the content.” We should have an interoperable layer that rides on top of the LMS’s.

Within edX, there are discussions within classes, courses, tutorials, etc. These should be aggregated so that the conversations can reach across the entire space, and, of course, outside of it. edX is now working on annotation systems that will do this. E.g., imagine being able to discuss a particular image or fragments of videos, and being able to insert images into streams of commentary. Plus analytics of these interations. Heatmaps of activity. And a student should be able to aggregate all her notes, journal-like, so they can be exported, saved, and commented on, “We’re talking about a persistent annotation layer with API access.” “We want to go there.”

For this we need stable repositories. They’ll use URNs.

Be the first to comment »

March 6, 2013

[2b2k] Cliff Lynch on preserving the ever-expanding scholarly record

Cliff Lynch is giving talk this morning to the extended Harvard Library community on information stewardship. Cliff leads the Coalition for Networked Information, a project of the Association of Research Libraries and Educause, that is “concerned with the intelligent uses of information technology and networked information to enhance scholarship and intellectual life.” Cliff is helping the Harvard Library with the formulation of a set of information stewardship principles. Originally he was working with IT and the Harvard Library on principles, services, and initial projects related to digital information management. Given that his draft set of principles are broader than digital asset management, Cliff has been asked to address the larger community (says Mary Lee Kennedy).

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

Cliff begins by saying that the principles he’s drafted are for discussion; how they apply to any particular institution is always a policy issue, with resource implications, that needs to be discussed. He says he’ll walk us through these principles, beginning with some concepts that underpin them.

When it comes to information stewardship, “university community” should include grad students whose research materials the university supports and maintains. Undergrads, too, to some extent. The presence of a medical school here also extends and smudges the boundaries.

Cliff then raises the policy question of the relation of the alumni to the university. There are practical reasons to keep the alumni involved, but particularly for grads of the professional schools, access to materials can be crucial.

He says he uses “scholarly record” for human-created things that convey scholarly ideas across time and space: books, journals, audio, web sites, etc. “This is getting more complicated and more diverse as time goes on.” E.g., author’s software can be part of that record. And there is a growing set of data, experimental records, etc., that are becoming part of the scholarly record.

Research libraries need to be concerned about things that support scholarship but are not usually considered part of the historical record. E.g., newspapers, popular novels, movies. These give insight into the scholarly work. There are also datasets that are part of the evidentiary record, e.g., data about the Earth gathered from sensors. “It’s so hard to figure out when enough is enough.” But as more of it goes digital, it requires new strategies for acquisition, curation and access. “What are the analogs of historical newspapers for the 21st century?” he asks. They are likely to be databases from corporations that may merge and die and that have “variable and often haphazard policies about how they maintain those databases.” We need to be thinking about how to ensure that data’s continued availability.

Provision of access: Part of that is being able to discover things. This shouldn’t require knowing which Harvard-specific access mechanism to come to. “We need to take a broad view of access” so that things can be found through the “key discovery mechanisms of the day,” beyond the institution’s. (He namechecks the Digital Public Library of America.)

And access isn’t just for “the relatively low-bandwidth human reader.” [API's, platforms and linked data, etc., I assume.]

Maintaining a record of the scholarly work that the community does is a core mission of the university. So, he says, in his report he’s used the vocabulary of obligation; that is for discussion.

The 5 principles

1. The scholarly output of the community should be captured, preserved, organized, and made accessible. This should include the evidence that underlies that output. E.g., the experimental data that underlies a paper should be preserved. This takes us beyond digital data to things like specimens and cell lines, and requires including museums and other partners. (Congress is beginning to delve into this, Cliff notes, especially with regard to preserving the evidence that enables experiments to be replicated.)

The university is not alone in addressing these needs.

2. A university has the obligation to provide its community with the best possible access to the overall scholarly record. This is something to be done in partnership with research libraries aaround the world. But Harvard has a “leadership role to play.”

Here we need to think about providing alumni with continued access to the scholarly record. We train students and then send them out into the world and cut off their access. “In many cases, they’re just out of luck. There seems to be something really wrong there.”

Beyond the scholarly record, there are issues about providing access to the cultural record and sources. No institution alone can do this. “There’s a rich set of partnerships” to be formed. It used to be easier to get that cultural record by buying it from book jobbers, DVD suppliers, etc. Now it’s data with differing license terms and subscription limitations. A lot out of it’s out on the public Web. “We’re all hoping that the Internet Archive will do a good job,” but most of our institutions of higher learning aren’t contributing to that effort. Some research libraries are creating interesting partnerships with faculty, collecting particular parts of the Web in support of particular research interests. “Those are signposts toward a future where the engagement to collect and preserve the cultural records scholar need is going to get much more complex” and require much more positive outreach by libraries, and much more discussion with the community (and the faculty in particular) about which elements are going to be important to preserve.

“Absolutely the desirable thing is share these collections broadly,” as broadly as possible.

3. “The time has come to recognize that good stewardship means creating digital records of physical objects” in order to preserve them and make them accessible. They should be stored away from the physical objects.

4. A lot goes on here in addition to faculty research. People come through putting on performances, talks, colloquia. “You need a strategy to preserve these and get them out there.”

“The stakes are getting much higher” when it comes to archives. The materials are not just papers and graphs. They include old computers and storage materials, “a microcosm of all of the horrible consumer recording technology of the 20th century,” e.g., 8mm film, Sony Betamax, etc.

We also need to think about what to archive of the classroom. We don’t have to capture every calculus discussion section, but you want to get enough to give a sense of what went on in the courses. The documentation of teaching and learning is undergoing a tremendous change. The new classroom tech and MOOCs are creating lots of data, much of it personally identifiable. “Most institutions have little or no policies around who gets to see it, how long they keep it, what sort of informed consent they need from students.” It’s important data and very sensitive data. Policy and stewardship discussions are need. There are also record management issues.

5. We know that scholarly communication is…being transformed (not as fast as some of us would like â?? online scientific journals often look like paper versions) by the affordances of digital technology. “Create an ongoing partnership with the community and with other institutions to extend and broaden the way scholarly communication happens. The institutional role is terribly important in this. We need to find the balances between innovation and sustainability.

Q&A

Q: Providing alumni with remote access is expensive. Harvard has about 100,000 living alumni, which includes people who spent one semester here. What sort of obligation does a university have to someone who, for example, spent a single semester here?

A: It’s something to be worked out. You can define alumnus as someone who has gotten a degree. You may ask for a co-payment. At some institutions, active members of the alumni association get some level of access. Also, grads of different schools may get access to different materials. Also, the most expensive items are typically those for which there are a commercial market. For example, professional grade resources for the financial industry probably won’t allow licensing to alumni because it would cannibalize their market. On the other hand, it’s probably not expensive to make JSTOR available to alumni.

Q: [robert darnton] Very helpful. We’re working on all 5 principles at Harvard. But there is a fundamental problem: we have to advance simultaneously on the digital and analog fronts. More printed books are published each year, and the output of the digital increases even faster. The pressures on our budget are enormous. What do you recommend as a strategy? And do you think Harvard has a special responsibility since our library is so much bigger, except for the Library of Congress? Smaller lilbraries can rely on Hathi etc. to acquire works.

A: “Those are really tough questions.” [audience laughs] It’s a large task but a finite one. Calculating how much money would take an institution how far “is a really good opportunity for fund raising.” Put in place measures that talk about the percentage of the collection that’s available, rather than a raw number of images. But, we are in a bad situation: continuing growth of traditional media (e.g., books), enormous expansion of digital resources. “My sense is…that for Harvard to be able to navigate this, it’s going to have to get more interdependent with other research libraries.” It’s ironic, because Harvard has been willing to shoulder enormous responsibility, and so has become a resource for other libraries. “It’s made life easier for a lot of the other research libraries” because they know Harvard will cover around the margins. “I’m afraid you may have to do that a little more for your scholars, and we are going to see more interdependence in the system. It’s unavoidable given the scope of the challenge.” “You need to be able to demonstrate that by becoming more interdependent, you’re getting more back than you’re giving up.” It’s a hard core problem, and “the institutional traditions make the challenge here unique.”

1 Comment »

February 5, 2013

[berkman] Diana Kimball: Coding as a Liberal Art

Diana Kimball [twitter:dianakimball] is giving a Berkman lunchtime talk on coding as a liberal art. She’s a Berkman Fellow and at the Harvard Business School. (Here are some of her posts on this topic.)

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

She says that she’s loved computers since she was a kid. But when she went to Harvard as an undergrad she decided to study history, in part because there’s a natural specialization that happens in college: the students who come in as coders are fantastic at coding, whereas Diana had greater strengths as a writer of prose. She found HTML and programming intimidating. But in her third year, she got interested in coding and Internet culture. She was one of the founders of ROFLcon [yay!]. She got hired by Microsoft after college, as a technical product manager with the Powerpoint team in Silicon Valley. “This was culture shock in the best possible way.”

When she graduated in 2009, she and some friends started found the SnarkMarket blog that considers what the new liberal arts might be (inspired by Kottke). She wrote an essay that’s a proposal for coding and decoding. She reads it. (It’s short.) An excerpt:

Coding and Decoding is about all modes of communication, and all are in its view. But it is built with particular attention to the future, and what that future will be like. Technological experts can seem like magicians, conjuring effects wordlessly. By approaching that magic as a collection of component parts instead of an indivisible miracle, we can learn to see through these sleights of typing hands. In seeing through, we will learn to perform them ourselves; and think, as magicians, about the worlds we will build.

Language, now, is about more than communication. It is the architecture behind much of what we experience. Understanding that architecture will allow us to experience more.

Her boyfriend taught her how to code. They spent a lot of time on it. “He picked up on something I’d said and took it seriously.” After two years at Microsoft, she was enthusiastic, but still a beginner. It wasn’t until she started at Harvard Business School that coding really took off for her. The entrepreneurial atmosphere encouraged her to just do it. Plus, she was more of a geek than most of the other students. “This was great for my identity, and for my confidence.” She also found it a social refuge. “It takes a lot of time to get over the hump.” She refers to Ellen Ullman’s “Close to the Machine” that talks about the utility of being arrogant enough to obsess over a project, cycling back to humility.

She decided to code up her own site for a project for school, even though the team had been given the money to hire devs for the task. Last fall she took the famous CS50 course [Harry Lewis, who created the course in about 1981, is sitting next to me.] CS50 teaches C, targeted at people who are either taking only that one class, or are going to take many more. For her final project, she did a project that used multiple APIs that she was very proud of. She’s also proud of her Ruby projects folder. Each project is something she was trying to teach herself. She’s more proud of the list than the finished products.

“Learning to code means reclaiming patience and persistence and making them your stubborn own.” [nice]

Ideally, everyone should be exposed to programming, starting at 5 yrs old, or even earlier, Diana says. Seymore Papert’s “Mind-Storms” has greatly influenced her thinking about how coding fits into education and citizenship. At a university, it ought to be taken as a liberal art. She quotes Wikipedia’s definition. And if “grammar, rhetoric, and logic were the core of the liberal arts,” then that’s sound like coding. [Hmm.] What the law was to the liberal arts, programming ought to be, i.e., that which you try if you don’t know what else to do with your liberal arts degree.

Why isn’t it seen that way? When computer scientists teach you, they teach they way they learned: at school. But many of the best programmers are self-taught. CS50 does give a variety of assignments, but it’d be better if students solved their own problems much earlier.

But the number one problem is the academic attitude, she says. Students get fixated on the grade, even when it doesn’t matter. Coding is critical for children because debugging is part of it, as Papert says. But grades are based on the endpoints. Coding is much more like life: You’re never done, you can always make it better.

Diana has a proposal. Suppose coding classes were taught like creative writing workshops. Take it whenever you’re ready. Taught by hackers, esepcially autodidacts. It’d vary in substance — algorithms, apis, etc. — and you’d get to choose. You’d get to see something on screen that you’d never seen before And you’d be evaluated on ingenuity and persistence, rather than only on how well your code runs.

She says what her syllabus would look like:

“Coding should be taught in the same breath as expository writing… Everyone deserves to be exposed to it.” She’s not sure if it should be required.

She quotes Papert: “…the most powerful idea of all is the idea of powerful ideas.” There’s no better example of this, she says, than open source software. And David Foster Wallace’s commencement address: “Learning how to think really means learning to exercise some control over how and what you think…If you cannot exercise this sort of choice in adult life, you will be totally hosed.” Diana says that’s her. She was wrapped up in writing from an early age. She has a running internal commentary. [Join the club!] Coding swaps in a different monologue, one in which she’s inventing thing. That’s the greatest gift: her internal monologue is much more useful and interesting. “If you wanted to be a novelist in 1900, you’d want to be a programmer today.” The experience of creating something that people use is so motivating.

Q&A

Q: Would you be willing to webcast yourself programming and let people join in? I do this all the time when at hackathons. I think, OMG, there must be 10,000 kids in India who want to be here. And so here they are. “Hackers at Berkeley” does this really well.

A: That’s awesome. I want more people to have more access to that experience of sharing.

Q: Are you familiar with RailsBridge — non-computer scientists who are teaching themselves how to code via weekend workshops.

A: RailsBridge is extraordinary. It’s great to see this happening outside of the university context.

A: [me] Great talk, and I’m a humanities major who spends most of his hobby time programming. But aren’t you recommending the thing that you happen to love? And programming as opposed to traditional logic is an arbitrary set of rules…

Q: Yes, but it would be really useful if more people loved it. We could frame it in a way that is exciting for humanities majors. I’m proposing an idea rather than making an airtight argument. “You’re basically right but I don’t really care” (she says laughing :).

Q: I like your idea of teaching it like a writers workshop so that it doesn’t turn into just another course. But I’m not sure that colleges are the best at doing that.

A: not everyone loves programming.

Q: [harry lewis] I take responsibility for eliminating the Harvard requirement for a programming course. Also, take a look at code.org. Third, the academic world treats computer science the way it does because of our disciplinary specialization. That label — computer science — came about in the context of fields like political science, and arose when computers were used not for posting web sites but for putting people on the Moon where a bug could kill someone. The fact that CompSci exists in academic departments will make it very difficult for your vision of computing to exist, just as creative writing is often an uneasy fit into English curricula.

A: That’s very fair. I know it’d be hard. RIT has separate depts for CompSci and coding.

Q: There’s an emergent exploration of coding in Arts schools, with a much more nimble, plug and play approach, very similar to the one you describe. My question: What do the liberal arts have to offer coding? Much of coding is quite new, e.g., open source. These could be understood within a historical context. Maybe these need to be nurtured, explored, broken. Does seeing coding as a liberal art have something to offer sw development?

A: ITP is maybe the best example of artists working with coders. Liberal Arts can teach programmers so much!

Q: Can we celebrate failure? That’d be a crucial part of any coding workshop.

A: Yes! Maybe “find the most interesting bug” and reward introspection about where you’ve gone wrong. But it’s hard in a class like CS50 where you’re evaluating results.

Q: This is known as egoless programming. It’s 40 years old, from Weinberger [no relation].

Q: You’re making a deeper point, which is not just about coding. The important thing is not the knowledge you get, but the way you get there. Being self-reflective about you came about how you learn. You can do this with code but with anything.

A: You’re so right. Introspection about the meta-level of learning is not naturally part of a course. But Ruby is an introspective language: you can ask any object what it is, and it will tell you. This is a great mirror for trying to know yourself better.

Q: What would you pick to teach?

A: I love Ruby. It would be a good choice because there’s a supportive community so students can learn on their own afterwards, and it’s an introspective language. And the lack of ornament in Ruby (no curly braces and little punctuation) makes it much more like English. The logic is much more visible. (My preference is Sinatra, not Rails.)

Q: What sort of time commitment the average person would have to put in to have a basic grasp of a programming language? Adults vs. children learning it?

A: I’d love to see research on this. [Audience: Rottmeyers, CMU (?)] A friend of mine reported he spent 20 hours. The learning curve is very halting at first. It’s hard to teach yourself. It helps to have a supportive in-person environment. CS50 is a 10-20 hour commitment/week and who has that sort of time except for fulltime students? To teach yourself, start out a few hours a time.

Q: How about where the MOOCs are going? Can you do a massively online course in compSci that would capture some of what you’re talking about?

A: The field is so focused on efficiency that MOOCs seem like an obvious idea. I think that a small workshop is the right way to start. CS50 requires so much fear of failure and resilience that it wouldn’t have been a good way for me to start. At CS50, you can’t let others read your code.

Q: We shouldn’t put together Computer Science and programming. Programming is just a layer of expression on top of computer science. You don’t need compSci to become a programmer. And the Net is the new computer; we’re gluing together services from across the Net. That will change how people think about programming because eveyrone will be able to do it. The first language everyone should learn is ifttt.com

Q: I’m a NY Times journalist. I love languages. And I love the analogy you draw. I’m 30. Do you think coding is really essential? Would it open my eyes as a journalist?

A: It’s never too late. If you keep asking the question, you should probably do it. You don’t have to be good at it to get a lot out of it. It’s so cool that your children are learning multiple languages including coding. Learn alongside them.

8 Comments »

November 16, 2012

[2b2k] MOOCs as networks

Siva Vaidhyanathan [twitter: sivavaid] has a really well-done (as usual) article that reminds us that for all the excitement about Massive Open Online Courses — which he shares — we still have to figure out how to do them right. There are lots of ways to go wrong. (And I should hear note that I’m posting this in order to: (1) recommend Siva’s article, and (2) make an obvious point about MOOCs. Feel free to stop here.)

The fundamental issue, of course, is that real-world ed doesn’t scale very well. The largest classes in the real world are in the hundreds (oh, maybe some school has a course with thousands), and those classes are generally not held up as paradigms of Western ed. Further, traditional ed doesn’t scale in the sense that not everyone gets to go to college.

So, now we have a means for letting classes get very big indeed. Hundreds of thousands. Put in the terms of Too Big to Know, the question is: how do you make that enormous digital classroom smarter than the individuals in it? 2B2K’s answer (such as it is) is that you make a room smart by enabling its inhabitants to create a knowledge network.

  • Such a network would at a minimum connect all the participants laterally, as well as involving the teacher

  • It would encourage discussion of course topics, but be pleased about discussions that go off topic and engage students socially.

  • It would enable the natural experts and leaders among the students to emerge.

  • It would encourage links within and outside of the course network.

  • This network would enable students to do their work online and together, and make those processes and their traces fully available to the public.

  • All the linking, discussions, answered questions, etc., would be fed back into the system, making it available to everyone. (This assumes there are interactions that produce metadata about which contributions are particularly useful.)

  • It would encourage (via software, norms, and evaluations) useful disagreements and differences. It doesn’t always try to get everyone onto exactly the same page. Among other things, this means tolerating — appreciating and linking to — local differences among the students.

  • It would build upon the success of existing social tools, such as liking, thumbs upping, following…

  • Students would be encouraged to collaborate, rather than being evaluated only as individual participants.

  • The learning process would result in a site that has continuing value to the next students taking the course and to the world.

I’m not trying to present a Formula for Success, because I have no idea what will actually work or how to implement any ideas. Fortunately, there are tons of really smart people working on this now, with a genuine spirit of innovation. All I’m really saying is something obvious: to enable education to scale so that MOOCs don’t become what no one wants them to be — cyber lecture halls — it’s useful to think about the “classroom” as a network.

13 Comments »

Next Page »