May 8, 2014
May 8, 2014
March 15, 2014
I just posted at Medium.com about why it’s important to remember the difference between the Net and the Web. Here’s the beginning:
A note to NPR and other media that have been reporting on “the 25th anniversary of the Internet”: NO, IT’S NOT. It’s the 25th anniversary of the Web. The Internet is way older than that. And the difference matters.
The Internet is a set of protocols?—?agreements?—?about how information will be sliced up, sent over whatever media the inter-networked networks use, and reassembled when it gets there. The World Wide Web uses the Internet to move information around. The Internet by itself doesn’t know or care about Web pages, browsers, or the hyperlinks we’ve come to love. Rather, the Internet enables things like the World Wide Web, email, Skype, and much much more to be specified and made real. By analogy, the Internet is like an operating system, and the Web, Skype, and email are like applications that run on top of it.
This is not a technical quibble. The difference between the Internet and the Web matters more than ever for at least two reasons.
January 12, 2014
William McGeveran [twitter:BillMcGev] has written an article for University of Minnesota Law School that suggests how to make “frictionless sharing” well-behaved. He defines frictionless sharing as “disclosing “individuals’ activities automatically, rather than waiting for them to authorize a particular disclosure.” For example:
Bill’s article considers the pros and cons:
Bill is not trying to build walls. “The key to online disclosures … turns out to be the correct amount of friction, not its elimination.” To assess what constitutes “the correct amount” he offers an heuristic, which I am happy to call McGeveran’s Law of Friction: “It should not be easier to ‘share’ an action online than to do it.” (Bill does not suggest naming the law after him! He is a modest fellow.)
One of the problems with the unintentional sharing of information are “misclosures,” a term he attributes to Kelly Caine.
Not only does this reveal, say, that you’ve been watching Yoga for Health: Depression and Gastrointestinal Problems (to use an example from Sen. Franken that Bill cites), it reveals that fact to your most intimate friends and family. (In my case, the relevant example would be The Amazing Race, by far the worst TV I watch, but I only do it when I’m looking for background noise while doing something else. I swear!) Worse, says Bill, “preference falsification” — our desire to have our known preferences support our social image — can alter our tastes, leading to more conformity and less diversity in our media diets.
Bill points to other problems with making social sharing frictionless, including reducing the quality of information that scrolls past us, turning what could be a useful set of recommendations from friends into little more than spam: “…friends who choose to look at an article because I glanced at it for 15 seconds probably do not discover hidden gems as a result.”
Bill’s aim is to protect the value of intentionally shared information; he is not a hoarder. McGeveran’s Law thus tries to add in enough friction that sharing is intentional, but not so much that it gets in the way of that intention. For example, he asks us to imagine Netflix presenting the user with two buttons: “Play” and “Play and Share.” Sharing thus would require exactly as much work as playing, thus satisfying McGeveran’s Law. But having only a “Play” button that then automatically shares the fact that you just watched Dumb and Dumberer distinctly fails the Law because it does not “secure genuine consent.” As Bill points out, his Law of Friction is tied to the technology in use, and thus is flexible enough to be useful even as the technology and its user interfaces change.
I like it.
Categories: culture, policy Tagged with: privacy • programming the social
Date: January 12th, 2014 dw
November 17, 2013
Noam Chomsky and Barton Gellman were interviewed at the Engaging Big Data conference put on by MIT’s Senseable City Lab on Nov. 15. When Prof. Chomsky was asked what we can do about government surveillance, he reiterated his earlier call for us to understand the NSA surveillance scandal within an historical context that shows that governments always use technology for their own worst purposes. According to my liveblogging (= inaccurate, paraphrased) notes, Prof. Chomsky said:
I was glad that Barton Gellman — hardly an NSA apologist — called Prof. Chomsky on his lumping of the NSA with the Stasi, for there is simply no comparison between the freedom we have in the US and the thuggish repression omnipresent in East Germany. But I was still bothered, albeit by a much smaller point. I have no serious quarrel with Prof. Chomsky’s points that government incursions on rights are nothing new, and that governments generally (always?) believe they are acting for the best of purposes. I am a little bit hung-up, however, on his equivocating on “information.”
Prof. Chomsky is of course right in his implied definition of information. (He is Noam Chomsky, after all, and knows a little more about the topic than I do.) Modern information is often described as a measure of surprise. A string of 100 alternating ones and zeroes conveys less information than a string of 100 bits that are less predictable, for if you can predict with certainty what the next bit will be, then you don’t learn anything from that bit; it carries no information. Information theory lets us quantify how much information is conveyed by streams of varying predictability.
So, when U.S. security folks say they are spying on us for our own security, are they saying literally nothing? Is that claim without meaning? Only in the technical sense of information. It is, in fact, quite meaningful, even if quite predictable, in the ordinary sense of the term “information.”
First, Prof. Chomsky’s point that governments do bad things while thinking they’re doing good is an important reminder to examine our own assumptions. Even the bad guys think they’re the good guys.
Second, I disagree with Prof. Chomsky’s generalization that governments always justify surveillance in the name of security. For example, governments sometimes record traffic (including the movement of identifiable cars through toll stations) with the justification that the information will be used to ease congestion. Tracking the position of mobile phones has been justified as necessary for providing swift EMT responses. Governments require us to fill out detailed reports on our personal finances every year on the grounds that they need to tax us fairly. Our government hires a fleet of people every ten years to visit us where we live in order to compile a census. These are all forms of surveillance, but in none of these cases is security given as the justification. And if you want to say that these other forms don’t count, I suspect it’s because it’s not surveillance done in the name of security…which is my point.
Third, governments rarely cite security as the justification without specifying what the population is being secured against; as Prof. Chomsky agrees, that’s an inherent part of the fear-mongering required to get us to accept being spied upon. So governments proclaim over and over what threatens our security: Spies in our midst? Civil unrest? Traitorous classes of people? Illegal aliens? Muggers and murderers? Terrorists? Thus, the security claim isn’t made on its own. It’s made with specific threats in mind, which makes the claim less predictable — and thus more informational — than Prof. Chomsky says.
So, I disagree with Prof. Chomsky’s argument that a government that justifies spying on the grounds of security is literally saying something without meaning. Even if it were entirely predictable that governments will always respond “Because security” when asked to justify surveillance — and my second point disputes that — we wouldn’t treat the response as meaningless but as requiring a follow-up question. And even if the government just kept repeating the word “Security” in response to all our questions, that very act would carry meaning as well, like a doctor who won’t tell you what a shot is for beyond saying “It’s to keep you healthy.” The lack of meaning in the Information Theory sense doesn’t carry into the realm in which people and their public officials engage in discourse.
Here’s an analogy. Prof. Chomsky’s argument is saying, “When a government justifies creating medical programs for health, what they’re saying is meaningless. They always say that! The Nazis said the same thing when they were sterilizing ‘inferiors,’ and Medieval physicians engaged in barbarous [barber-ous, actually – heyo!] practices in the name of health.” Such reasoning would rule out a discussion of whether current government-sponsored medical programs actually promote health. But that is just the sort of conversation we need to have now about the NSA.
Prof. Chomsky’s repeated appeals to history in this interview covers up exactly what we need to be discussing. Yes, both the NSA and the Stasi claimed security as their justification for spying. But far from that claim being meaningless, it calls for a careful analysis of the claim: the nature and severity of the risk, the most effective tactics to ameliorate that threat, the consequences of those tactics on broader rights and goods — all considerations that comparisons to the Stasi and Genghis Khan obscure. History counts, but not as a way to write off security considerations as meaningless by invoking a technical definition of “information.”
Categories: infohistory, policy Tagged with: information • nsa • policy • security
Date: November 17th, 2013 dw
November 15, 2013
The sociologist Saskia Sassen is giving a plenary talk at Engaging Data 2013. [I had a little trouble hearing some of it. Sorry. And in the press of time I haven’t had a chance to vet this for even obvious typos, etc.]
1. The term Big Data is ambiguous. “Big Data” implies we’re in a technical zone. it becomes a “technical problem” as when morally challenging technologies are developed by scientists who thinks they are just dealing with a technical issue. Big Data comes with a neutral charge. “Surveillance” brings in the state, the logics of power, how citizens are affected.
Until recently, citizens could not relate to a map that came out in 2010 that shows how much surveillance there is in the US. It was published by the Washington Post, but it didn’t register. 1,271 govt orgs and 1,931 private companies work on programs related to counterterrorism, homeland security and intelligence. There are more than 1 million people with stop-secret clearance, and maybe a third are private contractors. In DC and enirons, 33 building complexes are under construction or have been built for top-secret intelligence since 9/11. Together they are 22x the size of Congress. Inside these environments, the govt regulates everything. By 2010, DC had 4,000 corporate office buildings that handle classified info,all subject to govt regulation. “We’re dealing with a massive material apparatus.” We should not be distracted by the small individual devices.
Cisco lost 28% of its sales, in part as a result of its being tainted by the NSA taking of its data. This is alienating citzens and foreign govts. How do we stop this? We’re dealing with a kind of assemblage of technical capabilities, tech firms that sell the notion that for security we all have to be surveilled, and people. How do we get a handle on this? I ask: Are there spaces where we can forget about them? Our messy, nice complex cities are such spaces. All that data cannot be analyzed. (She notes that she did a panel that included the brother of a Muslim who has been indefinitely detained, so now her name is associated with him.)
3. How can I activate large, diverse spaces in cities? How can we activate local knowledges? We can “outsource the neighborhood.” The language of “neighborhood” brings me pleasure, she says.
If you think of institutions, they are codified, and they notice when there are violations. Every neighborhood has knowledge about the city that is different from the knowledge at the center. The homeless know more about rats than the center. Make open access networks available to them into a reverse wiki so that local knowledge can find a place. Leak that knowledge into those codified systems. That’s the beginning of activating a city. From this you’d get a Big Data set, capturing the particularities of each neighborhood. [A knowledge network. I agree! :)]
The next step is activism, a movement. In my fantasy, at one end it’s big city life and at the other it’s neighborhood residents enabled to feel that their knowledge matters.
Q: If local data is being aggregated, could that become Big Data that’s used against the neighborhoods?
A: Yes, that’s why we need neighborhood activism. The polticizing of the neighborhoods shapes the way the knowledge isued.
Q: Disempowered neighborhoods would be even less able to contribute this type of knowledge.
A: The problem is to value them. The neighborhood has knowledge at ground level. That’s a first step of enabling a devalued subject. The effect of digital networks on formal knowledge creates an informal network. Velocity itself has the effect of informalizing knowledge. I’ve compared environmental activists and financial traders. The environmentalists pick up knowledge on the ground. So, the neighborhoods may be powerless, but they have knowledge. Digital interactive open access makes it possible bring together those bits of knowledge.
Q: Those who control the pipes seem to control the power. How does Big Data avoid the world being dominated by brainy people?
A: The brainy people at, say, Goldman Sachs are part of a larger institution. These institutions have so much power that they don’t know how to govern it. The US govt has been the post powerful in the world, with the result that it doesn’t know how to govern its own power. It has engaged in disastrous wars. So “brainy people” running the world through the Ciscos, etc., I’m not sure. I’m talking about a different idea of Big Data sets: distributed knowledges. E.g, Forest Watch uses indigenous people who can’t write, but they can tell before the trained biologists when there is something wrong in the ecosystem. There’s lots of data embedded in lots of places.
[She’s aggregating questions] Q1: Marginalized neighborhoods live being surveilled: stop and frisk, background checks, etc. Why did it take tapping Angela Merkel’s telephone to bring awareness? Q2: How do you convince policy makers to incorporate citizen data? Q3: There are strong disincentives to being out of the mainstream, so how can we incentivize difference.
A: How do we get the experts to use the knowledge? For me that’s not the most important aim. More important is activating the residents. What matters is that they become part of a conversation. A: About difference: Neighborhoods are pretty average places, unlike forest watchers. And even they’re not part of the knowledge-making circuit. We should bring them in. A: The participation of the neighborhoods isn’t just a utility for the central govt but is a first step toward mobilizing people who have been reudced to thinking that they don’t count. I think is one of the most effective ways to contest the huge apparatus with the 10,000 buildings.
Categories: culture, policy Tagged with: 2b2k • big data • cities • ed2013 • liveblog • surveillance
Date: November 15th, 2013 dw
I’m at the Engaging Data 2013conference where Noam Chomsky and Pulitzer Prize winner (twice!) Barton Gellman are going to talk about Big Data in the Snowden Age, moderated by Ludwig Siegele of the Economist. (Gellman is one of the three people Snowden vouchsafed his documents with.) The conference aims at having us rethink how we use Big Data and how it’s used.
LS: Prof. Chomsky, what’s your next book about?
NC: Philosophy of mind and language. I’ve been writing articles that are pretty skeptical about Big Data. [Please read the orange disclaimer: I’m paraphrasing and making errors of every sort.]
LS: You’ve said that Big Data is for people who want to do the easy stuff. But shouldn’t you be thrilled as a linguist?
NC: When I got to MIT at 1955, I was hired to work on a machine translation program. But I refused to work on it. “The only way to deal with machine translation at the current stage of understanding was by brute force, which after 30-40 years is how it’s being done.” A principled understanding based on human cognition is far off. Machine translation is useful but you learn precisely nothing about human thought, cognition, language, anything else from it. I use the Internet. Glad to have it. It’s easier to push some buttons on your desk than to walk across the street to use the library. But the transition from no libraries to libraries was vastly greater than the transition from librarites to Internet. [Cool idea and great phrase! But I think I disagree. It depends.] We can find lots of data; the problem is understanding it. And a lot of data around us go through a filter so it doesn’t reach us. E.g., the foreign press reports that Wikileaks released a chapter about the secret TPP (Trans Pacific Partnership). It was front page news in Australia and Europe. You can learn about it on the Net but it’s not news. The chapter was on Intellectual Property rights, which means higher prices for less access to pharmaceuticals, and rams through what SOPA tried to do, restricting use of the Net and access to data.
LS: For you Big Data is useless?
NC: Big data is very useful. If you want to find out about biology, e.g. But why no news about TPP? As Sam Huntington said, power remains strongest in the dark. [approximate] We should be aware of the long history of surveillance.
LS: Bart, as a journalist what do you make of Big Data?
BG: It’s extraordinarily valuable, especially in combination with shoe-leather, person-to-person reporting. E.g., a colleague used traditional reporting skills to get the entire data set of applicants for presidential pardons. Took a sample. More reporting. Used standard analytics techniques to find that white people are 4x more likely to get pardons, that campaign contributors are also more likely. It would be likely in urban planning [which is Senseable City Labs’ remit]. But all this leads to more surveillance. E.g., I could make the case that if I had full data about everyone’s calls, I could do some significant reporting, but that wouldn’t justify it. We’ve failed to have the debate we need because of the claim of secrecy by the institutions in power. We become more transparent to the gov’t and to commercial entities while they become more opaque to us.
LS: Does the availability of Big Data and the Internet automatically mean we’ll get surveillance? Were you surprised by the Snowden revelations>
NC: I was surprised at the scale, but it’s been going on for 100 years. We need to read history. E.g., the counter-insurgency “pacification” of the Philippines by the US. See the book by McCoy [maybe this. The operation used the most sophisticated tech at the time to get info about the population to control and undermine them. That tech was immediately used by the US and Britain to control their own populations, .g., Woodrow Wilson’s Red Scare. Any system of power — the state, Google, Amazon — will use the best available tech to control, dominate, and maximize their power. And they’ll want to do it in secret. Assange, Snowden and Manning, and Ellsberg before them, are doing the duty of citizens.
BG: I’m surprised how far you can get into this discussion without assuming bad faith on the part of the government. For the most part what’s happening is that these security institutions genuinely believe most of the time that what they’re doing is protecting us from big threats that we don’t understand. The opposition comes when they don’t want you to know what they’re doing because they’re afraid you’d call it off if you knew. Keith Alexander said that he wishes that he could bring all Americans into this huddle, but then all the bad guys would know. True, but he’s also worried that we won’t like the plays he’s calling.
LS: Bruce Schneier says that the NSA is copying what Google and Yahoo, etc. are doing. If the tech leads to snooping, what can we do about it?
NC: Govts have been doing this for a century, using the best tech they had. I’m sure Gen. Alexander believes what he’s saying, but if you interviewed the Stasi, they would have said the same thing. Russian archives show that these monstrous thugs were talking very passionately to one another about defending democracy in Eastern Europe from the fascist threat coming from the West. Forty years ago, RAND released Japanese docs about the invasion of China, showing that the Japanese had heavenly intentions. They believed everything they were saying. I believe these are universals. We’d probably find it for Genghis Khan as well. I have yet to find any system of power that thought it was doing the wrong thing. They justify what they’re doing for the noblest of objectives, and they believe it. The CEOs of corporations as well. People find ways of justifying things. That’s why you should be extremely cautious when you hear an appeal to security. It literally carries no information, even in the technical sense: it’s completely predictable and thus carries no info. I don’t doubt that the US security folks believe it, but it is without meaning. The Nazis had their own internal justifications.
BG: The capacity to rationalize may be universal, but you’ll take the conversation off track if you compare what’s happening here to the Stasi. The Stasi were blackmailing people, jailing them, preventing dissent. As a journalist I’d be very happy to find that our govt is spying on NGOs or using this power for corrupt self-enriching purposes.
NC: I completely agree with that, but that’s not the point: The same appeal is made in the most monstrous of circumstances. The freedom we’ve won sharply restricts state power to control and dominate, but they’ll do whatever they can, and they’ll use the same appeals that monstrous systems do.
LS: Aren’t we all complicit? We use the same tech. E.g., Prof. Chomsky, you’re the father of natural language processing, which is used by the NSA.
NC: We’re more complicit because we let them do it. In this country we’re very free, so we have more responsibility to try to control our govt. If we do not expose the plea of security and separate out the parts that might be valid from the vast amount that’s not valid, then we’re complicit because we have the oppty and the freedom.
LS: Does it bug you that the NSA uses your research?
NC: To some extent, but you can’t control that. Systems of power will use whatever is available to them. E.g., they use the Internet, much of which was developed right here at MIT by scientists who wanted to communicate freely. You can’t prevent the powers from using it for bad goals.
BG: Yes, if you use a free online service, you’re the product. But if you use a for-pay service, you’re still the product. My phone tracks me and my social network. I’m paying Verizon about $1,000/year for the service, and VZ is now collecting and selling my info. The NSA couldn’t do its job as well if the commercial entities weren’t collecting and selling personal data. The NSA has been tapping into the links between their data centers. Google is racing to fix this, but a cynical way of putting this is that Google is saying “No one gets to spy on our customers except us.”
LS: Is there a way to solve this?
BG: I have great faith that transparency will enable the development of good policy. The more we know, the more we can design policies to keep power in place. Before this, you couldn’t shop for privacy. Now a free market for privacy is developing as the providers now are telling us more about what they’re doing. Transparency allows legislation and regulation to be debated. The House Repubs came within 8 votes of prohibiting call data collection, which would have been unthinkable before Snowden. And there’s hope in the judiciary.
NC: We can do much more than transparency. We can make use of the available info to prevent surveillance. E.g., we can demand the defeat of TPP. And now hardware in computers is being designed to detect your every keystroke, leading some Americans to be wary of Chinese-made computers, but the US manufacturers are probably doing it better. And manufacturers for years have been trying to dsign fly-sized drones to collect info; that’ll be around soon. Drones are a perfect device for terrorists. We can learn about this and do something about it. We don’t have to wait until it’s exposed by Wikileaks. It’s right there in mainstream journals.
LS: Are you calling for a political movement?
NC: Yes. We’re going to need mass action.
BG: A few months ago I noticed a small gray box with an EPA logo on it outside my apartment in NYC. It monitors energy usage, useful to preventing brown outs. But it measures down to the apartment level, which could be useful to the police trying to establish your personal patterns. There’s no legislation or judicial review of the use of this data. We can’t turn back the clock. We can try to draw boundaries, and then have sufficient openness so that we can tell if they’ve crossed those boundaries.
LS: Bart, how do you manage the flow of info from Snowden?
BG: Snowden does not manage the release of the data. He gave it to three journalists and asked us to use your best judgment — he asked us to correct for his bias about what the most important stories are — and to avoid direct damage to security. The documents are difficult. They’re often incomplete and can be hard to interpret.
Q: What would be a first step in forming a popular movement?
NC: Same as always. E.g., the women’s movement began in the 1960s (at least in the modern movement) with consciousness-raising groups.
Q: Where do we draw the line between transparency and privacy, given that we have real enemies?
BG: First you have to acknowledge that there is a line. There are dangerous people who want to do dangerous things, and some of these tools are helpful in preventing that. I’ve been looking for stories that elucidate big policy decisions without giving away specifics that would harm legitimate action.
Q: Have you changed the tools you use?
BG: Yes. I keep notes encrypted. I’ve learn to use the tools for anonymous communication. But I can’t go off the grid and be a journalist, so I’ve accepted certain trade-offs. I’m working much less efficiently than I used to. E.g., I sometimes use computers that have never touched the Net.
Q: In the women’s movement, at least 50% of the population stood to benefit. But probably a large majority of today’s population would exchange their freedom for convenience.
NC: The trade-off is presented as being for security. But if you read the documents, the security issue is how to keep the govt secure from its citizens. E.g., Ellsberg kept a volume of the Pentagon Papers secret to avoid affecting the Vietnam negotiations, although I thought the volume really only would have embarrassed the govt. Security is in fact not a high priority for govts. The US govt is now involved in the greatest global terrorist campaign that has ever been carried out: the drone campaign. Large regions of the world are now being terrorized. If you don’t know if the guy across the street is about to be blown away, along with everyone around, you’re terrorized. Every time you kill an Al Qaeda terrorist, you create 40 more. It’s just not a concern to the govt. In 1950, the US had incomparable security; there was only one potential threat: the creation of ICBM’s with nuclear warheads. We could have entered into a treaty with Russia to ban them. See McGeorge Bundy’s history. It says that he was unable to find a single paper, even a draft, suggesting that we do something to try to ban this threat of total instantaneous destruction. E.g., Reagan tested Russian nuclear defenses that could have led to horrible consequences. Those are the real security threats. And it’s true not just of the United States.
Categories: big data, egov, journalism, libraries, liveblog, policy, politics Tagged with: big data • journalism • libraries • liveblog • nsa • snowden
Date: November 15th, 2013 dw
August 21, 2013
The FCC’s Open Internet Advisory Committee’s 2013 Annual Report has been posted. The OIAC is a civilian group, headed by Jonathan Zittrain [twitter:zittrain] . The report is rich, but I want to point to one part that I found especially interesting: the section on “specialized services.”
Specialized services are interesting because when the FCC adopted the Open Internet Order (its “Net Neutrality” policy), it permitted the carriers to use their Internet-delivery infrastructure to provide some specific type of content or service to side of the Internet. As Harold Feld put it in 2009, in theory the introduction of “managed services”
The danger is that the providers will circumvent the requirement that they not discriminate in favor of their own content (or in favor of content from companies that pay them) by splintering off that content and calling it a a special service. (For better explanations, check Technoverse, Ars Technica, Commissioner Copps’ statement.)
So, a lot comes down to the definition of a “specialized service.” This Annual Report undertakes the challenge. The summary begins on page 9, and the full section begins on p. 66.
I won’t pretend to have the expertise to evaluate the definitions. But I do like the principles that guided the group:
The Specialized Services group was led by David Clark, and manifests a concern for what Jonathan Zittrain calls “generativity“: it’s not enough to measure the number of bits going through a line to a person’s house; we also have to make sure that the user is able to do more with those bits than simply consume them.
I’m happy to see the Committee address the difficult issue of specialized services, and to do so with the clear intent of (a) not letting access to the open Internet be sacrificed, and(b) not allowing special services to be an end run around an open Internet.
Note: Jonathan Zittrain is my boss’ boss at the Harvard Law Library. I’ve known him through the Berkman Center for ten years before that.
Categories: net neutrality, policy Tagged with: fcc • net neutrality • zittrain
Date: August 21st, 2013 dw
April 9, 2013
Derek Khanna is giving a Berkman talk on trying to connect the dots so that policy-makers “get it.” “How do we even frame discussions about the economy and innovation?” Copyright law hasn’t been re-assessed in at least 15 yrs, he says. He begins with his bakcstory: He’s from Mass. Worked for Romney and Scott Brown. (Derek wrote the copyright reform report for the Republican Study Group.)
Rule 1: “Being right is just part of the battle.” Rule 2: “It’s less important what you say…It’s most important who says it.” Rule 3: “Control the framing of the issue.” E.g., we [copyright reformers] frame copyright very differently than does Capitol Hill.
Take SOPA. He quotes Adam Green saying it’s not a matter of right vs. wrong but old vs. new. Staffers had been warning about SOPA, but suddenly the public engaged. The result was astounding: Co-sponsors became opponents of the bill. Derek says it wasn’t Google that killed SOPA. It was the 3 million people reaching out to Congress that killed it. “People like Elizabeth Stark, Alexis Ohanian [reddit] and Aaron Swartz.” The RIAA and MPAA like to frame it as having lost to Google rather than having lost to the American people. (He points to a Mario Savio speech that begins “There’s a time when the operation of the machine becomes so odious…”) SOPA remains very much on Congress’ mind, he says.
The framing was “perfect”: SOPA will censor the Internet and inhibit innovation.
Most conversations about copyright are framed as: Piracy is rampant, costing American jobs. Content is a crucial export, “the only thing produced in US any more.” Copyright is thus good, but more copyright is better.
Derek set out to reframe it in his “Three Myths of Copyright.” At a panel he asked “Who thinks terrorism is bad? Who thinks the TSA is only the way to protect us?” Likewise, is copyright the only way to protect content when it makes 23M Americans into felons? He points to the difference between the original copyright law and the current one. To conservatives, it can be framed as looking like a wild divergence from the original intent.
The “Three Myths” memo went out and was supported by conservatives until 24 hours later when it was pulled. A few weeks later, Derek was fired. He’s continuing but he thinks that when you’re on the outside, you have to fight small, strategic battles.
Idea + Movement + Effort = Legislation
A few weeks ago the head of the copyright office endorsed many of the reforms in “Three Myths,” updating copyright for the digital generation. The day before the content industry made the old argument in Roll Call. The other side isn’t countering. The content lobby knows that Roll Call is read by Congress. We need similar expertise.
How do we start?
“We lack the institutional capacity to quickly intervene in the political process in the way the content industry has. We therefore need to be smarter and more tactical.” We should start with smaller battles. We should avoid the narrative of “fighting the Man,” that companies are evil, etc. That won’t win over a party that sees itself as a party of business. “Instead, foster a David v. Goliath narrative.” That media like that narrative.
We should not talk about piracy. And even if the DMCA needs to be replaced, that’s a non-starter on Capital Hill.
Derek’s first campaign was on cellphone unlocking, after the Librarian of Copyright said it was now illegal (i.e., ending the DMCA exemption) to enable your phone to be used on a different carrier. Unlocking would increase competition among carriers. Derek wrote an article for The Atlantic that pointed out that the technology for the blind also has to be exempted every three years, a clear example of how the system is broken. Derek expected this issue to be hard. It didn’t get any mainstream media attention. It has a $32M lobbying effort on the other side. “That’s a problem on Capitol Hill: We don’t have a lobby for the future.” IT requires making hypothetical arguments.
But as the argument went on, examples emerged. E.g., Republic Wireless offers very cheap connectivity, but it depends on users bringing in unlocked phones.
Derek started a White House petition that got 114,000 signatures, the largest at the time. In part this worked because of people’s prior experience with SOPA. There were positive arguments on Left and Right. Left: It’s a matter of fairness. Right: Property rights. Derek added to this the value of innovation as a cross-party value.
After the petition, the FCC announced an investigation, and the White House came out in favor of unlocking. Before that, Derek had urged Congressfolks to come out in favor of it, if only because he was worried that after Obama came out in favor of repeal, the right would take the other side. But shortly after Obama endorsed, some conservatives came out in favor. Bills were introduced in both chambers.
Unfortunately, we have no way of mobilizing the 114,000 people who signed the petition; the names couldn’t be captured.
Why was it successful?
Derek presented this at a conservative org and got called a Marxist. Fox Business also: “You’re just against contracts.” “When you take up an issue, you have to know where your third rails are.” Response: The contract is between you and your carrier; the feds shouldn’t be arresting people for violating a contract.
Why is it important? It’s the first time Congress has questioned the DMCA. We might get a hearing on it. Congress is unaware of the implications of the DMCA. It also helped Congress realize that international treaties are being used as a backdoor for these restrictions. It may affect the Trans-Pacific Partnership treaty. And it helped identify allies.
Bottom line: “A free society shouldn’t have to petition its govt every 3 years to allow access to tech.” It’s akin to free speech, he says.
On the CFAA: “The statute is terrible.” There’s consensus about this. “But no one has written about in Weekly Standard or Politico.” It hasn’t reached Congress’ attention. Most members of Congress think that the sky is falling when it comes to cybersecurity. Every time a cybersec bill comes up, Congress has experts telling them that we are in deep peril. “Essentially the arguments for CFAA are that we need to reduce the DoJ’s discretion.” You have to defeat that training. Meet with Rogers or McCain or the other cyber-hawks and convince them that the CFAA needs to be reformed, that we can target hacking with a more narrowly focused bill.
Q: Can we try to drive a wedge in the opposition?
A: Yes. The RIAA’s and MPAA’s policies don’t foster innovation in their own industry. Over a 100 wireless carriers supported us on unlocking.
Q: You said that people who “get” tech are on the side of openness, etc. That optimistically suggests that if we educate people, they’ll take more common sense positions on tech.
A: Not entirely. Congress listens to people they trust, who are the RIAA, MPAA…
Q: …But even if Congressfolks fully understood tech, would the funds they get from the content industry still sway them?
A: Yes, some understand and still oppose us. But the ones who understand generally agree with us. The story is more complex: The MPAA/RIAA are very liberal, but the right still tend toward copyright protection.
Q: Why is the content industry so powerful, given the size of Google, etc.
A: AT&T and Verizon are both in the top ten of lobbying companies: $32M. Google spends about $6M on lobbying. “No tech company had a DC presence until Microsoft” when it was about to be broken up. Also, as the tech companies invest heavily to survive, say, patent law, why would you favor wholesale patent law change? Also, when the RIAA/MPAA sue kids, the money goes back into lobbying, not to the artists. They’re self-funding. But the tech industry has to justify why they’re spending money on lobbying.
Q: In Pakistan, piracy is rampant. Doesn’t that hurt innovation?
A: Piracy is real. But, those generally weren’t loss sales. The obsession with piracy is the problem.
Q: How about the role of public interest groups?
A: I’m a big fan of Public Knowledge and EFF, etc. But they need supplementing with more activist movements.
Q: If we focus on small victories, will people think we’re not doing enough? Will you have to keep winning bigger and bigger?
A: You can exist at a level for a while, if you’re strategic about it. Eventually you have to move on to bigger battles.
Q: How about the importance of multistake partnerships?
A: You need as many allies as you can. E.g., I’m interested in orphan works: in copyright but you can’t find the copyright holders. Our interests are in line with the RIAA.
A: Are we in a moment like the environmental movement before it formed under a single banner?
Q: I’m not an expert on the environmental movement. There are lots of lessons to be learned from them.
Q: Is there a schism in the conservatism over copyright reform?
A: I haven’t seen much of a schism. The best argument I’ve heard is the natural rights one: copyright ought to exist forever. But that’s not the system we’ve adopted. Our founding fathers rejected it. I’d like to build a cross-party coalition, but that’s a longtime goal.
Q: Did you get pushback on using the WH petition mechanism?
A: I got some from privacy folks.
Q: When we win a battle, the other side comes up with something more drastic. E.g., we won a first sale argument, but the right may be preparing something much more drastic. How can we avoid that?
A: I’m not sure they’re going to try to reverse the first sale doctrine, but we need to have our eyes open.
Q: What should we do right now?
A: We’d like to start to bring together the CISPA coalition.
Categories: berkman, liveblog, policy, politics Tagged with: cfaa • copyright • policy • politics • sopa
Date: April 9th, 2013 dw
November 26, 2012
It turns out that I am less concerned about privacy than are most of my friends (not you, Jeff!), but this petition makes complete sense to me. The ECPA is a textbook example of a law that’s been outstripped by technology.
June 29, 2012
Eric Schmidt is being interviewed by Jeff Goldberg about the Net and Democracy. I’ll do some intermittent, incomplete liveblogging…
NOTE: Posted without having even been re-read. Note note (a few hours later): I’ve done some basic cleanup.
After some amusing banter, Jeff asks Eric about how responsible he felt Google was for Arab Spring. Jeff in passing uses the phrase “Internet revolution.”
ES: Arab Spring was enabled by a failure to censure the Internet. Google enabled people to organize themselves. Especially in Libya, five different militias were able to organize their armed revolt by using the Net. It’s unfair to the people who died to call it an “Internet revolution.” But there were fewer people who died, in part because of the incessant media coverage. And we’ve seen that it’s very easy to start what some call an Internet revolution, but very hard to finish it.
JG: These were leaderless revolutions, crowdsourced revolution. But in Egypt the crowd’s leaders were easily pushed aside after Mubarek fell.
ES: True leaders are very hard to find. In Libya, there are 80 militias, armed to the teeth. In most of the countries there were repressed Muslim groups that have emerged as leaders because they organized while repressed. Whoever takes over inherits financial and social problems, and will be thrown out if they fail.
JG: Talk about Google’s tumultuous relationship with China…
ES: There are lots of reasons to think that China works because its citizens like its hierarchical structure. But I think you can’t build a knowledge society without freedom. China wants to be a knowledge society. It’s unclear if China’s current model gets them past a middle income GDP. Google thought that if we gave them free access to info, the Chinese people would revolt. We were wrong, and we moved Google to Hong Kong, on the open side of the Great Firewall. (We had to because that’s the Chinese law.) Now when you enter a forbidden query, we tell the user that it’s likely to be blocked. We are forbidden from announcing what the forbidden terms are because we don’t want employees put in jail.
JG: Could Arab Spring happen in China? Could students organize Tianamen Square now?
ES: They could use the Chinese equivalent of Twitter. But if someone organizes a protest, two people show up, plus 30 media, and 50 police.
JG: Google’s always argued that democratization of info erodes authoritarian control. Do you still believe that?
ES: The biggest thing I’ve learned is how hard it is to learn about the differences among people in and within countries. I continue to believe that this device [mobile phone] will change the world. The way to solve most of the world’s problems is by educating people. Because these devices will become ubiquitous, it’ll be possible to see how far we humans can get. With access to the Net, you can sue for justice. In the worst case you can actually shame people.
JG: And these devices can be used to track people.
ES: Get people to understand they have choices, and they will eventually organize. Mobiles tend to record info just by their nature. The phone company knows where you are right now. You’re not worried about that because a law says the phone company can’t come harass you where you’re sitting. In a culture where there isn’t agreement about basic rights…
JG: Is there evidence that our democracy is better off for having the Internet?
ES: When we built the Net, that wasn’t the problem we were solving. But more speech is better. There’s a lack of deliberative time in our political process. Our leaders will learn that they’ll make better decisions if they take a week to think about things. Things will get bad enough that eventually reason will prevail. We complain about our democracy, but we’re doing quite well. The US is the beacon of innovation, not just in tech, but in energy. “In God we trust … all others have to bring data.” Politicians should just start with some facts.
JG: It’s easier to be crazy and wrong on the Net.
ES: 0.5% of Americans are literally crazy. Two years ago, their moms got them broadband connections. And they have a lot of free time. Google is going to learn how to rank them. Google should enable us to hear all these voices, including the crazy people, and if we’re not doing that, we’re not doing our job.
JG: I googled “Syria massacre” this morning, and the first story was from Russia Today that spun it…
ES: It’s good that you have a choice. We have to educate ourselves and our children. Not everything written is true, and very powerful forces want to convince you of lies. The Net allows that, and we rank against it, but you have to do your own investigation.
JG: Google is hitting PR problems. Talk about privacy…
ES: There’s no delete button on the Net. When you’re a baby, no one knows anything about you. As you move through life, inevitably more people know more about you. We’re going to have to learn about that. The wifi info gathering by StreetView was an error, a mistake, and we’ve apologized for it.
JG: The future of journalism?
ES: A number of institutions are figuring out workable models. The Atlantic [our host]. Politico. HuffingtonPost. Clever entrepreneurs are figuring out how to make money. The traditional incumbents have been reduced in scale, but there are plenty of new voices. BTW, we just announced a tablet with interactive, dynamic magazines. To really worry about: We grew up with the bargain that newspapers had enough cash flow to fund long term investigative research. That’s a loss to democracy. The problem hasn’t been fully solved. Google has debated how to solve it, but we don’t want to cross the content line because then we’d be accused of bias in our rankings.
JG: Will search engines search for accuracy rather than popularity?
ES: Google’s algorithms are not about popularity. They’re about link structures, and we start from well-known sources. So we’re already there. We just have to get better.
JG: In 5 yrs what will the tech landscape look like?
ES: Moore’s Law says that in 5 yrs there will be more power for less money. We forget how much better our hw is now than even 5 years. And it’s faster than Moore’s Law for disks and fiber optic connections. Google is doing a testbed optical installation. At that bandwidth all media are just bits. We anticipate a lot of specialty devices.
JG: How do you expect an ordinary, competent politician to manage the info flow? Are we inventing tech that is past our ability to process info?
ES: The evidence is that the tech is bringing more human contact. The tech lets us express our humanity. We need a way of sorting politicians better. I’d suggest looking for leaders who work from facts.
JG: Why are you supporting Obama?
ES: I like having a smart president.
JG: Is Romney not smart?
ES: I know him. He’s a good man. I like Obama’s policies better.
Q: Our connectivity is 3rd world. Why haven’t we been able to upgrade?
A: The wireless networks are running out of bandwidth. The prediction is they’ll be saturated in 2016. Maybe 2017. That’s understandable: Before, we were just typing online and now we’re watching movies. The White House in a few weeks is releasing a report that says that we can share bandwidth to get almost infinite bandwidth. Rather than allocating a whole chunk that leaves most of it unused, using interference databases we think we can fix this problem. [I think but please correct me: A database of frequency usages so that unused frequencies in particular geographic areas can be used for new signals.]
A: The digital can enhance our physical connections. E.g., a grandmother skyping with a grandchild.
JG: You said you can use the Net to shame govts. But there are plenty of videos of Syria doing horrible things, but it’s done no good.
ES: There are always particularly evil people. Syria is the exception. Most countries, even autocratic ones, are susceptible to public embarrassment.
Q: Saying “phones by their nature collect data” evades responsibility.
ES: I meant that in order to their work, they collect info. What we allow to be done with that info is a legal, cultural issue.
Q: Are we inherently critical thinkers? If not, putting info out there may not lead to good decisions.
ES: There’s evidence that we’re born to react quickly. Our brains can be taught reasoning. But it requires strong family and education.
Q: Should there be a bill of rights to simplify the legalese that express your privacy rules?
ES: It’s a fight between your reasonable point of view, and the lawyers and govt that regulate us. Let me reassure you: If you follow the goal of Google to have you as a customer, the quickest way to lose you is to misuse your information. We are one click away from competitors who are well run and smart. [unless there was money in it, or unless they could get away with it, or…]
Q: Could we get rid of representative democracy?
ES: It’ll become even more important to have democratic processes because it’s all getting more complicated. For direct democracy we’d have to spend all day learning about the issues and couldn’t do our jobs.
JG: David Brooks, could you comment? Eric is an enormous optimist…
ES: …The evidence is on my side!
JG: David, are you as sanguine that our politicians will learn to slow their thinking down, and that Americans have the skills to discern the crap from the true.
David Brooks: It’s not Google’s job to discern what’s true. There are aggregators to do this, including the NYT and TheBrowser. I think there’s been a flight to quality. I’m less sanguine about attention span. I’m less sanguine about confirmation bias, which the Web makes easier.
ES: I generally agree with that. There’s evidence that we tend to believe the first thing we hear, and we judge plus and minus against that. The answer is always for me culture, education.
Q: Will there be a breakthrough in education?
ES: Education changes much more slowly than the world does. Sometimes it seems to me that education is run for the benefit of the teachers. They should do measurable outcomes, A-B testing. There’s evidence that physics can be taught better by setting a problem and then do a collaborative effort, then another problem…
Categories: censorship, echo chambers, education, egov, liveblog, media, net neutrality, policy, politics Tagged with: arab spring • aspenideas • democracy • e-democracy • eric schmidt • google • liveblog • privacy
Date: June 29th, 2012 dw
Joho the Blog by David Weinberger is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 United States License.
Creative Commons license: Share it freely, but attribute it to me, and don't use it commercially without my permission.