Joho the Blogprivacy Archives - Page 2 of 6 - Joho the Blog

September 5, 2013

Pew Internet survey on Net privacy: Most of us have done something about it

Pew Internet has a new study out that shows that most of us have done something to maintain our privacy (or at least the illusion of it) on the Net. Here’s the summary from the report’s home page:

A new survey finds that most internet users would like to be anonymous online, but many think it is not possible to be completely anonymous online. Some of the key findings:

  • 86% of internet users have taken steps online to remove or mask their digital footprintsâ??ranging from clearing cookies to encrypting their email.

  • 55% of internet users have taken steps to avoid observation by specific people, organizations, or the government.

The representative survey of 792 internet users also finds that notable numbers of internet users say they have experienced problems because others stole their personal information or otherwise took advantage of their visibility online. Specifically:

  • 21% of internet users have had an email or social networking account compromised or taken over by someone else without permission.

  • 12% have been stalked or harassed online.

  • 11% have had important personal information stolen such as their Social Security Number, credit card, or bank account information.

  • 6% have been the victim of an online scam and lost money.

  • 6% have had their reputation damaged because of something that happened online.

  • 4% have been led into physical danger because of something that happened online.

You can read the whole thing online or download the pdf, for free. Thank you, Pew Internet!

1 Comment »

March 28, 2013

[berkman] Dan Gillmor on living off the privacy grid

Dan Gillmor is giving a Berkman lunchtime talk about his Permission Taken project. Dan, who has been very influential on my understanding of tech and has become a treasured friend, is going to talk about what we can do to live in an open Internet. He begins by pointing to Jonathan Zittrain’s The Future of the Internet and Rebecca MacKinnon’s Consent of the Networked [two hugely important books].

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

He says that the intersection of convenience and freedom is narrowing. He goes through a “parade of horribles” [which I cannot keep up with]. He pauses on Loic Le Meur’s [twitter:loic] tweet: “A friend working for Facebook: ‘we’re like electricity.'” If that’s the case, Dan says, we should maybe even think about regulation, although he’s not a big fan of regulation. He goes through a long list of what apps ask permission to do on your mobile. His example is Skype. It’s a long list. Bruce Schneier says when it comes to security, we’re heading toward feudalism. Also, he says, Skype won’t deny it has a backdoor. “You should assume they do,” he says. The lock-in is getting tighter and tighter.

We do this for convenience. “I use a Kindle.” It makes him uncomfortable but it’s so hard to avoid lock-in and privacy risks. The fight against SOPA/PIPA was a good point. “But keep in mind that the copyright cartel is a well-funded smart group of people who never quit.” He says that we certainly need better laws, rules, and policies. “That’s crucial.” But his question this afternoon is what we as individuals can do. Today he’s going to focus on security countermeasures, although they’re not enough. His project â?? which might become a book â?? will begin simply, because it’s aimed at the broad swath of people who are not particularly technically literate.

“Full disk encryption should be the default. It’s not. Microsoft charges extra for it. Mac makes it pretty easy. So does Ubuntu.”

Disable intrusive browser extensions.

Root your phone. That’s not perfect. E.g., it makes you vulnerable to some attacks. But the tradeoff is that you now control your phone.

Dan blocks apps from particular permissions. Sometimes that keeps the app from working. “I accept that.” This is a counter to vendors insisting that you give them all the rights.

Use Tor [The Onion Router], even though “I assume some of the exit nodes” being run by the CIA. Tor, he explains, is a way of browsing the Web with some reasonable likelihood your ISP doesn’t know what you’re actually looking at, and what you’re looking at doesn’t know where you’re coming from.” This he says is important for whistleblowers, etc.

When loyalty cards came out, he and his friend used to randomly swap them to make the data useless. The last time he got one, he filled in his address as 1600 Pennsylvania Ave., and the guy in the store said, “It’s amazing how many people live there.” If you use a false address with a card, it may not work. If you do it on line, you’re committing a felony under the Computer Fraud and Abuse Act. The revisions are going in the wrong direction. “This is terrifying…We have to do something collectively.”

Pick your platform carefully. “I was the biggest Apple person around…I was a Mac bigot for years.” At prss events, he’d be the only person (beside John Markoff) to have a Mac. Many things happened, including Apple suing websites wanting to do journalism about Apple. Their “control freakery” and arrogance with the iPhone was worse. “Now that everyone except me at a press event has a Mac, I get worried.” Now the Mac is taking on the restrictions of the iPhone operating system (IOS). “I want to do what I want with my own computer.” All computer makers are moving to devices that you can’t even open them. “Everyone wants to be Apple.”

Own your own domain. Why are journalists putting their work on Facebook or other people’s platforms? Because it brings distribution and attention. “We do these things on ‘free’ platforms at their sufferance.” “We all should have a place on the Web that is owned by us,” even if we don’t do most of our work there. Dan is going to require students to get their own domain name.

Dan says his book/project is going to present a gradient of actions. At the further end, there’s Linux. Dan switched last year and has found it almost painless. “No one should have to use the command line if they don’t want to,” and Linux isn’t perfect about that yet. “Even there it’s improving.” He says all the major distributions are pretty. He uses Ubuntu. “Even there there’s some control-freakery going on.” Dan says he tried Linux every year for 10 years, and how he finds it “ready for prime time.” He says some control features being introduced to Windows, for reasonable reasons, is making life harder for Linux users. [I’m not sure what he’s referring to.]

Dan says the lockdown is caused by self-interest, not good vs. evil. He hopes that we can start to make the overlap of convenient and freedom larger and larger.

Q&A

Q: If you should have your own domain, you should also do your own hosting, run your own Apache server, etc.

A: You can’t be independent of all external services unless you really want that. There’s a continuum here. My hosting is done by someone I know personally. We really need systematic and universal encryption in the cloud, so whoever is storing your stuff can’t muck with it unless you give them permission. That raises legal questions for them.

Q: I really like what you’re saying. I’m not a specialist and it sounds like a conversation among a very small number of people who are refined specialists in this area. How do you get this out and more accessible? Could this be included in basic literacy in our public schools? On the other hand, I worry there’s a kind of individualism: You know how to do it, so you get to do it, but the rest don’t. How do we build a default position for people who can’t manage this for themselves.

A: Yes, I worry that this for geeks. But I’m not aiming this project at geeks. It’s more aimed at my students, who have grown up thinking Facebook is the Internet and that the MacBook Air gives them complete freedom [when in fact it can’t be opened and modified]. The early chapters will be on what you can do whatever it is that use. It won’t solve the problem, but it will help. And then take people up a ramp to get them as far as they’re comfortable doing. In really clear language, I hope. And it’d be a fine idea to make this part of digital literacy education. I’m a huge fan of CodeAcademy; Douglas Rushkoff wrote a wonderful book called “Program or Be Programmed,” and I think it does help to know some of this. [See Diana Kimball’s Berkman Talk on coding as a liberal art.] It’s not going to be in big demand any time soon. But I hope people can see what’s at risk, what they’re losing, and also what they gain by being locked down.

Q: Do you think freedom and convenience will grow further apart? What are the major factors?

A: Overall, the bad direction is still gaining. That’s why I’m doing this. I don’t think people are generally aware of the issues. It’ll help if we can get word out about what’s at risk and what the choices are. If people are aware of the issues and are fine with giving up their freedom, that’s their choice. We’ve been trading convenience of the illusion of security. “We put our hands up in scanners as if we’re being frisked.” There’s more money and power on the control side. Every major institution is aligned on the same side of this: recentralizing the technology that promised radical decentralization. That’s a problem. I’m going to try to convince people to use tech that doesn’t do that, and to push for better policies, but …

Q: What exactly are you concerned about? I feel free to do anything I want on the Internet. Maybe the govt is managing me. Marketers definitely are. I worry about hackers stealing my identity. But what are the risks?

A: “I think a society that is under pervasive surveillance is a deadened society in the long run.” It’s bad for us “in every way that I can imagine” except for the possibility that can stop a certain amount of crime. “But in dictatorships, the chief criminals are the govt and the police, so it doesn’t solve the problem.” The FBI wants a backdoor into every technology. If they get one, it will be used by bad people. This stuff doesn’t stay secret forever. The more you harden the defenses, the more room there is for really bad actors to get in. Those are some of the main reasons.

Q: How can Tor can help whistleblowers? Do you have other advice for journalists?

A: I have a chapter in a book that’s coming out about journalists and closed platforms. Journalists need to learn about security right away because they’re putting the lives of their sources at risk. The Committee to Protect Journalists has done important work on helping journalists understand the risks and mitigate them. It’s a crucial issue that hasn’t gotten enough attention inside the craft. although I had my PGP signature at the bottom of my column for 6 years and got 2 emails that used it, one of which said he just wanted to know if it worked. Also, you should be aware that you can’t anticipate every risk. E.g., if the US govt wants to find out what I’m talking about online, they’ll figure out a way to do it. They could break into my house and put up cameras. But like the better deadbolt lock stopping amateur criminals, better security measures will discourage some intrusions. When I do my online banking, I do it from a virtual machine that I use only for that; it has never gone anywhere else on the Internet. I don’t think that’s totally paranoid. There are still risks.

Q: The Supreme Court just affirmed first sale of materials manufactured outside of the US. Late stage capitalism want to literally own their markets, offline as well as online. How much of that wider context do you want to get into?

A: If the Court hadn’t affirmed first sale, every media producer would have moved all their production facilities offshore so that we wouldn’t be able to resell it. These days we buy licenses, not goods. Increasingly, physical goods will have software components. That’s an opportunity for the control crowd to keep you from owning anything you buy. In Massachusetts, the car repair shops got a ballot measure saying they get access to the software in cars; that was marvelous. BTW, I’m making common cause with some friends on the Right. Some of the more far-seeing people on the Right are way ahead in thinking about this. E.g., Derek Khanna. I will be an ally of anybody.

Q: [harry lewis] Great project. Here’s your problem: What are you worried about? This is a different sort of surveillance society. This is the opposite of the Panopticon where everyone knows they’re being spied upon. People won;t be motivated until there are breeches. The incentive of the surveillors is to do it as unobtrusively as possible. You’ll never know why your life insurance premium is $100 higher than my. You want ever see the data paths that led to that, because the surveillance will be happening at a level that will be ompletely invisible to the individual. It’ll be hard to wake people up. “A surveillance society is a deadened society” only if people know they’re being surveilled.

A: If they don’t see a consequence, then they won’t act. If the govt a generation ago had told you that you will henceforth carry a tracking device so we can where you are at any time, there would have been an uproar. But we did it voluntarily [holding up a mobile phone]. The cell tower has to know where you are, but I’d like to find a way to spoof everything else for everyone else. (You should assume your email is being read on your employer’s server, Dan says.)

Q: I worry about creating a privacy of the elite that only a small segment can access. That creates a dangerous scenario. Should there be govt regulations to make sure we’re all operating with the same levels of privacy?

A: It’s an important point. The govt rules won’t be the ones you want. We need to create a market-based solutions. Markets work better than advice or edicts.

Q: But hasn’t the market spoken, and it’s the iPhone?

A: The iPhone has important security features. But people aren’t scared enough to create a market.

A: The ACLU should be advised on how to create pamphlets that will reach people.

A: So much of hacker culture and open source culture are based on things being difficult. Many of the privacy tools work but are too hard to use. There is a distinct lack of design, and we don’t see poorly designed things as legitimate. And that’s a fairly easy thing to fix. A: Yes.

Q: Younger people don’t seem to care about privacy. Is there a generational shift?

A: There are two possibilities for the future. My hope is that we’ll all start cutting each other more slack; everyone will recognize that we all did unbelievably stupid, even possibly criminal things, in our 20s. I still do plenty of stupid things. But it worries me that cultures sometimes grow less tolerant. This could be catastrophic, if the country goes toward the Right.

A: There are tools to make it easy to do this. E.g., CryptoParty.org, the Pirate Party. And are there alternatives to social media that are ready for prime time?

A: Still pretty geeky, but it’s a wonderful start. But many of the tools cost money.

Q: Any thoughts about ways to use govt and corporate interests to promote your goals. E.g., protect the children.

A: I’ll rename this Protect the Children and then everyone will do what I want :) Overall, the problem is that power is shifting, pulling back into the center. This has long term negative consequences. But speculating on what the consequences will be is never as effective as showing what’s going wrong now. I want the power to be distributed. “I’m pretty worried, although I’m a relentless optimist.” “I’m a resister.”

Comments Off on [berkman] Dan Gillmor on living off the privacy grid

November 26, 2012

Petition: Update the Electronic Communications Privacy Act

Jonathan Kamens has started a petition to the White House to update the Electronic Communications Privacy Act (ECPA). Here’s the nut of his explanation:

The Electronic Communications Privacy Act protects the privacy of your email, requiring law-enforcement authorities to show probable cause and obtain a search warrant before they can read it. Except it doesn’t, really, because under the ECPA, any email left “on the server” for more than 180 days is considered “abandoned” under the ECPA, and any prosecutor in the country can get access to it simply by signing a letter requesting such access.

The law was written in the days when people’s email stayed on the server only until they downloaded it from there to their desktop computer over a slow modem. Nowadays, however, virtually everybody leaves their email “on the server” so that it can be accessed from anywhere on any device. So virtually everybody’s old email is accessible to law-enforcement authorities without a search warrant. This is horrendously unacceptable, and this petition calls on Congress to amend the law and on the Obama administration to support and push for such an amendment.

It turns out that I am less concerned about privacy than are most of my friends (not you, Jeff!), but this petition makes complete sense to me. The ECPA is a textbook example of a law that’s been outstripped by technology.

Comments Off on Petition: Update the Electronic Communications Privacy Act

July 2, 2012

How I became a creepy old man

I was checking Facebook yesterday afternoon, as I do regularly every six months or so. It greeted me with a list of friend requests. One was from the daughter of a colleague. So I accepted on the grounds that it was unexpected but kind of cute that she would ask.

Only after I clicked did I realize that the list was not of requests but of suggestions for people I might want to friend. So, now the daughter of a colleague has received a friend request from a 61 year old man she never heard of, and I’m probably going to end up on the No Fly list.

The happy resolution: I contacted my colleague to let him know, and he took it as an opportunity to have a conversation with his daughter about how to handle friend requests from people she doesn’t know, especially pervy-looking old men.

3 Comments »

June 29, 2012

[aspen] Eric Schmidt on the Net and Democracy

Eric Schmidt is being interviewed by Jeff Goldberg about the Net and Democracy. I’ll do some intermittent, incomplete liveblogging…

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

NOTE: Posted without having even been re-read. Note note (a few hours later): I’ve done some basic cleanup.

After some amusing banter, Jeff asks Eric about how responsible he felt Google was for Arab Spring. Jeff in passing uses the phrase “Internet revolution.”

ES: Arab Spring was enabled by a failure to censure the Internet. Google enabled people to organize themselves. Especially in Libya, five different militias were able to organize their armed revolt by using the Net. It’s unfair to the people who died to call it an “Internet revolution.” But there were fewer people who died, in part because of the incessant media coverage. And we’ve seen that it’s very easy to start what some call an Internet revolution, but very hard to finish it.

JG: These were leaderless revolutions, crowdsourced revolution. But in Egypt the crowd’s leaders were easily pushed aside after Mubarek fell.

ES: True leaders are very hard to find. In Libya, there are 80 militias, armed to the teeth. In most of the countries there were repressed Muslim groups that have emerged as leaders because they organized while repressed. Whoever takes over inherits financial and social problems, and will be thrown out if they fail.

JG: Talk about Google’s tumultuous relationship with China…

ES: There are lots of reasons to think that China works because its citizens like its hierarchical structure. But I think you can’t build a knowledge society without freedom. China wants to be a knowledge society. It’s unclear if China’s current model gets them past a middle income GDP. Google thought that if we gave them free access to info, the Chinese people would revolt. We were wrong, and we moved Google to Hong Kong, on the open side of the Great Firewall. (We had to because that’s the Chinese law.) Now when you enter a forbidden query, we tell the user that it’s likely to be blocked. We are forbidden from announcing what the forbidden terms are because we don’t want employees put in jail.

JG: Could Arab Spring happen in China? Could students organize Tianamen Square now?

ES: They could use the Chinese equivalent of Twitter. But if someone organizes a protest, two people show up, plus 30 media, and 50 police.

JG: Google’s always argued that democratization of info erodes authoritarian control. Do you still believe that?

ES: The biggest thing I’ve learned is how hard it is to learn about the differences among people in and within countries. I continue to believe that this device [mobile phone] will change the world. The way to solve most of the world’s problems is by educating people. Because these devices will become ubiquitous, it’ll be possible to see how far we humans can get. With access to the Net, you can sue for justice. In the worst case you can actually shame people.

JG: And these devices can be used to track people.

ES: Get people to understand they have choices, and they will eventually organize. Mobiles tend to record info just by their nature. The phone company knows where you are right now. You’re not worried about that because a law says the phone company can’t come harass you where you’re sitting. In a culture where there isn’t agreement about basic rights…

JG: Is there evidence that our democracy is better off for having the Internet?

ES: When we built the Net, that wasn’t the problem we were solving. But more speech is better. There’s a lack of deliberative time in our political process. Our leaders will learn that they’ll make better decisions if they take a week to think about things. Things will get bad enough that eventually reason will prevail. We complain about our democracy, but we’re doing quite well. The US is the beacon of innovation, not just in tech, but in energy. “In God we trust … all others have to bring data.” Politicians should just start with some facts.

JG: It’s easier to be crazy and wrong on the Net.

ES: 0.5% of Americans are literally crazy. Two years ago, their moms got them broadband connections. And they have a lot of free time. Google is going to learn how to rank them. Google should enable us to hear all these voices, including the crazy people, and if we’re not doing that, we’re not doing our job.

JG: I googled “Syria massacre” this morning, and the first story was from Russia Today that spun it…

ES: It’s good that you have a choice. We have to educate ourselves and our children. Not everything written is true, and very powerful forces want to convince you of lies. The Net allows that, and we rank against it, but you have to do your own investigation.

JG: Google is hitting PR problems. Talk about privacy…

ES: There’s no delete button on the Net. When you’re a baby, no one knows anything about you. As you move through life, inevitably more people know more about you. We’re going to have to learn about that. The wifi info gathering by StreetView was an error, a mistake, and we’ve apologized for it.

JG: The future of journalism?

ES: A number of institutions are figuring out workable models. The Atlantic [our host]. Politico. HuffingtonPost. Clever entrepreneurs are figuring out how to make money. The traditional incumbents have been reduced in scale, but there are plenty of new voices. BTW, we just announced a tablet with interactive, dynamic magazines. To really worry about: We grew up with the bargain that newspapers had enough cash flow to fund long term investigative research. That’s a loss to democracy. The problem hasn’t been fully solved. Google has debated how to solve it, but we don’t want to cross the content line because then we’d be accused of bias in our rankings.

JG: Will search engines search for accuracy rather than popularity?

ES: Google’s algorithms are not about popularity. They’re about link structures, and we start from well-known sources. So we’re already there. We just have to get better.

JG: In 5 yrs what will the tech landscape look like?

ES: Moore’s Law says that in 5 yrs there will be more power for less money. We forget how much better our hw is now than even 5 years. And it’s faster than Moore’s Law for disks and fiber optic connections. Google is doing a testbed optical installation. At that bandwidth all media are just bits. We anticipate a lot of specialty devices.

JG: How do you expect an ordinary, competent politician to manage the info flow? Are we inventing tech that is past our ability to process info?

ES: The evidence is that the tech is bringing more human contact. The tech lets us express our humanity. We need a way of sorting politicians better. I’d suggest looking for leaders who work from facts.

JG: Why are you supporting Obama?

ES: I like having a smart president.

JG: Is Romney not smart?

ES: I know him. He’s a good man. I like Obama’s policies better.

Q&A

Q: Our connectivity is 3rd world. Why haven’t we been able to upgrade?

A: The wireless networks are running out of bandwidth. The prediction is they’ll be saturated in 2016. Maybe 2017. That’s understandable: Before, we were just typing online and now we’re watching movies. The White House in a few weeks is releasing a report that says that we can share bandwidth to get almost infinite bandwidth. Rather than allocating a whole chunk that leaves most of it unused, using interference databases we think we can fix this problem. [I think but please correct me: A database of frequency usages so that unused frequencies in particular geographic areas can be used for new signals.]

A: The digital can enhance our physical connections. E.g., a grandmother skyping with a grandchild.

JG: You said you can use the Net to shame govts. But there are plenty of videos of Syria doing horrible things, but it’s done no good.

ES: There are always particularly evil people. Syria is the exception. Most countries, even autocratic ones, are susceptible to public embarrassment.

Q: Saying “phones by their nature collect data” evades responsibility.

ES: I meant that in order to their work, they collect info. What we allow to be done with that info is a legal, cultural issue.

Q: Are we inherently critical thinkers? If not, putting info out there may not lead to good decisions.

ES: There’s evidence that we’re born to react quickly. Our brains can be taught reasoning. But it requires strong family and education.

Q: Should there be a bill of rights to simplify the legalese that express your privacy rules?

ES: It’s a fight between your reasonable point of view, and the lawyers and govt that regulate us. Let me reassure you: If you follow the goal of Google to have you as a customer, the quickest way to lose you is to misuse your information. We are one click away from competitors who are well run and smart. [unless there was money in it, or unless they could get away with it, or…]

Q: Could we get rid of representative democracy?

ES: It’ll become even more important to have democratic processes because it’s all getting more complicated. For direct democracy we’d have to spend all day learning about the issues and couldn’t do our jobs.

JG: David Brooks, could you comment? Eric is an enormous optimist…

ES: …The evidence is on my side!

JG: David, are you as sanguine that our politicians will learn to slow their thinking down, and that Americans have the skills to discern the crap from the true.

David Brooks: It’s not Google’s job to discern what’s true. There are aggregators to do this, including the NYT and TheBrowser. I think there’s been a flight to quality. I’m less sanguine about attention span. I’m less sanguine about confirmation bias, which the Web makes easier.

ES: I generally agree with that. There’s evidence that we tend to believe the first thing we hear, and we judge plus and minus against that. The answer is always for me culture, education.

Q: Will there be a breakthrough in education?

ES: Education changes much more slowly than the world does. Sometimes it seems to me that education is run for the benefit of the teachers. They should do measurable outcomes, A-B testing. There’s evidence that physics can be taught better by setting a problem and then do a collaborative effort, then another problem…

1 Comment »

December 6, 2011

[berkman] Jeff Jarvis on Publicness

Jeff Jarvis is giving a lunch time talk about his new book, Public Parts. He says he’s interested in preserving the Net as an open space. Privacy and publicness depend on each other. Privacy needs protection, he says, but we are becoming so over-protective that we are in danger of losing the benefits of publicness. (He apologizes for the term “publicness” but did not want to use the marketing term “publicity.”)

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

He begins with a history of privacy. In 1890, Brandeis wrote an article about privacy, in response to the rise of Kodak cameras. The NYT wrote about “fiendish Kodakers lying in wait.” Teddy Roosevelt banned photo-taking in public parks. Technology seems often to raise privacy concerns. After Gutenberg some authors did not want their name associated with works. Some say that privacy arose in Britain as a result of the creation of the back stairs. As tech advances, we need to find new norms. Instead, we tend to legislate to try to maintain the status quo.

Now for publicness, he begins by referring to Habermas: the public sphere arose in the 18th C in coffee houses and salon as a counterweight to the power of governments. But, Canadian researchers began The Making Publics Project that concluded that people had the tools for making publics before the 18th C. E.g., printed music, art, etc. all enabled the creation of publics. When a portrait of a Dutch gentleman was shown in Venice, if a Dutch man showed up, he looked like “them,” which helped define the Venetians as “us” (for example).

Mass media made us into a mass. It pretended to speak for us. Online, though, we can each make a public. E.g., Occupy Wall Street, and before that Arab Spring. He recounts tweeting angrily, and after a few glasses of wine, “Fuck you Washington! It’s our money.” Someone suggested to him that there were these new things called “hashtags,” and that this one should be #FUwashington. 110,000 tweets later, the hashtag had become a platform. “People viewed in this empty vessel what they wanted to.” Indeed, the first recorded use of #occupywallstreet was in a tweet that consisted of: “#fuwashington #occupywallstreet.” [Note: It might be #OWS.] Now the public is a network.

We’re going through a huge transition, he says. He refers to the Gutenberg Parenthesis. Before Gutenberg, knowledge was passed around, person to person. It was meant to honor and preserve ancient knowledge. After Gutenberg, knowledge became linear. There are beginnings and ends and boxes around things. It’s about product. There’s a clear sense of ownership. It honors current knowledge and its authors. Then you get to the other side of the parenthesis, and there are similarities. More passing it around, more remixing, less sense of ownership. The knowledge we revere starts to become the network itself. Our cognition of the world changes. The CTO of the Veterans Admin calls the Internet the Eighth Continent. “I used to think of the Internet as a medium,” but now he thinks of it more as a place, although there are problems with the place metaphor. (“All metaphors are wrong,” interjects Doc Searls. “That’s why they work.”) It was a hard transition into the parenthesis, and it’ll be hard coming out of it. It took 50 years after Gutenberg for books to come into their own, and 100 years to recognize the impact of books. We’re still looking at the Net using our the past as our analog.

To talk about publicness, Jeff had to go through “the gauntlet of privacy.” He looked for a good definition of privacy. Control is part of it, but “privacy” is an empty vessel itself. “I came to believe that privacy should be seen as an ethic.” It’s about the responsibility for making ethical decisions about sharing it. People and companies have different responsibilities here, of course. “There should be an ethic that people should be able to know who has access to their information. And it should be portable.” He gives a shout out to Doc Searls’ projectVRM.

If privacy is an ethic of knowing, publicness is an ethic of sharing. Not everything should be shared, of course, but there’s a generosity of sharing that should have us thinking about how sharing can benefit us. “I shared info about my prostate cancer on line, which means I was sharing information about my non-functioning penis. Why would I do that?” He has friends who learned of this because he was public, and some who shared with them great information about what he was about to go through. One guy started out under a pseudonym but then started using his real name. A woman told her story about how her husband died needlessly. Jeff refers to Xeni Jardin‘s posting of her mammogram and how this will likely save some lives. [Xeni, we are all thinking about you! And love you!]

“I am not utopian,” Jeff says, “because I’m not predicting a better world.” But we should be imagining the best that can happen, as well as the worst. There are many benefits to publicness. Bringing trust. Improving relationships. It enables collaboration. It disarms the notion of the stranger. It disarms stigmas: coming outside the closet disarms the old stigma (although, Jeff adds, no one should be forced out of a closet). Gov’t is too often secret by default, and that should be switched; the same is not true for individuals where the default should always be a choice. We should make it clear that the Internet is a shitty place to put secrets. Facebook has made mistakes about privacy, but 800M have joined because they want to share. Zuckerberg believes he is not changing but enabling human nature. By nature we want to share.

Jeff got accused by someone of “over-sharing” which he finds an odd phrase. It means “shut up.” The guy does not have to follow Jeff or read his blog. “I wasn’t over-sharing. He was over-listening.”

Companies should share more because it opens up the ability to collaborate. In What Would Google Do? Jeff speculated about a company that might design cars collaboratively. Many scoffed. But Local Motors is now doing it.

When Google pulled out of China, they did the right thing, he says. But can we expect companies to protect the Internet? Nah. Google did a devil’s deal with Verizon. Gov’t also can’t protect the Net. Jeff went to the E-G8 where he asked Sarkozy to take a Hippocratic Oath “to first do no harm.” Sarkozy replied that it’s not harm to protect your children. There are unintended consequences, e.g., danah boyd’s study of the consequence of COPPA. More than half of the 12 yr olds had Facebook pages, most of which had been created with the help of parents, violating the terms of use. Thus, COPPA is requiring families to lie. COPPA has resulted in young people being the worst served segment on the Net because it’s too risky to build a kid site. We need to protect our children, but we also have to protect the Net.

So, who has to protect the Net? We do. The people of the Net. Jeff went back to the Sullivan Principles (while noting that he’s not equating YouTube censorship with Apartheid) about corporate responsibility when dealing with South Africa. We need a discussion of such principles for doing business on the Net. The discussion will never end, he says, but it gives us something to point at. His own principles, he says, are wrong, but they are: 1. You have a right to connect. (Not that you have a right to demand a connection, but you can’t be disconnected.) 2. Privacy as an ethic of knowing and publicness as an ethic of sharing. 3. What’s public is a public good. The Germans allow citizens to demand Google pixelate Street View, resulting in a degradation of a useful tool. Google is taking pictures of public places in public views. Illinois and MA do not allow you to audio record police officers. Reducing what’s public reduces the value of the public. What are the principles at work here? 4. Institution’s info should become public by default. 5. Net neutrality. 6. The Net must remain open and distributed. “The fact that no one has sovereignity is what makes the Net the Net.”

“I am not a technodeterminist,” he says. “We are a point of choice. We need to maintain our choices. If we don’t protect them, companies and well-meaning and ill-meaning companies will take away those choices.” He points to Berkman as a leading institute for this. “I don’t blame Sarkozy for holding the event. I blame us for not holding our own event, the WE-G8, because it is our Internet.”

Jeff now does The Oprah.

Q: How about Google Plus requiring real names?
A: Anonymity has its place on line. So do pseudonyms. They protect the vulnerable. But I understand that real names improves he discourse. I get the motivation, but they screwed it up. They were far too literal in what someone’s identity is. I think Google knows this now. They’re struggling with a principle and a system. I do understand trying to avoid having the place overrun by fake identities and spam.

Q: German Street View is really about scale. It’s one thing for someone to take a picture of your house. It’s another for Google to send a car to drive down every street and post the pictures for the world. For some people it crosses the ethics of privacy. Why isn’t that a valid choice?
A: But it’s a public view. If you own the building, do you own the view of it? But you’re right about scale. But we need to protect the principle that what is public is the public good.

Q: We have a vacation rental. Any bad guy can use Street View to see if it’s worth robbing.
A: Riverhead LI used Google Earth to look for pools in backyards that had no permits. People were in an uproar. But it could also save children’s lives.

Q: [me] Norms are not the same as ethics. Can you talk about the difference? To what extent should privacy as an ethic of knowing be a norm? Etc.
A: Privacy as an ethic should inform the norms. I’ve been talking about my desire for a return of the busy signal… [missed a bit of this.]

Q: What about the ethics of having info shared for you? As people post photos of each other, enormous amounts of info will be shared…
A: We’re trying to adjust to this as a society. Currently, FB tells me if I’m tagged in a photo and lets me say no. It’s wrong if someone tricks you out of info, or violates a presumed confidence. Tyler Clemente who committed suicide after a picture of him was posted…the failure was human, not the technology’s.
A: Why don’t we share all of our health? We’d get more support. We’d have more data that might help. But health insurance would misuse it. Job applicants being disqualified? We could regulate against this. The real reason is stigma. “In this day and age, for anyone to be ashamed of sickness is pathetic.” The fact that we can use illness against people says more about our society.
A: Part of your message is that publicness is our best weapon against stigma.

Q: [espen andersen, who also blogged this talk] In Norway the gov’t publishes how much money people make. That arose when you had to go down to City Hall to get the info. Now there are FB mashups. So what about info that’s used for unintended purposes? And how about the Data Storage Directive that in Europe requires the storage of data “just in case.”
A: Helen Nissenbaum says the key to privacy is context. But it’s hard to know what the context is in many cases. Apparently Norway is rethinking its policy. But there was a cultural benefit that it’d be a shame to lose. Google threatened to pull Gmail out of Germany because of the data storage requirements. Why in the US does email have less protection than mail.
Q: I’m a member of the group suing the Norwegian govt on the grounds that that law is unconstitituional. But no one ever sets targets.

Q: Public by default, private by necessity: Yes. Where’s the low-hanging fruit for universities?
A: Lessig reminds us that if we only use govt data to get the bastards, govt will see openness as an enemy. We need also to be showing the positive benefit of open data. Universities will be in the next wave of disruption of the Net. Around the world, how many instructors write a lecture about capillary action, and how many of them are crap? The fact that you have Open Course lets you find the best lectures in the world. You can find and reward the best. Local education becomes more like tutoring. Why should students and teachers be stuck with one another? I’m reading DIY U and it’s wonderful. It’ll change because of the economics of education.

A: [I had trouble hearing this long question. He recommended going back to Irving Goffman, and pointed out that Net publicness is different if you’re famous.]
A: You’re talking about what a public is. We have thought that the public mean everyone. But now we can create limited publics around things. (Jeff points to a problem with circles in G+ : People think they create private spheres, but they don’t.) FB confused a public with the public; when it changed the defaults, people thought they were talking to a public but were in fact talking to the public.

Q: [me] Norms of privacy help define publics. Are you arguing for a single norm? Why not? [this was my question and I actually asked it much worse than this.]
A: I’m arguing for choice.
Q: Are Americans wrong for being modest in saunas?
A: Nope. [I’ve done a terrible job of capturing this.]

9 Comments »

October 31, 2011

The Firefox difference

Sebastian Anthony points to a distinguishing philosophy of Firefox that was not clear to me until I read it. The title is “Firefox is the cloud’s biggest enemy,” which he in the comments admits is not entirely apt. Rather, Firefox wants you to own and control your data; it uses the cloud, but encrypts your data when it does. This is a strong differentiation from Google Chrome and Microsoft IE.

Comments Off on The Firefox difference

June 28, 2011

How much do you trust the Internet?

I love the Internet. I trust what I learn from it, or, more exactly, I generally trust my ability not to be fooled by it. But, like all of us (?), I have a limit.

For example, Google has a new service called What Do You Love? It’s mainly a marketing tool: Tell it something you love, and it will aggregate that term across many of the services Google offers: Send an email, find it on a Google Map, find Google Groups that mention it, etc.

So, I entered “Ted Kaczynski” (the Unabomber), and WDYL cheerfully created the online equivalent of one of those creepy walls of souvenirs that clinch a suspect as the crazypants stalker/killer in cheesy crime movies. I enjoyed it, anyway, even as I made a joke to myself about now probably being on a Homeland Security watch list.

But I realized that there were limits to what I would enter into the site for fear of government consequences: “I love terrorism.” “I love child porn.” I’m actually even nervous putting those sentences into this post as examples. (Granted, either would make for WDYL responses that are more disturbing than amusing.)

So, a part of me apparently believes that the government is watching. And that the government has no sense of humor.

2 Comments »

June 10, 2011

[hyperpublic] Herbert Burkert

Herbert of Burkert of U ofSt. Gallen is giving a talk. He claims to be ill at ease because he’s a lawyer talking about art, but I’m betting his unease is misplaced :) [Note after the talk: Yup, it was totally misplaced. Delightful talk.]

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

He will structure his comments around two people. 1. John Peter Willebrand (1719-1786). He wrote “the outline of a beautiful city,” rules for “enhancing social happiness in cities.” He tried to coerce people into beauty. Design talk and architecture talk are dangerous, says Herbert. E.g., Le Courbousier designed how people should live. Idealists and Totalitarians do this. Contemporary designers have a more benevolent tone. So, Herbert’s first criterion: Are you actually designing for people? For example, are you imposing your idea of privacy or theirs? And are yo sure that their privacy is everybody’s privacy? How much space opportunities for people to develop and live their own lives to you give to others.

The second person: Lina Bo Bardi (1914-1992). She was an Italian architect once charged with turning a factory ground into a recreational area in Sao Paolo. What she built challenged ideas about the relation of work and recreation. The windows look like holes blown into a prison wall. From this Herbert infers that designers should be giving opportunities for social gathering, for cross-generational communication, cross-cultural communication, for variety, and for protected openness. The relation between private and public is a continuum. Is the low wall between seating areas a metaphor for scaled privacy, or should we just give up on the metaphors, at least not from architecture, because we fail to grasp the essence of electronic communication.

Comments Off on [hyperpublic] Herbert Burkert

[hyperpublic] The risks and beauty of hyperpublic life

Jeff Jarvis moderates a discussion. “We need principles to inform the architecture” for where we want to go, rather than waiting for terms of service from the sites we use. We need a discussion about terms of service for the Internet. (He’s careful to note that he’s not suggesting there be a single terms of service for the Net.) We all have a stake in the discussion of the public and private, Jeff says. We should be careful about our metaphors, he says, citing Doc Searls’ cautioning against calling it a medium since a medium is something that can be owned and controlled. “It is a platform for publics,” Jeff says.

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

Adam Greenfield, director of Urbanscale, a shop in NYC. He’s interested in how you design public spaces. He wants to push back against the idea of a networked city of the future. We already live in the networked city. Locative and declarative meedia (e.g., FourSquare) is widely adopted. We live among declarative objects, not just people declaring their position, e.g., TowerBridge tweets its open and closed states. In Tokyo, the side of a building is a QR code; it is an object with an informational shadow. Objects are incresaingly capable of gathering, processing, displaying, transmitting and/or taking action on info…which implies new modes of surveillance. His contention: Tens of millions of people are already living in these conditions, and we therefore need a new theory of networked objects.


He offers a taxonomy of what this class of objects implies. He begins with traffic signals controlled by motion detectors. The info is not uploaded to the network, and it has a clear public good.It is prima facie unobjectionable.


Then there is the mildly disruptive and disrespectful object. E.g., a sensor detects someone passing by a billboard that reacts to that presence. There’s no public good here. More concerning is a touch screen vending machine that makes gender assumptions based on interpretation of a camera image. Further, that info is gathered and used.


Another step into the disturbing: Advertisers scan faces and make predictive and prospectively normal assumptions that they sell to marketers.


But what about when power and knowledge resides in an ensemble of discrete things. E.g., in Barcelona, access to some streets depends on a variety of sensors and signs. It can even reside in code: surveillance camera sw gets upgraded with referendum.


We should be talking about public objects: Any discrete obj in the common spatial domain intended for the use and enjoyment of the general public [with a couple of refinements that went by to fast]. They should be open as in API and that their goods are non-rivalrous and non-excludable. This is great, but we should remember that this increases the “attack surface” for hostile forces. Also, we need to evolve the etiquettes and protocols of precedence and deconfliction. We should do this because it moves against the capture of public space by private entities, and it opens up urban resources like we’ve never seen (discoverable, addressable, queryable and scriptable). The right to the city should be underwirtten by the architecture of its infrastructure.


Q: [jeff] Why is the gender-sensing vending machine is creepy? Would you be ok if it guessed but let you correct it or ignore it?
A: I’ve been working with informed consent, but I heard this morning that that may not be the best model. We don’t want to over-burden people with alert boxes, etc.


Jeffrey Huang talks about a case study: the design of a new campus in the deserts of Ras Al Khaimah (one of the Emirates). In 2009, the Sheikh agreed to fund the creation of the new campus. Jeff and others were brought in to design the campus, from bare sand up. The initial idea for the project was to avoid creating a typical gated campus, but rather to make it open. They also wanted to avoid the water costs of creating a grass-lawned campus. “The ambition was to grow a campus where it made sense ecologically”: buildings where there’s natural cooling winds, etc. They’re designing large, fluid, open spaces, where “seeing and being seen is maximized.” There would be a network of sensors so that campus would be aware of what’s going on inside, including recognizing where individuals are. People’s profile info could be projected into their shadows. They wonder if they need special places of privacy. “There should be less necessity to design the private if and only if the hyperpublicness is adequately designed.” E.g. if no one owns the data, there’s full transparency about who looks at the data and what’s being captured.


Betsy Masiello from Google’s policy team gives some informal remarks. To her, a hyperpublic life implies Paris Hilton: Constant streaming, making your behavior available for everyone to see. But, she says, what this panel is really about is a data-driven life. It’s important not to blur the two. There’s public good that comes from big data analysis, and some baseline economic good.


She says she thinks about predictive analytics in two ways. 1. Analysis done to give you predictions about what you might like. It’s clear to the user what’s going on. 2. Predictions based on other people’s behavior. E.g., search, and Adam’s soda machine. Both create value. But what are there risks? The risk is a hyperpublic life. The risk of all this data is that it gets attached to us, gets re-identified, and gets attached to your identity. But this misses something…


E.g., she came across a Jonathan Franzen 1988 essay, “The Imperial Bedroom.” “Without shame there can be no distinction between public and private,” he wrote. She says you can feel shame even if you’re anonymous, but Franzen is basically right. Which brings her to a positive solution. “The design problem is how to construct and identify multiple identities, and construct and manage some degree of anonymity.” It is true that our tech will allow us to identify everyone, but policy requirements could punish companies from doing so. Likewise, there are some policy decisions that would make it easier to maintain multiple identities online.


Jeff: Your fear of re-identification surprises me.
Betsy: The hyperidentity public is created from the lack of contexts, without people knowing about it. People don’t know how all their contexts are becoming one. I think people want a separation of data used to personalize an ad from data they are sharing from their friends.
Jeff: This is John Palfrey’s “breakwalls”… But I’d think that Google would want as few restrictions as possible. They create liabilities for Google.
Betsy: That’s the design challenge. Search engines and Japanese soda machines haven’t gotten it right yet.
Jeff: What are the emerging principles. Separating gathering from usage. Control. Transparency…


Adam: I don’t there’s anonymous data any more.
Betsy: Yes, but could we create it via policy?
Adam: There are some fine uses of predictive analytics. E.g., epidemiology. But not when the police use it to predict crimes.
Jeff: Why not? Ok, we’ll talk later.


Q: What about third party abuse?
Adam: Our principle should be “First, do no harm.”
Jhoung: It’s a problem often because the systems don’t know enough. Either roll it back, or train it so it can make better distinctions.
Betsy: You can maybe get. FourSquare is an individual stating her identity. The flip is anonymous data about locations. That provides tremendous value, and you can do that while protecting the identities.
Jeff: But if you can’t protect, don’t collect it, then we’ll never collect anything and won’t get any of those benefits.


Q: [latanya] It’s not true that only those with something to hide want to remain anonymous. E.g., if you hide all the positive results of HIV tests, you can see who has HIUV. You have to protect the privacy of those who do not have HIV.
Jeff: But I got benefit from going public with my prostate cancer.
Latanya: But we live in a world of parallel universes. You got to control which ones knew about your cancer.


Q: [I could’t hear it]
Betsy: You don’t need to reveal anything about the individual pieces of data in a big data set in order to learn from it.


Q: (jennie toomey) There are lots of things we want kept private that have nothing to do with built or shame. Much of what we keep private we use to create intimacy.
Betsy: I was quoting Franzen.


Q: Privacy means something different in non-democratic societies.
Adam: We know historically that if info can be used against us, it eventually will be.


Q: Recommended: Solove’s The Taxonomy of Privacy
Adam: The info collected by E. Germany was used against people after E. Germany fell.
Jeff: But if only listen to the fears, we won’t get any of the benefits.

Comments Off on [hyperpublic] The risks and beauty of hyperpublic life

« Previous Page | Next Page »