Joho the Blog » privacy

July 1, 2014

[2b2k] That Facebook experiment

I have an op-ed/column up at CNN about the Facebook experiment. [The next day: The op-ed led to 4 mins on the Jake Tapper show. Oh what the heck. Here’s the video.]

All I’ll say here is how struck I am again (as always) about the need to leave out most of everything when writing goes from web-shaped to rectangular.

Just as a quick example, I’m not convinced that the Facebook experiment was as egregious as the headlines would have us believe. But I made a conscious decision not to address that point in my column because I wanted to make a more general point. The rectangle for an op-ed is only so big.

Before I wrote the column, I’d observed, and lightly participated in, some amazing discussion threads among people who bring many different sorts of expertise to the party. Disagreements that were not just civil but highly constructive. Evidence based on research and experience experience. Civic concern. Emotional connections. Just amazing.

I learned so much from those discussions. What I produced in my op-ed is so impoverished compared to the richness in that tangle of linked differences. That’s where the real knowledge lives.

1 Comment »

April 25, 2014

[nextweb] Ancilla Tilia on how we lost our privacy

Ancilla Tilia [twitter: ncilla] is introduced as a former model. She begins by pointing out that last year, when this audience was asked if they were worried about privacy implications of Google Glass. Only two people did. One was her. We have not heard enough from people like Bruce Schneier, she says. She will speak to us as a concerned citizen.

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

Knowledge is power, she says. Do we want to give away info about ourselves that will be available in perpetuity, that can be used by future governments and corporations? The them of this conf is “Power to the people,” so let’s use our power.

She says she had a dream. She was an old lady talking with her grand-daughter. “What’s this ‘freedom’ thing I’ve been hearing about? The kids at school say the old people used to have it.” She answered, “It’s hard to define. You don’t realize what it is until you stop having it. And you stop having it when you stop caring about privacy.” We lost it step by step, she says. By paying with our bank cards, every transaction was recorded. She didn’t realize the CCD’s were doing face recognition. She didn’t realize when they put RFID chips in everything. And license plate scanners were installed. Fingerprint scanners. Mandatory ID cards. DNA data banks. Banning burqas meant that you couldn’t keep your face covered during protests. “I began to think that ‘anonymous’ was a dirty word.” Eye scanners for pre-flight check. Biometrics. Wearables monitoring brainwaves. Smart TVs watching us. 2013’s mandatory pet chipping. “And little did I know that our every interaction would be forever stored.” “When journalists started dying young, I didn’t feel like being labeled a conspiracy nut.” “I didn’t know what a free society was until I realized it was gone, or that we have to fight for it.”

Her granddaughter looks at her doe-eyed, and Ancilla can’t explain any further.

2 Comments »

February 16, 2014

First post at Medium.com: The Internet is not a Panopticon

I’ve been meaning to try Medium.com, a magazine-bloggy place that encourages carefully constructed posts by providing an elegant writing environment. It’s hard to believe, but it’s even better looking than Joho the Blog. And, unlike HuffPo, there are precious few stories about side boobs. So, and might do so again.

The piece is about why we seem to keep insisting that the Internet is panopticon when it clearly is not. So, if you care about panopticons, you might find it interesting. Here’s a bit from the beginning:

A panopticon was Jeremy Bentham’s (1748-1832) idea about how to design a prison or other institution where people need to be watched. It was to be a circular building with a watchers’ station in the middle containing a guard who could see everyone, but who could not himself/herself be seen. Even though everyone couldn’t be seen at the same time, prisoners would never know when they were being watched. That’d keep ’em in line.

There is indeed a point of comparison between a panopticon and the Internet: you generally can’t tell when your public stuff is being seen (although your server logs could tell you). But that’s not even close to what a panopticon is.

…So why did the comparison seem so apt?

63 Comments »

January 16, 2014

Some sources on Snowden and the NSA

Axel Arnbak has aggregated some papers about Snowden and the NSA revelations that you might find useful. It nicely does not include only US sources and interests.

Be the first to comment »

January 12, 2014

McGeveran’s Law of Friction

William McGeveran [twitter:BillMcGev] has written an article for University of Minnesota Law School that suggests how to make “frictionless sharing” well-behaved. He defines frictionless sharing as “disclosing “individuals’ activities automatically, rather than waiting for them to authorize a particular disclosure.” For example:

…mainstream news websites, including the Washington Post, offer “social reading” applications (“apps”) in Facebook. After a one- time authorization, these apps send routine messages through Facebook to users’ friends identifying articles the users view.

Bill’s article considers the pros and cons:

Social media confers considerable advantages on individuals, their friends, and, of course, intermediaries like Spotify and Facebook. But many implementations of frictionless architecture have gone too far, potentially invading privacy and drowning useful information in a tide of meaningless spam.

Bill is not trying to build walls. “The key to online disclosures … turns out to be the correct amount of friction, not its elimination.” To assess what constitutes “the correct amount” he offers an heuristic, which I am happy to call McGeveran’s Law of Friction: “It should not be easier to ‘share’ an action online than to do it.” (Bill does not suggest naming the law after him! He is a modest fellow.)

One of the problems with the unintentional sharing of information are “misclosures,” a term he attributes to Kelly Caine.

Frictionless sharing makes misclosures more likely because it removes practical obscurity on which people have implicitly relied when assessing the likely audience that would find out about their activities. In other words, frictionless sharing can wrench individuals’ actions from one context to another, undermining their privacy expectations in the process.

Not only does this reveal, say, that you’ve been watching Yoga for Health: Depression and Gastrointestinal Problems (to use an example from Sen. Franken that Bill cites), it reveals that fact to your most intimate friends and family. (In my case, the relevant example would be The Amazing Race, by far the worst TV I watch, but I only do it when I’m looking for background noise while doing something else. I swear!) Worse, says Bill, “preference falsification” — our desire to have our known preferences support our social image — can alter our tastes, leading to more conformity and less diversity in our media diets.

Bill points to other problems with making social sharing frictionless, including reducing the quality of information that scrolls past us, turning what could be a useful set of recommendations from friends into little more than spam: “…friends who choose to look at an article because I glanced at it for 15 seconds probably do not discover hidden gems as a result.”

Bill’s aim is to protect the value of intentionally shared information; he is not a hoarder. McGeveran’s Law thus tries to add in enough friction that sharing is intentional, but not so much that it gets in the way of that intention. For example, he asks us to imagine Netflix presenting the user with two buttons: “Play” and “Play and Share.” Sharing thus would require exactly as much work as playing, thus satisfying McGeveran’s Law. But having only a “Play” button that then automatically shares the fact that you just watched Dumb and Dumberer distinctly fails the Law because it does not “secure genuine consent.” As Bill points out, his Law of Friction is tied to the technology in use, and thus is flexible enough to be useful even as the technology and its user interfaces change.

I like it.

Be the first to comment »

November 13, 2013

Protecting library privacy with a hard opt-in

Marshall Breeding gave a talk today to the Harvard Library system as part of its Discoverability Day. Marshall is an expert in discovery systems, i.e., technology that enables library users to find what they need and what they didn’t know they needed, across every medium and metadata boundary.

It’s a stupendously difficult problem, not least because the various providers of the metadata about non-catalog items — journal articles, etc. — don’t cooperate. On top of that, there’s a demand for “single searchbox solutions,” so that you can not only search everything the Googley way, but the results that come back will magically sort themselves in the order of what’s most useful to you. To bring us closer to that result, Marshall said that systems are beginning to use personal profiles and usage data. The personal profile lets the search engine know that you’re an astronomer, so that when you search for “mercury” you’re probably not looking for information about the chemical, the outboard motor company, or Queen. The usage data will let the engine sort based on what your community has voted on with its checkouts, recommendations, etc.

Marshall was careful to stipulate that using profiles or usage data will require user consent. I’m very interested in this because the Library Innovation Lab where I work has created an online library browser — StackLife — that sorts results based on a variety of measures of Harvard community usage. StackLife computes a “stackscore” based on a simple calculation of the number of checkouts by faculty, grad students or undergrads, how many copies are in Harvard’s 73 libraries, and potentially other metrics such as how often it’s put on reserve or called back early. The stackscores are based on 10-year aggregates without any personal identifiers, and with no knowledge of which books were checked out together. And our Awesome Box project, now in more than 40 libraries, provides a returns box into which users can deposit books that they thought were “awesome,” generating particularly delicious user-based (but completely anonymized) data.

Marshall is right: usage data is insanely useful for a community, and I’d love for us to be able to get our hands on more of it. But, I got into a Twitter discussion about the danger of re-identification with Mark Ockerbloom [twitter:jmarkockerbloom] and John Wilbanks [twitter:wilbanks], two people I greatly respect, and I agree that a simple opt-in isn’t enough, because people may not fully recognize the possibility that their info may be made public. So, I had an idea.

Suppose you are not allowed to do a “soft” opt-in, by which I mean an opt-in that requires you to read some terms and ticking a box that permits the sharing of information about what you check out from the library. Instead, you would be clearly told that you are opting-in to publishing your check-outs. Not to letting your checkouts be made public if someone figures out how to get them, or even to making your checkouts public to anyone who asks for them. No, you’d be agreeing to having a public page with your name on it that lists your checkouts. This is a service a lot of people want anyway, but the point would be to make it completely clear to you that ticking the checkbox means that, yes, your checkouts are so visible that they get their own page. And if you want to agree to the “soft” opt-in, but don’t want that public page posted, you can’t.

Presumably the library checkout system would allow you to exempt particular checkouts, but by default they all get posted. That would, I think, drive home what the legal language expressed in the “soft” version really entails.

 


Here are a couple of articles by Marshall Breeding: 1. Infotoday 2. Digital Shift

1 Comment »

September 5, 2013

Pew Internet survey on Net privacy: Most of us have done something about it

Pew Internet has a new study out that shows that most of us have done something to maintain our privacy (or at least the illusion of it) on the Net. Here’s the summary from the report’s home page:

A new survey finds that most internet users would like to be anonymous online, but many think it is not possible to be completely anonymous online. Some of the key findings:

  • 86% of internet users have taken steps online to remove or mask their digital footprintsâ??ranging from clearing cookies to encrypting their email.

  • 55% of internet users have taken steps to avoid observation by specific people, organizations, or the government.

The representative survey of 792 internet users also finds that notable numbers of internet users say they have experienced problems because others stole their personal information or otherwise took advantage of their visibility online. Specifically:

  • 21% of internet users have had an email or social networking account compromised or taken over by someone else without permission.

  • 12% have been stalked or harassed online.

  • 11% have had important personal information stolen such as their Social Security Number, credit card, or bank account information.

  • 6% have been the victim of an online scam and lost money.

  • 6% have had their reputation damaged because of something that happened online.

  • 4% have been led into physical danger because of something that happened online.

You can read the whole thing online or download the pdf, for free. Thank you, Pew Internet!

1 Comment »

March 28, 2013

[berkman] Dan Gillmor on living off the privacy grid

Dan Gillmor is giving a Berkman lunchtime talk about his Permission Taken project. Dan, who has been very influential on my understanding of tech and has become a treasured friend, is going to talk about what we can do to live in an open Internet. He begins by pointing to Jonathan Zittrain’s The Future of the Internet and Rebecca MacKinnon’s Consent of the Networked [two hugely important books].

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

He says that the intersection of convenience and freedom is narrowing. He goes through a “parade of horribles” [which I cannot keep up with]. He pauses on Loic Le Meur’s [twitter:loic] tweet: “A friend working for Facebook: ‘we’re like electricity.'” If that’s the case, Dan says, we should maybe even think about regulation, although he’s not a big fan of regulation. He goes through a long list of what apps ask permission to do on your mobile. His example is Skype. It’s a long list. Bruce Schneier says when it comes to security, we’re heading toward feudalism. Also, he says, Skype won’t deny it has a backdoor. “You should assume they do,” he says. The lock-in is getting tighter and tighter.

We do this for convenience. “I use a Kindle.” It makes him uncomfortable but it’s so hard to avoid lock-in and privacy risks. The fight against SOPA/PIPA was a good point. “But keep in mind that the copyright cartel is a well-funded smart group of people who never quit.” He says that we certainly need better laws, rules, and policies. “That’s crucial.” But his question this afternoon is what we as individuals can do. Today he’s going to focus on security countermeasures, although they’re not enough. His project â?? which might become a book â?? will begin simply, because it’s aimed at the broad swath of people who are not particularly technically literate.

“Full disk encryption should be the default. It’s not. Microsoft charges extra for it. Mac makes it pretty easy. So does Ubuntu.”

Disable intrusive browser extensions.

Root your phone. That’s not perfect. E.g., it makes you vulnerable to some attacks. But the tradeoff is that you now control your phone.

Dan blocks apps from particular permissions. Sometimes that keeps the app from working. “I accept that.” This is a counter to vendors insisting that you give them all the rights.

Use Tor [The Onion Router], even though “I assume some of the exit nodes” being run by the CIA. Tor, he explains, is a way of browsing the Web with some reasonable likelihood your ISP doesn’t know what you’re actually looking at, and what you’re looking at doesn’t know where you’re coming from.” This he says is important for whistleblowers, etc.

When loyalty cards came out, he and his friend used to randomly swap them to make the data useless. The last time he got one, he filled in his address as 1600 Pennsylvania Ave., and the guy in the store said, “It’s amazing how many people live there.” If you use a false address with a card, it may not work. If you do it on line, you’re committing a felony under the Computer Fraud and Abuse Act. The revisions are going in the wrong direction. “This is terrifying…We have to do something collectively.”

Pick your platform carefully. “I was the biggest Apple person around…I was a Mac bigot for years.” At prss events, he’d be the only person (beside John Markoff) to have a Mac. Many things happened, including Apple suing websites wanting to do journalism about Apple. Their “control freakery” and arrogance with the iPhone was worse. “Now that everyone except me at a press event has a Mac, I get worried.” Now the Mac is taking on the restrictions of the iPhone operating system (IOS). “I want to do what I want with my own computer.” All computer makers are moving to devices that you can’t even open them. “Everyone wants to be Apple.”

Own your own domain. Why are journalists putting their work on Facebook or other people’s platforms? Because it brings distribution and attention. “We do these things on ‘free’ platforms at their sufferance.” “We all should have a place on the Web that is owned by us,” even if we don’t do most of our work there. Dan is going to require students to get their own domain name.

Dan says his book/project is going to present a gradient of actions. At the further end, there’s Linux. Dan switched last year and has found it almost painless. “No one should have to use the command line if they don’t want to,” and Linux isn’t perfect about that yet. “Even there it’s improving.” He says all the major distributions are pretty. He uses Ubuntu. “Even there there’s some control-freakery going on.” Dan says he tried Linux every year for 10 years, and how he finds it “ready for prime time.” He says some control features being introduced to Windows, for reasonable reasons, is making life harder for Linux users. [I’m not sure what he’s referring to.]

Dan says the lockdown is caused by self-interest, not good vs. evil. He hopes that we can start to make the overlap of convenient and freedom larger and larger.

Q&A

Q: If you should have your own domain, you should also do your own hosting, run your own Apache server, etc.

A: You can’t be independent of all external services unless you really want that. There’s a continuum here. My hosting is done by someone I know personally. We really need systematic and universal encryption in the cloud, so whoever is storing your stuff can’t muck with it unless you give them permission. That raises legal questions for them.

Q: I really like what you’re saying. I’m not a specialist and it sounds like a conversation among a very small number of people who are refined specialists in this area. How do you get this out and more accessible? Could this be included in basic literacy in our public schools? On the other hand, I worry there’s a kind of individualism: You know how to do it, so you get to do it, but the rest don’t. How do we build a default position for people who can’t manage this for themselves.

A: Yes, I worry that this for geeks. But I’m not aiming this project at geeks. It’s more aimed at my students, who have grown up thinking Facebook is the Internet and that the MacBook Air gives them complete freedom [when in fact it can’t be opened and modified]. The early chapters will be on what you can do whatever it is that use. It won’t solve the problem, but it will help. And then take people up a ramp to get them as far as they’re comfortable doing. In really clear language, I hope. And it’d be a fine idea to make this part of digital literacy education. I’m a huge fan of CodeAcademy; Douglas Rushkoff wrote a wonderful book called “Program or Be Programmed,” and I think it does help to know some of this. [See Diana Kimball’s Berkman Talk on coding as a liberal art.] It’s not going to be in big demand any time soon. But I hope people can see what’s at risk, what they’re losing, and also what they gain by being locked down.

Q: Do you think freedom and convenience will grow further apart? What are the major factors?

A: Overall, the bad direction is still gaining. That’s why I’m doing this. I don’t think people are generally aware of the issues. It’ll help if we can get word out about what’s at risk and what the choices are. If people are aware of the issues and are fine with giving up their freedom, that’s their choice. We’ve been trading convenience of the illusion of security. “We put our hands up in scanners as if we’re being frisked.” There’s more money and power on the control side. Every major institution is aligned on the same side of this: recentralizing the technology that promised radical decentralization. That’s a problem. I’m going to try to convince people to use tech that doesn’t do that, and to push for better policies, but …

Q: What exactly are you concerned about? I feel free to do anything I want on the Internet. Maybe the govt is managing me. Marketers definitely are. I worry about hackers stealing my identity. But what are the risks?

A: “I think a society that is under pervasive surveillance is a deadened society in the long run.” It’s bad for us “in every way that I can imagine” except for the possibility that can stop a certain amount of crime. “But in dictatorships, the chief criminals are the govt and the police, so it doesn’t solve the problem.” The FBI wants a backdoor into every technology. If they get one, it will be used by bad people. This stuff doesn’t stay secret forever. The more you harden the defenses, the more room there is for really bad actors to get in. Those are some of the main reasons.

Q: How can Tor can help whistleblowers? Do you have other advice for journalists?

A: I have a chapter in a book that’s coming out about journalists and closed platforms. Journalists need to learn about security right away because they’re putting the lives of their sources at risk. The Committee to Protect Journalists has done important work on helping journalists understand the risks and mitigate them. It’s a crucial issue that hasn’t gotten enough attention inside the craft. although I had my PGP signature at the bottom of my column for 6 years and got 2 emails that used it, one of which said he just wanted to know if it worked. Also, you should be aware that you can’t anticipate every risk. E.g., if the US govt wants to find out what I’m talking about online, they’ll figure out a way to do it. They could break into my house and put up cameras. But like the better deadbolt lock stopping amateur criminals, better security measures will discourage some intrusions. When I do my online banking, I do it from a virtual machine that I use only for that; it has never gone anywhere else on the Internet. I don’t think that’s totally paranoid. There are still risks.

Q: The Supreme Court just affirmed first sale of materials manufactured outside of the US. Late stage capitalism want to literally own their markets, offline as well as online. How much of that wider context do you want to get into?

A: If the Court hadn’t affirmed first sale, every media producer would have moved all their production facilities offshore so that we wouldn’t be able to resell it. These days we buy licenses, not goods. Increasingly, physical goods will have software components. That’s an opportunity for the control crowd to keep you from owning anything you buy. In Massachusetts, the car repair shops got a ballot measure saying they get access to the software in cars; that was marvelous. BTW, I’m making common cause with some friends on the Right. Some of the more far-seeing people on the Right are way ahead in thinking about this. E.g., Derek Khanna. I will be an ally of anybody.

Q: [harry lewis] Great project. Here’s your problem: What are you worried about? This is a different sort of surveillance society. This is the opposite of the Panopticon where everyone knows they’re being spied upon. People won;t be motivated until there are breeches. The incentive of the surveillors is to do it as unobtrusively as possible. You’ll never know why your life insurance premium is $100 higher than my. You want ever see the data paths that led to that, because the surveillance will be happening at a level that will be ompletely invisible to the individual. It’ll be hard to wake people up. “A surveillance society is a deadened society” only if people know they’re being surveilled.

A: If they don’t see a consequence, then they won’t act. If the govt a generation ago had told you that you will henceforth carry a tracking device so we can where you are at any time, there would have been an uproar. But we did it voluntarily [holding up a mobile phone]. The cell tower has to know where you are, but I’d like to find a way to spoof everything else for everyone else. (You should assume your email is being read on your employer’s server, Dan says.)

Q: I worry about creating a privacy of the elite that only a small segment can access. That creates a dangerous scenario. Should there be govt regulations to make sure we’re all operating with the same levels of privacy?

A: It’s an important point. The govt rules won’t be the ones you want. We need to create a market-based solutions. Markets work better than advice or edicts.

Q: But hasn’t the market spoken, and it’s the iPhone?

A: The iPhone has important security features. But people aren’t scared enough to create a market.

A: The ACLU should be advised on how to create pamphlets that will reach people.

A: So much of hacker culture and open source culture are based on things being difficult. Many of the privacy tools work but are too hard to use. There is a distinct lack of design, and we don’t see poorly designed things as legitimate. And that’s a fairly easy thing to fix. A: Yes.

Q: Younger people don’t seem to care about privacy. Is there a generational shift?

A: There are two possibilities for the future. My hope is that we’ll all start cutting each other more slack; everyone will recognize that we all did unbelievably stupid, even possibly criminal things, in our 20s. I still do plenty of stupid things. But it worries me that cultures sometimes grow less tolerant. This could be catastrophic, if the country goes toward the Right.

A: There are tools to make it easy to do this. E.g., CryptoParty.org, the Pirate Party. And are there alternatives to social media that are ready for prime time?

A: Still pretty geeky, but it’s a wonderful start. But many of the tools cost money.

Q: Any thoughts about ways to use govt and corporate interests to promote your goals. E.g., protect the children.

A: I’ll rename this Protect the Children and then everyone will do what I want :) Overall, the problem is that power is shifting, pulling back into the center. This has long term negative consequences. But speculating on what the consequences will be is never as effective as showing what’s going wrong now. I want the power to be distributed. “I’m pretty worried, although I’m a relentless optimist.” “I’m a resister.”

Be the first to comment »

November 26, 2012

Petition: Update the Electronic Communications Privacy Act

Jonathan Kamens has started a petition to the White House to update the Electronic Communications Privacy Act (ECPA). Here’s the nut of his explanation:

The Electronic Communications Privacy Act protects the privacy of your email, requiring law-enforcement authorities to show probable cause and obtain a search warrant before they can read it. Except it doesn’t, really, because under the ECPA, any email left “on the server” for more than 180 days is considered “abandoned” under the ECPA, and any prosecutor in the country can get access to it simply by signing a letter requesting such access.

The law was written in the days when people’s email stayed on the server only until they downloaded it from there to their desktop computer over a slow modem. Nowadays, however, virtually everybody leaves their email “on the server” so that it can be accessed from anywhere on any device. So virtually everybody’s old email is accessible to law-enforcement authorities without a search warrant. This is horrendously unacceptable, and this petition calls on Congress to amend the law and on the Obama administration to support and push for such an amendment.

It turns out that I am less concerned about privacy than are most of my friends (not you, Jeff!), but this petition makes complete sense to me. The ECPA is a textbook example of a law that’s been outstripped by technology.

Be the first to comment »

July 2, 2012

How I became a creepy old man

I was checking Facebook yesterday afternoon, as I do regularly every six months or so. It greeted me with a list of friend requests. One was from the daughter of a colleague. So I accepted on the grounds that it was unexpected but kind of cute that she would ask.

Only after I clicked did I realize that the list was not of requests but of suggestions for people I might want to friend. So, now the daughter of a colleague has received a friend request from a 61 year old man she never heard of, and I’m probably going to end up on the No Fly list.

The happy resolution: I contacted my colleague to let him know, and he took it as an opportunity to have a conversation with his daughter about how to handle friend requests from people she doesn’t know, especially pervy-looking old men.

3 Comments »

Next Page »