Joho the Blog » security

November 17, 2013

Noam Chomsky, security, and equivocal information

Noam Chomsky and Barton Gellman were interviewed at the Engaging Big Data conference put on by MIT’s Senseable City Lab on Nov. 15. When Prof. Chomsky was asked what we can do about government surveillance, he reiterated his earlier call for us to understand the NSA surveillance scandal within an historical context that shows that governments always use technology for their own worst purposes. According to my liveblogging (= inaccurate, paraphrased) notes, Prof. Chomsky said:

Governments have been doing this for a century, using the best technology they had. I’m sure Gen. Alexander believes what he’s saying, but if you interviewed the Stasi, they would have said the same thing. Russian archives show that these monstrous thugs were talking very passionately to one another about defending democracy in Eastern Europe from the fascist threat coming from the West. Forty years ago, RAND released Japanese docs about the invasion of China, showing that the Japanese had heavenly intentions. They believed everything they were saying. I believe this is universal. We’d probably find it for Genghis Khan as well. I have yet to find any system of power that thought it was doing the wrong thing. They justify what they’re doing for the noblest of objectives, and they believe it. The CEOs of corporations as well. People find ways of justifying things. That’s why you should be extremely cautious when you hear an appeal to security. It literally carries no information, even in the technical sense: it’s completely predictable and thus carries no info. I don’t doubt that the US security folks believe it, but it is without meaning. The Nazis had their own internal justifications. [Emphasis added, of course.]

I was glad that Barton Gellman — hardly an NSA apologist — called Prof. Chomsky on his lumping of the NSA with the Stasi, for there is simply no comparison between the freedom we have in the US and the thuggish repression omnipresent in East Germany. But I was still bothered, albeit by a much smaller point. I have no serious quarrel with Prof. Chomsky’s points that government incursions on rights are nothing new, and that governments generally (always?) believe they are acting for the best of purposes. I am a little bit hung-up, however, on his equivocating on “information.”

Prof. Chomsky is of course right in his implied definition of information. (He is Noam Chomsky, after all, and knows a little more about the topic than I do.) Modern information is often described as a measure of surprise. A string of 100 alternating ones and zeroes conveys less information than a string of 100 bits that are less predictable, for if you can predict with certainty what the next bit will be, then you don’t learn anything from that bit; it carries no information. Information theory lets us quantify how much information is conveyed by streams of varying predictability.

So, when U.S. security folks say they are spying on us for our own security, are they saying literally nothing? Is that claim without meaning? Only in the technical sense of information. It is, in fact, quite meaningful, even if quite predictable, in the ordinary sense of the term “information.”

First, Prof. Chomsky’s point that governments do bad things while thinking they’re doing good is an important reminder to examine our own assumptions. Even the bad guys think they’re the good guys.

Second, I disagree with Prof. Chomsky’s generalization that governments always justify surveillance in the name of security. For example, governments sometimes record traffic (including the movement of identifiable cars through toll stations) with the justification that the information will be used to ease congestion. Tracking the position of mobile phones has been justified as necessary for providing swift EMT responses. Governments require us to fill out detailed reports on our personal finances every year on the grounds that they need to tax us fairly. Our government hires a fleet of people every ten years to visit us where we live in order to compile a census. These are all forms of surveillance, but in none of these cases is security given as the justification. And if you want to say that these other forms don’t count, I suspect it’s because it’s not surveillance done in the name of security…which is my point.

Third, governments rarely cite security as the justification without specifying what the population is being secured against; as Prof. Chomsky agrees, that’s an inherent part of the fear-mongering required to get us to accept being spied upon. So governments proclaim over and over what threatens our security: Spies in our midst? Civil unrest? Traitorous classes of people? Illegal aliens? Muggers and murderers? Terrorists? Thus, the security claim isn’t made on its own. It’s made with specific threats in mind, which makes the claim less predictable — and thus more informational — than Prof. Chomsky says.

So, I disagree with Prof. Chomsky’s argument that a government that justifies spying on the grounds of security is literally saying something without meaning. Even if it were entirely predictable that governments will always respond “Because security” when asked to justify surveillance — and my second point disputes that — we wouldn’t treat the response as meaningless but as requiring a follow-up question. And even if the government just kept repeating the word “Security” in response to all our questions, that very act would carry meaning as well, like a doctor who won’t tell you what a shot is for beyond saying “It’s to keep you healthy.” The lack of meaning in the Information Theory sense doesn’t carry into the realm in which people and their public officials engage in discourse.

Here’s an analogy. Prof. Chomsky’s argument is saying, “When a government justifies creating medical programs for health, what they’re saying is meaningless. They always say that! The Nazis said the same thing when they were sterilizing ‘inferiors,’ and Medieval physicians engaged in barbarous [barber-ous, actually - heyo!] practices in the name of health.” Such reasoning would rule out a discussion of whether current government-sponsored medical programs actually promote health. But that is just the sort of conversation we need to have now about the NSA.

Prof. Chomsky’s repeated appeals to history in this interview covers up exactly what we need to be discussing. Yes, both the NSA and the Stasi claimed security as their justification for spying. But far from that claim being meaningless, it calls for a careful analysis of the claim: the nature and severity of the risk, the most effective tactics to ameliorate that threat, the consequences of those tactics on broader rights and goods — all considerations that comparisons to the Stasi and Genghis Khan obscure. History counts, but not as a way to write off security considerations as meaningless by invoking a technical definition of “information.”

Be the first to comment »

September 5, 2013

Pew Internet survey on Net privacy: Most of us have done something about it

Pew Internet has a new study out that shows that most of us have done something to maintain our privacy (or at least the illusion of it) on the Net. Here’s the summary from the report’s home page:

A new survey finds that most internet users would like to be anonymous online, but many think it is not possible to be completely anonymous online. Some of the key findings:

  • 86% of internet users have taken steps online to remove or mask their digital footprintsâ??ranging from clearing cookies to encrypting their email.

  • 55% of internet users have taken steps to avoid observation by specific people, organizations, or the government.

The representative survey of 792 internet users also finds that notable numbers of internet users say they have experienced problems because others stole their personal information or otherwise took advantage of their visibility online. Specifically:

  • 21% of internet users have had an email or social networking account compromised or taken over by someone else without permission.

  • 12% have been stalked or harassed online.

  • 11% have had important personal information stolen such as their Social Security Number, credit card, or bank account information.

  • 6% have been the victim of an online scam and lost money.

  • 6% have had their reputation damaged because of something that happened online.

  • 4% have been led into physical danger because of something that happened online.

You can read the whole thing online or download the pdf, for free. Thank you, Pew Internet!

1 Comment »

December 11, 2011

If credit card companies cared about security…

1. When there’s a security issue, they wouldn’t robocall people and ask them to provide personal information. They would robocall people and ask them to call the number on the back of their cards.

2. They would put people’s photographs on their credit cards. Citi used to offer that as a free option, but apparently has discontinued the practice.

1 Comment »

April 7, 2011

Citicard does its best to train us in horrible security practices

Citibank continues to train its customers to use terrible security processes.

This morning I got a call from a robot that claimed to be from Citibank. When I refused to type in my zip code, and then waited for two minutes of repeated requests to do so, it transferred me to a human who wanted me to give him my name, undoubtedly to be followed by a request for my password. Thus does Citibank train its users to divulge personal information to anyone with an automated phone dialer.

This is the same outfit that no longer offers to put a thumbnail photo of you on your credit card, which is a pretty good way to foil card-grabbing bastards. It also used to embed an image of your signature on the front of the card. Again, a cheap and effective prophylactic measure that it no longer offers.

This is also the same outfit that is very happy to sell us monthly services — $10/month last time I looked — that inform us when Citibank has failed to protect us from identity theft.


December 1, 2009

Sprint Nextel informs on its customers 8M times

Chris Soghoian reports:

print Nextel provided law enforcement agencies with its customers’ (GPS) location information over 8 million times between September 2008 and October 2009. This massive disclosure of sensitive customer information was made possible due to the roll-out by Sprint of a new, special web portal for law enforcement officers.

The evidence documenting this surveillance program comes in the form of an audio recording of Sprint’s Manager of Electronic Surveillance, who described it during a panel discussion at a wiretapping and interception industry conference, held in Washington DC in October of 2009.

It is unclear if Federal law enforcement agencies’ extensive collection of geolocation data should have been disclosed to Congress pursuant to a 1999 law that requires the publication of certain surveillance statistics — since the Department of Justice simply ignores the law, and has not provided the legally mandated reports to Congress since 2004.


June 19, 2009

FlyClear: Cutting in line so the terrorists won’t win

At the Reagan Airport (would I be jumping the gun to start calling it the Obama Airport already?), Clear has a little square of space right before the security inspection stations. For $200/year, you can skip the long lines and go for the exceedingly short line to Clear. There the uniformed employees will compare some of your body parts (iris and fingerprints) with the information on the Clear card you present. Once you’re through, you can go straight to the Conveyor of Transparencies where you rejoin the hoi polloi so that the TSA can make sure your shoes aren’t on fire.

What I don’t get is why Clear has to give you an extra special biometric scan. Why can’t they just do what the TSA folks do: Look at your drivers license, look at you, and wave you on through? All I can figure is that Clear’s market research showed that people would be more willing to pay to cut in line — which is what Clear is really about — if there’s a pretense that it enhances security.

As far as whether all the fancy-shmancy biometrics — heck, my face is the only biometric I need! — actually increases security, if I were an evil do-er, I’d just bribe a Clear airport employee. They don’t go through security clearances the way TSA folks do, at least according to the Clear employee I asked.

[Tags: ]

June 22, 2009: Clear just went out of business.


June 16, 2009

Google: Make security the default (Now with Iranian tweets)

Chris Soghoian has posted an open letter to Google, asking it to make encryption the default. This is in line with the talk he gave recently at the Berkman Center.

[Update later that day: Two hours after releasing the letter, Google agreed to try setting encryption as the default for a subset of users, as a trial. If it works out, they'll consider expanding it.]

Also, Jonathan Zittrain has posted about why the Iranians have problems blocking Twitter. [Tags: ]

Be the first to comment »

May 26, 2009

[berkman] Chris Soghoian on privacy in the cloud

Chris Soghoian is giving a Berkman lunchtime talk called: “Caught in the Cloud: Privacy, Encryption, and Government Back Doors in the Web 2.0 Era,” based on paper he’s just written. In the interest of time, he’s not going to talk about the “miscreants in government” today.

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

Pew says that “over 69% of Americans use webmail services, store data online, or other use software programs such as word processing applications whose functionality is in the cloud.” Chris’ question: Why have cloud providers failed to provide adequate security for the customers. (“Cloud computing” = users’ data is stored on a company server and the app is delivered through a browser.)

He says that providers are moving to the cloud because they don’t have to worry about privacy. Plus they can lock out troublesome users or countries. It lets them protect patented algorithms. They can do targeted advertising. And they can provide instant updates. Users get cheap/free software, auto revision control, easy collaboration, and worldwide accessibility. Chris refers to “Cloud creep”: the increasing use of cloud computing, its installation on new PCs, etc. Vivek Kundra switched 38,000 DC employees over to Google Docs becore he became Federal CIO. “It’s clear he’s Google-crazy.” Many people may not even know they’ve shifted to the cloud. Many cloud apps now provide offline access as well. HTML 5 (Firefox 3.5) provide offline access without even requiring synchronizers such as Google Gears.

Chris says that using a single browser to access every sort of site — from safe to dangerous — is bad practice. Single-site browsers avoid that. E.g., Mozilla Prism keeps its site in its own space. With Prism, you have an icon on your desktop for, e.g., Google Docs. It opens in a browser that can’t go anywhere else; it doesn’t look like a cloud app. “It’s a really cool technology.” Chris uses it for online banking, etc.

Conclusion of Part 1 of Chris’ talk: Cloud services are being used increasingly, and users don’t always know it.

Part 2

We use encryption routinely. SSl/TLS is used by banks, e-commerce, etc. But the cloud providers don’t use SSL for much other than the login screen. Your documents, your spreadsheets, etc., can easily be packet-sniffed. Your authentication cookies can be intercepted. That lets someone login, modify, delete, or pretend to be you. “This is a big deal.” (The “Cookie Monster” tool lets you hijack authentication cookies. AIMJECT lets you intercept IM sessions; you can even interject your own messages.)

This problem has been wn since August 2007, and all the main cloud providers were notified. It took Google a year to release a fix, and even so it hasn’t been turned on by default. Facebook, Yahoo mail, Microsoft, etc. don’t even offer SSL. Google says it doesn’t turn it on by default because it can slow down your computer, because it has to decrypt your data. But Google does require you to use it for Google Health, because the law requires it. To get SSL for gmail, you have to go 5 levels down to set it.

So, why doesn’t Google provide SSL bu default? Because it takes “vastly more processing power,” and thus is very expensive for Google. SSL isn’t a big deal when done on your computer (the client computer), but for cloud computing, it would all fall on Google’s shoulders. “If 100% of Google’s customers opt to use SSL, it sees no new profits, but higher costs.” “And Google is one of the better ones.” The only better one, in Chris’ view, is Adobe, which turns it on by default for its online image editing service. [Here's a page that tells you how to turn on SSL for a Google Accounts account.]

Chris thinks that cloud computing security may be a type of “shrouded attribute,” i.e. am attribute that isn’t considered when making a buying decision. But, Chris says, defaults matter. E.g., if employees opt employees into a 401K, no one opts out, but if you leave it to employees to opt in, fewer than half do. Facebook, for example, seems to blame the user for not turning privacy features off. “Users should be given safe services by default.”

Part 3: Fixing it

Chris draws analogies to seatbelts and tobacco legislation. He recommends that we go down the cigarette pathway first: Raise publice awareness so that they demand mandatory warnings for insecure apps. E.g., “WARNING: Email messagew that you write can be read, intercepted or stolen. Click here to turn on protection…” [Chris' version was better. Couldn't type fast enough.]

Or, if necessary, we could pass regulations mandating SSL. T he FTC could rule that companies that claim their services are safe are lying.

Q: [me] How much crime does this enable? A: The tools are out there. But there's no data because intercepting packets leaves no traces.

Q: How about OpenID?
A: The issue of authentication cookies is the same.

Q: Should we have a star rating system?
A: Maybe.

Q: The lack of data about the crime is a problem for getting people to act. Maybe you should look at the effect on children: Web sites aimed for children, under 18 year olds using Facebook…
A: Good idea! Although Google’s terms of service don’t allow people under 18 to use any of their services.

Q: People also feel there’s safety in numbers.

Q: How much more processing power would SSL require from Google?
A: Google custom builds its servers. Adding in a new feature would require crypto-co-processor cards. I don’t think they have those. They’d have to deploy them.

Q: There are GreaseMonkey scripts that require FB to use SSL. Worthwhile?
A: FB won’t accept SSL connections.

Q: Google Chrome’s incognito mode? Does it help with anything?
A: It helps with porn. That cleans up your history, but it doesn’t encrypt traffic.

Q: The vast majority of people where I live don’t lock their house doors. And [says someone else] people don’t lock their mailboxes even though they contain confidential docs.
A: Do you walk around with your ATM PIN number on your forehead? Your bank uses SSL because it’s legally responsible for electronic break-ins, whereas Google isn’t.
A: The risk is small if you’re using a wired ethernet connection or a protected wifi connection.

Q: With seatbelts and smoking, your life’s at risk. For Gmail, the risk seems different. There aren’t data, screaming victims, etc. It makes the demand for regulation harder to stimulate.
A: The analogy doesn’t work 100%. But I think the disanalogy works in my favor: It’s hard to have a cigarette that doesn’t harm you, but it’s easy to have a secure SSL connection.

Q: Shouldn’t business care about this?
A: Yes, CIO’s can make that decision and turn on encryption for the entire org. Consumers have to be their own CIOs.

[from the IRC] Maybe the govrnment wants Google to be insecure to enable snooping.
A: Allow me to put on my tin foil hat. Last year the head of DNI said that the gov’t collects vast amounts of traffic. We don’t know how they’re doing it, which networks they’re collecting data from. If Google and AT&T, etc., turned on SSL be default, the gov’t’s job would be much harder. Google has other reasons to keep SSL off, but it works out to the gov’t’s benefit.

Does Adobe’s online wordprocessor, Buzzword, offer SSL for its docs?
A: Don’t know. [It does] [Tags: ]


May 11, 2009

Smart and secure grids and militaries

The piece I wrote about Robin Chase prompted Andrew Bochman to send me an email. Andy is an MIT and DC energy tech guy (and, it turns out, a neighbor) who writes two blogs: The Smart Grid Security Blog and the DoD Energy Blog. Neither of these topics would make it into my extended profile under “Interests,” but I found myself sucked into them (confirming my rule of thumb that everything is interesting if look at in sufficient detail). So many things in the world to care about!

[Tags: ]


March 16, 2009

Extra Sensory Keyboard Detection

Researchers have discovered ways to pick up your keystrokes by reading tiny scraps of electromagnetic radiation, or with PS2-connected keyboards, just by plugging into the power grid. It turns out Cryptonomicon wasn’t paranoid enough!

[Tags: ]

1 Comment »

Next Page »

Switch to our mobile site