|
|
First panel at HyperPublic conf. Hurriedly typed and not re-read.
|
NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.
|
Paul Dourish: Think of privacy not so much as something that people have, but something that people do. “What are people doing when they are doing public, or doing private?” Think of doing privacy as one of the ways of engaging with a group.
And pay attention to the multiple publics we deal with when encountering media objects. When we encounter a media object we think “this is aimed at people like me.” Publics = complicated sameness and difference. For example, for a couple of years, he looked at paroled sex offenders in California who are being tracked with GPS. How do you think about space if you have to first worry about coming with 2,000 feet of a school, library, etc.? That reconfigures the scale at which public space is encountered: since it’s impossible to navigate at a level of 2,000 feet, these people think about which towns are safe for them. Instead of privacy, it helps to think in terms of our accountability to others.
Jonathan Zittrain suggests an iphone app that shows up map routes that take account of the sex offenders’ rule of avoiding schools, etc. He also raises the relation of privacy and identity.
Laurent Stalder mentions work on what privacy meant within a house in the 1880s in England. Artifacts were introduced that affected privacy, from sliding doors to doorbells. Then he shows a 2008 floor plan that distinguishes much less between public and private, inside and outside — the rise of a differentiated set of threshold devices. What is the role of the architect when spaces are filled with an endless stream of people, information, fluids…? Laurent points to the continual renegotiation of borders and their consistency. [I had trouble hearing some of the talk; the room does not have good acoustics. Nor do my ears.] In converation with JZ, Larent contrasts two Harvard buildings, one of which has a clear inside and outside, and another that has a long transitional state.
John Palfrey says that lawyers are so engaged in the question of privacy because they too are designers, but of rule-sets. Lawyers have not done a great job in determining which rule-set about privacy will enable us to thrive. He makes three points: 1. The importance of human experience in these spaces. We are public by default, he says, crediting danah boyd. We’re learning that though we often trade convenience for control, we care about in particular contexts, a changing set of practices. 2. The old tools haven’t worked well for us with privacy. E.g., the 4th Amendment doesn’t fit the cyber world well. 3. The systems that tend to work best are highly interoperable wit one another; we don’t want to type in the same info into multple systems. Open, interoperable systems succeed. But that gives rise to privacy problems. We need places — breakwalls — where the data can be either slowed or stopped.
JZ points out that JP is, like Laurent, talking about having long thresholds.
JZ imagines a world in which many people “lifestream” their lives and we are able to do a query to see who was where at just about any time. That makes Google StreetView’s photo-ing of houses seem like nothing, he says.
In response to Jeff Jarvis’ question, Paul reminds us that the social takes up the architectural, so that the same threshold space (or any space) can take on different privacy norms for different cultures and sub-cultures.
JZ: Architectural spaces last for decades or centuries. Online spaces can be reconfigured easily. The “house” your moved into can be turned into something different by the site’s owners. E.g., Facebook tinkers with the space you use by changing
Q: What is the purpose of the threshold?
Laurent: Connection and separation
Q: Don’t we want some type of digital threshold that does the job of introducing, transitioning, informing, introducing, etc. “You keep some of where you were in where you are.” The lack of that affects identity and more.
JZ: You can imagine a web site that shows you where other people are visiting from. “Wow, a lot of folks are coming from AOL. This must not be a cool site.” :)
Paul: It’s important to historicize sites appropriately so we understand where they came from.
Me: It’s possible to misuse architectural spaces, because architecture is always intensely local. So, will privacy norms ever settle down in the global Web?
Invention of the chimney enabled privacy in homes, as opposed to central fire. [Having trouble hearing] Will the poor not have Internet privacy, while the affluent do?
As important as the Net spaces are the spaces in which people use the Net. E.g., Net cafes in the developing world. Access and capital change publicness and privacy.
Paul: In China, people go to public spaces to play online games. (He says that they consider World of Warcraft as a Chinese game in its values.) There certainly won’t be global agreements about privacy norms. Nor does there have to be, because your encounters wit hthem always occur in local settings.
JZ: And within these spaces can be communities their own norms.
Categories: liveblog Tagged with: architecture • hyperpublic • privacy • public Date: June 10th, 2011 dw
|
NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.
|
Urs Gasser opens the conference by reflecting on the recent Swiss court decision that Google StreetView has to go to extraordinary lengths to obscure personal identifiers (like faces and racial identity), especially in front of “sensitive” public areas.
Judith Donath points to our increased discomfort with the new lines between public and private, while in an environment where it’s not only hard to separate them, but where the old well-defined norms don’t work. This is not solely a problem of the online world, she points out. How do we understand this new public as designers? In the online world, you can be public while sitting alone in your room.
She says she was one of the initiators of this interdisciplinary conference (also: Jeff Huang) because different fields have different ideas of what is desirable. E.g., lawyers traditionally think that privacy is a goal in itself.
Judith says we’re looking at this topic at a time in human history when we’ve had an almost unprecedented amount of privacy; e.g., we are more mobile and thus can shed our prior public selves. We have also been more isolated and alienated: we can live without engaging with others, in a city of strangers, in a workplace where all our ties are weak, etc.
She reminds us that during the day we should be thinking about how what we learn can be applied to help build a better civil society.
Categories: culture Tagged with: hyperpublic • privacy • public Date: June 10th, 2011 dw
The upcoming HyperPublic conference has posted a provocation I wrote a while ago but didn’t get around to posting, on rebooting library privacy now that we’re in the age of social networks. (Ok, so the truth is that I didn’t post it because I don’t have a lot of confidence in it.) Here’s the opening couple of subsections:
Why library privacy matters
Without library privacy, individuals might not engage in free and open inquiry for fear that their interactions with the library will be used against them.
Library privacy thus establishes libraries as a sanctuary for thought, a safe place in which any idea can be explored.
This in turn establishes the institution that sponsors the library — the town, the school, the government — as a believer in the value of free inquiry.
This in turn establishes the notion of free, open, fearless inquiry as a social good deserving of support and protection.
Thus, the value of library privacy scales seamlessly from the individual to the culture.
Privacy among the virtues
Library privacy therefore matters, but it has never been the only or even the highest value supported by libraries.
The privacy libraries have defended most strictly has been privacy from the government. Privacy from one’s neighbors has been protected rather loosely by norms, and by policies inhibiting the systematic gathering of data. For example, libraries do not give each user a private reading booth with a door and a lock; they thus tolerate less privacy than provided by a typical clothing store changing room or the library’s own restrooms. Likewise, few libraries enforce rules that require users to stand so far apart on check-out lines that they cannot see the books being carried by others. Further, few libraries cover all books with unlabeled gray buckram to keep them from being identifiable in the hands of users.
Privacy from neighbors has been less vigorously enforced than privacy from government agents because neighborly violations of privacy are perceived to be less consequential, and because there are positive values to having shared social spaces for reading.
While privacy has been a very high value for libraries, it has never been an absolute value, and is shaded based on norms, convenience, and circumstance.
more…
Categories: libraries Tagged with: dpla • libraries • privacy Date: May 19th, 2011 dw
Paul Ohm (law prof at U of Colorado Law School — here’s a paper of his) moderates a panel among those with lots of data. Panelists: Jessica Staddon (research scientist, Google), Thomas Lento (Facebook), Arvin Narayanan (post-doc, Stanford), and Dan Levin (grad student, U of Mich).
|
NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.
|
Dan Levin asks what Big Data could look like in the context of law. He shows a citation network for a Supreme Court decision. “The common law is a network,” he says. He shows a movie of the citation network of first thirty years of the Supreme Court. Fascinating. Marbury remains an edge node for a long time. In 1818, the net of internal references blooms explosively. “We could have a legalistic genome project,” he says. [Watch the video here.]
What will we be able to do with big data?
Thomas Lento (Facebook): Google flu tracking. Predicting via search terms.
Jessica Staddon (Google): Flu tracking works pretty well. We’ll see more personalization to deliver more relevant info. Maybe even tailor privacy and security settings.
Dan: If someone comes to you as a lawyer and ask if she has a case, you’ll do a better job deciding if you can algorithmically scour the PACER database of court records. We are heading for a legal informatics revolution.
Thomas: Imagine someone could tell you everything about yourself, and cross ref you with other people, say you’re like those people, and broadcast it to the world. There’d be a high potential for abuse. That’s something to worry about. Further, as data gets bigger, the granularity and accuracy of predictions gets better. E.g., we were able to beat the polls by doing sentiment analysis of msgs on Facebook that mention Obama or McCain. If I know who your friends are and what they like, I don’t actually have to know that much about you to predict what sort of ads to show you. As the computational power gets to the point where anyone can run these processes, it’ll be a big challenge…
Jessica: Companies have a heck of a lot to lose if they abuse privacy.
Helen Nissenbaum: The harm isn’t always to the individual. It can be harm to the democratic system. It’s not about the harm of getting targeted ads. It’s about the institutions that can be harmed. Could someone explain to me why to get the benefits of something like the Flu Trends you have to be targeted down to the individual level?
Jessica: We don’t always need the raw data for doing many types of trend analysis. We need the raw data for lots of other things.
Arvind: There are misaligned incentives everywhere. For the companies, it’s collect data first and ask questions yesterday; you never know what you’ll need.
Thomas: It’s hard to understand the costs and benefits at the individual level. We’re all looking to build the next great iteration or the next great product. The benefits of collecting all that data is not clearly defined. The cost to the user is unclear, especially down the line.
Jessica: Yes, we don’t really understand the incentives when it comes to privacy. We don’t know if giving users more control over privacy will actually cost us data.
Arvind describes some of his work on re-identification, i.e., taking anonymized data and de-anonymizing it. (Arvind worked on the deanonymizing of Netflix records.) Aggregation is a much better way of doing things, although we have to be careful about it.
Q: In other fields, we hear about distributed innovation. Does big data require companies to centralize it? And how about giving users more visibility into the data they’ve contributed — e.g., Judith Donath’s data mirrors? Can we give more access to individuals without compromising privacy?
Thomas: You can do that already at FB and Google. You can see what your data looks like to an outside person. But it’s very hard to make those controls understandable. There are capital expenditures to be able to do big data processing. So, it’ll be hard for individuals, although distributed processing might work.
Paul: Help us understand how to balance the costs and benefits? And how about the effect on innovation? E.g., I’m sorry that Netflix canceled round 2 of its contest because of the re-identification issue Arvind brought to light.
Arvind: No silver bullets. It can help to have a middleman, which helps with the misaligned incentives. This would be its own business: a platform that enables the analysis of data in a privacy-enabled environment. Data comes in one side. Analysis is done in the middle. There’s auditing and review.
Paul: Will the market do this?
Jessica: We should be thinking about systems like that, but also about the impact of giving the user more controls and transparency.
Paul: Big Data promises vague benefits — we’ll build something spectacular — but that’s a lot to ask for the privacy costs.
Paul: How much has the IRB (institutional review board) internalized the dangers of Big Data and privacy?
Daniel: I’d like to see more transparency. I’d like to know what the process is.
Arvind: The IRB is not always well suited to the concerns of computer scientists. Maybe current the monolithic structure is not the best way.
Paul: What mode of solution of privacy concerns gives you the most hope? Law? Self-regulation? Consent? What?
Jessica: The one getting the least attention is the data itself. At the root of a lot of privacy problems is the need to detect anomalies. Large data sets help with this detection. We should put more effort in turning the date around to use it for privacy protection.
Paul: Is there an incentive in the corporate environment?
Jessica: Google has taken some small steps in this direction. E.g., Google’s “got the wrong bob” tool for gmail that warns you if you seem to have included the wrong person in a multi-recipient email. [It's a useful tool. I send more email to the Annie I work with than to the Annie I'm married to, so my autocomplete keeps wanting to send Annie I work with information about my family. Got the wrong Bob catches those errors.]
Dan: It’s hard to come up with general solutions. The solutions tend to be highly specific.
Arvind: Consent. People think it doesn’t work, but we could reboot it. M. Ryan Calo at Stanford is working on “visceral notice,” rather than burying consent at the end of a long legal notice.
Thomas: Half of our users have used privacy controls, despite what people think. Yes, our controls could be simpler, but we’ve been working on it. We also need to educate people.
Q: FB keeps shifting the defaults more toward disclosure, so users have to go in and set them back.
Thomas: There were a couple of privacy migrations. It’s painful to transition users, and we let them adjust privacy controls. There is a continuum between the value of the service and privacy: all privacy and it would have no value. It also wouldn’t work if everything were open: people will share more if they feel they control who sees it. We think we’ve stabilized it and are working on simplification and education.
Paul: I’d pick a different metaphor: The birds flying south in a “privacy migration”…
Thomas: In FB, you have to manage all these pieces of content that are floating around; you can’t just put them in your “house” for them to be private. We’ve made mistakes but have worked on correcting them. It’s a struggle of a mode of control over info and privacy that is still very new.
Categories: too big to know Tagged with: 2b2k • bigdata • facebook • google • privacy Date: November 30th, 2010 dw
Maggie Fox [twitter:maggiefox] says we think about privacy wrong.
|
NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.
|
We can feel violated when what we thought was private goes into unwanted hands. “Violated” is a strong word, and originally meant someone crossing physical space, coming into your house. Our laws of privacy are all about physical place. “We suck at context. We think here and now is all there is.” Privacy is not universal, she says. The notion that you have a private space that no one else can come into is a Western concept. In fact, ours is an American concept. There is no Russian word for privacy. George Lowenstein’s study showed the cultural basis of privacy. He found that when guarantees of confidentiality were given and people were asked to disclosed things, disclosure dropped by 50%. And the more informal the disclosure statement on a site was, the more they disclosed. People don’t think about privacy unless they’re told to think about it.
Privacy is a new concept, relative to human history. It is not global. Rooted in 18th centure property law. And it’s very squishy (= contextual). And now we’re digital. But most people really aren’t all that interested in privacy. We leave breadcrumbs all the time. “In the digital revolution, that data is incredibly valuable, but not to Big Brother.” “If you’re a spy, you shouldn’t be on Twitter.” Worrying about that is a red herring.
We ought to be much more worried about advertisers’ use of data. Their business model is ending, Maggie says. They want to transition from trying to get all the eyeballs to getting the right eyeballs. There is a market for your data. Your privacy is no longer a place. It is a commodity — something people want to buy. You should worry more about Facebook than Big Brother.
So we need to approach privacy differently. Right now, we treat privacy as something that makes you feel weird when someone violates it, e.g., when your Mom refers to your FB page. But, the marketers aren’t just making you feel weird. They’re taking something from you: your data.
Your data has value, and you ought to extract that value. Advertising recognizes that with profit-sharing, discount, loyalty programs: you can trak me in exchange for something I want.
The big sites like Amazon have value because of the data we’ve given them. Our aggregated data is the information age’s natural resource.
We need to think about privacy differently, Maggie concludes.
Q: [esther dyson] What will a company that create a service that does represent the user?
A: Great question. I don’t have the answer.
Categories: cluetrain, liveblog, marketing Tagged with: marketing • privacy Date: November 17th, 2010 dw
Jonathan Zittrain [twitter: zittrain] explains the Facebook directory “leak,” which turns out not to be a leak at all.
Categories: social media Tagged with: facebook • privacy Date: July 29th, 2010 dw
David Hornik of August Capital is funnier than many of the guests on the Radio Berkman series of podcasts. Not to mention that he’s also smart. I interviewed him about privacy ”n’ stuff.
Categories: everythingIsMiscellaneous, social media Tagged with: berkman • moi • privacy Date: April 2nd, 2010 dw
From an email being circulated:
From: “Privacy”
Date: March 11, 2010 11:32:22 AM EST
To: undisclosed-recipients:;
Subject: [VACANCY ANNOUNCEMENT] Director of Privacy Policy and Senior Advisor
For privacy professionals looking for a challenging and rewarding assignment, consider a position as a privacy leader at the U.S. Department of Homeland Security Privacy Office. As the Director of Privacy Policy and Senior Advisor at the Department’s head quarters Privacy Office, you will have direct policy responsibility for complex and cutting edge privacy issues such as social media, cloud computing, information security and risk management. The DHS Privacy Office is looking for an expert recognized in the privacy community who possesses creative and analytical problem solving skills and can build in privacy solutions into the Homeland Security mission. Individuals should have the ability to lead change, lead people, build coalitions and resolve problems within the Department and at an inter-agency level. You will be working with one of the leading privacy offices in the Federal Government. If you are excited by what may be the challenge of a lifetime, we look forward to hearing from you.
The link for the position is provided below.
http://jobview.usajobs.gov/GetJob.aspx?JobID=86787394&q=os-20100443&brd=3876&vw=d&FedEmp=N&FedPub=Y&x=77&y=10&pg=1&re=10&AVSDM=2010-03-10+18:15:00&rc=2&TabNum=6
How about getting someone fantastic into that position? Maybe you…?
Categories: policy Tagged with: dhs • privacy Date: March 11th, 2010 dw
Julie Cohen is giving a Berkman lunch on “configuring the networked self.” She’s working on a book that “explores the effects of expanding copyright, pervasive surveillance, and the increasingly opaque design of network architectures in the emerging networked information society.” She’s going to talk about a chapter that “argues that “access to knowledge” is a necessary but insufficient condition for human flourishing, and adds two additional conditions.” (Quotes are from the Berkman site.) [NOTE: Ethan Zuckerman's far superior livebloggage is here.]
|
NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.
|
The book is motivated by two observations of the discourse around the Net, law, and policy in the U.S.
1. We make grandiose announcements about designing infrastructures that enable free speech and free markets, but at the end of the day, many of the results are antithetical to the interests of the individuals in that space by limiting what they can do with the materials they encounter.
2. There’s a disconnect between the copyright debate and the privacy debate. The free culture debate is about openness, but that can make it hard to reconcile privacy claims. We discuss these issues within a political framework with assumptions about autonomous choice made by disembodied individuals…a worldview that doesn’t have much to do with reality, she says. It would be better to focus on the information flows among embodied, real people who experience the network as mediated by devices and interfaces. The liberal theory framework doesn’t give us good tools. E.g., it treats individuals as separate from culture.
Julie says lots of people are asking these questions. They just happen not to be in legal studies. One purpose of her book is to unpack post modern literature to see how situated, embodied users of networks experience technology, and to see how that affects information law and policy. Her normative framework is informed by Martha Nussbaum‘s ideas about human flourishing: How can information law and policy help human flourishing by providing information to information and knowledge? Intellectual property laws should take this into account, she says. But, she says, this has been situated within the liberal tradition, which leads to indeterminate results. You lend it content by looking at the post modern literature that tells us important things about the relationship between self and culture, self and community, etc. By knowing how those relationships work, you can give content to human flourishing, which informs which laws and policies we need.
[I'm having trouble hearing her. She's given two "political reference points," but I couldn't hear either. :(]
[I think one of them is everyday practice.] Everyday practice is not linear, often not animated by overarching strategies.
The third political reference point is play. Play is an important concept, but the discussion of intentional play needs to be expanded to include “the play of circumstances.” Life puts random stuff in your way. That type of play is often the actual source of creativity. We should be seeking to foster play in our information policy; it is a structural condition of human flourishing.
Access to knowledge isn’t enough to supply a base for human flourishing because it doesn’t get you everything you need, e.g., right to re-use works. We also need operational transparency: We need to know how these digital architectures work. We need to know how the collected data will be used. And we also need semantic discontinuity: Formal incompleteness in legal and technical infrastructures. E.g., wrt copyright to reuse works you shouldn’t have to invoke a legal defense such as fair use; there should be space left over for play. E.g., in privacy, rigid arbitrary rules against transacting and aggregating personal data so that there is space left over for people to play with identity. E.g., in architecture, question the norm that seamless interoperability makes life better, because it means that data about you moves around without your having the ability to stop it. E.g., interoperability among social networks changes the nature of social networks. We need some discontinuity for flourishing.
Q: People need the freedom to have multiple personas. We need more open territory.
A: Yes. The common pushback is that if you restrict the flow of info in any way, we’ll slide down the slippery slope of censorship. But that’s not true and it gets in the way of the conversation we need to have.
Q: [charlie nesson] How do you create this space of playfulness when it comes to copyright?
A: In part, look at the copyright law of 1909. It’s reviled by copyright holders, but there’s lots of good in it. It set up categories that determined if you could get the rights, and the rights were much more narrowly defined. We should define rights to reproduction and adaptation that gives certain significant rights to copyright holders, but that quite clearly and unambiguously reserves lots to users, with reference to the possible market effect that is used by courts to defend the owners’ rights.
Q: [charlie] But you run up against the pocketbooks of the copyright holders…
A: Yes, there’s a limit to what a scholar can do. Getting there is no mean feat, but it begins with a discourse about the value of play and that everyone benefits from it, not just crazy youtube posters, even the content creators.
JPalfrey asks CNesson what he thinks. Charlie says that having to assert fair use, to fend off lawsuits, is wrong. Fair uyse ought to be the presumption.
Q: [csandvig] Fascinating. The literature that lawyers denigrate as pomo makes me think of a book by an anthropologist and sociologist called “The Internet: An Ethnographic Approach.” It’s about embodied, local, enculturated understanding of the Net. Their book was about Trinidad, arguing that if you’re in Trinidad, the Net is one thing, and if you’re not, it’s another thing. And, they say, we need many of these cultural understandings. But it hasn’t happened. Can you say more about the lit you referred to?
A: Within mainstream US legal and policy scholarship, there’s no recognition of this. They’re focused on overcoming the digital divide. That’s fine, but it would be better not to have a broadband policy that thinks it’s the same in all cultures. [Note: I'm paraphrasing, as I am throughout this post. Just a reminder.]
A: [I missed salil's question; sorry] We could build a system of randomized incompatibilities, but there’s value in having them emerge otherwise than by design, and there’s value to not fixing some of the ones that exist in the world. The challenge is how to design gaps.
Q: The gaps you have in mind are not ones that can be designed the way a computer scientist might…
A: Yes. Open source forks, but that’s at war with the idea that everything should be able to speak to everything else. It’d
Q: [me] I used to be a technodeterminist; I recognize the profound importance of cultural understandings/experience. So, the Internet is different in Trinidad than in Beijing or Cambridge. Nevertheless, I find myself thinking that some experiences of the Net are important and cross cultural, e.g., that Ideas are linked, there’s lots to see, people disagree, people like me can publish, etc.
A: You can say general things about the Net if you go to a high enough level of abstraction. You’re only a technodeterminist if you think there’s only way to get there, only one set of rules that get you there. Is that what you mean?
Q: Not quite. I’m asking if there’s a residue of important characteristics of the experience of the Net that cuts across all cultures. “Ideas are linked” or “I can contribute” may be abstractions, but they’re also important and can be culturally transformative, so the lessons we learn from the Net aren’t unactionably general.
A: Liberalism creeps back in. It’s acrappy descriptional tool, but a good aspirational one. The free spread of a corpus of existing knowledge…imagine a universal digital library with open access. That would be a universal good. I’m not saying I have a neutral prescription upon which any vision of human flourishing would work. I’m looking for critical subjectivity.
A: Network space changes based on what networks can do. 200 yrs ago, you wouldn’t have said PAris is closer to NY than Williamsburg VA, but today you might because lots of people go NY – Paris.
Q: [doc] You use geographic metaphors. Much of the understanding of the Net is based on plumbing metaphors.
A: The privacy issues make it clear it’s a geography, not a plumbing system. [Except for leaks :) ]
[Missed a couple of questions]
A: Any good educator will have opinions about how certain things are best reserved for closed environments, e.g., in-class discussions, what sorts of drafts to share with which other people, etc. There’s a value to questioning the assumption that everything ought to be open and shared.
Q: [wseltzer] Why is it so clear that it the Net isn’t plumbing? We make bulges in the pipe as spaces where we can be more private…
A: I suppose it depends on your POV. If you run a data aggregation biz, it will look like that. But if you ask someone who owns such a biz how s/he feels about privacy in her/his own life, that person will have opinions at odds with his/her professional existence.
Q: [jpalfrey] You’re saying that much of what we take as apple pie is in conflict, but that if we had the right toolset, we could make progress…
A: There isn’t a single unifying framework that can make it all make sense. You need the discontinuities to manage that. Dispute arise, but we have a way to muddle along. One of my favorite books: How We Became Post-Human. She writes about the Macy conferences out of which came out of cybernetics, including the idea that info is info no matter how it’s embodied. I think that’s wrong. We’re analog in important ways.
I had not heard of Flash cookies until Fernando Bermejo’s Berkman talk last week. Now he’s inaugurated his new blog (well, it’s his second post) with a posting about a new study. Fernando writes:
the white paper concludes “that companies making inappropriate or irresponsible use of the Flash technology are very likely asking for trouble (and potentially putting the rest of the online industry at risk of additional government regulation)”. As for [end users], flash cookies are characterized as “super-cookies which are dramatically more resilient than cookies due to their implementation and a general lack of knowledge about their existence among consumer”.
To remove Flash cookies – which have some peacetime uses – go here.
« Previous Page | Next Page »
|