logo
EverydayChaos
Everyday Chaos
Too Big to Know
Too Big to Know
Cluetrain 10th Anniversary edition
Cluetrain 10th Anniversary
Everything Is Miscellaneous
Everything Is Miscellaneous
Small Pieces cover
Small Pieces Loosely Joined
Cluetrain cover
Cluetrain Manifesto
My face
Speaker info
Who am I? (Blog Disclosure Form) Copy this link as RSS address Atom Feed

June 10, 2011

[hyperpublic] Final panel: Cooperation without Coercion

At the final panel of the conference. Judith Donath is moderating.

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

Charlie Nesson asks: “When we talk about our space, who are we?” In Jeff Huang‘s presentation, it seemed like he was given the perfect hypothetical — a desert — to build a public and private place. “In cyber terms, we are people of the Net. What then is our domain? It’s the public domain. And if you are to build the public domain, then I believe the wisdom to follow from a lawyer’s point of view is the same wisdom that has more or less informed the world of real property. If you want an orderly world of real property, you build a registry. If you want an orderly world of bits, you build a registry.” This is Charlie’s new project: a registry of the public domain. They’re starting with IMSLP.org: a musical score library. It has 93,000 musical scores in the public domain., exquisitely put together.

The Net divides into two domains, says Charlie, one that is free and one that is not. Free means free of copyright and other encumbrances. Charlie wants to build our domain on a foundation solid in law. The registry he’s building identifies works as public domain, with links to the registrars attesting to this. He wants it to be populated by librarians with public domain collections. But, the problem with registries is litigation risk, i.e., the threat of lawsuit. “So the essence of this idea is to couple the registrar with a pro bono commitment of legal service from a law firm of repute to defend litigation based on infringement.”

Where do you find the institutions that want to protect privacy, asks Charlie. How about libraries, he suggests?

“I’m tough on privacy, Judith,” says Charlie, in response to a question. “I’ve never liked it.” He explains it’s so often based on fear and looks backwards.

Martin Nowak looks at cooperation evolutionary term in which a donor pays a cost and a recipient gets a benefit. He explains game theory’s Prisoner’s Dilemma. Why do people cooperate? “Natural selection chooses defection,” rather than cooperation. In a mixed population, defection becomes increasingly more popular. So, natural selection needs help to favor co-operation. Martin categorizes the factors into five mechanisms: kin selection, direct reciprocity, indirect reciprocity spatial selection and group selection.

Direct reciprocity (I help you, you help me). If you play the Prisoner’s Dilemma several times, the economics changes, as The Folk Theorem shows them. Martin quickly summarizes Axelrod and Rapaport. [Too hard to live blog. Read Ethanz. Really. Now.] Errors turn out to ruin cooperation, so you need a process that allows for forgiveness. Martin’s doctoral dissertation showed that if everyone plays randomly, the right tactic is to always defect. A tit for tat strategy corrects that, and generous tit-for-tat (I may still cooperate even if you defect) provides a math model for the evolution of forgiveness and cooperation. There are always oscillations; cooperations are never stable. We need structures that rebuild cooperation quickly after it is destroyed because it always will be destroyed.

Direct reciprocity allows allow for the evolution of cooperation if there’s a prospect of another round. Indirect reciprocity (I help you, someone helps me) leads to cooperation if reputation matters. You need natural selection to care about reputation, so to speak. “What you need for indirect reciprocity is gossip” to spread reputation. For that you need language. “You could argue this is the selection process that led to language.” “For direct reciprocity you need a face. For indirect reciprocity you need a name.” (David Haig) Our brain has both capabilities. If interactions are completely anonymous you run into problems. Also, you need gossip to be relatively honest.

Spatial selection = neighbors help each other. Martin flips through some graphs that shows that it selects for coop if you have a few close friends. Likewise, evolutionary set theory says that people wanting to join particular groups can also lead to coop.

Judith: What about strong vs. weak ties?
Martin: We assume equal ties. There’s a trade-off between wealth and vulnerability.

Nicholas Negroponte asks himself a question every morning: Is he doing something that normal market forces would do anyway? If so, he stops. He wants to do that which market forces will not do.

There are now 3M One Laptop Per Child laptops in the hands of kids. This isn’t huge since OLPC would like to get laptops into the hands of about 500M kids. Before that, people assumed computers teach by imparting content. Instead, you want to see children teaching. 20-30% of the million Peruvian kids with OLPC machines are using them to teach their parents how to read.

Nicholas goes through some points he made in a talk at the UN recently. Among the points: Measurement is overrated. You only measure when the changes are so small that you can only see them by measurement.

Judith: When we see well-off kids sitting side by side looking into screens, we think it’s a nightmare of anti-sociality, but when we see your adorable photos of third world kids in the same position, it looks desirable?
Nicholas: I don’t see the well-off kids that way. And why don’t we make OLPC’s available in the US? Because the issues are deeper than that.

A: Talk about anonymity…?
Jeff Jarvis: It’s foundational to democracy. It’s getting a bad name because of trolls. But it must be protected.

Q: This discussion is soaked in privilege. There’s much inscribed in the language that affects how people act. When you idolize the public space as a place where all can share their ideas safely, it feels really far away for me.

Q: (Charlie) Nicholas, you’ve said that Uruguay has given all 500,000 of its kids OLPCs. Given your position on measurement, what change will we see?
Nicholas: Their curiosity, the way they approach problems, the way they look at things…I think you’re going to see a nation that is far more creative than many other nations. Nicholas tells a story of kid whose homework got 100K hits.
Martin: Who teaches them how to use it?
Nicholas: It’s genetic :) We’re going to do a scientific experiment in which we drop OLPC laptops out of helicopters onto remote villages and come back in a year and see how many have learned how to read.

Q: (urs gasser) One vision says build a great tool and see what happens. The other is to study human behavior scientifically. (Nicholas vs. Martin). How difficult is the translation from findings from science about human behavior to adapting them to technology?
Martin: I’m fascinated by mathematics, but we do apply it to practical issues. In the field of cooperation, we’d like to bring the models closer to human observations. For example, many cultures like punishment, but I think it doesn’t work well to create cooperation because it creates complications. Reward seems better. So, we study that. We do the same experiment in multiple cultures. In Romania, for example, people differentiated between public and private outcomes, because they lacked faith that public engagement had positive outcomes.

Q: (zeynep) The Net has let the cooperative side of human nature be more manifest. Does your work in evolutionary biology take account of this?
A: The coop we see in the animal world must rely on direct observation. Humans can communicate. We don’t have to rely on our personal experience with another to decide whether to coop. The Net can help us to evaluate others quickly.

Tweet
Follow me

Categories: copyright, culture, education, liveblog, science Tagged with: commons • cooperation • evolution • hyperpublic • olpc • prisoner's dilemma • public domain Date: June 10th, 2011 dw

2 Comments »

[hyperpublic] The risks and beauty of hyperpublic life

Jeff Jarvis moderates a discussion. “We need principles to inform the architecture” for where we want to go, rather than waiting for terms of service from the sites we use. We need a discussion about terms of service for the Internet. (He’s careful to note that he’s not suggesting there be a single terms of service for the Net.) We all have a stake in the discussion of the public and private, Jeff says. We should be careful about our metaphors, he says, citing Doc Searls’ cautioning against calling it a medium since a medium is something that can be owned and controlled. “It is a platform for publics,” Jeff says.

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

Adam Greenfield, director of Urbanscale, a shop in NYC. He’s interested in how you design public spaces. He wants to push back against the idea of a networked city of the future. We already live in the networked city. Locative and declarative meedia (e.g., FourSquare) is widely adopted. We live among declarative objects, not just people declaring their position, e.g., TowerBridge tweets its open and closed states. In Tokyo, the side of a building is a QR code; it is an object with an informational shadow. Objects are incresaingly capable of gathering, processing, displaying, transmitting and/or taking action on info…which implies new modes of surveillance. His contention: Tens of millions of people are already living in these conditions, and we therefore need a new theory of networked objects.


He offers a taxonomy of what this class of objects implies. He begins with traffic signals controlled by motion detectors. The info is not uploaded to the network, and it has a clear public good.It is prima facie unobjectionable.


Then there is the mildly disruptive and disrespectful object. E.g., a sensor detects someone passing by a billboard that reacts to that presence. There’s no public good here. More concerning is a touch screen vending machine that makes gender assumptions based on interpretation of a camera image. Further, that info is gathered and used.


Another step into the disturbing: Advertisers scan faces and make predictive and prospectively normal assumptions that they sell to marketers.


But what about when power and knowledge resides in an ensemble of discrete things. E.g., in Barcelona, access to some streets depends on a variety of sensors and signs. It can even reside in code: surveillance camera sw gets upgraded with referendum.


We should be talking about public objects: Any discrete obj in the common spatial domain intended for the use and enjoyment of the general public [with a couple of refinements that went by to fast]. They should be open as in API and that their goods are non-rivalrous and non-excludable. This is great, but we should remember that this increases the “attack surface” for hostile forces. Also, we need to evolve the etiquettes and protocols of precedence and deconfliction. We should do this because it moves against the capture of public space by private entities, and it opens up urban resources like we’ve never seen (discoverable, addressable, queryable and scriptable). The right to the city should be underwirtten by the architecture of its infrastructure.


Q: [jeff] Why is the gender-sensing vending machine is creepy? Would you be ok if it guessed but let you correct it or ignore it?
A: I’ve been working with informed consent, but I heard this morning that that may not be the best model. We don’t want to over-burden people with alert boxes, etc.


Jeffrey Huang talks about a case study: the design of a new campus in the deserts of Ras Al Khaimah (one of the Emirates). In 2009, the Sheikh agreed to fund the creation of the new campus. Jeff and others were brought in to design the campus, from bare sand up. The initial idea for the project was to avoid creating a typical gated campus, but rather to make it open. They also wanted to avoid the water costs of creating a grass-lawned campus. “The ambition was to grow a campus where it made sense ecologically”: buildings where there’s natural cooling winds, etc. They’re designing large, fluid, open spaces, where “seeing and being seen is maximized.” There would be a network of sensors so that campus would be aware of what’s going on inside, including recognizing where individuals are. People’s profile info could be projected into their shadows. They wonder if they need special places of privacy. “There should be less necessity to design the private if and only if the hyperpublicness is adequately designed.” E.g. if no one owns the data, there’s full transparency about who looks at the data and what’s being captured.


Betsy Masiello from Google’s policy team gives some informal remarks. To her, a hyperpublic life implies Paris Hilton: Constant streaming, making your behavior available for everyone to see. But, she says, what this panel is really about is a data-driven life. It’s important not to blur the two. There’s public good that comes from big data analysis, and some baseline economic good.


She says she thinks about predictive analytics in two ways. 1. Analysis done to give you predictions about what you might like. It’s clear to the user what’s going on. 2. Predictions based on other people’s behavior. E.g., search, and Adam’s soda machine. Both create value. But what are there risks? The risk is a hyperpublic life. The risk of all this data is that it gets attached to us, gets re-identified, and gets attached to your identity. But this misses something…


E.g., she came across a Jonathan Franzen 1988 essay, “The Imperial Bedroom.” “Without shame there can be no distinction between public and private,” he wrote. She says you can feel shame even if you’re anonymous, but Franzen is basically right. Which brings her to a positive solution. “The design problem is how to construct and identify multiple identities, and construct and manage some degree of anonymity.” It is true that our tech will allow us to identify everyone, but policy requirements could punish companies from doing so. Likewise, there are some policy decisions that would make it easier to maintain multiple identities online.


Jeff: Your fear of re-identification surprises me.
Betsy: The hyperidentity public is created from the lack of contexts, without people knowing about it. People don’t know how all their contexts are becoming one. I think people want a separation of data used to personalize an ad from data they are sharing from their friends.
Jeff: This is John Palfrey’s “breakwalls”… But I’d think that Google would want as few restrictions as possible. They create liabilities for Google.
Betsy: That’s the design challenge. Search engines and Japanese soda machines haven’t gotten it right yet.
Jeff: What are the emerging principles. Separating gathering from usage. Control. Transparency…


Adam: I don’t there’s anonymous data any more.
Betsy: Yes, but could we create it via policy?
Adam: There are some fine uses of predictive analytics. E.g., epidemiology. But not when the police use it to predict crimes.
Jeff: Why not? Ok, we’ll talk later.


Q: What about third party abuse?
Adam: Our principle should be “First, do no harm.”
Jhoung: It’s a problem often because the systems don’t know enough. Either roll it back, or train it so it can make better distinctions.
Betsy: You can maybe get. FourSquare is an individual stating her identity. The flip is anonymous data about locations. That provides tremendous value, and you can do that while protecting the identities.
Jeff: But if you can’t protect, don’t collect it, then we’ll never collect anything and won’t get any of those benefits.


Q: [latanya] It’s not true that only those with something to hide want to remain anonymous. E.g., if you hide all the positive results of HIV tests, you can see who has HIUV. You have to protect the privacy of those who do not have HIV.
Jeff: But I got benefit from going public with my prostate cancer.
Latanya: But we live in a world of parallel universes. You got to control which ones knew about your cancer.


Q: [I could’t hear it]
Betsy: You don’t need to reveal anything about the individual pieces of data in a big data set in order to learn from it.


Q: (jennie toomey) There are lots of things we want kept private that have nothing to do with built or shame. Much of what we keep private we use to create intimacy.
Betsy: I was quoting Franzen.


Q: Privacy means something different in non-democratic societies.
Adam: We know historically that if info can be used against us, it eventually will be.


Q: Recommended: Solove’s The Taxonomy of Privacy
Adam: The info collected by E. Germany was used against people after E. Germany fell.
Jeff: But if only listen to the fears, we won’t get any of the benefits.

Tweet
Follow me

Categories: liveblog Tagged with: hyperpublic • privacy Date: June 10th, 2011 dw

Be the first to comment »

[hyperpublic] Panel 2 :Experience and re-creation

Jeffrey Schnapp introduces the second panel.

Beatriz Colomina gives a brief talk called “Blurred Vision: Architectures of Surveillance.” [I continue to have difficult hearing due to the room’s poor acoustics and my own age-appropriate hearing loss. Also, Beatriz talks very fast.] She begins with a photo of a scene framed by windows. Comm is about bringing the outside in. So is glass; glass has taken over more of the building. She points to skyscrapers made of out of glass that have an x-ray aesthetic. It is no coincidence that glass houses and X-rays occur at the same time, she says. X-rays exposed the inside of the body to public eye, while architecture was disclosing the inside of the house to the public eye. X-rays acclimatized us to living in glass houses, including the glass house of blogging. Beatriz talks about architecture that looks further inward, through more and more layers, beyond transparency [I lack acoustic confidence that I’m getting this right. sorry.] With our surveillance equipment, x-ray vision is becoming pervasive, changing the definition of the private.

danah boyd gives a talk: “Teen privacy strategies in networked publics.” She begins by explaining she’s an ethnographer. How do young people think about privacy? The myth is that they don’t care about it, but they do. They care about it but they also participate in very public places. Just because they want to participate in a public doesn’t meant they want to be public. Being active in a public does not mean they want everything to be public to everyone.

Networked publics are publics that are enabled by network technologies, and that are simultaneously spaces constructed through tech and an imagined communities. We are becoming public by default, and private by default. danah quotes at 17yr-old who explains that rather than negotiating publics to make things available one by one, she posts in a public space so it’s available all of them.

New strategies are emerging. Privacy = ability to control a social situation, and to have agency to assert control over those situations. A 14yr old danah interviewed thinks that he’s signalling the social norms in his communications, but people comment inappropriately, so he’s started using some explicit social structures. Another young person deletes comments to her posts after she’s read them, and deletes her own comments on other people’s posts the next day. She’s trying to make the structure work for her.

A 17yr-old likes her mother but feels her mother over-reacts to FB posts. So, when the teen broke up with her boyfriend, she posted the lyrics from “Always look on the bright side of life.” This is social stenography, i.e., hiding in plain sight, for that song is from the crucifixion scene in The Life of Brian.

danah points to an online discussion of a social fight. The kids knew the details. The adults did not know if they were allowed to ask. The kids’ careful use of pronouns controlled access to meaning.

Sometimes we can use the tech, and sometimes we have to adopt social norms. In all of our discussion of privacy about the role of law, tech, and the market, we ought to pay careful attention to the social norms they’re trying to overrule. (She hat tips Lessig for these four.)

Ethan Zuckerman talks about the role of cute cats. Web 1.0 was about sharing info. Web 2.0 is about sharing photos of kittens. This has important implications for activists. The tools for kitten sharing are effective for activists. They’re easy to use, they’re pervasively viral, and there’s tremendous cost to a totalitarian regime trying to censor them because they have to throw out the cute cats with the revolutionary fervor. It raises the cost of censorship.

Ethan says that cute cats have a deep connection to activism. What happened in a dusty little town of 40,000 spread throughout Tunisia, spread because of cute-cat social media. Protests happen, they get filmed and posted on Facebook. FB is pervasive, but makes it extremely find to the content, make sense of it, and translate it. So, local people find it and make sense of it, and feed it to Al Jazeera. Now people can see the events and decide if they want to join in.

Why FB? Because Tunisia has blocked just about everything except FB. They tried to block it in 2008, which resulted in a 3x increase, because Tunisians inferred there was something good about FB. The day before stepping down, Tunisia’s leader offered three concessions: It won’t fire on crowds, it will lower the tax on bread, and it will allow Net freedom.

Tunisia confirms Ethan’s theory, but Egypt is counter-evidence. The Egypt government shut off the Internet. China is manufacturing its own cute cats: you can post all the kitten vids you want on Chinese sites. “This is a much more effective way of combating the cute cat theory.” But it’s expensive and requires a huge amount of human labor to review.

What worries Ethan most is that we’re moving our public discourse into private spaces, e.g. FB and Google. “We’re leaving it up to the owners of these spaces whether we’ll be allowed to use these spaces for political purposes.” It’s not that these spaces are evil. Rather, these digital spaces have been designed for other purposes. They have an incentive to shut down profiles in response to complaints, especially when it’s in a different language. Also, the terms of service are often violated by activist content. And real name identity is often dangerous for activists.

Organizations are slowly but surely figuring out how to deal with this. But it’s slow and very difficult. E.g., video of the army deliberately killing unarmed civilians. These videos violate YouTube’s terms of service ;. But YouTube made an exception, putting up a warning that it’s disturbing video. This is great, but it holds out some basic tensions. For example, it’s not good for advertisers and thus runs against YouTube’s business model.

The challenge is that we have invented these tools to have a certain set of behaviors. We wanted friends to be able to exchange info, and we create terms of service for that. Now we’ve allowed those privately held spaces to become our networked public spheres. But the lines between private and public are not well suited for political and activist discourse. Do we ask corporations to continue hosting these, or do we try to come up with alternatives. We didn’t drive people to YouTube because they were good for activists , but for the other cute cat reasons. Now we have to figure out the right tools.

Q: (zenep) Value of real name policies?
Ethan: It may be that we need public interest regulation of some of the policies of these corporations.
danah: Our tech will make real names no longer the best and only way to identify you. Systems of power will be able to identify people, and no amount of individual hiding within a collective will work. We need to rethink our relation to power as individuals and collectives.

Paul: We shouldn’t forget that it’s not just corporations. It’s American corporations. to shut down WikiLeaks you just need Visa and Mastercard.
Ethan: WikiLeaks is vulnerable to credit card platforms because DDoS attacks made it move off its own platform to Amazon’s, and Amazon is vulnerable to such pressure. The Amazons have special responsibilities. Also, we’re now advising activists to always make sure there’s an English-language description of your material when you put it up on YouTube, etc., so that the YouTube admins can evaluate the take-down claims that arise.

Jeff Jarvis: Regulation is the wrong way. The question what is the def of a public space for public speech. Other than lobbying private corps, what’s the right way?
Ethan: Rebecca MacKinnon’s upcoming book, Consent of the Networked, argues that we need to have a revolutionary moment in which the users of these spaces rise up these spaces and use the companies that are open to supporting them. Ultimately though, we don’t have a way to do a FB in a decentralized fashion. We can’t have a networked conversation without having some degree of centrality.
danah: Corporations have incentives that sometimes align with users’. There’s a lot of power when users think about alignment. Sometimes it’s about finding common interests, or social norms at a legal or social. It’s good to find those points of alignment.

Q: danah, have you seen designs that are more conducive to people following social norms?
danah: The design question can miss the way in which the tech is used in various contexts. E.g., you can design in tons of privacy, but nothing stops a parent from looking over the shoulder of a child. People will adjust if they understand the design. Design becomes essentially important when there are changes. It’s important for designers to figure out how to tango with users as the design evolves.

Q: [tim from facebook] Every design for any networked system has consequences. The choice that has always bedevilled me: the same system that finds fake accounts for activists also identifies fake accounts from secret police. How do we avoid building systems that create a pseudo sense of privacy?
ethan: People do things with social platforms that we never intended. Admirable or dangerous. How to figure out? It’s got to be an ongoing process. But, as danah says, changing those decisions can be dangerous and disruptive. We need to have some way of opening up that process. The activist community should be involved in evolving the terms of service so that it doesn’t recognize just the legitimate law enforcement, but also recognizes the needs of activists and citizens. It should not just be a process for lawyers but also for citizens.
danah: What is the moral environment in which we want to live. What outs activists can also out human traffickers. Some of the hardest questions are ahead of us.

Tweet
Follow me

Categories: culture, liveblog, peace Tagged with: architecture • design • hyperpublic • privacy Date: June 10th, 2011 dw

Be the first to comment »

[hyperpublic] First panel: Delineating public and private

First panel at HyperPublic conf. Hurriedly typed and not re-read.

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

Paul Dourish: Think of privacy not so much as something that people have, but something that people do. “What are people doing when they are doing public, or doing private?” Think of doing privacy as one of the ways of engaging with a group.

And pay attention to the multiple publics we deal with when encountering media objects. When we encounter a media object we think “this is aimed at people like me.” Publics = complicated sameness and difference. For example, for a couple of years, he looked at paroled sex offenders in California who are being tracked with GPS. How do you think about space if you have to first worry about coming with 2,000 feet of a school, library, etc.? That reconfigures the scale at which public space is encountered: since it’s impossible to navigate at a level of 2,000 feet, these people think about which towns are safe for them. Instead of privacy, it helps to think in terms of our accountability to others.

Jonathan Zittrain suggests an iphone app that shows up map routes that take account of the sex offenders’ rule of avoiding schools, etc. He also raises the relation of privacy and identity.

Laurent Stalder mentions work on what privacy meant within a house in the 1880s in England. Artifacts were introduced that affected privacy, from sliding doors to doorbells. Then he shows a 2008 floor plan that distinguishes much less between public and private, inside and outside — the rise of a differentiated set of threshold devices. What is the role of the architect when spaces are filled with an endless stream of people, information, fluids…? Laurent points to the continual renegotiation of borders and their consistency. [I had trouble hearing some of the talk; the room does not have good acoustics. Nor do my ears.] In converation with JZ, Larent contrasts two Harvard buildings, one of which has a clear inside and outside, and another that has a long transitional state.

John Palfrey says that lawyers are so engaged in the question of privacy because they too are designers, but of rule-sets. Lawyers have not done a great job in determining which rule-set about privacy will enable us to thrive. He makes three points: 1. The importance of human experience in these spaces. We are public by default, he says, crediting danah boyd. We’re learning that though we often trade convenience for control, we care about in particular contexts, a changing set of practices. 2. The old tools haven’t worked well for us with privacy. E.g., the 4th Amendment doesn’t fit the cyber world well. 3. The systems that tend to work best are highly interoperable wit one another; we don’t want to type in the same info into multple systems. Open, interoperable systems succeed. But that gives rise to privacy problems. We need places — breakwalls — where the data can be either slowed or stopped.

JZ points out that JP is, like Laurent, talking about having long thresholds.

JZ imagines a world in which many people “lifestream” their lives and we are able to do a query to see who was where at just about any time. That makes Google StreetView’s photo-ing of houses seem like nothing, he says.

In response to Jeff Jarvis’ question, Paul reminds us that the social takes up the architectural, so that the same threshold space (or any space) can take on different privacy norms for different cultures and sub-cultures.

JZ: Architectural spaces last for decades or centuries. Online spaces can be reconfigured easily. The “house” your moved into can be turned into something different by the site’s owners. E.g., Facebook tinkers with the space you use by changing

Q: What is the purpose of the threshold?
Laurent: Connection and separation
Q: Don’t we want some type of digital threshold that does the job of introducing, transitioning, informing, introducing, etc. “You keep some of where you were in where you are.” The lack of that affects identity and more.
JZ: You can imagine a web site that shows you where other people are visiting from. “Wow, a lot of folks are coming from AOL. This must not be a cool site.” :)
Paul: It’s important to historicize sites appropriately so we understand where they came from.

Me: It’s possible to misuse architectural spaces, because architecture is always intensely local. So, will privacy norms ever settle down in the global Web?
Invention of the chimney enabled privacy in homes, as opposed to central fire. [Having trouble hearing] Will the poor not have Internet privacy, while the affluent do?
As important as the Net spaces are the spaces in which people use the Net. E.g., Net cafes in the developing world. Access and capital change publicness and privacy.
Paul: In China, people go to public spaces to play online games. (He says that they consider World of Warcraft as a Chinese game in its values.) There certainly won’t be global agreements about privacy norms. Nor does there have to be, because your encounters wit hthem always occur in local settings.
JZ: And within these spaces can be communities their own norms.

Tweet
Follow me

Categories: liveblog Tagged with: architecture • hyperpublic • privacy • public Date: June 10th, 2011 dw

Be the first to comment »

[hyperpublic] Judith Donath

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

Urs Gasser opens the conference by reflecting on the recent Swiss court decision that Google StreetView has to go to extraordinary lengths to obscure personal identifiers (like faces and racial identity), especially in front of “sensitive” public areas.

Judith Donath points to our increased discomfort with the new lines between public and private, while in an environment where it’s not only hard to separate them, but where the old well-defined norms don’t work. This is not solely a problem of the online world, she points out. How do we understand this new public as designers? In the online world, you can be public while sitting alone in your room.

She says she was one of the initiators of this interdisciplinary conference (also: Jeff Huang) because different fields have different ideas of what is desirable. E.g., lawyers traditionally think that privacy is a goal in itself.

Judith says we’re looking at this topic at a time in human history when we’ve had an almost unprecedented amount of privacy; e.g., we are more mobile and thus can shed our prior public selves. We have also been more isolated and alienated: we can live without engaging with others, in a city of strangers, in a workplace where all our ties are weak, etc.

She reminds us that during the day we should be thinking about how what we learn can be applied to help build a better civil society.

Tweet
Follow me

Categories: culture Tagged with: hyperpublic • privacy • public Date: June 10th, 2011 dw

Be the first to comment »


Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
TL;DR: Share this post freely, but attribute it to me (name (David Weinberger) and link to it), and don't use it commercially without my permission.

Joho the Blog uses WordPress blogging software.
Thank you, WordPress!