Joho the Blog » [hyperpublic] Panel 2 :Experience and re-creation
EverydayChaos
Everyday Chaos
Too Big to Know
Too Big to Know
Cluetrain 10th Anniversary edition
Cluetrain 10th Anniversary
Everything Is Miscellaneous
Everything Is Miscellaneous
Small Pieces cover
Small Pieces Loosely Joined
Cluetrain cover
Cluetrain Manifesto
My face
Speaker info
Who am I? (Blog Disclosure Form) Copy this link as RSS address Atom Feed

[hyperpublic] Panel 2 :Experience and re-creation

Jeffrey Schnapp introduces the second panel.

Beatriz Colomina gives a brief talk called “Blurred Vision: Architectures of Surveillance.” [I continue to have difficult hearing due to the room’s poor acoustics and my own age-appropriate hearing loss. Also, Beatriz talks very fast.] She begins with a photo of a scene framed by windows. Comm is about bringing the outside in. So is glass; glass has taken over more of the building. She points to skyscrapers made of out of glass that have an x-ray aesthetic. It is no coincidence that glass houses and X-rays occur at the same time, she says. X-rays exposed the inside of the body to public eye, while architecture was disclosing the inside of the house to the public eye. X-rays acclimatized us to living in glass houses, including the glass house of blogging. Beatriz talks about architecture that looks further inward, through more and more layers, beyond transparency [I lack acoustic confidence that I’m getting this right. sorry.] With our surveillance equipment, x-ray vision is becoming pervasive, changing the definition of the private.

danah boyd gives a talk: “Teen privacy strategies in networked publics.” She begins by explaining she’s an ethnographer. How do young people think about privacy? The myth is that they don’t care about it, but they do. They care about it but they also participate in very public places. Just because they want to participate in a public doesn’t meant they want to be public. Being active in a public does not mean they want everything to be public to everyone.

Networked publics are publics that are enabled by network technologies, and that are simultaneously spaces constructed through tech and an imagined communities. We are becoming public by default, and private by default. danah quotes at 17yr-old who explains that rather than negotiating publics to make things available one by one, she posts in a public space so it’s available all of them.

New strategies are emerging. Privacy = ability to control a social situation, and to have agency to assert control over those situations. A 14yr old danah interviewed thinks that he’s signalling the social norms in his communications, but people comment inappropriately, so he’s started using some explicit social structures. Another young person deletes comments to her posts after she’s read them, and deletes her own comments on other people’s posts the next day. She’s trying to make the structure work for her.

A 17yr-old likes her mother but feels her mother over-reacts to FB posts. So, when the teen broke up with her boyfriend, she posted the lyrics from “Always look on the bright side of life.” This is social stenography, i.e., hiding in plain sight, for that song is from the crucifixion scene in The Life of Brian.

danah points to an online discussion of a social fight. The kids knew the details. The adults did not know if they were allowed to ask. The kids’ careful use of pronouns controlled access to meaning.

Sometimes we can use the tech, and sometimes we have to adopt social norms. In all of our discussion of privacy about the role of law, tech, and the market, we ought to pay careful attention to the social norms they’re trying to overrule. (She hat tips Lessig for these four.)

Ethan Zuckerman talks about the role of cute cats. Web 1.0 was about sharing info. Web 2.0 is about sharing photos of kittens. This has important implications for activists. The tools for kitten sharing are effective for activists. They’re easy to use, they’re pervasively viral, and there’s tremendous cost to a totalitarian regime trying to censor them because they have to throw out the cute cats with the revolutionary fervor. It raises the cost of censorship.

Ethan says that cute cats have a deep connection to activism. What happened in a dusty little town of 40,000 spread throughout Tunisia, spread because of cute-cat social media. Protests happen, they get filmed and posted on Facebook. FB is pervasive, but makes it extremely find to the content, make sense of it, and translate it. So, local people find it and make sense of it, and feed it to Al Jazeera. Now people can see the events and decide if they want to join in.

Why FB? Because Tunisia has blocked just about everything except FB. They tried to block it in 2008, which resulted in a 3x increase, because Tunisians inferred there was something good about FB. The day before stepping down, Tunisia’s leader offered three concessions: It won’t fire on crowds, it will lower the tax on bread, and it will allow Net freedom.

Tunisia confirms Ethan’s theory, but Egypt is counter-evidence. The Egypt government shut off the Internet. China is manufacturing its own cute cats: you can post all the kitten vids you want on Chinese sites. “This is a much more effective way of combating the cute cat theory.” But it’s expensive and requires a huge amount of human labor to review.

What worries Ethan most is that we’re moving our public discourse into private spaces, e.g. FB and Google. “We’re leaving it up to the owners of these spaces whether we’ll be allowed to use these spaces for political purposes.” It’s not that these spaces are evil. Rather, these digital spaces have been designed for other purposes. They have an incentive to shut down profiles in response to complaints, especially when it’s in a different language. Also, the terms of service are often violated by activist content. And real name identity is often dangerous for activists.

Organizations are slowly but surely figuring out how to deal with this. But it’s slow and very difficult. E.g., video of the army deliberately killing unarmed civilians. These videos violate YouTube’s terms of service ;. But YouTube made an exception, putting up a warning that it’s disturbing video. This is great, but it holds out some basic tensions. For example, it’s not good for advertisers and thus runs against YouTube’s business model.

The challenge is that we have invented these tools to have a certain set of behaviors. We wanted friends to be able to exchange info, and we create terms of service for that. Now we’ve allowed those privately held spaces to become our networked public spheres. But the lines between private and public are not well suited for political and activist discourse. Do we ask corporations to continue hosting these, or do we try to come up with alternatives. We didn’t drive people to YouTube because they were good for activists , but for the other cute cat reasons. Now we have to figure out the right tools.

Q: (zenep) Value of real name policies?
Ethan: It may be that we need public interest regulation of some of the policies of these corporations.
danah: Our tech will make real names no longer the best and only way to identify you. Systems of power will be able to identify people, and no amount of individual hiding within a collective will work. We need to rethink our relation to power as individuals and collectives.

Paul: We shouldn’t forget that it’s not just corporations. It’s American corporations. to shut down WikiLeaks you just need Visa and Mastercard.
Ethan: WikiLeaks is vulnerable to credit card platforms because DDoS attacks made it move off its own platform to Amazon’s, and Amazon is vulnerable to such pressure. The Amazons have special responsibilities. Also, we’re now advising activists to always make sure there’s an English-language description of your material when you put it up on YouTube, etc., so that the YouTube admins can evaluate the take-down claims that arise.

Jeff Jarvis: Regulation is the wrong way. The question what is the def of a public space for public speech. Other than lobbying private corps, what’s the right way?
Ethan: Rebecca MacKinnon’s upcoming book, Consent of the Networked, argues that we need to have a revolutionary moment in which the users of these spaces rise up these spaces and use the companies that are open to supporting them. Ultimately though, we don’t have a way to do a FB in a decentralized fashion. We can’t have a networked conversation without having some degree of centrality.
danah: Corporations have incentives that sometimes align with users’. There’s a lot of power when users think about alignment. Sometimes it’s about finding common interests, or social norms at a legal or social. It’s good to find those points of alignment.

Q: danah, have you seen designs that are more conducive to people following social norms?
danah: The design question can miss the way in which the tech is used in various contexts. E.g., you can design in tons of privacy, but nothing stops a parent from looking over the shoulder of a child. People will adjust if they understand the design. Design becomes essentially important when there are changes. It’s important for designers to figure out how to tango with users as the design evolves.

Q: [tim from facebook] Every design for any networked system has consequences. The choice that has always bedevilled me: the same system that finds fake accounts for activists also identifies fake accounts from secret police. How do we avoid building systems that create a pseudo sense of privacy?
ethan: People do things with social platforms that we never intended. Admirable or dangerous. How to figure out? It’s got to be an ongoing process. But, as danah says, changing those decisions can be dangerous and disruptive. We need to have some way of opening up that process. The activist community should be involved in evolving the terms of service so that it doesn’t recognize just the legitimate law enforcement, but also recognizes the needs of activists and citizens. It should not just be a process for lawyers but also for citizens.
danah: What is the moral environment in which we want to live. What outs activists can also out human traffickers. Some of the hardest questions are ahead of us.

Previous: « || Next: »

Leave a Reply

Comments (RSS).  RSS icon