“This suggests that those interested in intervening in Wikipedia, or other peer-production based projects, might be better served by focusing on changing the terms of negotiation between interested parties, rather than technologically empowering individuals.”
24 hours of video per minute
That's the rate at which digital footage is being uploaded to YouTube, according to Michael Jones' keynote opening keynote presentation at Future Media Fest. Jones, who is Chief Technology Advocate at Google, cited the number as part of his argument that digital communication technology is becoming ever more ubiquitous. Understandably, he saw Google as playing an important role in this ubiquitous information environment.
This image, of thousands of camera-phone eyes feeding days of video into Google as minutes tick by may, for some media theorists, call to mind the image of the Panopticon, the model prison made famous by French philosopher Michel Foucault, in which prisoners arranged in transparent cells at the perimeter had their every move watched by a concealed figure in a central tower. The Panopticon, Foucault explained, was designed to teach prisoners to internalize the values of their guards, because they never knew if a guard was watching, they began to watch themselves.
Is Google a modern day Panopticon, watching over us all, invisibly guiding, Foucault would say “disciplining,” our behavior? Jones didn't think so. He went to great pains to describe Google as a passive entity. “We are your servant,” he said at one point. At another, he claimed, “we don't make decisions, we take what humans do and we amplify it.” As examples, he cited the ways in which Google tried to reflect the needs of its customers. He described how users of Google maps were active participants in the process of drawing the maps that Google served, especially in developing countries. Explaining the motivations of contributors to the Google maps project, Jones said ,”they didn't want me to have their map, they wanted their friends to have their map.” Finally, in response to a questioner who asked how Google could claim that they were a reflection of already existing behavior when values were always embedded in technology, Jones replied that data harvested from users was used to develop the technology itself. For example, he explained that the size of buttons in Google's Gmail webmail service had not been designed by some top-down process of expertise, rather different button sizes had been provided to different users, and the ideal button size had been determined based on data collected on the users' reaction times when using the various buttons.
All this, should, of course, be taken with a grain of salt. Anytime an executive officer of major corporation argues that his company is basically powerless, it suggests the company has become aware of popular anxieties about its power. Certainly, this is true for Google. Jones' claims that Google is passive and reflective also seem to overlook an observation that he made earlier in his presentation, when he noted that, “Henry Ford changed the way cities were designed.” Just as the automobile transformed the American urban landscape, leading to, among other things, the rise of the suburbs, so too, it is difficult to imagine that a technology as powerful as search could fail to transform our patterns of behavior.
That said, however, I think that Jones' apology for Google makes clear important differences between the 19th century technology of the Panopticon and the 21st century technology of search. Unlike the Panopticon, where a human agent stood in the tower and imposed rational, intentional values on the confined prisoners, encouraging them to adopt regimented work habits and abandon dangerous transgressions, nothing human could possibly process the surveillance performed by Google. Just to watch a day's worth of YouTube video would require a three-year effort! Instead, what seems to stand in the center of Google's apparatus of search (to the extent that there is such a thing) is something else entirely, something lashed together out of computer algorithms and pre-conscious thought. Something that adjusts buttons without us noticing and sums together collective contributions to make a map.
This should not be, in and of itself, frightening. The mastery of human consciousness was always a bit of an illusion. However, I do think we may need to do some reflection about who the mechanisms of search benefit, and what larger transformations this shift from intention to algorithm may entail.
I put together two more Wikipedia word clouds, in part because I wanted an excuse to work on my Python coding skills, and in part because I enjoy word clouds as an interesting visualization. For these word clouds, I used a Python script to organize the information I scraped from the Wikipedia Zeitgeist page (see prior post for link). The resulting file listed the titles of articles and the number of times each article had been edited for the month(s) it had made the list. By running this file through the Wordle software, I was able to produce a word cloud that displays the titles with their relative sizes determined by the number of edits they had received in a single month.
The image above shows that the Wikipedia article on the Virginia Tech Massacre probably has the largest number of edits in a single month for any one English Wikipedia article, though if you look closely (click through to the larger size on Flickr) you can see some articles, like the one on George W. Bush, represented by many smaller entries in the word cloud. This represents the many months that the George W. Bush article was one of the most edited articles on the English Wikipedia, even though it was never edited nearly as many times in a single month as the Virginia Tech Massacre article.
Here is the same data, with some of the less-edited articles left out. The result is less visually impressive, but a little more legible.
Next, I'll modify my script to count up all the edits and display a cloud showing which titles are the most edited articles on the English Wikipedia ever!
Just for fun, here's a quick and dirty wordcloud built by running the data from the most edited articles on the English Wikipedia (Found here: http://stats.wikimedia.org/EN/TablesWikipediaEN.htm#zeitgeist) through the IBM software that powers the website Wordle.
Wikipedia maintains a massive archive of statistical data on the project. Among this data is a list of the 50 most edited pages on the English Wikipedia. Of these 50 most edited pages, all but two are pages having to do with project maintenance, such as the page that is used to notify administrators of vandalism.
In Ellen Ullman, in her excellent memoir Close to the Machine, describes an odd young man she briefly took up with as an on-again off-again lover. Among his many obsessions was the notion of creating a cryptographic currency, a wholly anonymous and independent banking system. Well, it looks like someone has gone and implemented this idea. The BitCoin project “is a peer-to-peer network based digital currency.” It apparently derives its backing from CPU processor cycles. I'm not exactly sure how that works, but the Ron Paul-esque libertarian dreams of the creators are quite clear in their description of the project's advantages: “Be safe from the instability caused by fractional reserve banking and bad policies of central banks.”
I'm pretty sure projects like this get something deeply wrong about the social relationships that money relies on, but I'm not sure exactly what. The individualist mindset that backs all this is suspect, money relies on shared social relationships. However, the bitcoin folks clearly imagine a set of relationships among individuals, in the form of the peer-to-peer network. It is easy to explain why peer-to-peer networks do not describe the world as it currently exists. It is more difficult to explain why attempts to build them seem to inevitably fail.
Wikipedia's Neutral Point of view is, understandably, a controversial thing in some circles. I'm still investigating what the actual impact of the NPOV is, and I think its effects are actually complicated. However, I think some people are getting hung up on the word “Neutral,” which may imply a sort of “objectivity” that they find distasteful. Namely, a vision of objective knowledge that is “neutral” because it is “pure,” somehow detached from the messy world of politics and subjectivity we live in. This critique of the NPOV, I think, is a mistake. I'm not the only one to make this argument, Wikipedians themselves make it in their FAQ for the NPOV, and Joe Reagle has made it as well.
Nietzsche, in The Geneology of Morals makes a critique of the notion of “pure reason” or “knowledge in itself,” not dissimilar, I think, to the criticism that is sometimes leveled at the NPOV. He writes:
There is only a perspective seeing, only a perspective “knowing”; and the more affects we allow to speak about one thing, the more eyes, different eyes, we can use to observe one thing, the more complete will our “concept” of this thing, our “objectivity,” be.
I couldn't help but be struck by how similar Nietzsche's call for an “objectivity” based on a multitude of perspectives is to the NPOV itself! One section of the policy reads:
[The NPOV] is not a lack of viewpoint, but is rather an editorially neutral point of view. An article and its sub-articles should clearly describe, represent, and characterize all the disputes within a topic, but should not endorse any particular point of view. It should explain who believes what, and why, and which points of view are most common.
Of course, this proves nothing about how the NPOV is actually employed in practice, which is considerably more complicated. Still, I think it makes for an interesting comparison.
Hello! And welcome to the blog!
Here on Copyvillain, I hope to work toward a crtical examination of what's called been called Free Culture, Peer-to-Peer culture, or Peer Production. Basically, all of these terms refer to the mode of producing culture that we find on Wikipedia, in which many loosely organized collaborators work together to produce a larger text without a strictly hierarchical organization.
I think it is important to explain what I mean when I say I intend to take a “critical” take on this method of production on this blog. I do not come here to bury Wikipedia (or YouTube, or Hacker spaces, or what have you). These are some of my favorite things. I think that these projects, and the people involved with them, are often animated by a tremendous idealism, and a wonderful sense that they are working together to build a better future for all of humanity.
I think that idealism is real, in fact I'm banking on it. What I'd like to do here is to argue that some of the cultural assumptions Peer Production brings with it from capitalism may serve to undercut the very idealistic goals that its practitioners embrace. My hope is that their genuine commitment to a better, fairer, human future will motivate them to move away from these assumptions and towards a new vision for Peer Production based on a broader understanding of human equality and shared responsibility.
Of course, I'll also be posting some short write ups on current news, just keeping abreast of things.