The five New Jersey men and seven women who decided Dharun Ravi’s fate last Friday had compelling reasons to find him guilty of a hate crime. Here was a young man training a webcam on his roommate, live-streaming the roommate’s intercourse with another man, and using social media tools such as Twitter to advertise the video and invite his friends to watch. These actions drove the roommate, Tyler Clementi, to leap to his death off the George Washington Bridge, and they led the jury to hold Ravi responsible. Talking to the press shortly after the verdict was read, Marcellus A. McRae, a former federal prosecutor who has been following the case, stated that the decision was “a watershed moment, because it says youth is not immunity.”
It isn’t. But the Internet just might be: If the Rutgers case is a watershed moment, it’s because it forces us to come to terms with the implications of technology having grown faster and wilder than the social norms and the legal edicts designed to keep it—and us—in check.
A few vital caveats: There can be little doubt that homophobia played a considerable role in the events leading to Clementi’s death, a painful reminder that much public education is still needed before one’s sexual orientation is no longer considered something to gawk at, mock, or assail. And there can be no doubt that Ravi, like the rest of us mortals, possesses a sense of agency, and therefore could have, and should have, treated Clementi with the dignity and respect he so dearly deserved. But these key points aside, let us examine the environment that allowed this tragedy to take shape.
As a professor of digital media, I spend much time both researching our nascent modes of communication and observing young men and women interacting with, and through, them. I’m still a geek at heart—technology tends to make me giddy—but the more I think about it, the more things seem grim. Put bluntly, I believe that for all its many and undeniable advantages, the suite of media, technologies, and practices collectively known as Web 2.0 is facilitating a radical reimagining of what it means to be human, dimming the critical faculties, sanctioning speed over contemplation, and spawning a host of what could only be called nonpersons.
That’s not my term. It’s Jaron Lanier’s. A celebrated computer scientist and a founding father of virtual reality, Lanier gradually came to see the vicissitudes of technology as a spiraling dive into barbarity. The problem, he argued in his controversial manifesto You Are Not a Gadget, is that Facebook, Twitter, and the other platforms that govern and regulate so many of our exchanges with our fellow human beings are information systems, and as such they demand, well, information: favorite bands, lists of friends, quick posts, snapshots.
But information, Lanier observed, under-represents reality. We are more than lovers of Phish or graduates of Yale, more than that person tagged in that photo from last night’s party. We contain multitudes too complex for 140 characters to capture. And yet, in our zeal to catch up with our friends and with the times, we reduce ourselves to data. We are so thrilled with the opportunity to keep in touch with so many people with such speed and facility that we agree to limit our thoughts and our feelings to dispatches that are easy to categorize and store in the growing database that is the contemporary Internet. This, Lanier laments, is based on a “philosophical mistake,” the belief “that computers can presently represent human thought or human relationships. These are things computers cannot currently do.”
Lanier is not alone in his critique. Sherry Turkle, a professor of social studies of science and technology at MIT, has come to similar conclusions. An enthusiastic technophile, Turkle too took a turn for the dark in her new book, Alone Together: Why We Expect More From Technology and Less From Each Other. Conducting a staggering amount of ethnographic interviews with young users of technology, she reported on a generation accustomed to thinking of communication as a ceaseless flow in which meaning is tangential and identities blurred.
This, sadly, is the world in which Clementi and Ravi came of age. Examining Ravi’s online missives, one finds little of the particular anti-gay vitriol evident, say, in the words and the deeds of the brutes who murdered Matthew Shepard. They, Aaron McKinney and Russell Henderson, drove Shepard to a remote location, tortured him, tied him to a fence, and left him to die. When tried, they argued the gay panic defense, namely that Shepard’s alleged sexual advances filled them with homicidal rage. Dharun Ravi, on the other end, tweeted: “Roomate asked for the room till midnight. I went into Molly’s room and turned on my webcam. I saw him making out with a dude. Yay.” In Ravi’s words and actions, one finds only the sophomoric, the titillating, and the thoughtless, which is to say one finds only what one usually finds on the Internet. Ravi’s, then, is no hate crime. Nor is it a mere youthful indiscretion, a case of boys being boys. It is, rather, a solid example of the sort of thing that, because of the advent of fast and ubiquitous platforms of communication, passes for human interaction these days.
The other count of which Ravi is accused, invasion of privacy, is equally complicated. In a much-publicized interview in 2010, just a few months before Ravi and Clementi met, Facebook’s founder, Mark Zuckerberg, announced that as far as he was concerned, the age of privacy was over. Had he started his company now, he announced, he would have insisted that all personal information be public by default.
Zuckerberg wasted little time in acting on his insight. Facebook, like most other web-based services, has since changed its opt-in default—assuming that users want all information kept private unless they specify otherwise—to an opt-out default, which assumes all information is public unless specific boxes are checked in the user’s profile settings. This is no small point. Considering how intricate the software has become, users now have to navigate through more than 50 settings and choose from more than 170 options just to control that seemingly most basic of things: their privacy. It’s a tough task for someone with a Ph.D., let alone a high-schooler. Doing nothing is far easier.
Add to that the omnipresence of cameras in everything from our computers to our phones, and the demand to post more and post quickly and constantly replenish our walls and our feeds, and you may begin to understand why so much deeply personal stuff makes its way online these days. Teenagers haven’t suddenly become any more exhibitionistic than their counterparts who grew up at the time the telephone was introduced suddenly became chatterboxes. Then as now, we begin by fashioning tools and end up with the tools fashioning us.
In its own way, Judaism discovered this truth millennia ago; by repeatedly turning down those wishing to convert, for example, the rabbis fashioned a system that separates excited impulsives from thoughtful and sincere believers, accepting the latter into the fold and rejecting the former. We should do the same for our children, by ensuring that the thicket of screens they must navigate makes it hard to share an embarrassing photograph, not hard to keep it to themselves. Finding Ravi guilty of violating privacy laws that have fallen far behind the way we live now does little to solve this problem or address its roots.
And then, as always, there’s education. The children who grow up talking to each other through posts and tweets and likes can imagine no other way of interacting, and we—teachers and parents and other interested adults—are too often too quick to jump on the bandwagon and bless technological progress as inevitable. It’s anything but. As the author Zadie Smith observed in a superb essay about Facebook, so much about that ubiquitous platform, from its color scheme to its architecture, had to do with its founder’s personal preferences. There are other ways, none of them Luddite, to imagine media that are truly social. Rather than strain to be cool by joining our children online, we should offer them useful criticism and, at the very least, help them understand just what it is that they’re doing when they go online, and, more important, just what it is that they’re giving up. A good way to start is by teaching them code: For all of our dependence on software, the overwhelming majority of us are still shockingly ignorant of even its most basic building blocks. Code is as important now as the alphabet; let’s make sure our kids know how to read it by the time they turn 10.
None of this will bring back Tyler Clementi. And none of this, most likely, would be of much help to Dharun Ravi. But if we don’t do something to change the environment that enabled this tragedy, the next time it plays itself out, it’s ourselves we should put on trial.