EFF lays out plan to end online harassment

09.01.2015
The Electronic Frontier Foundation (EFF), the non-profit digital rights advocacy group known for its strong public stances on topics like Net Neutrality, piracy and privacy, on Thursday expanded its focus with a blog entry identifying online harassment as a digital rights issue that stands in the way of true freedom of speech.

The big problem, as EFF co-authors Danny O'Brien and Nadia Kayyali readily acknowledge, is that it's hard to address online harassment without pushing up against the Internet's cardinal benefits of open communication and free discourse. 

"Unfortunately, it's not easy to craft laws or policies that will address those harms without inviting government or corporate censorship and invasions of privacy--including the privacy and free speech of targets of harassment," writes the EFF. "[But] there are ways to craft effective responses, rooted in the core ideals upon which the Internet was built, to protect the targets of harassment and their rights."

The (long) post goes on to lay out exactly what the EFF means by harassment -- defined as "[what] happens when Internet users attract the attention of the wrong group or individual." Escalating levels of harassment, from inundating a target with horrific violent or sexual imagery all the way up to death threats, is frequently used to silence women, minorities and members of any other disadvantaged population, which makes it run counter to the whole "equality of speech" thing the Internet is supposed to be all about. 

"The sad irony is that online harassers misuse the fundamental strength of the Internet as a powerful communication medium to magnify and coordinate their actions and effectively silence and intimidate others," the EFF wrote.

For a case in point, check out the ongoing #GamerGate "movement," which began with the bullying of a woman in game development to silence her and resulted in an ongoing campaign of harassment involving credible death threats, the release of private information like home addresses and lots of other unsavory behavior. 

The obvious solution would be stronger laws. But as the EFF notes, there are plenty of well-meaning laws on the books with upsetting and negative unintended consequences. The laws can be applied unevenly, for instance, with recorded instances of the police telling victims of online harassment to simply turn off the computer or that it's just kids being kids. 

From the private sector, "real name" policies like Facebook's are great for improving transparency and tying harassers back to their everyday lives. But they come at the cost of protections for people who need to maintain the protections of online anonymity, from corporate whistleblowers to citizens in totalitarian regimes to victims of domestic violence. That goes for things like logging IP addresses or physical locations, too. 

Moreover, the private sector often relies on a community-led approach to taking down offensive content: It's up to individual users to flag something as problematic for review by the publisher (Facebook, Twitter, the webmaster at hand), and an overworked content-policing team is just as likely to take down an inoffensive page as one that offends. (That's if they respond at all.) And given that doing the right thing is not necessarily a profit center for any business, social networks are hardly incentivized to change that. 

"Companies' primary focus is on revenue and legal safety. Many would be happy to sacrifice free expression if it became too expensive," the EFF warned.

That's a tall stack of problems. How do you stop harassment without stepping on  rights, without censoring users who really are just expressing an opinion, and without forcing Facebook and others to spend more money 

"We think that the best solutions to harassment do not lie with creating new laws, or expecting corporations to police in the best interests of the harassed. Instead, we think the best course of action will be rooted in the core ideals underpinning the Internet: decentralization, creativity, community, and user empowerment," the EFF said.

The EFF's primary answer -- in addition to better, more specific and more fairly-enforced laws -- is to give users more control. The nonprofit group's advice ranges from helping communities to better police their own social streams, collectively, rather than let each individual monitor their own, all the way up to opening more APIs for citizen developers to build their own anti-harassment tools.

For a good example of what this could look like in practice, check out Randi Harper's Good Game Auto Blocker, which maintains a centralized list of known Twitter harassers (started in the midst of the aforementioned #GamerGate) and lets any user run it and automatically block the lot of them. 

The EFF also calls for simpler tools to see what information about yourself is available on the public net -- the better to guard from doxxing -- and easier tools to cloak yourself in the mask of anonymity without special technical know-how. Another noteworthy measure suggested by the EFF: tools that help users capture actual harassment. 

And finally, the EFF suggested that people who see harassment just say something -- especially if they're not likely to become victims of harassment themselves. Solidarity is key. 

The EFF suggestions are just that -- suggestions. But the group has a long history of speaking out, if nothing else, and it's a good indicator that after a disastrous 2014, the tide is turning and 2015 may well be the year the tech industry gets serious about building better communities. 

(www.computerworld.com)

Matt Weinberger

Zur Startseite