Stolen photos of stars find 'safe harbor' online

Published on NewsOK Modified: September 4, 2014 at 6:30 pm •  Published: September 4, 2014
Advertisement
;

WHY WAS A SAFE HARBOR NEEDED?

If websites could be held liable for copyright violations, they would be thrust into the position of making judgment calls on a piece of content before it's posted online. That would be a daunting task, given the volume of material that Web surfers share on the Internet today. About 144,000 hours of video are uploaded to YouTube alone each day, while Twitter processes more than 500 million tweets per day and Facebook's 1.3 billion users share billions of photos.

"The platforms that host that content can't readily police all of it the way that a newspaper can carefully select what should go in as a letter to the editor," says Harvard University Law School professor Jonathan Zittrain, who is also co-founder of the Berkman Center for Internet & Society.

Some pre-screening of content is still done. YouTube prevents some video from being posted through a copyright-screening tool that was created after Google took over.

Not all copyright violations are caught, so Google is still inundated with takedown requests. In the past month alone, Google says it received requests to remove more than 31 million links in its search engine index directing traffic to content cited as copyright violations. That number doesn't include content posted on YouTube or its blogging service. Google says it complies with the overwhelming majority of the takedown requests.

It's probably a good thing that websites aren't asked to decide what's legal and what's not, says Corynne McSherry, intellectual property director for the Electronic Frontier Foundation, a group focused on digital rights. She worries big companies would likely to err on the side of caution and block more content than necessary because they wouldn't want to risk being held liable for something that could dent their earnings and stock price. Small startups, meanwhile, would also likely be prone to block a lot more content because they can't afford anything that could drain their finances.

"The Internet, as we know it, would not exist if it were not for the DMCA's safe harbor," McSherry says. "If we are ever in a position where Internet service providers have to monitor their sites, I think Internet users will lose."

DON'T WEBSITES ALREADY BLOCK OR REMOVE MATERIAL THAT DOESN'T INVOLVE COPYRIGHT VIOLATIONS?

Yes, but those decisions typically involve violations of a websites own rules. For instance, YouTube and Facebook try to block pornographic images from appearing on their services. Both of those sites, along with Twitter, also forbid graphic violence, such as the recent beheadings of U.S. journalists videotaped by the Islamic State militants that killed them. In many instances, though, the websites still rely on their own users to identify posted content that violates the terms of service.

"The lasting test here is of the ethical moment that users face when they choose to seek out or repost photos they know weren't meant to be public," Zittrain says.