Robots: as smart as the morons who program them

A great example of the law of unintended consequences: last night the livestream broadcast of the Hugo Awards at Worldcon got cut off. Well, technical issues happen to all sorts of broadcasts - except that this interruption wasn't a failure, it was the system behaving as intended. The Worldcon livestream was blocked by UStream for copyright violations.

Wait, a live broadcast violating copyright? How did that happen?

Ustream, like many other video sites, fingerprints known copyrighted commercial works - music videos, films etc. - using techniques similar to that described in Avery Li-chun Wang's paper on Shazam's algorithm. If someone tries to upload a copyrighted work such as the latest Taylor Swift music video, the video site will fingerprint the upload and spot a common pattern with the canonical Taylor Swift video; the upload will then be marked as infringing copyright and access disabled. Job done.

What happened with Worldcon is that, in common with other awards ceremonies, they showed short video clips of the contenders for each award. Some of those clips were of copyright material; the fingerprinting robots spotted the material, flagged the livestream as infringing copyright and shut down access. This despite the fact that a) the copyright holders did not object to the inclusion of the material - indeed, they were keen for it to be used, and b) use of clips falls squarely within the safe harbour provided by fair use in copyright law.

So here's the downside of automatic policing of copyright infringement - it is only as smart and fair as the robots policing it. Because of the rabid nature of the lawyers of copyright holders, all the effort will be put into making the robots effective; no-one with money will care whether or not they are fair.

No comments:

Post a Comment

All comments are subject to retrospective moderation. I will only reject spam, gratuitous abuse, and wilful stupidity.