The following is an excerpt from David Murphy | June 27, 2016 | Foxnews.com |
If you try posting copyrighted material on Facebook, like a music video that isn't yours, odds are good that the service's systems will be able to find that you've done so based on the unique fingerprint the video file has. If this fingerprint, or "hash," matches up with a known list of copyright material, that's ityour video is flagged and off it goes into the digital ether.
And the same is true on a number of other platforms. Most have heard about YouTube's Content ID program, for example, which scans uploads (including their audio tracks) against an existing database of copyright work and flags anything that matches too closely.
However, new reports suggest that some of the bigger players in social media and content hosting are turning their matching tools from copyright to contentspecifically, extremist and/or exceedingly violent content. While these techniques don't stop new videos from being posted to the site, they can help police those attempting to share videos that have already been flagged or removed.
As Reuters reports, Facebook and Google are two companies that are allegedly tuning their content-matching systems to eliminate extremist content. However, neither are talking about it, nor are they discussing how, exactly, they decide this kind of content fits the criteria for removal. Some content is obvious: a beheading, for example. But where does one draw the line between, say, encouraging violence and passionate rhetoric?
It's also unclear whether these companies are relying exclusively on their matching systems to find and remove this kind of content or whether some human review process is used to separate permissible from undesirable content.
The companies now doing this kind of content matching aren't talking about it because they feel shy about policing their platforms. Rather, discussing the methods they use to flag and remove extremist content might give those posting it some kind of insight in how to beat the system. And these companieswho all likely have different standards about what's acceptable on their networkswould probably prefer the effort to be as little cat-and-mouse as possible.
For more visit: Foxnews.com