Major marketers’ ads have likely been showing up on or near YouTube videos promoting terrorism, neo-Nazi groups and other web content for a long time. So why has the brand-safety problem suddenly burst into the open, prompting big advertisers such as General Motors, Walmart, Verizon, AT&T and Johnson & Johnson to stop spending on YouTube or other Google properties? Thank – or blame – Eric Feinberg, a longtime marketing services executive who in recent months has made it his mission to find ad-supported content linked to terror and hate groups, then push links and screen shots proving it happened to journalists in the UK. and US.
The resulting coverage has sparked a full-fledged advertiser revolt.
Feinberg owns Southfield, Mich.-based Gipec, short for Global Intellectual Property Enforcement Center, which employs “deep web interrogation” to find keywords and coding linked to terrorism and hate speech.
He’s also co-owner of a patent issued in December for a “computerized system and method for detecting fraudulent or malicious enterprises.” His system works in part by analyzing when videos and websites contain words that appear alongside such phrases as “kill Jews.” He’s logged thousands of sometimes innocuous or obscure sounding terms he says “co-trend” with such hate speech or exhortations to violence, which in turn helps him finding offensive videos.
His efforts with the media have been classic problem-solution marketing. Feinberg makes no bones about his interest in licensing his technology to Google and other digital platforms to monitor offensive content and keep ads away from it.
Certainly, Google knows plenty about artificial intelligence and machine learning, as its executives have eagerly informed marketers in public and private presentations for years. And last week, as major advertisers one after the other pressed “pause” on YouTube advertising, Google said in a blog post that it’s beefing up its tech efforts and hiring more people to prevent placement of ads with unsavory content. A spokesman declined to comment further.
But Feinberg says in an interview on March 24 that he doubts Google can succeed. At least, “not without violating my patent.”
Seemingly there shouldn’t be a market for what Feinberg has to sell. Brand safety, or monitoring ad placements to prevent brands from appearing alongside porn and other embarrassing content, is a standard part of offerings from digital audience measurement firms such as Moat, which in late 2015 became the first such company invited in by YouTube to monitor the site for agencies and brands.
Moat CEO Jonah Goodhart declined to comment on Feinberg’s system because he’s not familiar with it, but conceded that the ad-placement issue “is a topic that seems to be important to folks.”
“I don’t know that an advertiser is able to see the URL where their ad runs by policy of some of the major sites,” Goodhart says in an e-mail. “So one can crawl the web and find bad content, but that’s a different thing than stopping an advertiser from appearing on it. Again, though, I don’t know him or his company so maybe they do something else.”
Feinberg wants to sell his tech to the digital platforms themselves, not to brands, agencies or third-party monitors. He says that’s because he’s not just out to make money but to stop terrorists and hate groups from making money off digital advertising. He likens the idea of selling his tech piecemeal to brands and agencies to “fixing your toilet or sink at the house when the problem is at the sewer or the reservoir.”
“They aren’t really understanding key trending or key words, and they’re not looking for it like we do,” says Feinberg. “I have a database of thousands of words and phrases linked to nefarious activity.”
One is a Serbian word roughly transliterated as “hanwa,” which has been co-trending with jihadist content online. YouTube Red, YouTube’s ad-free subscription service, advertised its movie “The Thinning” on one such “hanwa” video on Saturday, according to a link and screen shot he sent.
Whether all this will help Mr. Feinberg close a deal with Google or anyone else is hard to say. If he doesn’t and Google develops its own solution, he can try to stake a patent claim. Does that make him a patent troll? He argues that he’s doing something different than patent litigators that apply an obscure patent to something tech firms were already doing anyway. Mr. Feinberg, by contrast, has gone to great and public lengths in recent weeks to demonstrate that his technology can root out problems Google hasn’t found.
Certainly, he’s no typical ad-tech guy. He spent nearly two decades running an event marketing, promotion and social-media activation firm focused on sporting events under the name Extreme Advertising & Promotion, based variously in New York and Boca Raton, Fla. Among his efforts were branded tractor-trailer wraps during the Super Bowl in Miami in 1999. He says he’s had numerous Fortune 500 clients over the years, including Motorola, General Motors, Sprint and PepsiCo.
Work on NFL promotions alerted Feinberg to the problem of counterfeit licensed merchandise, prompting to him form Fans Against Kounterfeit Enterprise, or FAKE, a company focused on rooting out digital ads from largely Russian and Chinese purveyors of fake but authentic-looking goods.
He says he also began five years ago working with musicians to track down pirated videos on YouTube and elsewhere, so they could either have them taken down or get compensated. Two years ago, following the terror attack in France on the Charlie Hebdo newspaper, he began turning his attention to terrorist-linked content online.
“I’m sickened by it,” says Feinberg , contrasting ads supporting terrorism and hate groups to what he did in brand promotions. “I’m proud of what I did. That’s why I’m doing this. It’s a level of passion.”
This site uses cookies: Find out more.