The Problem with Google’s Advertising Model (and what you can do about it)

mismatch

How household brands are unwittingly "funding terrorism" through programmatic advertising

There has been a shift in how online advertising works. Because advertisers now have access to large amounts of consumer data, they are able to specifically target individual consumers. This should make advertising much more effective. The shift away from the old-fashioned blanket bombing approach to a targeted one has, however, had one unfortunate side-effect.

Programmatic advertising: how it works... or doesn't

Advertisers purchase an ad slot that will supposedly play to a suitable demographic (the right, age, education level, etc), but because most of these purchases are made by algorithm, it's likely that they will have no idea what video their ad will appear alongside. They have an abundance of data on the audience, but not on the context.

The problem here is twofold. On the one hand, Google (who incidentally "provides most of the plumbing that enables programmatic") is not adequately vetting the space they sell before making it available; on the other, the media agencies are not checking what they buy on behalf of clients – those household brands now very publicly linked to terrorism.

Check, please

Google-owned Youtube does monitor video content, but their filter policy is mostly automated. The title, metadata and imagery are scanned for indications of matter that breaches regulations. Beyond that, they leave it up to their users to flag offensive videos. The difficulty with this is that if you are among a niche video's target audience – say, a member of a right-wing political group – you're unlikely to object to what they post.

Round-up

There's a theme here. The ad space auction is automated and the vetting process is done by algorithm. Moreover, a recent study suggests that around a third of the pageviews sites obtain may actually come from bots, rather than real-life consumers. So where are all the people?

As global marketing companies remove their clients' advertisements from Google and Youtube, it's clear that these technology giants need to manage their content better. This will involve improved data analysis and more advanced software, but it will also require a large amount of human judgment, something that has previously been lacking.

The other lesson is to do with us: we need to exercise our power as a curating crowd. We can't expect the internet to be self-governing – we, the users, have a responsibility to contribute to the web's governance if we want it to reflect our values.

Share this via

Partners & Technology