alphabet incbennet cypherselectronic frontier foundationfacebookGadgetGadgetsgithubgooglegoogle chromehttp cookieikeainternet privacyken paxtonmarketingmicrosoftonline advertisingSoftwaretargeted advertisingtechnology internettrackingweb analyticsweb browsers

What Is Google’s FLoC Technology?

Illustration for article titled What You Need to Know About FLoC, the Ad-Targeting Tech Google Plans to Drop on Us All

Photo: David Ramos (Getty Images)

About two weeks ago, millions of Google Chrome users were signed up for an experiment they never agreed to be a part of. Google had just launched a test run for Federated Learning of Cohorts—or FLoC–a new kind of ad-targeting tech meant to be less invasive than the average cookie. In a blog post announcing the trial, the company noted that it would only impact a “small percentage” of random users across ten different countries, including the US, Mexico, and Canada, with plans to expand globally as the trials run on.

These users probably won’t notice anything different when they click around on Chrome, but behind the scenes, that browser is quietly keeping a close eye on every site they visit and ad they click on. These users will have their browsing habits profiled and packaged up, and shared with countless advertisers for profit. Sometime this month, Chrome will give users an option to opt-out of this experiment, according to Google’s blog post—but as of right now, their only option is to block all third-party cookies in the browser.

That is if they even know that these tests are happening in the first place. While I’ve written my fair share about FLoC up until this point, the loudest voices I’ve seen pipe up on the topic are either marketing nerds, policy nerds, or policy nerds that work in marketing. This might be due to the fact that—aside from a few blog posts here or there—the only breadcrumbs Google’s given to people looking to learn more about FLoC are inscrutable pages of code, an inscrutable GitHub repo, and inscrutable mailing lists. Even if Google bothered asking for consent before enrolling a random sample of its Chrome user base into this trial, there’s a good chance they wouldn’t know what they were consenting to.

(For the record, you can check whether you’ve been opted into this initial test using this handy tool from the Electronic Frontier Foundation.)

Since Google doesn’t have a good track record of being forthright about its privacy practices, we decided to write up the basics of this tech, the trial, and why FLoC’s promises aren’t actually all they’re cracked up to be.

“WTF is a FLoC?”

In Google’s own words, it’s a “privacy-preserving mechanism for interest-based ad selection.” In normal human words, it’s a way to track users across the web for ad-targeting purposes, in a way that’s more privacy-friendly than the cookies and code advertisers have relied on until now—at least, that’s what Google says.

“How’s it supposed to work?”

It’s a bit complicated. When someone floats from site to site across the web using a FLoC-powered browser, that browser will use an internal algorithm to suss out an appropriate “interest cohort” to lump that person into, and these cohorts will get recalculated on a weekly basis. These specific cohorts, Google says, are built up of thousands of different users at any given time, making tracking and targeting your specific browser history nigh impossible for any sleazy adtech types.

Just as an example here: I’m in the middle of refurbishing my apartment, which means I spend a good two hours a day clicking through sites for stores like West Elm, Target, IKEA, and the like. In this situation, my browser could (pretty accurately) label me as a home decor nerd, and lump me into a cohort with thousands of other people that also spend hours poring over couches.

Under FLoC, every cohort is given a name that’s a jumble of letters, numbers, or both, so let’s just call the home-decor cohort HGTV, after the legendary channel of the same name.

The next time I visit a site for tips about, I don’t know, reupholstering my couch, that site can ask the cohort I’m a part of. When it gets notice that I’m a part of the HGTV cohort, the site can then keep track of my behavior on-site and the couch ads that I inevitably click on, and then aggregate that data with other folks from the same cohort as they trickle in.

Every so often, that aggregated data about what the HGTV cohort is into (couch reupholstering! removable wallpaper! granite countertops!) gets uploaded to any ad networks an individual site might be working with.

Let’s just say the network in question is Google Ads since just about every site is using it. If I try browsing an ad-supported news site—like the one you’re on right now—after checking out that Couch Content, that news site will also ask my browser about the cohort I’m in (HGTV).

Once that’s settled, my cohort ID gets beamed to that site’s partnering ad networks, which naturally includes Google’s network. Based on the data this ad-serving system gleaned on this cohort previously (ie, they could probably use a new couch), it reaches into its back catalog of ads from about 7 million different advertisers that are waiting to run. The ad platform finds an ad for a new sofa and plunks that on the news site, where I see it, immediately give up on the idea of reupholstering anything, and click.

“How is any of this different from the tracking we have now?”

The trackers that FLoC is meant to replace are known as “third-party cookies.” We have a pretty in-depth guide to the way this sort of tech works, but in a nutshell: these are snippets of code from adtech companies that websites can bake into the code underpinning their pages. Those bits of code monitor your on-site behavior—and sometimes other personal details—before the adtech org behind that cookie beams that data back to its own servers.

That’s one of the key differences between FLoC and the current cookie hell we’re enmired in. With FLoC, my thousand-person strong cohort is the only thing an outside advertiser sees. Anything else—like the names of sites that I visited or details about couches I’ve clicked on in the past—are stored locally within the browser. In the cookie case, all of these details are beamed to an external server where the company in charge can pretty much have free reign: they can pawn off this data to other adtech firms, or they can merge their data with data from other cookie co’s, or in some cases, they can give that data to police.

This is why Google’s pitch sounds semi-appealing. Sure, you’re still being behaviorally profiled in what’s inarguably a kind of icky way, but at least you can’t be picked out of a lineup.

“There’s gotta be a catch here.”

The catch is that Google still has all that juicy user-level data because it controls Chrome. They’re also still free to keep doing what they’ve always been doing with that data: sharing it with federal agencies, accidentally leaking it, and, y’know, just being Google.

“No way.”

Way.

“Isn’t that kind of… anti-competitive?”

It depends on who you ask. Competition authorities in the UK certainly think so, as do trade groups here in the US. It’s also been wrapped up into a Congressional probe, at least one class action, and a massive multi-state antitrust case spearheaded by Texas Attorney General Ken Paxton. Their qualms with FLoC are pretty easy to understand. Google already controls about 30% of the digital ad market in the US, just slightly more than Facebook—the other half of the so-called Duopoly—that controls 25% (for context, Microsoft controls about 4%).

While that dominance has netted Google billions upon billions of dollars per year, it’s recently netted multiple mounting antitrust investigations against the company, too. And those investigations have pretty universally painted a picture of Google as a blatant autocrat of the ad-based economy, and one that largely got away with abhorrent behavior because smaller rivals were too afraid—or unable—to speak up. This is why many of them are speaking up about FLoC now.

“But at least it’s good for privacy, right?”

Again, it depends who you ask! Google thinks so, but the EFF sure doesn’t. In March, the EFF put out a detailed piece breaking down some of the biggest gaps in FLoC’s privacy promises. If a particular website prompts you to give up some sort of first-party data—by having you sign up with your email or phone number, for example—your FLoC identifier isn’t really anonymous anymore.

Aside from that hiccup, the EFF points out that your FLoC cohort follows you everywhere you go across the web. This isn’t a big deal if my cohort is just “people who like to reupholster furniture,” but it gets really dicey if that cohort happens to inadvertently mold itself around a person’s mental health disorder or their sexuality based on the sites that person browses. While Google’s pledged to keep FloC’s from creating cohorts based on these sorts of “sensitive categories,” the EFF again pointed out that Google’s approach was riddled with holes.

“Behavior correlates with demographics in unintuitive ways,” wrote EFF technologist Bennet Cyphers. “It’s highly likely that certain demographics are going to visit a different subset of the web than other demographics are, and that such behavior will not be captured by Google’s “sensitive sites” framing.”

“And Google’s pitching this as a better alternative to cookies?”

I know, right?????????

“How do I get all this across to my uncle/parent/neighbor/estranged nephew that’s not tech-savvy, but wants to know what FLoC is all about?”

Just remind them that this is a privacy product being pushed by Google. Google. That’s all they need to know.

//platform.twitter.com/widgets.js


Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button