Log In


Reset Password
  • MENU
    Columnists
    Tuesday, April 16, 2024

    Giving humans role in defining digital self

    Sascha Meinrath has been teaching his 4-year-old daughter to lie.

    When a website asks her favorite color, she says "plaid." If a game she's playing asks about her favorite pet, the answer is "giant squid." When queried about her best friend's name, it's "Beelzebub."

    He doesn't tell his daughter that these are lies; instead he portrays these fanciful answers as "telling stories," or building a "character" she can use to portray herself in online activities. It's a sort of gamification of the Meinrath family misinformation campaign.

    "The idea here is to prevent a 4-year-old from having a profile that's going to follow her for the rest of her life," Meinrath explains. "That scares me, because she cannot possibly understand the import of the data that she is providing."

    Meinrath is director of X-Lab, a nonprofit that "anticipates the disruptions and dystopian outcomes of tech policy decisions and aims to help humanity change course."

    He's telling this story at SXSW in Austin, Texas, the annual tech confab, whose attendees view large-scale, opaque data collection and surveillance as evil when practiced by government - but a totally legitimate business model when practiced by private companies.

    This audience is not terribly sympathetic to regulatory proposals to curb online data collection and behavioral targeting, or, as with the European Union's "right to be forgotten," allow users to expunge their online records altogether.

    But the solution Meinrath emphasizes is a different one: He's pushing for consumers to be able to easily correct the data that online trackers accumulate about them. This means someday, when his daughter is mature enough to understand the implications, she would get to amend the digital profile online data-trackers have assembled about her, so that employers, lenders, landlords and insurers who might possibly purchase information about her won't believe her best friend is really Satan.

    Meinrath worries not only about how much data companies are collecting; he worries about their drawing the wrong conclusions from the information they have, too. What if, he posits, a retailer mistakenly categorizes a conservative Southerner as gay because a flawed, stereotype-inflected algorithm correlates his color preferences with homosexuality? Whom might that wrong assumption get passed onto?

    Meinrath says he wastes a lot of time each month trying to correct faulty automated interpretations about him. He's a man named Sascha, after all, and corporations irksomely address him as "Mrs." - a lot. But he says more expensive inconveniences abound, including a monthly hold on his credit card after an authorized transaction always trips the fraud alert, even though he's repeatedly told his credit card company to expect this charge. He wants an easy means of telling vendors: Ignore what your algorithms say about who I am or what I should be purchasing; here's the right information.

    I certainly understand his frustration, given the number of misdirected ads flung my way. Ever since I changed my status on Facebook to "married," my newsfeed has been full of ads about morning sickness, baby names, "18 Pregnancy Announcements That Run The Gamut from Terrible to Terrific," and a quiz promising to help me decipher whether I'm gestating a girl or a boy.

    Thing is, I'm not pregnant. But Facebook advertisers are acting like a nosy mother anyway, making assumptions about my reproductive plans, presumably because statistics show that newlywed young women like me are often breeding.

    It's hard to say which feels more intrusive: when targeted ads get deeply personal things right, or when they get deeply personal things wrong. The impulse to correct others' mistakes about you is very strong, even when giving in to that impulse means divulging more information than is in your interest (as investigative journalists have long known).

    I'm not yet convinced Meinrath's solution to Panopticon-like corporate data collection is the best one. After all, what if the data profile online trackers construct about you turns out to be true, but still includes sensitive information you don't want shared with the highest bidder? What if you're a conservative Southerner who actually is gay, just as that hypothetical retailer deduced? I'd wager that's still no one's business but your own.

    When I asked Meinrath why he focuses on the right to correct data rather than limit its collection, he essentially said the horse is out of the barn. Privacy advocates have fought for Do Not Track policies for years now, and achievements thus far have been pretty toothless, while data trackers have gotten infinitely more sophisticated.

    Consumers' last best hope, he argues, is correctible data tracking.

    Comment threads are monitored for 48 hours after publication and then closed.