A blind and opaque reputelligent nosedive
August 2, 2020 2:46 AM   Subscribe

Data isn't just being collected from your phone. It's being used to score you. - "Operating in the shadows of the online marketplace, specialized tech companies you've likely never heard of are tapping vast troves of our personal data to generate secret 'surveillance scores' — digital mug shots of millions of Americans — that supposedly predict our future behavior. The firms sell their scoring services to major businesses across the U.S. economy. People with low scores can suffer harsh consequences."[1]
CoreLogic and TransUnion say that scores they peddle to landlords can predict whether a potential tenant will pay the rent on time, be able to “absorb rent increases,” or break a lease. Large employers use HireVue, a firm that generates an “employability” score about candidates by analyzing “tens of thousands of factors,” including a person’s facial expressions and voice intonations. Other employers use Cornerstone’s score, which considers where a job prospect lives and which web browser they use to judge how successful they will be at a job.

Brand-name retailers purchase “risk scores” from Retail Equation to help make judgments about whether consumers commit fraud when they return goods for refunds. Players in the gig economy use outside firms such as Sift to score consumers’ “overall trustworthiness.” Wireless customers predicted to be less profitable are sometimes forced to endure longer customer service hold times.

Auto insurers raise premiums based on scores calculated using information from smartphone apps that track driving styles. Large analytics firms monitor whether we are likely to take our medication based on our propensity to refill our prescriptions; pharmaceutical companies, health-care providers and insurance companies can use those scores to, among other things, “match the right patient investment level to the right patients.”

Surveillance scoring is the product of two trends. First is the rampant (and mostly unregulated) collection of every intimate detail about our lives, amassed by the nanosecond from smartphones to cars, toasters to toys. This fire hose of data — most of which we surrender voluntarily — includes our demographics, income, facial characteristics, the sound of our voice, our precise location, shopping history, medical conditions, genetic information, what we search for on the Internet, the websites we visit, when we read an email, what apps we use and how long we use them, and how often we sleep, exercise and the like.

The second trend driving these scores is the arrival of technologies able to instantaneously crunch this data: exponentially more powerful computers and high-speed communications systems such as 5G, which lead to the scoring algorithms that use artificial intelligence to rate all of us in some way.

The result: automated decisions, based on each consumer’s unique score, that are, as a practical matter, irreversible.

That’s because the entire process — the scores themselves, as well as the data upon which they are based — is concealed from us. It is mostly impossible to know when one has become the casualty of a score, let alone whether a score is inaccurate, outdated or the product of biased or discriminatory code programmed by a faceless software engineer. There is no appeal.
-IB's grading algorithm is a huge mess
-Edward Snowden: How Your Cell Phone Spies on You

meanwhile, in China...
People With Low Social Credit Scores Have Photo Shown At Movie Theaters - "The Chinese social scoring system has a new twist: if you have a low social credit score, then you are deemed by the government as untrustworthy. Untrustworthy individuals with low social credit scores are journalists who write unfavorable things about the government, or they are people who do not pay their bills. Christians also have low social credit scores and are deemed to be untrustworthy by the government."
Individuals who have a low social credit score are now publicly shamed. It has been reported that individuals with low social credit scores are appearing on IMAX movie screens before the beginning of the movie.

When the Marvel movie ‘The Avengers: Endgame’ was released in China, several individuals with low social credit scores appeared on the screen. This is becoming so common that it is called the “reel of shame.”

They are also appearing on large building screens.

Everything in China is reminding people of their social credit scores – when they go to the bank, get on a train, travel on a bus, go to the ATM or go to a large shopping center.
The Panopticon Is Already Here - Xi Jinping is using artificial intelligence to enhance his government's totalitarian control—and he's exporting this technology to regimes around the globe."
Xi’s pronouncements on AI have a sinister edge... He wants to build an all-seeing digital system of social control, patrolled by precog algorithms that identify potential dissenters in real time...

China already has hundreds of millions of surveillance cameras in place. Xi’s government hopes to soon achieve full video coverage of key public areas. Much of the footage collected by China’s cameras is parsed by algorithms for security threats of one kind or another. In the near future, every person who enters a public space could be identified, instantly, by AI matching them to an ocean of personal data, including their every text communication, and their body’s one-of-a-kind protein-construction schema. In time, algorithms will be able to string together data points from a broad range of sources—travel records, friends and associates, reading habits, purchases—to predict political resistance before it happens. China’s government could soon achieve an unprecedented political stranglehold on more than 1 billion people...

A crude version of such a system is already in operation in China’s northwestern territory of Xinjiang, where more than 1 million Muslim Uighurs have been imprisoned, the largest internment of an ethnic-religious minority since the fall of the Third Reich. Once Xi perfects this system in Xinjiang, no technological limitations will prevent him from extending AI surveillance across China. He could also export it beyond the country’s borders, entrenching the power of a whole generation of autocrats.
The Global Implications of "Re-education" Technologies in Northwest China - "Finally, and perhaps most importantly in light of the global future of policing and carceral technologies, the U.S. government should introduce legislation, and work with partner nations, to universally ban the collection and use of 'passive' or involuntary biometric information and data surveillance."
  • @doctorow: "But the whole story isn't the walled prisons: it's the entire region, which has been turned into an open air prison where technology tracks and controls predominantly Muslim Turkic people while allowing Han people to go about their business largely unhindered."
  • @bcrypt:"wow, this is some incredible reverse-engineering from a screen recording of a drone video... 'showing 3-400 detainees handcuffed & blindfolded at a train station in Xinjiang [uploaded to YouTube] In this thread I'll share how I've verified that this video was filmed at 库尔勒西站 (41.8202, 86.0176) on or around August 18th.'"
  • @misszing: "I've heard so many awful stories about what's happening in the Uighur concentration camps but this has absolutely broken me (cw: rape)."
  • @VinceCoglianese: "In search of the next Yao Ming, the NBA opened a player development academy in Xinjiang -- where most athletes were Uighurs. The Chinese government ran the facilities and the coaches physically beat the players." (NBA re-evaluating training programme in China after abuse allegations)
U.S. imposes sanctions on Chinese company over abuse of Uighurs - "The Xinjiang Production and Construction Corps is a quasi-military group created in 1954. It was initially made up of demobilized soldiers who spent time in military training while developing farms on the region's arid land. Civilian members from eastern China later joined the corps, which now numbers 3.11 million people, or more than 12% of the region's population. It is almost entirely made up of Han Chinese in a region that is home to the Muslim Uighur people. Experts have said the group is like a 'state within a state' and has established new cities in the region with schools and universities and jurisdiction over police and courts." (Seems like a big deal: U.S. sanctions China's paramilitary in Xinjiang)

Exclusive: SenseTime eyes STAR market IPO after $1.5 billion fundraising - sources - "The company was among eight Chinese tech companies placed on the U.S. entity list in October amid trade tensions between Beijing and Washington. The U.S. alleges the companies have played a role in human rights abuses against Muslim minority groups in China."

What's happening in Xinjiang is genocide - "The new evidence shows that China is systematically using pregnancy checks, forced intrauterine devices, sterilization and even abortion to reduce the population of Uighurs and other Muslims in Xinjiang. Moreover, having too many children is being punished by incarceration in the camps."

Shocked Hong Kong in a new era under 'white knuckle' China grip - "The law has brought a chill and the pulling of pro-democracy books from library shelves, disqualifications of democrats from a city election and the arrests of three teenagers for Facebook posts deemed secessionist. New arms of China's state security apparatus have been set up, including a National Security Office in a leafy neighbourhood on Hong Kong island. In a few flare-ups of opposition to the law, protesters have been arrested for once legal banners and for shouting slogans now labelled subversive." (Exclusive: Global banks scrutinize their Hong Kong clients for pro-democracy ties)

China's Xi Sets His Sights on Taiwan After Subduing Hong Kong - "Now fears are growing that Xi wants to cement his place alongside Mao and Deng by conquering Taiwan, a prize that's eluded Communist Party leaders for decades."

The World's Highest and Fastest Cell Service Could Have Geopolitical Implications - "Now, data speeds in the 'death zone' on Everest, where the altitude is too high and the air is too thin to support life, are faster than in most American neighborhoods... But this comes at a cost. The internet in Tibet, like the rest of China, is censored by the government, and it’s rife with alternative facts designed to hide the history of Tibet's sovereignty before 1950."

India, Jio, and the Four Internets - "One of the more pernicious mistruths surrounding the debate about TikTok is that this will potentially lead to the splintering of the Internet; this completely erases the history of China's Great Firewall, started 23 years ago, which effectively cut China off from most Western services. That the U.S. may finally respond in kind is a reflection of reality, not the creation of a new one. What is new is the increased splintering in the non-China Internet: the U.S. model is still the default for most of the world, but the European Union and India are increasingly pursuing their own paths." (Chinese banks urged to switch away from SWIFT as U.S. sanctions loom)

Whose century? - "China under the control of the CCP is involved in a gigantic and novel social and political experiment enrolling one-sixth of humanity, a historic project that dwarfs that of democratic capitalism in the North Atlantic." (via)
One has to wonder whether the advocates of a new Cold War have taken the measure of the challenge posed by 21st-century China. For Americans, part of the appeal of allusions to Cold War 2.0 is that they think they know how the first one ended. Yet our certainty on that point is precisely what the rise of China ought to put in question. The simple fact is that the US did not prevail in the Cold War in Asia. Korea was divided by a stalemate. Vietnam was a humiliating failure. It was to find a way out of that debacle that Nixon and Kissinger turned to Beijing and inaugurated a new era of Sino-American relations. America’s ability to tilt the balance against the Soviet Union was linked to its success in playing the Chinese off against the Soviets. The Tiananmen Square massacre was not an incidental blot on the liberal landscape of 1989; it was the Communist Party of China’s answer to the Berlin-centred ‘end of history’ narrative.

The mistake in thinking that we are in a ‘new Cold War’ is in thinking of it as new. In putting a full stop after 1989 we prematurely declare a Western victory. From Beijing’s point of view, there was no end of history, but a continuity – not unbroken, needless to say, and requiring constant reinterpretation, as any live political tradition does, but a continuity nevertheless. Although American hawks have only a crude understanding of China’s ideology, on this particular matter they have grasped the right end of the stick. We have to take seriously the CCP’s sense of mission...

China’s regime is serious about maintaining and expanding its power and conceives of itself as having a world historic mission to rival anything in the history of the West – the question is how rapidly we can move to détente, meaning long-term co-existence with a regime radically different from our own, a long-term attitude of ‘live and let live,’ shorn of assumptions about eventual convergence and the inevitable historical triumph of the West’s economic, social and political system. It would be a long-term co-existence, in which, over time, the US may well find that it has become the junior partner or, at best, the leader of a coalition of smaller powers balancing the massive weight of China.
Special Report: Rite Aid deployed facial recognition systems in hundreds of U.S. stores - "In the hearts of New York and metro Los Angeles, Rite Aid deployed the technology in largely lower-income, non-white neighborhoods, according to a Reuters analysis. And for more than a year, the retailer used state-of-the-art facial recognition technology from a company with links to China and its authoritarian government." (How Reuters analyzed Rite Aid's use of facial recognition technology)

-Congress Needs to Act On Facial Recognition
-Big Tech companies want to act like governments
-Rapidly developing technologies challenge politics

Data Privacy Before and After a Pandemic - "Marietje Schaake, former EU Parliament Member and international policy director of Stanford's Cyber Policy Center, explains how Singapore and China used surveillance to track COVID-19 and what it could mean for the US."

also btw... Why are all these science-fiction shows so awful? - "I get it: We are all scared of phones, and bots, and the Algorithm. Yet by demonizing technology, these projects oddly exonerate the people behind that technology."[2]
posted by kliuless (33 comments total) 184 users marked this as a favorite
 
Mega love <3 kliuless!
posted by infini at 3:32 AM on August 2, 2020


Hard to "favourite" this, you have done an excellent job of curating and collating the evidence of the onrushing dystopia. Good job, I guess, but had to stop reading because just too damn depressing.
posted by Meatbomb at 5:07 AM on August 2, 2020 [16 favorites]


And I thought credit scores were manipulative evil. This is capitalism's version of China's social credit scoring.
posted by Thorzdad at 5:14 AM on August 2, 2020 [6 favorites]


I was talking with a friend who is hardcore buying into “China is the bad guy” thing, and I mean, yeah, they kind of are, in terms of the data mining and social credit scoring. It’s absolutely horrific. But here he is, going on about how Tik Tok is data mining unsuspecting users and acting like Amazon, Facebook/Instagram, and supermarket rewards cards haven’t been doing the same damn thing.

It feels like the only difference at this point is that in China, the state is running the surveillance, but in the states, it’s corporations utterly untethered from any regulatory oversight. Shitty and evil either way, but I’m not seeing “ban Facebook” getting anywhere near the press as “Tik Tok is evil!”
posted by Ghidorah at 5:15 AM on August 2, 2020 [11 favorites]


So far capitalist tracking executes its genocide as a side effect of scoring and redlining. While unlikely, if it became more (obviously) profitable (likely due to regulating the externality costs better) to make scoring and using scores more fair or accurate (or just illegal as fraud, because they're very close), those markets might eventually do less direct harm by chasing other money elsewhere.

The Uighur genocide is conscious, overt, and has no regulatory authority that can affect it because it is the state's will.

They're both very bad. I'm not sure anybody needs to choose only one to fight.
posted by abulafa at 5:41 AM on August 2, 2020 [21 favorites]


Gidorah - the difference between a police surveillance state and ad targeting, rewards programs, or lending decisions is a really big one.

One thing I found interesting was the extent of religious persecution underpinning the Uighur genocide. Not because of the actual content of the religion, but just because religion can serve as a way to motivate and organize groups together. I think of the importance of Black churches in driving the civil rights movement and I hope that the US doesn’t lose those sources of non-corporate, non-state power.

I know we don’t live up to our ideals, but it is great to live in a country founded on the principles of freedom of assembly, religion, and speech.
posted by The Ted at 5:52 AM on August 2, 2020 [10 favorites]


This is an excellent post. As one of the folks looking for a job in Pandemic USA, it’s also a pretty terrifying one.
posted by FallibleHuman at 6:22 AM on August 2, 2020 [2 favorites]


I could have sworn there was a Black Mirror episode that dealt with this.
posted by ricochet biscuit at 7:10 AM on August 2, 2020 [6 favorites]


On rewatching of "Brazil", the oppressive bureaucracy seems fairly quaint and humanizing in comparison.
posted by RobotVoodooPower at 7:15 AM on August 2, 2020 [21 favorites]


the difference between a police surveillance state and ad targeting, rewards programs, or lending decisions is a really big one

Except, as the links point out, the companies making these systems and applying them in the States are essentially applying police state level surveillance that can make one essentially unable to participate in society. Denying housing, access to banking or loans, tracking movement, interactions, and willingly turning said data over to police seeking to prosecute.

The difference is that China has a head start, and is actively pursuing genocidal goals, and I do not in any way intend to diminish that. To pretend that tech firms like, say, Palantir wouldn’t do their part to enable such agendas in America (as long as there was a profit to be had), or aren’t trying to set themselves up as vendors for the systems that would be required is frankly naive. Past that, there seems to be next to no oversight or even an attempt to create a legal and ethical framework for the incredible power inherent in this level of data mining. As quoted above, we’re talking about data being used to determine whether or not a landlord should rent to someone. Whether a company should hire someone. Without any oversight, what chances do things like this have to *not* be as bad at race as the debacle with facial recognition being trained on primarily white faces, and failing terribly on non-white faces? What chance does any of this have to not inherently benefit people who match the profile of those who created it, to the detriment of others?

None of this is a positive advancement for humanity, but to follow the current trend and act as if China is the only party doing anything evil with shit ignores how it’s being used at home.
posted by Ghidorah at 7:16 AM on August 2, 2020 [28 favorites]


soundtrack for the thread
posted by flabdablet at 8:29 AM on August 2, 2020 [1 favorite]


I've been spending the last few years learning to code. The more that I'm learning, the more I recognize the immense power of the digital realm, and the general ignorance of this fact. Or to put another way, the more I learn about, the more I'm afraid of it.

The tech isn't new, but the massive immigration of global populations into the digital space is overwhelming in pace and volume. It's no coincidence that the highest valued assets in the digital space are really just tools of data collection.
posted by iamck at 8:41 AM on August 2, 2020 [2 favorites]


Except, as the links point out, the companies making these systems and applying them in the States are essentially applying police state level surveillance that can make one essentially unable to participate in society.

And they're also selling the data to actual police, who are totally able to make fair and effective use of it. /s
posted by klanawa at 8:42 AM on August 2, 2020 [8 favorites]


None of this is a positive advancement for humanity, but to follow the current trend and act as if China is the only party doing anything evil with shit ignores how it’s being used at home.
This is not a fair way to represent someone saying that there are different degrees of badness while not excusing either. If you want to effectively oppose either of these trends, calling your credibility into question like that is especially dangerous because everyone you’re opposing has huge PR operations which love to seize on unforced errors like that to discredit an entire movement (think of the current manufactured outrage around “cancel culture”).
posted by adamsc at 8:47 AM on August 2, 2020 [4 favorites]


None of this is a positive advancement for humanity, but to follow the current trend and act as if China is the only party doing anything evil with shit ignores how it’s being used at home.

It also ignores the fact that American tech companies (including Intel and Google) are making billions from selling their surveillance and related technologies to China.
posted by They sucked his brains out! at 8:55 AM on August 2, 2020 [7 favorites]


the more I learn about, the more I'm afraid of it

Most of my working life was spent in computing, one way or another, from embedded systems to system administration. The more time I spent working in IT, the more selective and judicious I became about adopting it and the more resentment I felt against people who market it.

I was horrified when the camera phones first appeared and horrified by the advent of Facebook (they want me to do what with my contact list? Give it to them? Fuck off, not happening).

Going through the process of raising children in a world where existing as both victim and perpetrator of ubiquitous, unaccountable surveillance has become their default expectation has dulled that repeated horror into something very like despair and honed my resentment against marketing of all stripes into something very close to absolute contempt.

And the paperless office is still not a thing. The cake is a lie.
posted by flabdablet at 8:59 AM on August 2, 2020 [31 favorites]


At the very core of the worst ills of capitalism as currently practiced around the world is information asymmetry.

Ancap corporatists in libertarian drag speak of the wonders of the unfettered market, that bad actors will surely never commit bad acts because the market will punish them. But the simple fact of the matter is that an ever-increasing fraction of profit is gained not from rent seeking (though that always has and likely always will be a problem), but from information asymmetry. The market does not correct for problems the market doesn't know about, so hiding relevant information, and trafficking in information only shared with pre-screened "partners" results in unbelievable profit for those who gather that information.

Squirl knew they were mixing mold back into their preserves, but you didn't. The people who made imported baby formula with melamine knew it was there, but you didn't. Marketers are happy to gather all sorts of information -- and make all sorts of tenuous or specious connections with that information -- and sell it to people who will decide whether to hire you, or whether to rent an apartment to you, or put you on hold interminably. Or withhold critical, life-saving care for you because you fit an unprofitable profile.

Unless an until the abuses of secretly collated and speciously connected data are utterly crushed by regulation that has real teeth, this will all only get worse. Unaccountable decisions made on information that will not be shared with you will become increasingly commonplace. And the fetishized, much-vaunted "market" won't do jack shit about it.
posted by tclark at 10:36 AM on August 2, 2020 [26 favorites]


Unless an until the abuses of secretly collated and speciously connected data are utterly crushed by regulation that has real teeth, this will all only get worse. Unaccountable decisions made on information that will not be shared with you will become increasingly commonplace. And the fetishized, much-vaunted "market" won't do jack shit about it.

This isn't really true, though.

The people who sold adulterate baby formula were, in fact, crushed by regulation with real teeth. Some were executed. Some got life prison sentences. Does anyone think that such criminality is now unthinkable as a result?

Teeth don't make for the good society. This is the mistake that people make all the time and the only guarantee it grants is more power to governing bodies. Not good governance.
posted by 2N2222 at 10:54 AM on August 2, 2020 [1 favorite]


I could have sworn there was a Black Mirror episode that dealt with this.

Yeah, it's not at all clear to me if dystopian science fiction serves more to warn us and help us think critically about these technologies or if it just normalizes them, desensitizes us, makes these changes feel inevitable.
posted by straight at 11:50 AM on August 2, 2020 [15 favorites]


I am happy to entertain any reportage about the Xinjiang region that does not rely on interpretation and commentary by Adrian Zenz. Zenz is an evangelical Christian employed by the Victims of Communism Memorial Foundation, who has never been to Xinjiang, who has stated that he feels compelled by God to perform his political work. Zenz is also the author of Worthy to Escape: Why All Believers Will Not Be Raptured Before the Tribulation in which he talks about the surety of damnation brought about by things like homosexuality and gender equality.

Surely, it should be easy to find reportage that doesn't reply on Zenz's unique touch, right?
posted by mobunited at 11:50 AM on August 2, 2020 [16 favorites]


And I thought credit scores were manipulative evil. This is capitalism's version of China's social credit scoring.

Yeah, "and people talk shit about China!" In both cases there is a score that is calculated from similar inputs, by people with the same education, getting paid enough to survive, with similar government access and usage rules, simply with differing levels of transparency in different parts of the processes and effects. Same shit, different narratives.
posted by rhizome at 12:35 PM on August 2, 2020 [2 favorites]


Unless an until the abuses of secretly collated and speciously connected data are utterly crushed by regulation that has real teeth, this will all only get worse. Unaccountable decisions made on information that will not be shared with you will become increasingly commonplace. And the fetishized, much-vaunted "market" won't do jack shit about it.

The problem is that data is not real, or is only real in certain contexts for certain kinds of data with regard to certain laws. If you exfiltrate data from a company, let's say Target, and you are caught, you will go to jail. If Target leaves a computer open and makes it easy for people to exfiltrate data about you, they will not be punished. Your data isn't real, theirs is, and the line between the two is commerce.

Do you use your personal data to make money? Then you might be able to go after people.

Point being, the law hasn't concerned itself with protecting your data, and it's in the interests of industry to preserve this state of affairs. What the industry (plus lobbyists, which means plus legislators) wants to avoid in any legislation is the creation of private rights of action, the ability for you to personally sue Target for letting your data out. The cards are stacked against individuals and the industries have been on this ball for 15-20-plus years, much like how gun companies have been ahead of the game in legislation and judicial precedents absolving them of liability in gun deaths (so far).

California has made baby steps in this regard in the form of the CCPA. A patchwork across the states will create and persist confusion, so this may turn out to be an acceptible compromise over federal legislation that covers everybody.
posted by rhizome at 1:03 PM on August 2, 2020 [10 favorites]


I have a bunch of stuff to say about Sift. If it's too much of a derail from the meat of the post, Xinjiang and the associated horrors, mods please feel free to delete.

Story: in my last professional incarnation, I worked as a fraud specialist for a start-up. Without saying exactly what we did, I can say it was in the ecommerce and SaaS space. We had a major fraud problem, on both the customer and client sides. My position was actually created because it was getting out of control. Our clients, many of which were small businesses, were getting hosed by credit card scammers and nonpayers (kind of hard to explain what that means without naming our industry, but you can get an idea if you think about someone who promises to pay for something, such as an art commission, without needing to put down a deposit, and then refusing to pay when it's ready. They aren't traditional scammers but they are high-risk customers in terms of potential wasted time, labor and opportunity cost.)

Our company had tried to roll our own fraud prevention tools but, bless our developers for trying, they suuuuucked. It was too hard a problem and we only had very blunt tools and not a lot of resources to put to improving them. Eventually I made a case for third-party solutions. We went with Sift.

I wasn't able to read the WSJ articles (paywall) but here's what I can tell you: Sift was a game-changer. After a few months, our fraud rates dwindled. Not disappeared - but went way down. Why did it take so long though? For one thing, we had to teach the algorithm what "good" and "bad" accounts looked like. They got an anonymized data set to start training with, but that left out a ton of information. We had to steadily and accurately confirm that Sift was getting things right by delivering a high risk score, or correct it when they wrongly presented a good customer with a high risk score.

Even after the algorithm had "learned" pretty well, trying to rely on scores alone would have been absurd for us. Sift can surface a breathtaking amount of information about users - think the kinds of detail you can see on a CDP platform like Evergage - BUT in a lot of cases, they can reveal diddly squat, or such little or conflicting information that it wasn't remotely actionable. Sift "scores" were from 1-100 with 100 being most risky. What to do with a user who scores 40, or 80? Instead, we did manual reviews with high scores only being one factor. We individually reviewed users with all kinds of scores: keeping an eye on groups of connected accounts (sometimes legit, such as people living in the same household, sometimes super scammy); brand new users who bought a crapton of stuff; users who shared an unusual characteristic as one of our worst all-time fraudsters, etc.

Sift encourages its partners to use automation. But after several years, we only ever automated one thing, and that was to put a temporary hold on users who scored 95+. To be fair, these users were almost always - but not always! - fraudsters. We could have refined the automation based on rules, (such as, does the user have a high score plus other high-risk characteristic,) but never did - it would have taken, in my opinion, a staggering set of rules to make the kind of nuanced decisions that a human could. That's because the scores are only one piece of the puzzle. In the wrong hands, you could make a lot of shitty and unfair decisions through automation.

Manual reviews also involve looking at user data that has run through Sift's interface, which is where a lot of juicy information lives: accounts that share a device fingerprint, for example. Spoiler though: it's also extremely easy to make shitty and unfair decisions through manual reviews. I made a bunch! Some were made out of unintentional racism/cultural ignorance, for example, not realizing that many people in China use anonymous, numbers only, email accounts, and thinking that looked fishy. It didn't help that Sift had linked them all together through a "risk signal." ("Risk signals" highlight things two or more accounts have in common. If you make two accounts with practically the same email, like scammer1@fakemail.com and scammer2@fakemail.com, Sift will disregard the numbers and show them as connected. Normally this is very useful...not so much with the all-number email accounts.)

We did improve in time. I erred on the side of watch-and-wait. I taught my team that scammers can come from literally anywhere - we had a couple legit great customers in Nigeria, and our all-time worst scammer was this infamous dude in Germany. I did my best and my team did their best, but at the end of the day there were lots of "best guesses," some of which were surely wrong.

I'm still glad we used Sift - I think it was the best tool for that job in that context, and overall I believe we did more good than harm. The space we were in was entirely for non-critical/luxury-type purchases. If Tiffany's wants to use if to figure out if someone is buying that diamond ring is for real or using your great aunt's stolen credit card? Fine, go for it, but please include training for your team in cultural competency so you don't make foolish or racist mistakes. That said, thinking about Sift, or similar tools, used to determine access to housing, employment, insurance, transportation, or anything related to human rights/ability to participate in a free society makes me shudder. It's not only so, so easy to get that shit wrong, the concept itself is unconscionable.

On facial recognition software: if you are in Massachusetts please consider calling your legislator to support the ban on it!
posted by prewar lemonade at 1:53 PM on August 2, 2020 [28 favorites]


One of the things that some loan companies in China use for assessing a loan applicant's reliability is the level of battery charge on their phone when they make a loan application via their mobile app. Customers with low battery levels are considered less likely to pay back their loan.
posted by L.P. Hatecraft at 6:03 PM on August 2, 2020 [4 favorites]


Sift encourages its partners to use automation.

Software marketing.

Hoik ptui.
posted by flabdablet at 7:42 PM on August 2, 2020


Customers with low battery levels are considered less likely to pay back their loan.

This is the kind of thing I imagine an American working in the loan industry hears and their eyes light up. Absent any realistic oversight, we’ll have the worst abuses of data tracking we hear about being used in China being used by private industry in order to scrape more profit, and already in this thread, people are mentioning what is already here.
posted by Ghidorah at 5:27 PM on August 3, 2020 [2 favorites]


At the very core of the worst ills of capitalism as currently practiced around the world is information asymmetry.

pretty sure this applies to all systems of power (aka hierarchies)
posted by philip-random at 10:31 PM on August 3, 2020 [3 favorites]


And worse, it is really anti-capitalist, total hypocrisy! All of that economics shit is based on frictionless markets and fully informed actors. Bullshit!
posted by Meatbomb at 11:26 PM on August 3, 2020


We need data privacy laws to shut this stuff down.
posted by interogative mood at 3:01 AM on August 4, 2020 [2 favorites]


We need data privacy laws to shut this stuff down.

There's a bill in the Senate right now to deal with some of these issues, but I have some bad news. This could change with amendments and whatnot, but it's going to require a lot of activism. No privacy law will ever have teeth without a private right of action.
posted by rhizome at 10:46 AM on August 4, 2020 [2 favorites]


I'm not on board with this unless they publish the leaderboards so I can lord myself over my inferiors.
posted by pwnguin at 12:35 PM on August 4, 2020 [1 favorite]


I think that already exists in the form of the interest rate on your mortgage or auto loan.
posted by rhizome at 12:36 PM on August 4, 2020


fwiw...
#techAsia: "Sheena Greitens at the University of Texas, Austin dismantles a recent Atlantic article on Chinese AI and surveillance in this expert Twitter thread."

also btw...
@Democracy may demand a splintered internet - "Citizens must be able to choose how their data are governed."

China's new digital currency takes aim at Alibaba and Tencent - "The experimental digital currency is on trial in a number of Chinese cities and the PBoC intends to use it to simplify digital payments and interbank settlements."

Fed's Brainard lays out central bank's instant payment framework - "Brainard said the launch of the payments system is still expected in 2023 or 2024, but the Fed is prioritizing developing key aspects of the program first to get it off the ground as soon as possible."
The Future of Retail Payments in the United States[1]

The Federal Reserve, acting as Fiscal Agent for the U.S. Department of the Treasury, processed most of the CARES Act payments to households using direct deposit, prepaid debit cards, and checks, which can take several days between the time the funds are sent and the time recipients get access to their funds. By contrast, the ability to disburse funds via instant payments could have helped reduce the strain for those who needed the funds quickly in order to meet financial obligations. The same is true for other payments intended to provide immediate assistance, for example, in the wake of natural disasters.

In good times as well as bad, instant payments will enable millions of American households and small businesses to get instant access to funds, rather than waiting days for checks to clear. An instant payment infrastructure ensures the funds are available immediately, which could be especially important for households on fixed incomes or living paycheck to paycheck, when waiting days for the funds to be available to pay a bill can mean overdraft fees or late fees that can compound, or reliance on costly sources of credit. For small businesses, the ability to receive customer payments instantly could help them manage cash flows when working capital is tied up in materials or inventory. And for the 1 in 10 Americans who regularly work in the gig economy, getting immediate access to the payments for their work could help address cash-flow constraints when money is tight.
Japan's large banks to look at building settlement system for small payments - "The government called last month for a review of bank transfer fees, unchanged for more than four decades, in a broader push to improve digital payment systems. While everyday transactions in Japan are usually completed in notes and coins, authorities have been keen to promote cashless transactions to raise productivity and curb transmission of the coronavirus."
posted by kliuless at 10:26 PM on August 6, 2020


« Older The Hugo Awards for 2020   |   "This is just people using big science words to... Newer »


This thread has been archived and is closed to new comments