Are you a robot?
October 22, 2021 9:33 AM   Subscribe

Quartz Weekly Obsession: CAPTCHA. A newsletter full of fascinating facts about CAPTCHA. "The demise of the CAPTCHA is mainly the result of rapid improvements in the field of AI. CAPTCHA’s research mission has succeeded so thoroughly that machines are now as good or better than humans at every task we’ve turned into a CAPTCHA test. We’re running out of challenges that humans are universally good at, but machines can’t handle." posted by carolr (20 comments total) 11 users marked this as a favorite
 
I long for a future in which intentional spam, noise, and misinformation are treated like pollution, with fines and forcible cleanup. Make the company take it down. If they refuse, make the hosting company take it down, with daily fines for non-compliance. If the hosting company is judgment-proof, make carriers block the hosting company.

We don't let random people erect giant billboards in the middle of main street, stuff our mailboxes with ads without even paying the post office for it, or replace pages in the yellow pages to redirect people to their shady businesses, yet our whole internet infrastructure offloads these externalities onto end users. Ad blockers, CAPTCHAs, and spam filters are symptoms of a fundamentally broken system.
posted by jedicus at 9:48 AM on October 22, 2021 [16 favorites]


"Passwords" as a whole also fall into this category.

"How should people authenticate themselves? By memorizing a string of characters which they then choose. It's important that that string be high-entropy, though, for obvious reasons. It's not like humans are bad at generating & memorizing high-entropy strings, right?"

"Also, no reuse. Everybody needs to have hundreds of high-entropy strings memorized, and you definitely don't want them to be predictable from each other."

There's historical reasons, of course, but it'd be hard to *pick* a more "easy for computers, difficult for humans" authentication concept.
posted by CrystalDave at 9:48 AM on October 22, 2021 [12 favorites]


jedicus: "I long for a future in which intentional spam, noise, and misinformation are treated like pollution, with fines and forcible cleanup."

ME TOO. I've often thought it shouldn't be THAT hard to trace spam, given enough will and resources.

If our officials put more effort into addressing things like this that affect everyone every day, the world would be a much better place.
posted by kristi at 9:59 AM on October 22, 2021 [2 favorites]


reality needs a metatalk.
posted by NoThisIsPatrick at 10:16 AM on October 22, 2021 [4 favorites]


"I long for a future in which intentional spam, noise, and misinformation are treated like pollution, with fines and forcible cleanup."

There's a bit in Daniel Suarez' first two novels where a shadowy force starts taking over the world. It doesn't win a lot of fans... until it publicly assassinated spammers. Then humanity applauds!
posted by doctornemo at 10:52 AM on October 22, 2021 [3 favorites]


"easy for computers, difficult for humans"

I think this will always be the default, unless we prioritize UX over computational ease. Most developers empathize better with efficiency for the computer (in that understanding concepts like "big O" notation for algorithms) that with the users, who will always be a "problem".

I knew someone who had a small startup that provided just one service: bulk captcha-solving. Of course it was all off-shored, with the only AI involved in working out how many humans to send it to and how much to pay them to keep it worthwhile. It was fast enough to appear as an API, and its biggest users were mail order companies getting service-level refunds from national postal carriers. Most postal carriers had a delivery service guarantee that allowed a rebate or refund if a certain standard wasn't met, but most of them hid the refund form behind a web-based form with a captcha. If you were mailing thousands of packages a day, you'd need a whole department dedicated just to this form entry.

(The amounts are not insignificant: one client recouped a third of their total shipping costs through service level rebates.)

I hope that captcha-farming goes away, because it's an awful thing to enslave people to computers.
posted by scruss at 11:22 AM on October 22, 2021 [3 favorites]


Slightly disappointed there wasn't a CAPTCHA before accessing the article.
posted by Webbster at 11:40 AM on October 22, 2021 [2 favorites]


*but I am a robot*
posted by chavenet at 12:51 PM on October 22, 2021 [4 favorites]


There was a time when I was often accused of being a robot. Between mostly flat affect and being seen to fix PCs by talking to them or threaten them, it wasn't an entirely ludicrous supposition. I'm not sure that being able to solve a CAPTCHA convinced anyone otherwise.
posted by wierdo at 2:55 PM on October 22, 2021


it'd be hard to *pick* a more "easy for computers, difficult for humans" authentication concept.

The problem is that computers can't securely store passwords which opens up other risks. There will always be a secret 0 problem when it comes to storing them safely. We can mitigate this with password managers that store secret 0 in the user's brain, but it's very difficult to do it securely as a standalone implimentation.

"easy for computers, difficult for humans"

I think this will always be the default, unless we prioritize UX over computational ease. Most developers empathize better with efficiency for the computer (in that understanding concepts like "big O" notation for algorithms) that with the users, who will always be a "problem".


If you can come up with a two factor solution that users will embrace, you can make a lot of money. Twitter has been trying to get users to turn it on for years and as of last Dec. had a 2.3% adoption rate. They even offer three forms of 2FA to try to make it as frictionless as possible.

Best practices for passwords, at least for the past 15 or so years, have been to make it explicitly designed against computational ease. The hash algorithms used for password storage/comparison are intentionally inefficient so that while they perform acceptably with normal use they will perform poorly at scale if you're trying to crack them.

And I'd predict that relatively few people who are developers know or consider big O notation at this point. We use computationally inefficient languages like Python and the basic built-in data structures that are easy but often inefficient because developer time and ease of reading and understanding code is more important most of the time than CPU cycles.
posted by Candleman at 3:25 PM on October 22, 2021 [1 favorite]


I think this will always be the default, unless we prioritize UX over computational ease. Most developers empathize better with efficiency for the computer (in that understanding concepts like "big O" notation for algorithms) that with the users, who will always be a "problem".

I have noticed in creating stuff for users that one of the things they hate the most is waiting. If you increase the response time for a button from 0.1 seconds to 10 seconds a lot of people will no longer like you. Some of them will say very angry things about you. In that way, big O and the rest of it are a part (though not the only part) of being friendly to users.
posted by clawsoon at 4:21 PM on October 22, 2021


This was illuminating. Thanks for posting.
posted by latkes at 4:24 PM on October 22, 2021


I have noticed lately the ‘I am not a robot’ checkboxes and wondered how that could not be the easiest thing a robot could fake, but the article might have answered that for me.
posted by MtDewd at 4:43 PM on October 22, 2021




Given that computers are far now far superior to humans at solving CAPTCHAS, I am can surely download a browser plug-in which tasks mine with the tedium of filling in the damn things?
posted by rongorongo at 5:10 AM on October 23, 2021 [1 favorite]


2020: > Twitter has been trying to get users to turn [2-factor authentication] on for years and as of last Dec. had a 2.3% adoption rate.

2019: Twitter admits it used two-factor phone numbers and emails for serving targeted ads
posted by kurumi at 11:35 AM on October 23, 2021 [7 favorites]


A-HAHHHAHHAH HAHHAHAHA
posted by lalochezia at 3:51 PM on October 23, 2021


it'd be hard to *pick* a more "easy for computers, difficult for humans" authentication concept ... and yet I suspect at the time, they were trying to make it hard for people to break into the computer, just never imagining that you might use another computer to break into the computer.

In fairness, where were you going to get another computer from?

At least, that's the only way it makes sense to me historically.
posted by pulposus at 12:17 AM on October 24, 2021


I made a CAPTCHA you can’t possibly pass just to annoy spammers. It’s the only deliberately dark design I’ve ever done, and I’d do it again. The button positions and labels change randomly. The instructions are impossible. It errors on purpose. I know spammers don’t work like this, but I love the idea of someone getting more and more frustrated trying to get through it.
posted by bigbigdog at 12:00 AM on October 25, 2021


I guess future CAPTCHAs will have to ask people to drive in the snow.

I’ve been assured by numerous people here and on arstechnica that that’s something machines will never be able to do.
posted by lastobelus at 7:16 PM on October 25, 2021 [1 favorite]


« Older Oh No, Our Nation, It's Broken   |   Lost Classics of Teen Lit: 1939-1989 Newer »


This thread has been archived and is closed to new comments