Dataism: Getting out of the 'job loop' and into the 'knowledge loop'
September 7, 2016 11:42 PM   Subscribe

From deities to data - "For thousands of years humans believed that authority came from the gods. Then, during the modern era, humanism gradually shifted authority from deities to people... Now, a fresh shift is taking place. Just as divine authority was legitimised by religious mythologies, and human authority was legitimised by humanist ideologies, so high-tech gurus and Silicon Valley prophets are creating a new universal narrative that legitimises the authority of algorithms and Big Data."

Privileging the right of information to circulate freely - "There's an emerging market called Dataism, which venerates neither gods nor man - it worships data. From a Dataist perspective, we may interpret the entire human species as a single data-processing system, with individual humans serving as its chips. If so, we can also understand the whole of history as a process of improving the efficiency of this system... Like capitalism, Dataism too began as a neutral scientific theory, but is now mutating into a religion that claims to determine right and wrong... Just as capitalists believe that all good things depend on economic growth, so Dataists believe all good things - including economic growth - depend on the freedom of information."

Our unparalleled ability to control the world around us is turning us into something new - "We have achieved these triumphs by building ever more complex networks that treat human beings as units of information. Evolutionary science teaches us that, in one sense, we are nothing but data-processing machines: we too are algorithms. By manipulating the data we can exercise mastery over our fate."

Planet of the apps - "Many of the themes of his first book are reprised: the importance of the cognitive revolution and the power of collaboration in speeding the ascent of Man; the essential power of myths — such as religion and money — in sustaining our civilisations; and the inexcusable brutality with which our species treats other animals. But having run out of history to write about, Harari is forced to turn his face to the future... 'Forget economic growth, social reforms and political revolutions: in order to raise global happiness levels, we need to manipulate human biochemistry'... For the moment, the rise of populism, the rickety architecture of the European Union, the turmoil in the Middle East and the competing claims on the South China Sea will consume most politicians' attention. But at some time soon, our societies will collectively need to learn far more about these fast-developing technologies and think far more deeply about their potential use."

also btw...
  • Preparing for our Posthuman Future of Artificial Intelligence - "By exploring the recent books on the dilemmas of AI and Human Augmentation, how can we better prepare for (and understand) the posthuman future? By David Brin." (omni o)
  • The Man-Machine Myth - "Beliefs inspired by the cybernetic mythos have a quasi-theological character: They tend to be faith-based."
  • Unsettling thought of the day
  • Each technological age seems to have a "natural" system of government that's the most stable and common... Anyway, now we've entered a new technological age: the information age. What is the "natural" system of government for this age?

    An increasing number of countries now seem to be opting for a new sort of illiberal government - the style of Putin and the CCP. This new thing - call it Putinism - combines capitalism, a "deep state" of government surveillance, and social/cultural fragmentation.

    It's obviously way too early to tell, but there's an argument to be made that Putinism is the natural system of government now. New technology fragments the media, causing people to rally to sub-national identity groups instead of to the nation-state.

    The Putinist "deep state" commands the heights of power with universal surveillance, and allies with some rent-collecting corporations. Meanwhile, IF automation decreases labor's share of income and makes infantry obsolete, the worker/soldier class becomes less valuable.

    "People power" becomes weak because governments can suppress any rebellion with drones, surveillance, and other expensive weaponry. Workers can strike, but - huge hypothetical assumption alert! - they'll just be replaced, their bargaining power low due to automation.

    In sum: Powerful authoritarian governments, fragmented society, capitalism, "Hybrid warfare", and far less liberty.
  • The Totalitarian - "Putinist models seem to curtail personal freedom and self-expression. Chases away innovation class. In the long run this makes them unable to keep up with more innovative, open societies. But innovative open societies are also fissiparous in the long run. They need a strong centralized, even authoritarian, core. To wit the big democracies also have deep states, just ones that infringe on domestic public life less than Putinist do. Automation makes mass citizenry superfluous as soldiers, workers or taxpayers. The insiders' club is ever-shrinking. Steady state of AI era is grim. One demigod and 10 billion corpses/brain-in-jars depending on humanism quotient of the one. The three pillars for this end state are strong AI, mind uploading/replication, and mature molecular nanotechnology."
  • Capitalism and Democracy: The Strain Is Showing - "Confidence in an enduring marriage between liberal democracy and global capitalism seems unwarranted."
  • So what might take its place? One possibility[:] ... a global plutocracy and so in effect the end of national democracies. As in the Roman empire, the forms of republics might endure but the reality would be gone.

    An opposite alternative would be the rise of illiberal democracies or outright plebiscitary dictatorships... [like] Russia and Turkey. Controlled national capitalism would then replace global capitalism. Something rather like that happened in the 1930s. It is not hard to identify western politicians who would love to go in exactly this direction.

    Meanwhile, those of us who wish to preserve both liberal democracy and global capitalism must confront serious questions. One is whether it makes sense to promote further international agreements that tightly constrain national regulatory discretion in the interests of existing corporations... Above all... economic policy must be orientated towards promoting the interests of the many not the few; in the first place would be the citizenry, to whom the politicians are accountable. If we fail to do this, the basis of our political order seems likely to founder. That would be good for no one. The marriage of liberal democracy with capitalism needs some nurturing. It must not be taken for granted.
  • G20 takes up global inequality challenge - "Even before the final communiqué is drafted for the annual G20 summit the leaders of the world's largest economies already seemed to agree on their most pressing priority: to find a way to sell the benefits of globalisation to an increasingly sceptical public. As they arrived in the Chinese city of Hangzhou over the weekend, many were on the defensive amid a welter of familiar complaints back home: frustratingly slow growth, rising social inequality and the scourge of corporate tax avoidance."
  • “Growth drivers from the previous round of technological progress are fading while a new technological and industrial revolution has yet to gain momentum,” Mr Xi said at the start of the G20, adding that the global economy was at a “critical juncture”.

    “Here at the G20 we will continue to pursue an agenda of inclusive and sustainable growth,” Mr Obama said, acknowledging that “the international order is under strain”.

    Mr Xi, whose country has arguably benefited more than any other from globalisation, struck a similarly cautious note in a weekend speech to business leaders. In China, he said, “we will make the pie bigger and make sure people get a fairer share of it”.

    He also recognised global inequity, noting that the global gini coefficient — the standard measure of inequality — had raced past what he called its “alarm level” of 0.6 and now stood at 0.7. “We need to build a more inclusive world economy,” Mr Xi said.
  • G20 leaders urged to 'civilise capitalism' - "Chinese president Xi Jinping helped set the tone of this year's G20 meeting in a weekend address to business executives. 'Development is for the people, it should be pursued by the people and its outcomes should be shared by the people', Mr Xi said... Before the two-day meeting, the US government argued that a 'public bandwagon' was growing to ditch austerity in favour of fiscal policy support. 'Maybe the Germans are not absolutely cheering for it but there is a growing awareness that 'fiscal space' has to be used to a much greater extent', agreed Ángel Gurría, secretary-general of the Organisation for Economic Cooperation and Development."
  • Martin Wolf calls for basic income, land taxation & intellectual property reform: Enslave the robots and free the poor
  • The rise of intelligent machines is a moment in history. It will change many things, including our economy. But their potential is clear: they will make it possible for human beings to live far better lives. Whether they end up doing so depends on how the gains are produced and distributed. It is possible that the ultimate result will be a tiny minority of huge winners and a vast number of losers. But such an outcome would be a choice not a destiny. A form of techno-feudalism is unnecessary. Above all, technology itself does not dictate the outcomes. Economic and political institutions do. If the ones we have do not give the results we want, we must change them.
  • From the Job Loop to the Knowledge Loop (via Universal Basic Income) - "We work so we can buy stuff. The more we work, the more we can buy. And the more is available to buy, the more of an incentive there is to work. We have been led to believe that one cannot exist without the other. At the macro level we are obsessed with growth (or lack thereof) in consumption and employment. At the individual level we spend the bulk of our time awake working and much of the rest of it consuming."
  • I see it differently. The real lack of imagination is to think that we must be stuck in the job loop simply because we have been in it for a century and a half. This is to confuse the existing system with humanity’s purpose.

    Labor is not what humans are here for. Instead of the job loop we should be spending more of our time and attention in the knowledge loop [learn->create->share]... if we do not continue to generate knowledge we will all suffer a fate similar to previous human societies that have gone nearly extinct, such as the Easter Islanders. There are tremendous threats, eg climate change and infectious disease, and opportunities, eg machine learning and individualized medicine, ahead of us. Generating more knowledge is how we defend against the threats and seize the opportunities.
  • What's more scarce: money, or attention? - "Attention is now the scarce resource."
posted by kliuless (45 comments total) 99 users marked this as a favorite
 
kliuless, you are killing it with these big data posts. *settles in to read*
posted by cendawanita at 12:01 AM on September 8, 2016 [8 favorites]


The Onion predicts what will happen when Big Data 'fails':
“Unfortunately, late Monday evening, a major failure in our news feed program allowed a significant number of users to come into contact with concepts unfamiliar to them,” said CEO Mark Zuckerberg, appearing contrite as he emphasized to reporters that the issue had been resolved and that it was now safe to visit the social media site again without fear of encountering any opinions, notions, or perspectives not aligning with one’s existing worldview.
posted by DreamerFi at 12:09 AM on September 8, 2016 [10 favorites]


Modern AI techniques are good at some things, but it's important to remember that data can be biased just as easily as humans can. And you still have to understand problems in order to fix them: currently AI is more at the "perception" level than the "understanding" level. It can tell you there is a firetruck in the image, but not how it might have arrived or how to get it to leave.
posted by panic at 12:29 AM on September 8, 2016 [3 favorites]


Automation makes mass citizenry superfluous as soldiers, workers or taxpayers.

Well, there's somebody who has no clue about how either armies or war works. Likewise to blame automation for the state of work these days ignores the fact that there has been 35 years of an extensive and highly effective political and propaganda assault on labor and the rights of workers.

This makes me rather skeptical of anything he says.
posted by happyroach at 12:33 AM on September 8, 2016 [4 favorites]


Devices such as Amazon’s Kindle are able constantly to collect data on their users while they are reading books. Your Kindle can monitor which parts of a book you read quickly, and which slowly; on which page you took a break, and on which sentence you abandoned the book, never to pick it up again. If Kindle was to be upgraded with face recognition software and biometric sensors, it would know how each sentence influenced your heart rate and blood pressure. It would know what made you laugh, what made you sad, what made you angry. Soon, books will read you while you are reading them. And whereas you quickly forget most of what you read, computer programs need never forget. Such data should eventually enable Amazon to choose books for you with uncanny precision. It will also allow Amazon to know exactly who you are, and how to press your emotional buttons.

Woah
posted by gt2 at 1:45 AM on September 8, 2016 [5 favorites]


Condé Nast Has Started Using IBM's Watson to Find Influencers for Brands Tapping into AI for recruitment By Marty Swant (Adweek)

...
Through a new partnership announced today with IBM and the influencer platform Influential, brands advertising with the media company's properties—publications such as Vogue, Vanity Fair and The New Yorker—will be able to use big data to better know which social media celebrities might make for a good match for any given campaign.
...
It all comes down to better understanding the consumer experience, said Stephen Gold, vp of IBM Watson. Every reader has countless options for where to consume content. Watson considers 52 unique attributes for each person: Are they open-minded? Are they dutiful? Are they outgoing? The more Watson knows about an individual, the more it can help brands know what will resonate.

posted by sebastienbailard at 1:54 AM on September 8, 2016


I find the Dataism article to be a good example of how to ignore the data and just make something up that sounds ominous by reverse-scattergun selection of interesting tidbits. There are obviously massive social changes happening as a result of technology and, yes, lots of them are ominous, and we need to try and get a grip on whats going on and where its heading and where we want to go instead. But coining the term 'Dataism' and drawing lots of random parallels with ancient religion does not seem like the most fruitful way to frame the current situation. Might be a good way to sell a book though.
posted by memebake at 2:44 AM on September 8, 2016 [10 favorites]


Mrs Segundus makes it impossible for Amazon to get any clear idea of what books I like. For one thing she uses my Kindle account, so that 40% of the books were actually bought by her. Quite a lot of those are ones even she doesn't really like, but has to have for one of her book clubs.

Then whenever I'm reading she interrupts me at random intervals so that where I pause, how long I take, and where I stop bear no relation to how interesting I'm finding the book.

Mind you, Amazon is pretty good at confusing itself and has always believed that I want to buy bicycle inner tubes, although I do not own a bike.

If Kindle was to be upgraded with face recognition software and biometric sensors, it would know how each sentence influenced your heart rate and blood pressure.


Oh yes, and if it was upgraded with caterpillar tracks and laser eyes it could be used as a deadly kill-bot.
posted by Segundus at 3:09 AM on September 8, 2016 [14 favorites]


Or to put it another way - the term Dataism bothers me because Data isn't really the problem. Making decisions based on data (i.e. things observed about the world) is a sensible thing to do. The problem is unfettered corporate control and lack of privacy protections (protections we need from both corporations and governments, as Snowden has shown). The article is from the FT so it doesnt have much to say about corporate control.
posted by memebake at 3:09 AM on September 8, 2016 [5 favorites]


Segundus, why would a device like this be so unlikely in the future? People wear fitbits all day.
posted by gt2 at 3:17 AM on September 8, 2016 [1 favorite]


Power will not devolve to data, but rather determine which data is gathered, which data is valued, how data is interpreted. The idea that technology will determine power, rather than power determining technology, is a very old fallacy and easy to fall for.
posted by Pope Guilty at 4:06 AM on September 8, 2016 [1 favorite]


God.
posted by fairmettle at 4:16 AM on September 8, 2016 [1 favorite]


All the (optimistic and dystopian) promises of data rely on the assumption that the data will be trusted, that the collectors will obediently follow the data wherever it leads. But we know already that nobody does this. The data is collected and its implications are largely ignored in favor of whatever pet idea the current CEO or CMO has in mind. "The data" makes a rather incontrovertible case for a global climate catastrophe, but who listens? "The data" are pretty clear that consumer spending is predicated on wage growth, but you don't see the advertising class lobbying for wage increases on behalf of their customers.

Like the Bible, "data" is made to say whatever its holders already prefer to believe.
posted by overeducated_alligator at 4:19 AM on September 8, 2016 [20 favorites]


The conflation of data and religion is really off putting and makes me think that a lot of the complaints are probably silly, since calling 'dataism' a new religion is really fucking silly
posted by MisantropicPainforest at 4:37 AM on September 8, 2016 [1 favorite]


Dataism is the belief that there is only One True Son of Noonian Soong.
posted by Faint of Butt at 4:40 AM on September 8, 2016 [6 favorites]


Segundus, why would a device like this be so unlikely in the future? People wear fitbits all day.

Battery life is a key selling point of e-ink devices.
posted by Leon at 5:31 AM on September 8, 2016 [2 favorites]


Meh. I've seen this before. When I was a young computer geek, people were confidently predicting that AI would lead to thinking computers by about 1980 or so. And I knew people who fervently believed that the goal of the human race was to create immortal silicon beings who would supplant us.

Big Data can make deep connections but also make erroneous conclusions. For example: I live in Toronto, but my computer thinks I live in Ottawa and at least one website thinks that I live in Kitchener.
posted by tallmiddleagedgeek at 5:52 AM on September 8, 2016


Sorry - I should have said 1990, not 1980. I was thinking of the Fifth Generation Project.
posted by tallmiddleagedgeek at 6:02 AM on September 8, 2016


"The need to be observed and understood was once satisfied by God. Now we can implement the same functionality with data-mining algorithms."

"I am a prototype for a much larger system."
posted by AAALASTAIR at 6:02 AM on September 8, 2016


Human judgement is still useful. That's still part of the loop.
posted by Annika Cicada at 6:32 AM on September 8, 2016


"Figures don't lie, but liars sure do figure".
posted by mfoight at 6:32 AM on September 8, 2016


"Machine learning is like money laundering for bias." - Maciej Ceglowski
posted by mhoye at 7:10 AM on September 8, 2016 [10 favorites]


I mean gosh darned some people will believe anything (cough trump, cough hil) so add a bit of arcane science that gets a surprising amount right and is it surprising that a cult of big data could arise?
posted by sammyo at 7:29 AM on September 8, 2016 [1 favorite]


"Figures don't lie

A lie unto itself.
posted by Pope Guilty at 7:43 AM on September 8, 2016 [1 favorite]


I have this notion of a pyramid model of data.

Data at the bottom, next level up is information, above information is knowledge (and controversially above that is wisdom). Data isn't information, information isn't knowledge you need a transformative step.

But lately I've been thinking about data as ore. You can mine huge bucketloads of data, but then you need to smelt it into ingots of information and then forge those ingots into usable knowledge.
I like this metaphor because data by itself is voluminous, dirty and messy and when you smelt it down you need to know what you're aiming for to get the information you want. Smelting ore for iron is different to smelting for aluminium. You do the processes differently, and you skim off different slag.
posted by Just this guy, y'know at 7:45 AM on September 8, 2016 [3 favorites]


I have this notion of a pyramid model of data.

A pyramid schema, you say?
posted by mhoye at 7:50 AM on September 8, 2016 [7 favorites]


But lately I've been thinking about data as ore.

Excellent metaphor. Steel comes from ore. Which then becomes surgeons tools to help humanity.
posted by sammyo at 7:57 AM on September 8, 2016 [1 favorite]


But they do make other things with steel....
posted by thelonius at 8:01 AM on September 8, 2016 [1 favorite]


The "from deities to data" link is balderdash. It reads like someone pontificating on the singularity but desperately trying to make it relevant by slapping a bunch of stickers that say "big data" on the surface.

From the article: "This novel creed may be called “Dataism”. In its extreme form, proponents of the Dataist worldview perceive the entire universe as a flow of data, see organisms as little more than biochemical algorithms and believe that humanity’s cosmic vocation is to create an all-encompassing data-processing system — and then merge into it."

Yeah, this guy is just making shit up and his musings have very little in common with how data science is practiced in the real world, at least in my small corner of experience.
posted by kprincehouse at 8:19 AM on September 8, 2016 [2 favorites]


I thought that Capitalism was the dominant world religion?
posted by jchack at 8:26 AM on September 8, 2016


I don't agree that Dataism is the religion of Silicon Valley. The religion of Silicon Valley is simply good old Mammon-worship (no matter how strenuously they deny it). Consider how many Big Data companies deal with advertising, and how few with climate change.
posted by splitpeasoup at 8:43 AM on September 8, 2016 [4 favorites]


Oh yes, and if it was upgraded with caterpillar tracks and laser eyes it could be used as a deadly kill-bot.

YES! I WANT MY KILL-KINDLE NOW PLEASE!
posted by happyroach at 9:07 AM on September 8, 2016 [1 favorite]


The greatest trick mammon ever pulled was the time he made his hand invisible.
posted by mccarty.tim at 9:20 AM on September 8, 2016 [6 favorites]


proponents of the Dataist worldview

Yep I've been reading about "data science" and working on the periphery for years and this is the first time I've heard the phrase or really the specific sentiment. There are certainly a good number of inflated ideas about how powerful the technology actually is but most would just say it's really powerful math, but still just math.
posted by sammyo at 10:12 AM on September 8, 2016


Much like rich economists who can picks stocks, show me the rich data scientists and don't even get me started on statisticians (Actuaries I let pass, as they're smart enough to protect their profession through many, many exams and actually do pretty well).

I think people would be surprised how dumb big data is. The Target pregnant teen coupon thing never happened. Tracking your phone tells us you often stay at home at night, got to work, sometimes have a meal or go to the gym afterwards. The astonishing analytic power of adverts means that thing you view on Amazon follows you around. Young people watch cartoons. Men watch sports. Roads are busy in the morning before 9 and in the evening around 6. People buy patio furniture in summer and gardening tools when they retire. The apparent 90% of the worlds data always created in the last 2 years is mostly Unix logs file and PDF whitepapers about the cloud/big data/IoT by vendors. It's convinced me more the best way of predicting grand themes of the near future is to read soft sci-fi. William Gibson and JG Ballard rather than what IBM's Watson has extrapolated from the past.
posted by Damienmce at 10:18 AM on September 8, 2016 [8 favorites]


Consider how many Big Data companies deal with advertising, and how few with climate change.

I've often wondered could it be used inversely somehow to nullify advertising, not block it, but invert it entirely so it has no effect. Not quite sure what the opposite is.
posted by Damienmce at 10:21 AM on September 8, 2016


1. kliuless you are like my all-time favorite metafilter poster.

2. I just finished the Weapons Math Destruction book. I have a question with a little part and a bigger part. The little part first.

2A. I bought a new computer about four months ago and I have been busy with a million other things and have not spent too much time using it and I haven't even gotten around to installing the privacy add-ons and the ad-block add-ons, so for the first time in years I have been assaulted with the target advertisements in my browser. They are way way off. They are precisely targeted at me in only one variable beyond the Sunday Newspaper Stuffing Ads--they class me as male, not female. Other than that they do not have a clue what products I consume. I'm sure the Feds know my birthday and my salary history and where I spend my money and that I am not a felon but I really doubt they have a first clue about what target and weapon I would choose if I flipped out. So the little part of my question is am I an outlier or is their data this miserably noisy for a lot of people or for most people or for practically everybody or for absolutely everybody?

2B. There is a question I have not seen addressed. It looks to me (I am not a Data Scientist but I have real world experience wrestling with Terrabyte size data sets--I am sure the 2016 NSA management thinks that makes me a complete loser but I am not) that most of these searching algorithms are looking in spaces that grow not by N, not by N^2, not even by e^N, but they are usually combinatorial optimization problems and these spaces grow by N!.

Is anybody arguing that the task purportedly being attempted by the NSA is a mathematical impossibility and this is a complete scam? Are they ever going to actually in fact stop a real crime or stop a real terrorist act? There is a teentsy weentsy little voice in the back of my mind that is yelling BULLSHIT like the Penn & Teller guy that yells.

It is rather bizarre to have a teentsy weentsy little voice yelling let me tell you.
posted by bukvich at 11:17 AM on September 8, 2016 [3 favorites]


ug, responding to the pull quote alone because I'm still trying to get through the other big data post...

anyone who things that nations were led by gods and not generals needs to learn more history. Authority comes from Gold and Steel, and the god botherers are just another way to move those around.

You know what happened that led to Julius Caesar becoming emperor? He didn't ransack the republican virtues of the Roman Republic. No, those were destroyed before he got there, when Gaius Marius stood for consul SEVEN TIMES. He had a beef with Sulla. While Sulla was in Asia kicking roman enemies' asses, Marius killed off all his supporters in rome. Murdered. "Purged". Sulla won the war, took his (fucking huge) army back to Rome, and "did away" with the Marian partisans (Marias himself having died two months after his seventh election).

Two armies swept into Rome, wiping out respective partisan political parties. And rome was like "well fuck that, one dude in charge."

Anyone who things people thought gods caused that needs to read more history.
posted by rebent at 11:58 AM on September 8, 2016 [2 favorites]


You know what happened that led to Julius Caesar becoming emperor? He didn't ransack the republican virtues of the Roman Republic. No, those were destroyed before he got there, when Gaius Marius stood for consul SEVEN TIMES.

Marius, though, was not the first. The Gracchi were the first. The populists brought to power on the back of the mob, who they won to their side with free bread. The mobs which existed because the poor had been driven off their lands by the patricians, who replaced them with the slaves the wars brought in. An endless supply of free labor was what really ended the Republic, useful to think about in our coming robots and algorithms future...

For instance: So Long to the Asian Sweatshop
In Asia, at least, the factors that made sweatshops an indelible part of industrialization are starting to give way to technology....Adidas Indonesia wants to reduce the proportion of manual labor in its cutting process to 30 percent. Hung Wah Garment Manufacturing in Cambodia has eliminated manual cutting outright.
And that's just the start. Three-D printing and other emerging technologies should allow manufacturers to meet customer specifications with unmatched quality, at speeds not previously imaginable in sweatshops, and with far less human labor. Even worse, for Asia's workers at least, is that Western companies can bring those same customizable technologies back home, and eliminate their overseas factories altogether.
Also, closer to the data centric side, this article was also interesting. It's essentially about whether or not we ought to use the tools of Big Data when we are incapable of understanding how they work. Explaining why we are incapable is what makes the article a long read, so I won't delve into details here, but suffice it to say that we are already capable of designing tools that are better at discerning patterns than humans are, and therefore potentially of great use to us (one example from the article: determining which individuals in a group of patients are likely to develop complications and should stay in hospital for monitering, while the rest can successfully recuperate at home). But we don't know how they come up with the answers, and since we don't know how they reason it can be very very difficult to pick up on the flaws in their reasoning...

Some of this talk of worshipping data is indeed a little overblown. Generals and politicans and the pulpits and missle silos they weild are still important. But I think Harari et al are onto something...there's something that smacks of augury in some of this deep learning stuff. We don't know why a green blotch on the side of the calves' liver means the king is doomed, it just does...We don't know why it picked patients A, Q, R and D to receive the experimental drug, it just did....but it's been right more often than they doctors have, and they're just guessing, too..... Boil, boil, toil and trouble, fire burn and cauldron bubble. Listen up, MacBeth, DeepMind has some careers advice for you...
posted by Diablevert at 2:08 PM on September 8, 2016 [1 favorite]


MisantropicPainforest: The conflation of data and religion is really off putting and makes me think that a lot of the complaints are probably silly, since calling 'dataism' a new religion is really fucking silly

Maybe the word "religion" is what's throwing you off. But it does look like the confluence of beliefs about the ability of future technology to save us from $problem, and the ability of industries to generate tribal loyalty in their workers. (It's commonplace here in Alberta, where people have an almost pathological loyalty to the oil industry; that and the whole Silicon Valley computing/financial complex and its adherents seem like straightforward examples.)

Damienmce: It's convinced me more the best way of predicting grand themes of the near future is to read soft sci-fi. William Gibson and JG Ballard rather than what IBM's Watson has extrapolated from the past.

And in fact Gibson and Sterling already called out this kind of technology worship (at least the kind we see in big IT bureaucracies and HR, themselves already rubbing their hands in glee at the prospect of Big Data) in The Difference Engine.
posted by sneebler at 2:48 PM on September 8, 2016


The author isn't pulling metaphysical ideas about biological creatures or ecosystems or the world or the whole universe being an information processing or computing system out of his ass either; there are lots of people who indulge in that kind of speculation.
posted by thelonius at 4:24 PM on September 8, 2016 [1 favorite]


Srsly. I invite you to consider the Macy Conferences and first wave cybernetics; maybe check out some N Katherine Hayles. Then into Stewart Brand. It's a belief at the heart of the Valley.
posted by dame at 8:02 PM on September 8, 2016


, this article was also interesting. It's essentially about whether or not we ought to use the tools of Big Data when we are incapable of understanding how they work.

That we "can not understand ML" is partly just bad science writing. Folks understand the principals of neural nets, the specific details are very complex but the core concepts are graspable. In this case "explaining to a client" may be and issue, the math behind the mashup of heavy statistics, linear algebra and gigantic sparse matrix computations does not make for easy powerpoints. Hard, but there are folks that can unwind it.
posted by sammyo at 10:10 PM on September 8, 2016


That we "can not understand ML" is partly just bad science writing. Folks understand the principals of neural nets, the specific details are very complex but the core concepts are graspable.

The author is a postdoc at Princeton's Neuroscience Institute, so I don't think this article its an instance of a dumb journalist unaware of the current state of the field. Understanding in principle how neural nets work is certainly possible. The fact that we struggle to grasp the specifics of how a given algorithm is making its decisions is the problem. The article discusses two different approaches being taken by scientists to sort of reverse engineer neural nets and figure out how they're making decisions. Both are limited and reveal interesting problems.

In one effort, for example, they looked at a photo-recognition program and examined how it identified photos containing drapes. Turned out it wasn't looking for windows or shadows, but beds. Because in the dataset it had been trained on, drapes were highly correlated with bedrooms. Perhaps that example seems a trivial thing. But it shows the fragility of the models --- would the network successfully identify pictures of drapes not in bedrooms? Would you ever know? Because if you searched for "pictures of drapes" you'd get a page full of them. It may seem simple to correct for, stuff like this. But the very strength and potential of these programs is to be able to, say, train them up on a selection of, say 1 million photos, then set them loose on the internet. The power and the wonder of them is that you're not only able to ask, "find me pictures of drapes" you can ask for pictures of cats or polka dots or clowns or tractors. The whole point of using them in the first place is that's it pretty damn difficult for a human to parse and classify a collection of a million photos, and so it may not be possible for an outside observer to spot biases inherent in the training data.

Are such flaws such a big deal when it comes to improving google image search? Maybe not. Are they a problem when you want to use neural nets to recommend patient treatment options? Criminal sentences? Award scholarships? It may be possible to design a program that say, reduces patient care costs or recidivism rates without fully understanding why it does so. Is it ethical, to use such a program?
posted by Diablevert at 8:13 AM on September 9, 2016 [3 favorites]




« Older BLACK CLASSICAL - HISTORY OF SPIRITUAL JAZZ   |   "Everything Is A Target" Newer »


This thread has been archived and is closed to new comments