Technology adoption swerve
August 16, 2017 9:38 PM   Subscribe

8 lessons from 20 years of Hype Cycles looks at how Gartner's predictions of upcoming tech panned out.
posted by a snickering nuthatch (51 comments total) 26 users marked this as a favorite
 
The mesh networks example is hilarious. It's like it's caught in a pacific gyre, endlessly cycling through the tough of disillusionment. Honestly you might as well skip your Gartner subscription, hire and intern and maybe subscribe to Analog which has about the same grounding in reality.
posted by GuyZero at 10:06 PM on August 16, 2017 [4 favorites]


Is he saying that HTML5 and BYOD are enjoying only "minor success"? That's not been my experience.
posted by sbutler at 11:14 PM on August 16, 2017 [4 favorites]


Lesson 1. We're terrible at making predictions. Especially about the future.

This is so true. I find my predictions about the future tend to be around 17 to 18% worse than my predictions of events that have already happened.
posted by gusottertrout at 11:25 PM on August 16, 2017 [16 favorites]


TL:DR: it's nearly impossible to predict what technologies will succeed or not, especially based on initial investor-seeking hype.

I do have a minor quibble: Podcasting AFAICT is a mature, commonly used technology. So why he kept it at stage 2 puzzles me. Then again, I may be biased due to doing most of my reading using podcasts.

Then again, did anybody ever predict the runaway success of YouTube videoblogging?
posted by happyroach at 11:33 PM on August 16, 2017 [4 favorites]


I love this SO MUCH! Thanks, OP.
posted by Bella Donna at 11:50 PM on August 16, 2017


If you ever find back issues of WIRED from the last century, grab them with both hands! Real everyone-will-have-a-personal-zeppelin style prognostics.
posted by adept256 at 11:51 PM on August 16, 2017 [4 favorites]


Wrong as a wrong thing about HTML5.
posted by GallonOfAlan at 11:57 PM on August 16, 2017 [1 favorite]


> So why he kept it at stage 2 puzzles me.

Because nobody made any money on it, I guess?
posted by Leon at 12:31 AM on August 17, 2017 [5 favorites]


The current bullshit blizzard about AI is a case in point.
posted by GallonOfAlan at 12:34 AM on August 17, 2017 [10 favorites]


When I look at these lists I am reminded of how, back in the early 1900s, electric cars looked like the obvious way forward: Baker Electric Cars were the largest car manufacturer in the world - they didn't need any hand cranking to start, they could do 22mph (not bad considering there were no paved roads) with a range of nearly 100 miles. Larger cities like New York were readily equipped with charging stations. Even Edison owned one - and surely somebody would want to take his name as an electric car brand?

So it is that sometimes the "slope of enlightenment" sometimes follows a gradient much much more shallow than we could ever realise. See also mice, graphical user interfaces, faxes, hypertext etc.
posted by rongorongo at 2:30 AM on August 17, 2017 [1 favorite]


Good point about electric cars, however AI is a different kettle of fish. We've been promised advanced computer intelligence since the 1960s; instead, I spend a good part of my time interacting with computers changing the word 'duck' back to what I was trying to say in the first place.
posted by The River Ivel at 2:53 AM on August 17, 2017 [16 favorites]


🗣️"Did you say 'Duck this pizza ship'?"🗣️
posted by rongorongo at 3:20 AM on August 17, 2017 [15 favorites]


Also self-driving cars. I can see increased use driver assists making sense. But are people really going to entrust the safety of themselves and their passengers to the car computer at 120kph even if the tech problems are solved? Not even on a wide three-lane motorway if every other vehicle on it was also autonomous. What problem are we trying to solve with this?
posted by GallonOfAlan at 3:38 AM on August 17, 2017 [3 favorites]


Is there a version of this story that does not demand I subscribe to a "Linkedin News Feed" to read it?

Edit: I had to turn off the VPN. Now It loads normally.
posted by Kirth Gerson at 3:40 AM on August 17, 2017


Speech Recognition appears in the very first Hype Cycle in 1995, where it's climbing the Plateau of Productivity. In reality, speech recognition was far from mature in 1995. It's only - possibly - with deep learning breakthroughs in the last two years that we have reached human equivalent recognition in speech recognition. Two decades later.

Dragon Systems certainly produced many convincing demos back in the day.
posted by fairmettle at 4:04 AM on August 17, 2017


But are people really going to entrust the safety of themselves and their passengers to the car computer at 120kph even if the tech problems are solved?

Can't see how it's any worse than what I trust now.
posted by thelonius at 4:16 AM on August 17, 2017 [12 favorites]


Surprised to see no mention of the "semantic web".
posted by thelonius at 4:18 AM on August 17, 2017 [5 favorites]


I'm living in autonomous car research ground-zero here in Pittsburgh, it's unusual to not see one when I'm driving through the city, and I seriously doubt that they'd be able to go 100% solo within the next fifteen years or so.
posted by octothorpe at 4:57 AM on August 17, 2017 [4 favorites]


Surprised to see no mention of the "semantic web".

I nearly did my entire Master's thesis on this. Instead I focused on blogging as a tool for political participation. đź”®
posted by ukdanae at 5:26 AM on August 17, 2017 [10 favorites]


Do these "experts" even know what the words mean? "Internet micropayments: alternatively known as ecash, epayments, cryptocurrency and in its latest incarnation - bitcoin..." -- Bitcoin was never planed to include micropayments, it solves a totally different problem. The "professional analyst" services often just string together words from a google search together adding the "disruptive" catch phrase du jour.
posted by sammyo at 5:27 AM on August 17, 2017 [5 favorites]


Yeah, the bitcoin thing made me cringe.
posted by a snickering nuthatch at 5:45 AM on August 17, 2017


Octothorpe, do you have any info on what the current status of the Uber vehicles are? Are they ever fully autonomous outside of their special Strip District corridor?
posted by soren_lorensen at 5:47 AM on August 17, 2017


As far as I know, none of them are ever totally autonomous on the city streets. There's always a driver with their hands near the steering wheel prepared to take over. That last 5% or so of reliability is going to be a big hump to get over.
posted by octothorpe at 6:09 AM on August 17, 2017 [4 favorites]


AFAIK the Pittsburgh cars all have two Uber humans in them at all times.
posted by GallonOfAlan at 6:09 AM on August 17, 2017


Good point about electric cars, however AI is a different kettle of fish. We've been promised advanced computer intelligence since the 1960s; instead, I spend a good part of my time interacting with computers changing the word 'duck' back to what I was trying to say in the first place.

Counterpoint: Siri, Google Assistant, Alexa, Cortana. They're not perfect, but the fact that it's even possible to have a useful voice only interface is a huge advancement. They are leaps and bounds over what used to be out there.
posted by leotrotsky at 6:26 AM on August 17, 2017 [2 favorites]


HTML5 was heralded as a potential revival of the semantic web. It failed to accomplish those goals.

Incremental improvements to JS engines (and subsequently the language itself) ended up having a much bigger impact.

(Which is to say, as unglamorous and obvious-sounding as it is, always bet on incremental improvements. It's a lot easier to bet on B when the path from A to B is apparent.)
posted by schmod at 6:57 AM on August 17, 2017 [4 favorites]


There are also technologies that "plateau" despite failing to fulfill their intended goals, and gradually fall out of favor.

XML is a prime example of this. The widespread adoption of XML was so poorly implemented in the first round that most developers became permanently disillusioned with the very concept, and put little effort into maintaining it, or using it in actually-appropriate contexts. [JSON and YAML are better-suited for many of the things that XML was used for, but not all of them. The current attempts to reimplement the worst parts of XML on top of JSON are misguided and painful to watch.]
posted by schmod at 7:01 AM on August 17, 2017 [5 favorites]


I went to a self driving car meetup last night and while incredibly interesting it reminded me that there is "software" then there is physics and chemistry. A huge change in the way atoms work is just really hard. The big one is the sensors, LIDAR is currently a high grade optical device, the speaker brought one, a small $8,000 'puck'. The goal of the industry is to get to $1000 devices. One would think redundancy would be a good thing so double the cost and it better have really good insurance as they are super fragile. Oh and about a year backlog on orders, just hard to build. Oh.. optical tech, a blob of mud or bird poop seriously degrades performance.

But what happens when a trucking company invests big and then charges 50% lower cost coast to coast with 30% faster delivery?
posted by sammyo at 7:16 AM on August 17, 2017 [1 favorite]


"Did you say 'Duck this pizza ship'?"

As far as my four-year-old nephew is concerned, Hollaback Girl is a song about a ship made out of bananas. I kind of like his version better.
posted by nebulawindphone at 7:31 AM on August 17, 2017 [5 favorites]


I suppose the semantic web goal did help promote the concept of web standards.
posted by thelonius at 7:41 AM on August 17, 2017 [1 favorite]


I've had similar thoughts as the author of this piece when looking at the technology predictions made in my own field, U.S. higher education. In particular, the Horizon Report has been published annually since 2004 (and I think it may have deeper roots) and it centers on one or more lists of technologies that are expected to take off in the near future. However, I haven't done much writing about this because (a) it's too easy to criticize or tear down without offering much that is constructive and (b) I'm not sure how many people really interpret this report as a concrete set of predictions instead of some kind of conversation starter about possibilities. If it is intended as a set of predictions then it has a terrible success rate!

That said, I have an immense skepticism of consultants, reporters, and anyone associated with "futurology."
posted by ElKevbo at 7:55 AM on August 17, 2017 [1 favorite]


I suppose the semantic web goal did help promote the concept of web standards.
That's what I was about to say; at this point it's more of a theme than a specific technology. The delay has and always will be the long tail of old, difficult-to-repurpose content spread across thousands and millions of minimally maintained web sites or managed by small orgs that don't have the resources to refresh it.
posted by verb at 7:55 AM on August 17, 2017


Mesh networking isn't a good example to use here. There are now 64 million smart gas or electrical meters in operation, virtually all of which use mesh networking to communicate back to the utility. Here in California, where smart meters are ubiquitous, virtually everyone "uses" a mesh networked device.

Sure, the hype cycle may have thought we'd all use mesh networking to create neighborhood Internet networks instead of neighborhood power utility data networks, but it's pretty clear that mesh networking has, behind the scenes, snuck into the plateau of productivity.

It's sort of like bemoaning that "Desktop Linux for Business" never took off, when the vast majority of us have phones or TVs that run Linux, many of which have vastly reduced the need to rely on a desktop operating system, or business/consumer services that ultimately run with Linux on the back-end. The Linux hype was totally legitimate, it's just that the way it was implemented differed from fairly narrow expectations of the way people would interact with computing devices in the future.
posted by eschatfische at 7:58 AM on August 17, 2017 [22 favorites]


If you ever find back issues of WIRED from the last century, grab them with both hands! Real everyone-will-have-a-personal-zeppelin style prognostics.
You don't have to go that far back. The Obama-edited issue from last year feels like an artifact that fell through a wormhole from an alternate universe these days.
posted by rhamphorhynchus at 8:19 AM on August 17, 2017


The delay has and always will be the long tail of old, difficult-to-repurpose content spread across thousands and millions of minimally maintained web sites or managed by small orgs that don't have the resources to refresh it.

It's not like people want to sematicize their web content, but just don't have the time and resources. They don't care.
posted by thelonius at 8:30 AM on August 17, 2017 [1 favorite]


We are swimming in AI, we just call the practical techniques "algorithms" and continue to do more and more advanced research.

That bugged me too. He's not tracking the technology, just the buzzwords. Sure, there's an uptick in speculation about general AIs, but there are tons of unglamorous narrow AIs that have been chugging along for decades, and the fact that there's increased media attention to artificial intelligence as a concept right now says nothing about the state of artificial intelligences in the real world.

Similarly, ambient technologies. That's more a design concept than a discrete technology really, and it didn't go away, just the buzzword. The concept is folded into lots of different types of technology, and is pervasive enough that the reason nobody talks about it is that it's unremarkable.

Technologies evolve. Most new technologies have roots in older ones, and a whole lot of them are mostly just old technologies with new names.
posted by ernielundquist at 8:32 AM on August 17, 2017 [5 favorites]


My day job involves "machine learning" and AI-adjacent technologies. From where I sit, we either already have, or never will achieve "true" AI.

The closer we get to any defined target, the less it looks like AI, and the more it looks like math. At least to me. Lots of math. Advanced math. But still: math. So is AI "just" math? If you say it could be, maybe we've already achieved some levels of AI. If not, then perhaps we never will.

If we can produce "real" AI using math, which is the road we're on now, that certainly does boost determinism as a philosophy!
posted by pwinn at 8:33 AM on August 17, 2017 [4 favorites]


On a side note: yes, I am usually way skeptical of futurist storytelling, but Kevin Kelly's new-ish book, The Inevitable, is a decent read. I have enjoyed and respected his thinking and writing since he was at Whole Earth Review/Catalog, and he is up front about prediction issues re:his time at Wired in the early part of the book. Because he is dealing with broader trends rather than specific technologies, he has more leeway, but the ideas are strong enough to have made me reexamine my own sometimes old-fashioned thoughts on technology adoption. Plus, I loved Out of Control, when it came out back in the 1990s - opened up a whole world of thinking for me and is, in a way, a precursor to this new book.
posted by buffalo at 8:45 AM on August 17, 2017 [2 favorites]


Regarding AI, there is this recent piece from Backchannel (now at Wired) by Kelly (adapted from the first chapter of the book).
posted by buffalo at 8:51 AM on August 17, 2017 [1 favorite]


Desktop Linux for Business - he implies a notable part of why this didn't work was "VMware enabled people to run Linux as an app on Windows, never allowing it to gather footprint as an OS" and not "Linux geeks utterly failed to set aside their anti-Windows biases long enough to help people switch."

I've made Linux install discs and flash drives that failed. No idea why. Linux forums tend to have the attitude, "if you can't follow the conversations here, you shouldn't be using Linux anyway."

Linux has not been adopted for business because (1) there's little or no profit in it and (2) too many Linux geeks love believing they have the "elite" OS, and don't want the general public to use it. If sales staff and upper management can't navigate the OS, the company isn't going to adopt it as the standard, no matter how many of the programmers would prefer it.
posted by ErisLordFreedom at 9:31 AM on August 17, 2017 [5 favorites]


Incremental improvements to JS engines (and subsequently the language itself) ended up having a much bigger impact.

Javascript is, in fact, a wonderful example of an under-hyped technology that has become ubiquitous in the same mid 90s to the present timeline covered by the author. I can remember playing about with Brendan Eich's first version - which had taken him 10 days to write - in 1995. Early demos showed that - if you could work around its manifold syntactical shortcomings and confusing as hell name ... you could use it to change the background colour of a Netscape (only) web page,reload a frame - or even serve as a crappy looking calculator. Good luck persuading the people at Gartner to tell everybody we should waste our time on that one...
posted by rongorongo at 9:57 AM on August 17, 2017 [3 favorites]


Linux has not been adopted for business because

Ah, hold on, linux basically runs AWS, google and probably every other massive data center excepting Microsoft, IBM is probably mostly some version of Unix (AIX) other than legacy mainframe applications.

Oh you mean "desktop applications", open a shell on a Mac or an Android and type ls, pwd, ps, cat, and gosh it just works, that's unix/linux. Microsoft is releasing an option to run linux apps natively on Win10.

IoT, huge percentage run a stripped down linux os.

There are millions of word/excel workstations, BILLIONS of "Unix" boxen.

LInux Won.
posted by sammyo at 10:23 AM on August 17, 2017


Microsoft is releasing an option to run linux apps natively on Win10
That's been out for a while. Last I checked, the network tools (eg ping) didn't work.
posted by rhamphorhynchus at 10:28 AM on August 17, 2017


Ah, hold on, linux basically runs AWS, google and probably every other massive data center excepting Microsoft...

I was using "Linux" as shorthand for "Linux OS as what's installed on the standard office computers." Unix-based code is indeed all over the IOT and the back-end of many devices - but it's nowhere near competing with Mac/Windows for standard office functions. And that's not because it's weak as an OS (or it wouldn't be able to do all the things it does), but IMHO because a great many Linux geeks, possibly the majority, seem offended that people might want to use it for web browsing and emails and maybe playing some games, and not become superhackers.

The common reaction is, "well, why would you need Linux if that's all you want to do with a computer?" Enter: Chromebooks and Win10, loaded with base OS features designed to track your every move, online and off.

Linux as desktop OS - not a VM option, not source code for limited-purpose devices - could put cracks in the data-sale industry that underlies much of the internet. Could bring back the notion that computer users should actually know something about how computers work, at least in the way that most drivers know something about how cars work.

But it can't happen as long as the general reaction to "I've heard Linux has privacy features that Windows lacks" is "you don't need Linux; here's some scripts you can run in Python; they might work on Windows or you could just modify them."
posted by ErisLordFreedom at 11:21 AM on August 17, 2017 [2 favorites]


So I think one important thing to understand is "who is the Gartner hype cycle for?"

Sure, anyone could get a copy of one of their reports and get some info from it, but who is paying money for it - and believe me, Gartner charges a lot of money for these reports.

One, technology buyers. But they don't really care much about hype cycles. They look at magic quadrants to figure out which vendors to buy from for CRM or marketing automation or whatever. They really want Forrester Waves or whatever vendor selection bullshit is trendy.

Two, vendors. These are the people looking at these hype cycle charts. They're basically asking whether they should make a mesh networking product or whatever. So when we have this:

Yes, it’s true that many of the Hype Cycle’s one hit wonders survive today, enjoying minor success or mindshare: Crowdsourcing - 2013, HTML5 - 2012, BYOD - 2012, Podcasting -2005.

These things came and went because no one was interested in buying Gartner reports about the market for these technologies. There's still no money in podcasting from the view of an app developer or technology vendor. If you're a journalist, sure, there's money, but they don't buy Gartner reports.

And Gartner aren't technical geniuses. They're all smart enough people, but they're not fundamentally inventing things. So when they mispredict WS services, authentication/identify services or "Tera-Architecture" it's not that surprising. They're just assessing the future of concrete things they can look at but they're not fundamentally capable of assessing things that don't exist. Also they don't really try to predict synergistic factors like how OAuth really took off once there was a public identity service that everyone actually used (Facebook as opposed to MSFT Passport)

But the question of "who are Gartner reports for" also helps explain why x86 virtualization, NoSQL, open-source, etc don't make it on the map - because no one is paying to understand that. VMWare invented the commercial market for virtualization and then there were open-source competitors. But having worked for a company in the VMWare ecosystem, there was simply nowhere for any other companies to go. VMWare made all the money and any add-on products were quickly subsumed into their product line. So why analyze a market when no one is going to pay for those reports?

Anyway, those whole thing is just as interesting to me as an analysis of why Gartner exists and what it really does versus why technology forecasting is doomed to failure.
posted by GuyZero at 11:56 AM on August 17, 2017 [11 favorites]


…computer users should actually know something about how computers work, at least in the way that most drivers know something about how cars work.

You vastly overestimate the mechanical knowledge of the average driver, methinks.
posted by Anticipation Of A New Lover's Arrival, The at 4:40 PM on August 17, 2017 [4 favorites]


The common reaction is, "well, why would you need Linux if that's all you want to do with a computer?" Enter: Chromebooks and Win10, loaded with base OS features designed to track your every move, online and off.

Chromebooks are desktop Linux machines. When you use a Chromebook to access Gmail, or Google Sheets, or visit most web sites, you're using Linux on the desktop to access a server running Linux. A whole lot of Linux.
posted by eschatfische at 5:47 PM on August 17, 2017 [1 favorite]


Ah, hold on, linux basically runs AWS, google and probably every other massive data center excepting Microsoft, IBM is probably mostly some version of Unix (AIX) other than legacy mainframe applications.

IBM's cloud runs linux.

Only weird minicomputers run AIX these days. If you didn't decide to use AIX in the mid-80s then you're not using it today.
posted by GuyZero at 5:52 PM on August 17, 2017


rhamphorhynchus:
Microsoft is releasing an option to run linux apps natively on Win10
That's been out for a while. Last I checked, the network tools (eg ping) didn't work.

Last I checked (five minutes ago) it works perfectly.
posted by lhauser at 7:36 PM on August 17, 2017


Taking operating systems in their entirety across all devices *nix is clearly dominant. Everywhere except the corporate and home desktop, and the number of vital applications in those arenas that only run on Windows is still vast.
posted by GallonOfAlan at 1:25 AM on August 18, 2017


As much as I made a post about the ubiquity of Unix/Linux -- Microsoft continues to dominate significant markets. Look at just about any "down" public display there is a windows icon, the vast majority of Point of Sale terminals in every store and restaurant, majority of machine control systems (only os that runs an app), most government systems (versions of Vista because the replacement certification continues, continues) and the ubiquitous humongous excel spread sheets that continue to grow and are a major force in RAM upgrades ;-)

MS in deeply ingrained in large areas that have significant forces requiring the use of that OS. It'll be legacy into the next century emulated probably much longer.
posted by sammyo at 5:55 AM on August 19, 2017


« Older Q: Why did the explorers haul a fruitcake to the...   |   UK politicians trying to get their cats wedged... Newer »


This thread has been archived and is closed to new comments