Benchmark Testing "There's No Such Thing as Bad Publicity"
October 10, 2018 1:38 AM   Subscribe

Following the debut of Intel's new 9th generation CPUs, Intel published a set of benchmarks commissioned from third-party Principled Technologies - ten full days before the press embargo was to be lifted. A set of benchmarks, that upon closer inspection, are suspect at best. With mistakes ranging from poor memory settings on the AMD systems and different hardware per test system, to the Intel CPUs given high-performance coolers with AMD CPUs running stock or unsuitable coolers and actually disabling half of the AMD Ryzen CPU's cores, there's a lot of questions about the testing Intel released ahead of the press embargo date. Steve Burke of Gamers Nexus drove to Principled Technologies' offices and sat down with one of the co-founders to try to find some answers.
posted by Punkey (41 comments total) 6 users marked this as a favorite
 
Benchmarks have always been suspect, tweaked for better sales numbers. But this sounds more blatant.

They shoulda called Roy. Roy has been benchmarking computers since the 1960s, and has seen all the tricks manufacturers pull.
posted by scruss at 3:32 AM on October 10, 2018 [7 favorites]


Well now, that's pretty freakin' shady Intel. What is anyone supposed to do about it, though? It's hard for me to envision Intel suffering any consequences from this.

Installment n of the long-running hit series, "Never Believe a Marketer," I guess.
posted by Anticipation Of A New Lover's Arrival, The at 3:56 AM on October 10, 2018 [3 favorites]


Wellllllllll...

...there's actually rumours that have been going around that Intel's lost the plot and that between ARM, Apple's cores and AMD, Intel's more vulnerable than they've been in decades. At this critical moment Intel needed to prove that their most recent chips are an improvement over AMD's older Ryzen chips. If this report has been faked, that gives the rumours more credibility. You'll see manufacturers who were waiting on Intel's new chips start to get a little antsy. It might be a big deal.
posted by Merus at 4:09 AM on October 10, 2018 [5 favorites]


Weird, Intel is usually more sophisticated than this when they try to mislead their customers. Did they honestly think no one would notice these fake benchmarks? Dude, you are in the performance business, people are going to test every bit to make sure it does what you claim it does, especially since you have a long history of underhanded tactics.

Huge thanks to everyone involved in exposing Intel.
posted by Foci for Analysis at 5:16 AM on October 10, 2018


"Principled Technologies"... what's that rule about whenever a business uses a virtuous word in its name, it's the opposite of that?
posted by clawsoon at 5:38 AM on October 10, 2018 [4 favorites]


It's certainly fair to say that Intel surprised us all with the unexpected shift of its upcoming 28-core chip to the Xeon family, as well as the announcement of the X-series chips, too. And what's the deal with hyperthreading? ... Intel didn’t even mention what many enthusiasts wanted to know: why only the i9-9900K, out of all of Intel’s mainstream parts, boasts the hyperthreading feature.
Do any of these surprises have to do with the recent Spectre/etc. security bugs?
posted by clawsoon at 5:43 AM on October 10, 2018 [2 favorites]


What's bizarre to me about this is processors have basically stopped getting faster in general, at least compared to their previous rate of advance. My 6th gen core NUC is perfectly fine. I guess marginal improvements in speed and power consumption make a difference in big data centers, but i also assume that the folks making those decisions are going to do their own tests before they buy 10,000 new boards or whatever. And isn't gaming all about the GPU?

Like literally who cares about these minor speed bumps anymore?
posted by dis_integration at 5:54 AM on October 10, 2018 [1 favorite]


Gee, Intel cheating on benchmarks ?

This is like news from 1998...

That said, Intel has really been shitting the bed as of the few years.
posted by Pogo_Fuzzybutt at 5:56 AM on October 10, 2018 [1 favorite]


I am certain their engineers have been screaming at the marketing department for years about being practical and honest about the limits of their technology based on the architectural choices they made decades ago, but Intel has built their entire business on being a performance leader and the only direction for them to go is either say 'nope it actually kind of sucks' or 'hey it's awesome (not technically lying)'.

Pretty sure their shareholders are going to eat them either way, but at least this way they won't get sued into oblivion. Just gradually fade into market obscurity. Where they fucking deserve to be.
posted by seanmpuckett at 6:04 AM on October 10, 2018 [1 favorite]


What's bizarre to me about this is processors have basically stopped getting faster in general, at least compared to their previous rate of advance.

I'm pretty techy and I stopped giving much of a shit about processors almost a decade ago. Most of the bottlenecks I experience these days are elsewhere like how big is my pocket? How long does the battery last? How fast is my own mind?
posted by srboisvert at 6:20 AM on October 10, 2018 [3 favorites]


Like literally who cares about these minor speed bumps anymore?

I built my pc with some middle of the road components four years ago and I haven't found any reason to upgrade the CPU or even the GPU since then.
posted by octothorpe at 6:21 AM on October 10, 2018 [3 favorites]


Maybe they'll become vulnerable enough that a private equity firm can snap them up and help / suck the marrow from their bones.
posted by Mr.Encyclopedia at 6:22 AM on October 10, 2018


Do any of these surprises have to do with the recent Spectre/etc. security bugs?

My opinion as a semi-informed dabbler: probably not, but Intel is weird. Moving high-core-count CPUs to a higher-tier brand is most likely a marketing decision, they decided they could make more money selling those chips to the server/workstation market than enthusiasts/gamers. Hyper-threading has traditionally been a concern for Spectre-style shared-cache side-channel attacks, but it doesn't seem to make sense for Intel to drop Hyper-threading in enthusiast CPUs for security reasons while retaining it in server CPUs where it's far more likely to be attacked.
posted by skymt at 6:30 AM on October 10, 2018 [3 favorites]


I hear that the 10th gen chips will just be a FPGA with a post-it note attached saying "Make your own CPU, assholes!"
posted by RobotVoodooPower at 7:06 AM on October 10, 2018 [6 favorites]


Like literally who cares about these minor speed bumps anymore

Minor speed bumps are an essential part of the process if you ever want computing to get faster or more efficient. Continual bumps in efficiencies across industries might literally save our animal friendly planet.

How many of those minor speed bumps could be missed before someone small looks to moving to AMD for their new machine, or someone big, like Apple, Microsoft, etc, moves to a different chip architecture entirely?

I'm always a bit surprised with how people see technical incremental gains-- without the increments, you never gain anything outside of a truly 'Eureka!' moment. If a 2018 car is /slightly/ more efficient then the 2017 model, that's still a win. If on day two of your couch to 5k program you can't run the five kilometres, that's no reason to stop. Incrementalism is progress.
posted by Static Vagabond at 7:11 AM on October 10, 2018 [11 favorites]


Sure, but in the context of CPUs specifically, it represents significant slowdown relative to how things were in the past, and to a lesser extent, how they still are with other computer components. Used to be that a five or ten year old CPU would not be comparable to a modern one. Now, a modern CPU is faster, but they're not incomparable. There's still a lot of hoo-hah for a new processor release, but very little reason to upgrade, which is not how it used to be. CPUs just seem to have stagnated, compared to the past, and other components.
posted by Dysk at 7:16 AM on October 10, 2018 [1 favorite]


It's like if a 2018 car is 2% more efficient than the 2017 model, but each model from 2015 back represented a 25% improvement year on year. It'd make sense to ask what happened to the efficiency gains.
posted by Dysk at 7:18 AM on October 10, 2018


What's bizarre to me about this is processors have basically stopped getting faster in general, at least compared to their previous rate of advance.

You can blame the size of a silicon atom for that. The possibilities for shrinking the size of the chip which would allow for higher clock rates hit pretty hard against that size limit. This was the old way that Intel and all made processors faster previously.
posted by jmauro at 7:29 AM on October 10, 2018 [6 favorites]


You can blame the size of a silicon atom for that. The possibilities for shrinking the size of the chip which would allow for higher clock rates hit pretty hard against that size limit.

Relevant Computerphile video.
posted by Dysk at 7:35 AM on October 10, 2018 [1 favorite]


between ARM, Apple's cores and AMD, Intel's more vulnerable than they've been in decades

NVidia's ahead of them on the sockets set to grow in the future too - AI and automotive (incremental self driving applications). While we in the peanut gallery have been rightly mocking the marketing behind these things for years already, the business does exist and is growing.
posted by MillMan at 7:45 AM on October 10, 2018


I agree-- ask where the gains are, and it's absolutely vital we ask if the gains are real, it's the 'who cares' aspect that frustrates me.

We're asking for world-records year after year, like literally, we're asking for the best humanity has ever been able to do (at consumer scale at least) and then pointing back in the calendar saying 'pfft-- you only broke the world record by point-one seconds, why even bother, you broke it by 15 seconds in 1980'.
posted by Static Vagabond at 7:55 AM on October 10, 2018


You'll see manufacturers who were waiting on Intel's new chips start to get a little antsy. It might be a big deal

If any manufacturer pretends to suddenly get cold feet from this,it's because they want to extract better pricing from Intel, not because they are suddenly caught unawares. Dell,HP, Lenovo,Fujitsu etc. Have been getting engineering samples since the silicon was A0 (or whatever the first semi-stable spin was). They already know everything they want to know about it, and this isn't their first rodeo.
posted by Dr. Twist at 7:56 AM on October 10, 2018 [4 favorites]


it's the 'who cares' aspect that frustrates me.

Well, that's consumers talking - why spend a small fortune on a few percent more performance? Why should I, as a consumer, buy this chip? Why should I care about it having been released? If it offers some significant improvement, then that's your answer. But that simply isn't there. This isn't about the scientific breakthroughs, or pushing the boundaries of human achievement - it's about whether there is much cause for consumers who already own computers to care that there's a new CPU out. And there really isn't, not in the way that there used to be, when the difference between the new chip and whatever was in your box spoke for itself.
posted by Dysk at 8:03 AM on October 10, 2018 [2 favorites]


If you are of the small minority of consumers who even have a small awareness of what a processor is and does, then you certainly have the ability to make the purchasing call yourself-- no-one is suggesting that everyone upgrade, well-- aside from Intel's very hopeful marketing department.

But the majority of users who might buy a computer twice a decade, they'll have five small incremental cycles built into their purchase price-- if they purchase the top-end because they just have the money to burn, or a particular reason, cool-- but even if they just buy the middle-of-the-road option, they still end up with the benefits of this incremental progress.

So, tldr, you might personally not care about /this/ particular round of improvements, but when you're next out shopping-- you'll likely be glad they happened.
posted by Static Vagabond at 8:20 AM on October 10, 2018 [2 favorites]


The nice thing about performance being what it is right now, is that the longevity of computer equipment has gone up substantially. Much to manufacturers' chagrin, probably.

Most companies aren't doing the 2 or 3 year PC refresh cycles that they were doing (at least in the tech world) a decade ago. At least not that I've seen—some companies will get people new toys as a sort of fringe benefit, but the days of buying everyone a new computer as soon as the last one is done depreciating on paper are over.

I'm using a 2014 MacBook right now, and as far as I'm concerned it's perfectly fine (hell, it's markedly superior to anything Apple has produced since, but that's nothing to do with the processor and everything to do with their idiotic port-removal fetish). We'll have to see when they try to magically transform it into garbage via forced obsolescence, and also when people finally get pissed off enough about that to pass laws against it. The Europeans are already on it, IIRC. The technology, as usual, is only half the battle, and maybe not the hard half.
posted by Kadin2048 at 8:29 AM on October 10, 2018 [4 favorites]


Why should I, as a consumer, buy this chip?

Because it's included in the replacement computer that your vendor's service rep assures you is your only realistic option, being as how your last computer is no longer economical to repair since humidity turned its moisture sensing dots red, even though the only thing causing the fault you brought it in for is a bent pin on the internal screen connector.

As Moore's Law winds down on actual performance, watch it wind up on planned obsolescence. Unsustainable economic growth isn't going to sustain itself, you know. As a consumer, it's your duty to the economy never to jump off the upgrade treadmill.
posted by flabdablet at 8:30 AM on October 10, 2018 [6 favorites]


Spectre and Meltdown mitigations in software both have cost, and affect Intel disproportionally more due to their architectural choices. Cost varies by workload, I've seen estimates ranging from 1% to 800%. Some of Intel's new processors contain hardware mitigations, but others continue to rely on software. This is bad news when you're chasing incremental improvements, and seems like unusually fertile soil for benchmark fudging.
posted by joeyh at 8:57 AM on October 10, 2018 [1 favorite]


If you are of the small minority of consumers who even have a small awareness of what a processor is and does, then you certainly have the ability to make the purchasing call yourself

Yes, which is exactly what the "who cares?" sentiment represents - there are disappearingly few consumers who are going to be moved by this kind of minor incremental progress in consumer grade chips. Which is in contrast to a lot of previous new-generation chip launches. Yes, you benefit from many cycles of incremental progress when you do get a new PC, but you just buy whatever is current when you're in the market. You don't need to follow the releases and stuff like you did in the past. So in that sense, "who cares?" is pretty sensible. It's nice and all that these minor incremental improvements are happening, but it's not relevant news like CPU launches used to be, when you timed your upgrade cycles around them.
posted by Dysk at 9:23 AM on October 10, 2018 [1 favorite]


Isn't part of the reason that we don't need more powerful computers at home because so much processing has moved to The Cloud, where they will be interested in every little advance?
posted by clawsoon at 9:56 AM on October 10, 2018


But the majority of users who might buy a computer twice a decade...

So from around 1986 to 2010-ish or so, most people needed up upgrade their existing computer every two or three years--typically around the same time as a new DOS Windows release--in order to use the latest versions of basic day-to-day software (e.g. word processors, email clients, web browsers, etc.) Games were even worse; one or two years in and modern games' requirements had passed your computer by. This thing where your computer continues to be useful after a year or two is new and is not the normal way of using Wintel PCs.

(I was one of those wierdos who managed to go for five years without buying a new computer mostly because I used Linux and did mostly non-CPU-intensive stuff on it, so I'd typically get into one or two new games every time I bought a new computer and then lose interest in gaming for a few years. But the cheapass refurbished APU-based PC I bought in 2013 can still play every game I throw at it, admittedly with occasional tweaking.)

This thing where slow computers are still useful is a relatively new stage of Intel's existence and I suspect the company hasn't really absorbed that fact yet.
posted by suetanvil at 10:08 AM on October 10, 2018 [6 favorites]


Apple uses Intel chips. The plan is to move away from Intel in 2020.

And I'm not so sure that they faked the tests so much as they cheated to ridiculous degrees. They claim that 64GB of RAM is some type of gaming standard. Even lay people probably raise an eyebrow at that, especially after seeing "8GB RAM" on the box of their PS4. They used the "stock" cooler on the AMD because that's what it ships with? What? So they're either con men or ridiculously ignorant. Not sure which I would choose to be.
posted by Brocktoon at 1:19 PM on October 10, 2018 [1 favorite]


AdoredTV has a video out on this too, which goes into additional detail on the issues.
posted by Leud at 2:11 PM on October 10, 2018


Time for Cyrix to make a surprise comeback, just like Valiant Comics did!
posted by turbid dahlia at 5:33 PM on October 10, 2018


A couple of weeks ago I was haunting the halls of Wal-Mart, shaking my chains and saying "Boo!" to passers-by as is my wont, but I was also there to look over the laptop models to gather information on what they had for a friend. I noticed that there were three tiers of memory, marked on the little info card by each with a graph:
                                            4 GIGABYTES: ========
                                            8 GIGABYTES: ================
4 GIGABYTES w/ 16 GIGABYTES INTEL OPTANE[TM] TECHNOLOGY: ========================
I read "Optane" in the same manner as "Advanced Memory Substitute," (the fact that it was always cheaper was a clue), so I looked it up and lo, it's not memory at all, but a flash caching system for hard drive access. Those fuckers.
posted by JHarris at 7:50 PM on October 10, 2018 [1 favorite]


Re optane, I haven’t actually used it but it’s not inherently a scam. A lot of computer operations are slow because of disk IO, so having a faster cache like that might actually speed things up for some workloads.

And fundamentally OSs are equipped to use virtual memory and swap out to disk, the only problem is speed, so the Optane part could be used to speed that up.
posted by vogon_poet at 8:40 PM on October 10, 2018 [2 favorites]


between ARM, Apple's cores and AMD, Intel's more vulnerable than they've been in decades

NVidia's ahead of them on the sockets set to grow in the future too


fun fact: nvidia's CEO is AMD's CEO's uncle :)
posted by kliuless at 10:28 PM on October 10, 2018 [1 favorite]



4 GIGABYTES: ========
8 GIGABYTES: ================
4 GIGABYTES w/ 16 GIGABYTES INTEL OPTANE[TM] TECHNOLOGY: ========================


Optane is really cool, it's basically "slow-ram speed" persistent storage. When your storage has to be plugged on the memory bus to get it's full bandwidth you know it's fast. But this benchmark only is valid if the memory is oversubscribed (ie it's paging to the disk), otherwise the 8gb scenario would win.

New CPUs are getting less and less exciting, single thread performance isn't really moving forward significantly anymore. And although multiple cores (which are exciting) will greatly contribute to a system that doesn't feel sluggish because you're running an os and multiple applications (and tabs in the browser) at the same time, it's a hard thing to put a number on in your marketing materials (ie what intel marketing people want). Games are the ubiquitous example of an application with an understandable performance metric (fps) that stresses a multicore system but even that it's limited since tuning is balanced towards less high-end systems on PC. Also fps as a metric for CPU is crap, you want to know total time spent working, FPS usually includes a lot of idle time on multicore systems.

And yeah.... Intel marketing are a bunch of clowns.
posted by WaterAndPixels at 10:05 AM on October 11, 2018 [1 favorite]


I have an 8 year old, 12-core desktop. Very few apps scale to that many cores, unless you're doing something like video compression. So you get this effect where the computer is not especially fast, particularly for games, but what you can do is play several games at once without noticeable slowdown. I used to play four instances of EVE Online at once, and it was pretty good machine for that.

(p.s. quad-boxing eve is not especially healthy)
posted by ryanrs at 12:03 AM on October 12, 2018


When I bought the above-mentioned desktop, I was expecting to hang onto it for awhile. But I'm still surprised how well it has held up these 8 years. I upgraded to an SSD a few years ago. And it's due for a GPU upgrade. But there's zero urgency to upgrade the CPU. I could probably double the per-core performance with a new CPU, but meh. It hardly seems worth it.

tbh I quite like this your-computer-lasts-ten-years future
posted by ryanrs at 12:12 AM on October 12, 2018 [1 favorite]


I wonder if you can still run multiple WoW clients at once? Damn that is an old memory, having six Shamans hit you with Lavablast at once. Frustrating but fun to watch!
posted by Brocktoon at 6:53 PM on October 16, 2018


Yeah, I'd feel worse about the flattening of the performance curve if it was obvious in my day-to-day usage and for the most part it isn't. I'm still struggling to justify upgrading my late 2012 iMac - everything on the market that isn't the top of the line BYO or Pro models have CPUs that are arguably worse than my 2012 vintage i7. I'm sure I'd notice it in all the other little ways - retina screen and faster interfaces and whatnot, but is any of that worth another $1500? It amortizes well, I guess.

What definitely _is_ improving is performance per watt and related heat issues, which is still a win. What good is it having a hugely powerful CPU if you can't run it at full tilt for more than 5 minutes without it thermalling out or needing a set of howling fans and cold intake air?
posted by Kyol at 7:15 AM on October 18, 2018


« Older The last full measure of devotion   |   The Plasticians: Death is not The End Newer »


This thread has been archived and is closed to new comments