Using authenticity, fairness and transparency to beat the benchmarks
Customers don't care about bench- So how do you engage with that kind of marks. Get used to it. tit for tat, lowest denominator advertising? And if savvy customers don't care OK, let's qualify that. Customers don't about your benchmarks, what do they care about your benchmarks. In today's care about? corporate world, where consumers are bombarded with hundreds of advertise- The answer is - disengage. Then, reenments a day, Brand X saying they're bet- gage on your own terms and serve your ter than Brand Y doesn't mean anything customer, not your competitor. any more. There's no trust. ...view middle of the document...
At worst, the people buying the technology will understand better the problems trying to compare it. At best, you'll come up with a metric that makes sense to a lot of people and shifts the goalposts away from stupid SPEC numbers to real-world benchmarks suites, designed by server savvy people, to help server savvy people.
Here's the issue: companies love their SPEC numbers. SPEC numbers are easy to put on a graph. They're also easy to manipulate, and anybody can make a SPEC say anything. Companies who use SPEC continue to give credibility to something which should, in a world of authenticity and transparency, deserve none. To overcome an opponent playing on SPEC, disengage.
Then reengage - and do so with insane gusto. Savvy server buyers are their own masters. They believe that they understand, better than anyone, what their own needs are. They don't trust companymade benchmarks, although SPEC numbers are attention grabbing. Do something that that grabs their attention more Involve your buyers, your savvy transparently, and more fairly, and leads server community, in your performto a more authentic result. ance metrics and they will buy into your brand. In this case, transparency means a blog. An Intel Server Performance Analysis Disengage from inauthentic, attention blog. Give an employee or two a really simple brief - to take AMD and Intel chips, benchmark them, and publish the results. And the methodology. And photos of the rigs. And videos of the benchmarks in progress. And thoughts about problems with benchmarks. And notes about hardware quirks. And details of what they had for lunch. Solicit input from the server savvy community in the Comments section of the blog, and implement them. Run a high-profile campaign asking for input from these guys. Don't aim for headline grabbing single numbers from single benchmarks. Get your team to design workloads that accurately reflect real-world performance situations, and then get them to fairly benchmark the chips in these situations. If a Commenter on the blog points out a discrepancy, or something unfair in the test setup, go back and re-do the test. Involve the savvy server community in the testing, and then you can give them back results that they are invested in and
grabbing headlines and reengage with authentic ones. "Intel wins 20/30 tests of server performance designed by, and transparently conducted in association with, savvy server buyers in a community wide participation exercise in authentic benchmarking" isn't quite as snappy as "15% in SPEC", but that's just down to the turn of phrase. Authentic beats inauthentic.
Problem 2: Communicate the transparent, are not fair and authentic, performance of desktop prod- and thus will be found lacking by the customer. ucts to consumers
As for the problem of undefinable perThe fundamental problem with desktop formance - don't push them raw numprocessing, from a competitive point of bers, push them dreams,...