Psychology of Instrumentation

Cary Millsap reread Knuth and found a fascinating quote:

“I’ve become convinced that all compilers written from now on should be designed to provide all programmers with feedback indicating what parts of their programs are costing the most; indeed, this feedback should be supplied automatically unless it has been specifically turned off.”

Knuth wrote this in 1974. Its been 35 years. Compilers (and runtime environments such as JVM) still do not provide this feedback automatically. I believe that the need for this did not diminish.

Since compilers do not provide performance feedback, there is a very large market for instrumentation vendors. The sell you a bit of code that you can integrate in your programs, or install on your J2EE application servers or on your database, and it will give you exact information about where your program is spending its time.

I’ve worked for an instrumentation vendor, and I also talked to many of them as part of my production DBA role. Whenever an instrumentation vendor talks to a prospective customer, the first question is always: “What is the overhead?”. Not “How it can help me?”, “How much I can expect to improve my performance?” or “Is it easy to use?”. The answer to the first question is always the same, by the way, 5-10%.

The reason that this is always the first question is that people in general are always more concerned about losing something currently they have than about gaining something new. This tendency is called “endowment effect”, and it is quite powerful. Salespeople will tell you that supply is limited when you hesitate about buying something, because they know that fear of losing (even if it is just losing the option of buying this) is a more powerful motivator than any attractive features of the product.

In our case, developers and DBAs are more worried about losing something they have (Performance benefits of running non-instrumented code) than they are interested in the benefits of knowing where their product is spending time.

Suppose that all compilers and runtime environments had instrumentation enabled by default, with the option to turn it off. The same endowment effect would now work in our favor! No one would turn this off, because they’ll be more concerned about losing information they currently have (and became used to having) than they’ll care about the performance benefits of turning this instrumentation off.

The pervasive lack of instrumentation in software products is more a result of psychological bias than real technical concerns. Software vendors can work around these psychological issues by building instrumentation as a default into tools involved in the development and deployment process. Just as Knuth said 35 years ago.

In the mean while, I hope that by being aware of this irrational psychological bias, developers, DBAs and production application owners can overcome it and make sure they have the data they need to monitor and improve their applications.

About these ads

5 Comments on “Psychology of Instrumentation”

  1. Doug Burns says:

    Fantastic post, Chen. I’m seeing this in effect at my current site, where I haven’t heard any question about ASH, AWR or ADDM overhead for a long time, now that people can use the results ;-)

  2. Baron says:

    I’ve written about the need for more instrumentation in MySQL many many times, and Percona has actually built it. And what’s the response? “I’m afraid of the overhead.” Never mind that you can turn it on and off at will and it’s not as significant as it’s rumored to be (benchmarks?). See an alternative to the MySQL Query Analyzer for example.

  3. Log Buffer says:

    Chen Shapira responded with a Knuth-inspired item on the psychology of instrumentation, looking at how and why compilers, runtimes, and DBMSs continue to short-shrift their users on instrumentation.

    Log Buffer #134

  4. Chen,

    I believe the situation is probably a bit more complex – but maybe not as bad as you indicate. :-)

    The question after the overhead does actually make sense IMHO because – depending on the nature of the analysis you are trying to make – profiling tools can actually spoil the complete analysis (a bit like Heisenberg’s uncertainty principle).

    OTOH modern JVM’s have a lot instrumentation built in plus they come with API’s where tool vendors can easily hook in to get information out.

    Nevertheless, the point about loosing something vs. gaining something is very interesting indeed! Ooops, now I lost another five minutes of work time – too bad. :-)

    Cheers

    robert

  5. [...] Shapira responded with a Knuth-inspired item on the psychology of instrumentation, looking at how and why compilers, runtimes, and DBMSs continue to short-shrift their users on [...]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 3,109 other followers