Skip to content

We at Raima are often asked to provide database benchmark results that prospective customers can use to evaluate the performance of our database systems. We, of course, have several ready-made benchmarks that we can provide. However, just as any other database vendor, the benchmark tests that we provide highlight our strengths. After all it’s natural to want to show off what you do best. With this in mind we encourage people evaluating database solutions to utilize the following benchmark guidelines.

The Database Benchmark Rule of Thumb

The only truly meaningful benchmarks are those that are based on your own database application requirements!

Why?

  • There are no “standard” database benchmarks that mimic and stress a DBMS in the same manner that your application will. Your application is unique. The reason you are developing it is to fill a need that is not being met by an existing solution. You are looking to differentiate your product from competitors. It only follows that your database requirements are going to be unique so why would you rely on so call standard benchmarks that measure performance of features and functionality that is not even remotely relevant to what your application will do?
  • Most reputable DBMS will perform well under a group of common use cases and perform less well under other more complicated use cases. We have rarely found an application with requirements limited to just the most common use cases. Most applications are going to stress the DBMS in multiple different scenarios and under multiple different loads and conditions. It would be nice if one size “solution” fit all but this isn’t reality.
  • Benchmarks provided by a DBMS vendor usually focus on those usage scenarios that demonstrate their system’s strengths. This is why you should never rely solely on database benchmark results provided by any database vendor in deciding which DBMS is best for your application.

Suggested Guidelines

  • Define those use cases that will be core to your specific application requirements. The more specific you can be the better chance that you will be able to truly represent your application’s performance requirements. If you cannot articulate your application’s requirements you should hold off on performing any benchmarking until you can. The risk of making an incorrect database selection based on irrelevant benchmarks is too high.
  • Plan on making the investment to create a test database that most closely represents the one that will be needed for your resulting application. This is not cheap but it is truly a case of “pay me now or pay me later.” Designing your database while writing your application can lead to many iterations of the design and rework of your application. To avoid this it makes sense to spend the time now to create a test database to avoid costly project delays later.
  • Elicit the assistance of the DBMS vendors, once you have clearly defined your database requirements and have identified the relevant use cases. They can be a valuable resource for aiding in the development of your own unique database benchmarks.
    • This will allow you to evaluate not only the system itself but the kind of support you can expect to receive from the DBMS vendor.
    • It also allows the vendor to show you ways to get more out of their products which you would not necessarily have discovered by doing a simple “out of the box” test.

If you follow these guidelines when analyzing database vendors, your projects will definitely benefit.