Friday, May 2, 2014

MySQL Benchmark with mysqlslap

So benchmarking different MySQL queries against your database is a wise thing to do. That should go without saying. While we optimize queries the best we can using EXPLAIN (and EXPLAIN EXTENDED)  taking them time to benchmark them should prove helpful.

This is a simple example of executing a mysqlslap statement.

For this example I loaded the WORLD database from MySQL. ( http://dev.mysql.com/doc/index-other.html )

I created a query that joined all three tables  and put it into /tmp/tests.sql. The explain plan is below.

root@localhost [world]> EXPLAIN EXTENDED SELECT C.Name as City, Y.Name as Country, L.Language,Y.Population FROM City C INNER JOIN Country Y ON C.CountryCode = Y.Code INNER JOIN CountryLanguage L ON C.CountryCode = L.CountryCode WHERE C.Name LIKE 'D%' AND Y.Continent='Europe' \G
*************************** 1. row ***************************
           id: 1
  select_type: SIMPLE
        table: C
         type: range
possible_keys: CountryCode,name_key
          key: name_key
      key_len: 5
          ref: NULL
         rows: 127
     filtered: 100.00
        Extra: Using where
*************************** 2. row ***************************
           id: 1
  select_type: SIMPLE
        table: Y
         type: eq_ref
possible_keys: PRIMARY
          key: PRIMARY
      key_len: 3
          ref: world.C.CountryCode
         rows: 1
     filtered: 100.00
        Extra: Using where
*************************** 3. row ***************************
           id: 1
  select_type: SIMPLE
        table: L
         type: ref
possible_keys: PRIMARY,CountryCode
          key: CountryCode
      key_len: 3
          ref: world.C.CountryCode
         rows: 2
     filtered: 100.00
        Extra: Using index
3 rows in set, 1 warning (0.00 sec)

root@localhost [world]> show warnings \G
*************************** 1. row ***************************
  Level: Note
   Code: 1003
Message: /* select#1 */ select `world`.`C`.`Name` AS `City`,`world`.`Y`.`Name` AS `Country`,`world`.`L`.`Language` AS `Language`,`world`.`Y`.`Population` AS `Population` from `world`.`City` `C` join `world`.`Country` `Y` join `world`.`CountryLanguage` `L` where ((`world`.`Y`.`Code` = `world`.`C`.`CountryCode`) and (`world`.`L`.`CountryCode` = `world`.`C`.`CountryCode`) and (`world`.`Y`.`Continent` = 'Europe') and (`world`.`C`.`Name` like 'D%'))


Now the mysqlslap tool has been around since MySQL 5.1.4
Below are some other helpful links.
Now that I have my query, I can benchmark it against the database with the following command.

mysqlslap --concurrency=150 --iterations=50 --query=/tmp/test.sql --create-schema=world

One note:
The query has to be very clean as the tool does error easily.
For example the following tossed this error:

SELECT C.Name as City, Y.Name as Country, L.Language,Y.Population
FROM City C
INNER JOIN Country Y ON C.CountryCode = Y.Code
INNER JOIN CountryLanguage L ON C.CountryCode = L.CountryCode
WHERE C.Name LIKE 'D%' AND Y.Continent='Europe'

While this query worked just fine.

SELECT C.Name as City, Y.Name as Country, L.Language,Y.Population FROM City C INNER JOIN Country Y ON C.CountryCode = Y.Code INNER JOIN CountryLanguage L ON C.CountryCode = L.CountryCode WHERE C.Name LIKE 'D%' AND Y.Continent='Europe'


The tool will output benchmark results for you


Benchmark
    Average number of seconds to run all queries: 0.104 seconds
    Minimum number of seconds to run all queries: 0.096 seconds
    Maximum number of seconds to run all queries: 0.141 seconds
    Number of clients running queries: 150
    Average number of queries per client: 1


mysqlslap --help will give you numerous options to test your queries with.

You can do everything automatically

# mysqlslap --auto-generate-sql
Benchmark
    Average number of seconds to run all queries: 0.243 seconds
    Minimum number of seconds to run all queries: 0.243 seconds
    Maximum number of seconds to run all queries: 0.243 seconds
    Number of clients running queries: 1
    Average number of queries per client: 0


You can test inserts as well. For example I created this table:

CREATE TABLE `foobar_table` (
  `id` int(11) NOT NULL AUTO_INCREMENT,
  `time_recorded` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
  PRIMARY KEY (`id`)
) ENGINE=InnoDB ;


So then tested with the following.

 # mysqlslap --concurrency=1150 --iterations=530  --query="use test;insert into foobar_table (id) values (null)" --delimiter=";"
mysqlslap: Could not create thread


OK Errors are not very helpful... But hopefully you notice the error. It is hard to have 1150 concurrent transactions if you only have 530 iterations.


# mysqlslap --concurrency=150 --iterations=30  --query=/tmp/test1.sql --create-schema=test --verbose
Benchmark
    Average number of seconds to run all queries: 0.260 seconds
    Minimum number of seconds to run all queries: 0.192 seconds
    Maximum number of seconds to run all queries: 0.476 seconds
    Number of clients running queries: 150
    Average number of queries per client: 1


For example the following worked much better.

# mysqlslap --concurrency=200 --iterations=1000  --query=" insert into foobar_table (id) values (null)"  --verbose --create-schema=test
Benchmark
    Average number of seconds to run all queries: 0.282 seconds
    Minimum number of seconds to run all queries: 0.217 seconds
    Maximum number of seconds to run all queries: 0.726 seconds
    Number of clients running queries: 200
    Average number of queries per client: 1


Just to prove that we are doing real inserts..

 root@localhost [test]> select count(id) from foobar_table;
+-----------+
| count(id) |
+-----------+
|    206091 |
+-----------+
1 row in set (0.13 sec)

Now I also should say that this is just a test database that I use for blog posts, so do not evaluate your production databases against these results.

I guess the point after all this... find a troublesome query, optimize it the best you can, and benchmark it. It is better to know your limits instead of just a guess. 


1 comment:

  1. This article is very informative and helpful. Thank you for sharing!

    ReplyDelete

@AnotherMySQLDBA