0
0
mirror of https://github.com/nodejs/node.git synced 2024-12-01 16:10:02 +01:00
nodejs/benchmark
Fishrock123 514b1d964b doc: add more info to benchmark/README.md
Adds info on the `wrk` prerequisite for http benchmarks and how to
run benchmarks with options.

PR-URL: https://github.com/iojs/io.js/pull/629
Reviewed-By: Ben Noordhuis <info@bnoordhuis.nl>
Reviewed-By: Evan Lucas <evanlucas@me.com>
2015-01-28 16:56:26 -05:00
..
arrays bench: Consistency in benchmark filenames 2013-02-19 17:16:55 -08:00
buffers buffer: implement iterable interface 2015-01-28 16:40:15 +03:00
crypto benchmark: hash stream 2013-05-14 11:36:04 -07:00
events lib: micro-optimize EventEmitter#removeListener() 2014-12-20 02:39:42 +01:00
fs benchmark: Correct the bufferSize to highWaterMark 2013-11-06 16:32:22 +04:00
http benchmark: update to use new wrk 2014-02-25 11:28:46 -08:00
misc benchmark: stop v8 benchmark clobbering RegExp 2015-01-18 14:08:29 +01:00
net benchmark: fix tcp bench after internal api change 2015-01-18 18:09:10 +01:00
tls benchmark: fixate ciphers in tls benchmarks 2013-12-07 02:32:03 +04:00
url lib: micro-optimize url.resolve() 2014-12-20 21:33:52 +01:00
common.js benchmark: print score to five decimal places 2015-01-21 01:21:44 +01:00
compare.js bench: compare binaries equal times 2013-03-20 20:25:48 +01:00
fs-write-stream-throughput.js fs: Change default WriteStream config, increase perf 2013-02-15 18:48:43 -08:00
http_bench.js Remove excessive copyright/license boilerplate 2015-01-12 15:30:28 -08:00
http_server_lag.js
http_simple_auto.js
http_simple_bench.sh benchmark: fix command name in benchmark scripts 2015-01-14 02:29:59 +01:00
http_simple_cluster.js
http_simple.js bench: Make http easier to profile 2013-02-25 17:47:28 -08:00
http_simple.rb
http-flamegraph.sh benchmark: fix command name in benchmark scripts 2015-01-14 02:29:59 +01:00
http.sh benchmark: fix command name in benchmark scripts 2015-01-14 02:29:59 +01:00
idle_clients.js
idle_server.js
io.c bench: Make io.c output easier to read 2013-02-19 14:14:37 -08:00
plot.R benchmark: fix command name in benchmark scripts 2015-01-14 02:29:59 +01:00
README.md doc: add more info to benchmark/README.md 2015-01-28 16:56:26 -05:00
report-startup-memory.js
static_http_server.js

io.js core benchmark tests

This folder contains benchmark tests to measure the performance for certain io.js APIs.

prerequisites

Most of the http benchmarks require wrk to be compiled beforehand.

make wrk

How to run tests

There are two ways to run benchmark tests:

  1. Run all tests of a given type, for example, buffers
iojs benchmark/common.js buffers

The above command will find all scripts under buffers directory and require each of them as a module. When a test script is required, it creates an instance of Benchmark (a class defined in common.js). In the next tick, the Benchmark constructor iterates through the configuration object property values and run the test function with each of the combined arguments in spawned processes. For example, buffers/buffer-read.js has the following configuration:

var bench = common.createBenchmark(main, {
    noAssert: [false, true],
    buffer: ['fast', 'slow'],
    type: ['UInt8', 'UInt16LE', 'UInt16BE',
        'UInt32LE', 'UInt32BE',
        'Int8', 'Int16LE', 'Int16BE',
        'Int32LE', 'Int32BE',
        'FloatLE', 'FloatBE',
        'DoubleLE', 'DoubleBE'],
        millions: [1]
});

The runner takes one item from each of the property array value to build a list of arguments to run the main function. The main function will receive the conf object as follows:

  • first run:
    {   noAssert: false,
        buffer: 'fast',
        type: 'UInt8',
        millions: 1
    }
  • second run:
    {
        noAssert: false,
        buffer: 'fast',
        type: 'UInt16LE',
        millions: 1
    }

...

In this case, the main function will run 2214*1 = 56 times. The console output looks like the following:

buffers//buffer-read.js
buffers/buffer-read.js noAssert=false buffer=fast type=UInt8 millions=1: 271.83
buffers/buffer-read.js noAssert=false buffer=fast type=UInt16LE millions=1: 239.43
buffers/buffer-read.js noAssert=false buffer=fast type=UInt16BE millions=1: 244.57
...
  1. Run an individual test, for example, buffer-slice.js
iojs benchmark/buffers/buffer-read.js

The output:

buffers/buffer-read.js noAssert=false buffer=fast type=UInt8 millions=1: 246.79
buffers/buffer-read.js noAssert=false buffer=fast type=UInt16LE millions=1: 240.11
buffers/buffer-read.js noAssert=false buffer=fast type=UInt16BE millions=1: 245.91
...
  1. Run tests with options

This example will run only the first type of url test, with one iteration. (Note: benchmarks require many iterations to be statistically accurate.)

iojs benchmark/url/url-parse.js type=one n=1

Output:

url/url-parse.js type=one n=1: 1663.74402

How to write a benchmark test

The benchmark tests are grouped by types. Each type corresponds to a subdirectory, such as arrays, buffers, or fs.

Let's add a benchmark test for Buffer.slice function. We first create a file buffers/buffer-slice.js.

The code snippet

var common = require('../common.js'); // Load the test runner

var SlowBuffer = require('buffer').SlowBuffer;

// Create a benchmark test for function `main` and the configuration variants
var bench = common.createBenchmark(main, {
  type: ['fast', 'slow'], // Two types of buffer
  n: [512] // Number of times (each unit is 1024) to call the slice API
});

function main(conf) {
  // Read the parameters from the configuration
  var n = +conf.n;
  var b = conf.type === 'fast' ? buf : slowBuf;
  bench.start(); // Start benchmarking
  for (var i = 0; i < n * 1024; i++) {
    // Add your test here
    b.slice(10, 256);
  }
  bench.end(n); // End benchmarking
}