Node-redis: Pipelining

Created on 13 Jan 2014  ·  9Comments  ·  Source: NodeRedis/node-redis

Hi Guys,

how to enable pipelining in node_redis programmatically?

I saw documentation says it's happen automatically in most cases - and I wand to understand is there a way to force it from the application code programmatically in node_redis

[ec2-user@devops ~]$ redis-benchmark -n 100000 -t set,get -P 16 -q -h 10.0.1.10
SET: 199600.80 requests per second
GET: 193050.19 requests per second

[ec2-user@devops ~]$ redis-benchmark -n 100000 -t set,get -q -h 10.0.1.12
SET: 14098.41 requests per second
GET: 13743.82 requests per second

it's more than 10 times difference on Redis node.

thanks,
Dmitry

Feature Request fixed / done

Most helpful comment

All commands are sent "pipelined" but these would never be sent in the same pipeline write frame for one or two reasons:

  1. The callback for the sismember is what invokes the hgetall call -- therefore you are implicitly preventing these commands from being pipelined together for any given sismember call. The callback is not invoked until Redis replies.
  2. If base and client are different Redis clients, they could _NEVER_ share the same pipeline.

Commands need to be executed from the same context* to see any benefit from the pipelining. E.g. similar to the PING example in the redis pipelining docs:

// These commands will all be pipelined together:
client.ping()
client.ping()
client.ping()
client.ping(function () {
  // This would *NEVER* be in the same pipeline frame as the other four because it requires a reply to be received first
  client.ping()
})

* or from separate contexts if sent faster than Redis is replying, but never reply dependent contexts

All 9 comments

Is it done through client.multi & multi.exec commands?

Hi @saritasa -- the pipelining is automatic. It is always happening inside the client. You do not need to use multi or exec to enable it.

Hi Bryce,

  1. I'm confused, what's the purpose of "multi" functions inside node_redis then?
  2. What is the algorithm used to automatically pipeline the commands?

thanks,
Dmitry

Multi operations are from the Redis protocol spec, they have a transactional purpose outside of pipelining: http://redis.io/commands/multi

In terms of how pipelining works: http://redis.io/topics/pipelining this is just a matter of the client writing commands to Redis as fast as you send them without waiting for the response before writing the next one. The client keeps track of commands sent and then realigns the replies with the sent commands once Redis replies.

Thanks Bryce, I understand how http://redis.io/topics/pipelining works. in a node.js application - how the code should be structured so that the requests are pipelined?

so in the example below

hgetall & expire - are pipelined or not? One of the calls is inside callback.


function getByID(table, id, next) {
if (!success) return setTimeout(function () {
getByID(table, id, next);
}, 2000);

var end = utils.getEndTime(),
    base = getBase(),
    client = getClient(PK2str(table, id)),
    key = getHashID(table, id);

base.sismember(table, key, function (err, val) {
    if (err || !val) return next();

    client.hgetall(key, function (err, val) {
        if (err || !val) return next(new Error('Expired'), id);

        val = arrays_parse(table, val);
        next(null, val);
        client.expire(key, cfg.ttl.shards);

        if (!exports.silent) {
            profiler.log('cache', {
                'table': table,
                'id': key,
                'method': 'getByID',
                'data': val,
                'time': end()
            });
        }
    });  
});

}

All commands are sent "pipelined" but these would never be sent in the same pipeline write frame for one or two reasons:

  1. The callback for the sismember is what invokes the hgetall call -- therefore you are implicitly preventing these commands from being pipelined together for any given sismember call. The callback is not invoked until Redis replies.
  2. If base and client are different Redis clients, they could _NEVER_ share the same pipeline.

Commands need to be executed from the same context* to see any benefit from the pipelining. E.g. similar to the PING example in the redis pipelining docs:

// These commands will all be pipelined together:
client.ping()
client.ping()
client.ping()
client.ping(function () {
  // This would *NEVER* be in the same pipeline frame as the other four because it requires a reply to be received first
  client.ping()
})

* or from separate contexts if sent faster than Redis is replying, but never reply dependent contexts

got it now - thank you! This is very valuable comment!

I think node_redis "automatic pipelining" is not the pipelining referred by official redis.

This time is called RTT (Round Trip Time). It is very easy to see how this can affect the performances when a client needs to perform many requests in a row (for instance adding many elements to the same list, or populating a database with many keys). For instance if the RTT time is 250 milliseconds (in the case of a very slow link over the Internet), even if the server is able to process 100k requests per second, we'll be able to process at max four requests per second.
If the interface used is a loopback interface, the RTT is much shorter (for instance my host reports 0,044 milliseconds pinging 127.0.0.1), but it is still a lot if you need to perform many writes in a row.

What the pipelining by official redis means is combining multiple commands and sending them once to counter the latency due to RTT, which is not addressed by node_redis. And "automatica pipelining" in node_redis is actually sending as many commands as possible in the same time by using Node's async programming model.

Here is a good article discussing this issue. http://informatikr.com/2012/redis-pipelining.html

@Vizwind @saritasa both .multi and .batch use pipelining as thought about from version 2.2 on.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

id0Sch picture id0Sch  ·  4Comments

juriansluiman picture juriansluiman  ·  3Comments

Stono picture Stono  ·  6Comments

strumwolf picture strumwolf  ·  4Comments

jackycchen picture jackycchen  ·  4Comments