I have some remote devices which communicate with a central Meteor application running on a Digital Ocean Ubuntu 14.04 droplet in a Docker container. Each device has it’s own channel, and the server is subscribed to all the channels plus a “telemetry” channel common to all devices. Everything works fine for a while (a few days to a few weeks), but then the subscribe callback on the server stops firing whenever there’s a message sent (the messages show up in the PubNub debug console). The server can publish normally though. The subscription works normally again after I restart the server. Here’s the relevant code snippet:

 pubnub = new PubNub({
    publishKey: "pub-key",
    subscribeKey: "sub-key"
    message: function (m) {
        if (m.channel == "telemetry") {
        } else {
    error: function (error) {
        // Handle error here

    channels: chans //chans is a list of channels

Is there some sort of default timeout that stops a subscription service? If so how can I disable it?

Looking through the reference I found this value that I can pass to the initialization, but I’m not sure if it’s relevant to me.


I’m running a test right now in my development server with presenceTimeout set to 0 and it’s working, but it’s a very hard problem to recreate as it can take a long time to pop up so if anyone has any insight into this problem it’ll be appreciated. I’ll update weekly on how the test is going.

Right now my temp fix in production is to have cron restart the server everyday which is not really ideal for me.

Update 2/15/17

Did a test with the Pubnub debugging logs enabled. It stopped working again. Below are the last few lines of the log.

enter image description here

The heartbeats seem to be coming in normally. I rechecked the logs before I restarted it and another one was recorded at 6:44 utc. The object after the heartbeat at 6:39 is a message that was successfully published to one of my devices. Any ideas?

You can see the full log here. The logs from my code aren’t formatted very well so excuse the mess.