Skip to content

Memory leakage in ioredis package while storing data (increasing the heap memory of backend) #1965

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
Abhay3333 opened this issue Mar 4, 2025 · 2 comments

Comments

@Abhay3333
Copy link

ioredis have memory leak issue where the data gets add on time by time and the backend gets crashed

//setup :
const Redis = require("ioredis");
require("dotenv").config();

const redisConfig = {
host: process.env.REDIS_HOST,
port: process.env.REDIS_PORT,
retryStrategy: (times) => {
const delay = Math.min(times * 50, 2000);
return delay;
},
};

// Create separate clients for pub/sub and regular operations
const redis = new Redis(redisConfig);
const publisher = new Redis(redisConfig);
const subscriber = new Redis(redisConfig);

// Event listeners for main Redis client
redis.on("connect", () => console.log("✅ Redis client connected"));
redis.on("error", (err) => console.error("❌ Redis client error:", err));

// Event listeners for publisher
publisher.on("connect", () => console.log("✅ Redis publisher connected"));
publisher.on("error", (err) => console.error("❌ Redis publisher error:", err));

// Event listeners for subscriber
subscriber.on("connect", () => console.log("✅ Redis subscriber connected"));
subscriber.on("error", (err) => console.error("❌ Redis subscriber error:", err));

module.exports = {
redis,
publisher,
subscriber
};

//function :
const { redis, publisher } = require("../redis/redisSetup");
const fs = require("fs");
const path = require("path");

const storeTickData = async (token, data) => {
try {
const key = tickDataNew:${token};

const tickFields = Object.entries(data).map(([key, value]) => [
  key,
  typeof value === "object" ? JSON.stringify(value) : value,
]);

await redis.hset(key, Object.fromEntries(tickFields));

await redis.expire(key, 82800);
return "success";

} catch (error) {
console.error(❌ Error storing tick data: ${error.message});
return "error";
}
};

@dmaier-redislabs
Copy link
Contributor

Hi Abhay,

here are a few questions to understand the problem better:

  • Which version of ioredis shows this symptom?
  • It seems that your code above does initialize a publisher and a subscriber, but it doesn't actually publish messages, subscribe to messages or process any messages. Why is that?
  • What do you mean by 'the backend gets crashed'? Do you mean that you kill the Redis server?
  • How did you measure the memory consumption in order to identify the potential leak?

Could you please provide step-by-step instructions that help us reproducing the issue?

Regards,
David

@Abhay3333
Copy link
Author

Thank you for responding,
the above first code was mine setup file to show how I was setting up Redis( "ioredis": "^5.4.2" ) i am using this version,

const tickFields = Object.entries(data).map(([key, value]) => [
key,
typeof value === "object" ? JSON.stringify(value) : value,
]);

await redis.hset(key, Object.fromEntries(tickFields));

await redis.expire(key, 82800);
return "success";

In above function when mine data is large around 3000 to 4000 in length and when i execute this function frequently every second data memory reference never get clears thus mine heap memory keep on increasing without letting it for garbage collection i used set method as well giving same issue.

mine backend eventually get crash after sometime showing out of memory

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants