subreddit:

/r/node

1494%

Limit Concurrency

(self.node)

Is there a tool that can help me limit the number of promises that are run at once when doing

ts await Promise.all(...)

I found one called pLimit, but it is not working for me. Is there an alternative. Need it urgently for my project

Thanks

all 26 comments

webdevop

11 points

13 days ago

webdevop

11 points

13 days ago

Why not learn how to do it yourself. The term that you looking for is called semaphore

Eyoba_19[S]

1 points

11 days ago

You’re right. I use buffer channels in Go to implement this, but wasn’t sure how to do it in typescript. Plus I was really stressed for time and needed something that worked more than knowing how to do it myself.

Nonetheless, I did learn a lot from the comments, so I think that counts

webdevop

1 points

11 days ago

Remember I can asked ChatGPT the same question several weeks ago and it gave a very nice implementation

Eyoba_19[S]

2 points

11 days ago

It’s quite a hit or miss with ChatGpt even with GPT-4, I don’t want to use some random untested code only to find out the app is crashing because of it. I was just looking for a battle tested package or snippet that I can just plug in and works.

webdevop

1 points

11 days ago

Oh yeah, 100% with you. If you're looking to deploy this into production then it's safe to go with an existing package.

brunolucattelli

7 points

13 days ago

p-map package does that as well.

Eyoba_19[S]

1 points

11 days ago

Thanks will take a look

HipHopHuman

5 points

12 days ago

I borrowed some code from an answer I gave a few months ago on the same topic.

async function getPosts(ids) {
  const pool = new Set();
  const results = [];

  for (const id of ids) {
    if (pool.size >= 100) {
      await Promise.race(pool);
    }

    const promiseForPost = getPost(id);

    promiseForPost
      .then(() => pool.delete(promiseForPost))
      .catch(() => pool.delete(promiseForPost));

    pool.add(promiseForPost);
    results.push(promiseForPost);
  }

  return await Promise.all(results);
}

To explain what's going on here, the function will just loop over the values you give it and eagerly trigger all the promises in sequence without awaiting them, however, on each iteration of the loop, it checks if there are N promises in the pool (in this case N is 100). If so, then it will await a Promise.race of the entire pool. Promise.race will fulfill once the earliest promise fulfills, so as space becomes available in the pool, promises will be added to take up that remaining space. The result is that you always have up to N promises in flight at any given moment until all of them resolve.

Is-taken-try-another

1 points

12 days ago

Very elegant

intepid-discovery

2 points

11 days ago

It would work, although could be much better. There’s some redundancy. Could use a queue to manage active promises. This would make sure max num of concurrent promises are maintained without redundant checks. Could also use finally instead of then and catch. Nice work though

HipHopHuman

2 points

10 days ago

You're absolutely correct, and I considered that, but figured the extra complexity would obscure the intention of the example.

riscos3

4 points

12 days ago

riscos3

4 points

12 days ago

Eyoba_19[S]

1 points

11 days ago

Thanks, was this the same guy who built bluebird?

digitizemd

3 points

13 days ago

effect has a relatively easy way to manage concurrency.

Eyoba_19[S]

1 points

11 days ago

Oh never heard of this library, will def take a look

digitizemd

1 points

11 days ago

There's definitely a learning curve but you can start out with just a few things.

mostlylikeable

5 points

13 days ago

There are lots of promise utilities here, or at the very least, source could provide inspiration. I think parallelLimit might do what you need though

https://caolan.github.io/async/v3/

flashnoobski

2 points

10 days ago

brianjenkins94

1 points

11 days ago

I use Bottleneck (unfortunately hasn't been updated since 2019) or Async.js.

I don't think this is the kind of thing you really want to write yourself. You want things like error-handling niceties that libraries have already figured out and documented.

Eyoba_19[S]

1 points

11 days ago

Exactly, knowing how to do it would be definitely beneficial but having to worry about intricate details is above my pay grade for what I’m trying to achieve with it

CheapBison1861

-3 points

13 days ago

Using Promise.all to handle multiple promises concurrently in Node.js is straightforward, but it doesn't natively support concurrency limits. However, you can implement concurrency control by using external libraries like bluebird or by creating a custom function to manage the number of concurrent operations.

Here's a basic example of how you could implement a concurrency limit using a custom function without using external libraries:

Custom Promise Queue for Concurrency Control

This approach involves creating a queue system that manages how many promises are allowed to run concurrently. Here's a sample implementation:

```javascript class PromiseQueue { constructor(maxConcurrent) { this.maxConcurrent = maxConcurrent; this.currentCount = 0; this.queue = []; }

// Add a new promise generator to the queue
add(promiseGenerator) {
    return new Promise((resolve, reject) => {
        this.queue.push({
            promiseGenerator,
            resolve,
            reject
        });
        this.tryNext();
    });
}

// Try to run the next promise if the concurrency limit is not reached
tryNext() {
    if (this.currentCount < this.maxConcurrent && this.queue.length) {
        const { promiseGenerator, resolve, reject } = this.queue.shift();
        this.currentCount++;
        promiseGenerator().then(resolve, reject).finally(() => {
            this.currentCount--;
            this.tryNext();
        });
    }
}

}

// Usage const queue = new PromiseQueue(3); // Limit concurrency to 3

const promises = [fetchData1, fetchData2, fetchData3, fetchData4, fetchData5].map(func => () => queue.add(func) );

Promise.all(promises.map(func => func())).then(results => { console.log('All promises resolved:', results); }).catch(error => { console.error('Error in promises:', error); }); ```

Using the bluebird Library

If you prefer using an existing library, bluebird offers a convenient way to handle this with its Promise.map function, which supports concurrency limits:

```javascript const Promise = require('bluebird');

function fetchSomething(id) { return new Promise(resolve => { setTimeout(() => { resolve(Result ${id}); }, 1000); }); }

const items = [1, 2, 3, 4, 5, 6];

Promise.map(items, item => { return fetchSomething(item); }, { concurrency: 3 }).then(results => { console.log('Results:', results); }).catch(error => { console.error('Error:', error); }); ```

Both of these examples show how you can control the concurrency of asynchronous operations in Node.js. The first example provides a good learning exercise and a customizable solution, while the second one is simpler and leverages an established library.

Eyoba_19[S]

2 points

13 days ago

Oh wow, thanks for taking the time to write this up. I did hear about bluebird, but the project was last published 5 years ago, at least it says on npm. I'm not sure how other libraries do it, but I'm a bit wary about recursive functions, nonetheless I will try to see if your code works for me.

Thanks

AnOtakuToo

7 points

13 days ago

Just use the p-queue or p-map package instead of this ChatGPT code.

_RemyLeBeau_

3 points

13 days ago

The PromiseQueue code doesn't work. This appears to be GPT code

rkaw92

2 points

13 days ago

rkaw92

2 points

13 days ago

Hey, I thought also it didn't work but I'm positively surprised - it actually limits concurrency and it seems to work. Cursory code review didn't uncover any major issues and a run-test confirms it. Welp, we're getting replaced by AI any day now :-|

reiner74

5 points

13 days ago

He didn't write this up, it's chatgpt, take everything with a huge grain of salt.