The solution to the Firestore batched write limit

  • Kat Lynch
    Kat Lynch
    Software Engineer
We’re building a whiteboard for software engineers. Read about how

Are you getting Error: 3 INVALID_ARGUMENT: maximum 500 writes allowed per request from Firestore? We have a one-liner solution for you!

Firebase and Firestore enable building modern, ‘live’ web applications without having to think about schemas, migrations and infrastructure. While this makes it easier to focus on what's important - delivering user value - you might sometimes find yourself missing the ‘good old SQL transaction’.

Firestore offers two types of batch operations – transactions (which allow both reads and writes) and batched writes (for writes only). However, both of those however have hard limit of writing to 500 documents. When you start building your app, you don't think you’ll ever have to update more than that.

And then you find yourself staring at Error: 3 INVALID_ARGUMENT: maximum 500 writes allowed per request in your Sentry dashboard (because you are responsible and you are tracking your functions errors). What should you do now?

Introducing: BigBatch

The easy solution: use our firestore-big-batch library and replace all calls to fs.batch() with BigBatch:

npm install @qualdesk/firestore-big-batch --save or yarn add @qualdesk/firestore-big-batch

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
// Original code:
const fs = admin.firestore()
const batch = fs.batch()

ids.forEach((id) => {
  const ref = fs.collection('documents').doc(id)
  batch.set(ref, { published: true }, { merge: true })
})

await batch.commit()

// New code:
import { BigBatch } from '@qualdesk/firestore-big-batch' // <- add this

const fs = admin.firestore()
const batch = new BigBatch({ firestore: fs }) // <--------- change this

ids.forEach((id) => {
  const ref = fs.collection('documents').doc(id)
  batch.set(ref, { published: true }, { merge: true })
})

await batch.commit()

How it works

It is a very simple, but powerful concept. All the BigBatch class does is keep track of how many operations you called, and when you reach the limit (defined at 499, just in case), it automatically creates another Firestore batch for you and adds it to an array of batches:

1
2
3
4
5
6
7
this.currentBatch.set(ref, data, options)
this.operationCounter++
if (this.operationCounter === MAX_OPERATIONS_PER_FIRESTORE_BATCH) {
  this.currentBatch = this.firestore.batch()
  this.batchArray.push(this.currentBatch)
  this.operationCounter = 0
}

Yes, that brings us to the biggest warning about this library. It will create multiple Firestore batches if you have more than 499 operations in your BigBatch. Unfortunately, this is the closest we can get to making big transactions. If anything changes in Firestore we will update the library accordingly.

When you call batch.commit() it uses Promise.all() to convert array of promise to one promise that you can await on in your code.

1
2
3
4
5
public commit() {
  const promises = this.batchArray.map((batch) => batch.commit())
  return Promise.all(promises)
}

Potential improvements

We know this library is not perfect, but it does the job for us at Qualdesk. There are a few improvements that we would like to introduce to it:

  • better error handling when batches fail (Promise.all() is not that great)
  • see if we can support runTransaction
  • write tests!

We hope you find it useful! If you want to help us with the library (whether it's improvements mentioned above or anything else), all PRs are welcome qualdesk/firestore-big-batch

If you have questions or comments about this post, please tweet us @qualdesk and let us know how you get on.