Why bulk processing matters in Salesforce Flows
Salesforce enforces governor limits that make it essential to design Flows that handle multiple records efficiently. Record-triggered Flows may execute for batches of records in a single transaction, so unbulkified logic can easily hit limits (SOQL, DML, CPU time) and cause failures for users.
Key concepts
Keep these principles in mind when designing Flows for bulk record processing:
- Bulkification: Treat inputs as collections and avoid per-record queries or DML inside loops.
- Use Collection Variables: Gather records into collection variables and perform a single Create/Update/Delete operation.
- Minimize SOQL/DML: Use a single Get Records (when possible) and perform single collection-level DML operations.
- Consider Async: For very large volumes or long-running processing, use scheduled paths, Platform Events, or invoke Apex (Queueable/Batchable).
Design patterns and best practices
Follow these practical steps when building Flows that will run in bulk:
1) Use record-triggered flows correctly
Record-triggered Flows (before-save and after-save) can run in bulk. Use before-save Flows for simple field updates because they don’t consume DML limits (they update the same transaction record efficiently). Use after-save Flows when you need to create/update related records or call external services.
2) Avoid SOQL/DML inside loops
Instead of querying or performing DML in a Loop element, collect changes into a collection variable and apply a single DML action at the end. This reduces the number of DML statements and avoids hitting governor limits.
3) Use Assignment + Collections
Typical pattern:
- Get Records (if needed) to pull related data for the entire batch.
- Loop through the trigger collection once to compute values.
- Use Assignment elements to add modified records or new records to a collection variable.
- After the loop, perform a single Create Records / Update Records using the collection variable.
4) Prefer Before-Save updates for simple field changes
Before-save Flows are the most efficient for updating the triggering record’s fields because they run before DML and avoid an extra DML operation. Use them whenever you only need to change fields on the same record that triggered the Flow.
5) Use Invocable Apex for complex bulk operations
When Flow elements can’t express complex logic or need to work with millions of records, create an invocable Apex method that accepts a list of records and performs bulk-safe operations (single SOQL, batched DML, error handling). Call that method from the Flow.
public with sharing class FlowBulkProcessor {
@InvocableMethod(label='Process Accounts in Bulk')
public static void processAccounts(List<AccountWrapper> wrappers) {
// Convert wrappers to Account list
List<Account> accountsToUpdate = new List<Account>();
for (AccountWrapper w : wrappers) {
accountsToUpdate.add(new Account(Id = w.accountId, Some_Field__c = w.value));
}
if (!accountsToUpdate.isEmpty()) {
update accountsToUpdate; // single DML for the batch
}
}
public class AccountWrapper {
@InvocableVariable
public Id accountId;
@InvocableVariable
public String value;
}
}
6) Error handling and fault paths
Configure Fault paths on elements that perform DML or callouts. Capture errors in a collection and log them to a custom object or send notifications. That way, a failure in processing one record won’t silently fail for the entire batch and you can retry or fix bad data.
7) Use pagination / batch patterns for very large volumes
If a Flow must process very large datasets, consider:
- Using Scheduled Flows that process limited chunks of records per run.
- Calling Queueable or Batchable Apex from Flow to take advantage of batch processing.
- Leveraging Platform Events to decouple and asynchronously process records.
8) Testing and monitoring
Test the Flow with bulk data (e.g., load 200 records) to ensure it stays within governor limits. Use debug logs, Flow interviews, and Salesforce’s Flow error email notifications to monitor behavior in production.
Checklist for bulk-safe Flows
- Use before-save Flow for simple field updates on the same record.
- Gather records into collections; perform a single DML operation.
- Avoid queries and DML inside loops; use a single Get Records for related data.
- Use invocable Apex for complex, high-volume work.
- Implement Fault paths and logging for robust error handling.
- Consider async (Queueable/Batchable/Platform Events) for very large volumes.
Following these practices will help ensure your Flows are resilient, efficient, and safe to run in bulk across diverse Salesforce workloads.








Leave a Reply