Why bulk processing matters in Salesforce Flows
Salesforce executes many operations in bulk. Flows that process records one-by-one can hit governor limits (SOQL, DML, CPU), cause poor performance, and produce inconsistent behavior in high-volume scenarios. Designing Flows for bulk record processing ensures reliability, better performance, and platform compliance.
Key principles
Follow these principles when building Flows intended to handle multiple records:
- Favor bulk operations — collect records in collections and perform single DML calls for the whole collection.
- Avoid performing SOQL or DML inside loops.
- Minimize CPU and memory work per transaction; offload complex logic to asynchronous processes or invocable Apex when needed.
- Use fault handling and limits-aware design (batching techniques, scheduled or autolaunched Flows).
Flow types and their bulk behavior
Not all Flows behave the same in bulk contexts:
- Record-Triggered Flows (after-save and before-save): Execute in bulk when multiple records change in one transaction (for example, data loader or API update). Before-save flows are best for fast, bulk-safe field updates because they avoid DML.
- Autolaunched Flows: Can be invoked from Apex or Process Builder in bulk — design them to accept and return collections.
- Scheduled Flows: Run on a schedule; good for large-batch processing when combined with collection-based actions.
Practical techniques for bulk-safe Flows
Use these concrete techniques when designing a Flow to handle many records:
1) Use collection variables and single DML operations
Gather the records you must change into a collection variable (e.g., a Record Collection Variable) using Get Records or by aggregating records during the Flow. After all processing, use a single Update Records element to persist changes for the whole collection.
2) Avoid DML and queries inside Loops
Do not put Get Records, Create, Update, or Delete Records inside a Loop element. Instead:
- Use the Loop to build a collection of records to update via Assignment.
- After the Loop completes, call Update Records once with the collection.
3) Use Before-Save Flows for fast field updates
When you only need to modify fields on the triggering record(s), prefer Before-Save Record-Triggered Flows. They allow you to change record values without issuing a separate DML — Salesforce applies changes during the save, which is much faster and more efficient for bulk updates.
4) Use Get Records with filters instead of querying inside loops
When you need related records, query once (or few times) with Get Records using filters that return the needed set. Store results in collections and reference them via Collection operations or helper maps (simulate maps by using nested loops cautiously).
5) Batch large collections using Scheduled Paths or chunking
If you must process thousands of records, implement chunking: split the large set into manageable page sizes (for example, 200–2,000 depending on complexity) and process each chunk via Scheduled Flows or by invoking an autolaunched Flow multiple times. For extreme scale, use Batch Apex or Queueable Apex.
6) Offload heavy processing to async (Platform Events, Queueable, or Batch Apex)
When heavy computations or many DML operations are required, call an invocable Apex (Queueable or Batch) from the Flow. This keeps the Flow lightweight and avoids synchronous governor limit exhaustion.
7) Error handling and bulk-aware fault management
For bulk operations, add Fault paths to capture and log failures (create a custom object or send Platform Events). Capture the record Ids and error messages — avoid stopping the entire batch for a single record if possible.
Example: Bulk-update Accounts’ custom flag based on Opportunities
Overview: Update a custom checkbox Has_Open_Opps__c on Accounts if they have any Open Opportunities.
- Use a Scheduled Flow or an Autolaunched Flow invoked with a list of Account Ids.
Get Records: Query Opportunities withStageName NOT IN ('Closed Won','Closed Lost')and group by AccountId (or just fetch Opportunities and handle duplication in Flow).- Build a Record Collection of Accounts to update via Loop and Assignment: set
Has_Open_Opps__c=truefor accounts that appear in the Opportunity results; otherwisefalse. - Outside the Loop:
Update Recordsonce with the Account collection.
/* Pseudo-Flow logic */
Get Records: OpenOpps = SELECT Id, AccountId FROM Opportunity WHERE StageName NOT IN ('Closed Won','Closed Lost')
Get Records: Accounts = SELECT Id, Has_Open_Opps__c FROM Account WHERE Id IN (list of AccountIds to evaluate)
Loop over Accounts -> Assignment: if Account.Id in OpenOpps.AccountId then Set Account.Has_Open_Opps__c = true -> Add to AccountsToUpdate collection
After Loop: Update Records AccountsToUpdate
Best practices checklist
- Prefer Before-Save Record-Triggered Flows for simple field updates.
- Never perform Get/Update/Create/Delete inside a Loop.
- Use collections and single DML statements (Update/Create/Delete Records on collection variables).
- Use Scheduled Flows or autolaunched Flows for large jobs; consider Batch Apex for very large volumes.
- Test Flows with bulk data (e.g., import 200+ records) and monitor debug logs and limits.
- Use invocable Apex for complex logic or when you need to guarantee bulk-safe patterns beyond Flow capabilities.
Conclusion
Handling bulk record processing in Flows is about respecting platform governor limits, minimizing per-record overhead, and designing with collections and single DML operations in mind. With the right patterns — before-save for quick updates, collection operations instead of per-record DML, and asynchronous offloading when needed — Flows can be reliable and performant in bulk scenarios.








Leave a Reply