Short answer
Yes — you can invoke a Batch Apex job from a trigger, but it’s usually not recommended to call Database.executeBatch directly from a trigger because of governor limits, concurrency limits, and potential performance and maintenance issues. Use aggregation patterns (Queueable, Platform Events, intermediate records, or Scheduled Apex) to safely call batch jobs from trigger-driven events.
Why calling Batch Apex directly from a trigger is risky
Triggers execute as part of a synchronous transaction and can fire frequently and for many records. Calling Database.executeBatch directly inside a trigger can lead to:
- Duplicate or excessive batch jobs when many DML events occur.
- Hitting the concurrent batch jobs limit (5 queued/active batch jobs per org).
- Hard-to-debug timing and transactional issues — the trigger transaction will complete while the batch runs asynchronously.
- Mixed asynchronous limits when you chain many async calls (Queueables, future, batch).
Important limits to remember
Key Apex limits to consider when triggering batch jobs:
- Maximum number of batch Apex jobs queued or active concurrently: 5
- Queueable jobs per transaction: 50 (enqueue limit)
- Async job overhead may cause failures if too many jobs are generated by bulk DML
Safe patterns to invoke Batch Apex from a trigger
Use one of these recommended patterns to avoid the pitfalls above:
1) Trigger → Queueable (single enqueued) → Database.executeBatch
Use a trigger handler that aggregates record IDs into a static set and enqueues one Queueable job per transaction. The Queueable then calls Database.executeBatch. This reduces the number of batch jobs and keeps trigger logic lightweight.
public class TriggerToBatchHandler {
private static Set
private static Boolean jobEnqueued = false;
public static void addIds(Map
recordIds.addAll(newMap.keySet());
}
public static void afterCommitEnqueue() {
if (!jobEnqueued && !recordIds.isEmpty()) {
// enqueue a single Queueable which will start the Batch
System.enqueueJob(new BatchLauncher(new List
jobEnqueued = true;
}
}
}
public class BatchLauncher implements Queueable {
private List
public BatchLauncher(List
public void execute(QueueableContext qc) {
Database.executeBatch(new MyBatch(ids));
}
}
public class MyBatch implements Database.Batchable
private List
public MyBatch(List
public Iterable
return [SELECT Id, Name FROM Account WHERE Id IN :ids];
}
public void execute(Database.BatchableContext bc, List
// batch logic
}
public void finish(Database.BatchableContext bc) {}
}
2) Trigger → Buffer (custom object / Platform Cache) → Scheduled Apex / Periodic Batch
Instead of immediately firing a batch, write the changed record IDs into a small custom object (e.g., Batch_Request__c) or Platform Cache. Use Scheduled Apex or a small periodic batch to process accumulated requests at controlled intervals (e.g., every 5–15 minutes). This is robust for high-volume orgs.
3) Trigger → Platform Event → Subscriber invokes Batch
Publish a Platform Event from the trigger and handle it with a triggered subscriber (Apex or Flow). The subscriber can aggregate events and invoke batch processing, which separates concerns and avoids triggers directly enqueuing batches.
When calling Database.executeBatch directly is acceptable
For low-volume triggers (rare updates, small orgs) you may call Database.executeBatch(new MyBatch(ids)) directly in the trigger’s after context. But still consider throttling and ensuring you won’t exceed the concurrent batch limit.
Practical checklist before invoking batch from trigger
- Confirm expected volume of trigger executions.
- Aggregate record IDs to avoid one batch per record.
- Use a Queueable or scheduled approach to control timing.
- Implement retry/backoff or deduplication logic.
- Monitor async job queue and add org-level alerts when limits are near.
Summary
Yes — a trigger can start a Batch Apex job, but avoid calling batch directly in high-volume scenarios. Prefer aggregator patterns (Queueable launcher, buffered requests, platform events, or scheduled processing) to maintain scalability, reliability, and to respect Salesforce async limits.








Leave a Reply