Analyzing Scheduled Flow Execution Discrepancies
It is a common frustration when a Salesforce Scheduled Flow behaves perfectly during debug execution—confirming the correct run time and processing logic—but appears to execute silently or perform zero DML operations when run by the scheduler in production. If you receive no failure emails and the Flow Monitoring tool shows a successful run, the root cause is often related to the execution context, user permissions, or data access when running asynchronously via the scheduler.
Common Pitfalls for Scheduled Flow Success vs. Debug
When a Flow is run via the scheduler (as opposed to manually debugging), it runs under specific constraints. Verify the following areas, as they frequently cause successful logic in debug to fail silently in production:
1. User Context and Permissions
Scheduled Flows run as the System Context unless explicitly configured otherwise (which is generally not possible for the standard scheduler trigger). However, issues can arise if the intended user context differs from the execution context.
- Running User Profile/Permission Sets: While the Flow runs in system context, confirm that the underlying
GetorUpdateelements are not implicitly relying on record-level security (FLS or Object permissions) that might be inconsistently applied or misinterpreted in the asynchronous environment. Note: System context usually bypasses typical user sharing rules, but FLS is still critical. - Guest User/External Access: If the Flow interacts with records owned by users with restricted access profiles, ensure the system context correctly bridges these gaps (which it should, but verify user licensing if applicable).
2. Data Visibility and Scope
If the Flow executes but updates zero records, the Get Records element is likely returning an empty collection.
- SOQL Visibility: The query executed by the
Get Recordselement in the scheduled context might be subtly different due to sharing rules or object visibility settings that behave differently between the logged-in debug session and the background scheduler. - Filter Logic: Scrutinize the criteria used in your
Get Recordselement. A common mistake is using a record variable or collection element that was populated successfully in debug but is empty when the scheduler fires (e.g., relying on a Process Builder/Workflow that hasn't executed yet, or a prior record creation that didn't occur).
Example: Debug vs. Scheduled Query Discrepancy
In debug, a variable $InputDate might be manually set to today. In the scheduled context, if the Flow expects this date to be populated by an external source or an initial value that doesn't translate correctly, the query fails:
// Hypothetical SOQL executed by Get Records element
SELECT Id, Status FROM Opportunity WHERE CloseDate = {!$Flow.CurrentDate}
If the Flow's configuration expects an environment variable that is null during scheduled execution, the resulting query returns no rows, and the subsequent Update element does nothing.
3. Asynchronous Execution Limits and Transaction Boundaries
Scheduled Flows execute as a single, asynchronous transaction. They are subject to governor limits, but they do not inherently offer built-in retry logic upon transient errors unless configured explicitly.
- Data Volume: If the criteria return a massive set of records that triggers a query limit (e.g., returning too many rows), the Flow might fail to process the update loop correctly, even if the initial execution appears successful.
- DML Limits: If the Flow is processing many records in a loop, confirm the total DML operations (especially
Update RecordsorCreate Recordselements) stay well within the 10,000 operation limit per transaction.
4. Scheduling Configuration Verification
Double-check the initial scheduling setup within Setup.
- Recurrence Pattern: Verify the start time, end time (if applicable), and frequency (e.g., daily, every N hours). Ensure the timezone offset used by the scheduler aligns with expectations.
- Active Status: Confirm the Scheduled Flow definition itself is marked as Active.
Debugging Strategy: Capture Execution Context
Since the production logs might be too sparse, modify the Flow temporarily to log the exact execution context when run by the scheduler:
- Add Logging: At the start of the Flow, add an Assignment element to capture key context variables into a global variable or a custom logging record.
- Log Query Results: After the
Get Recordselement, use a Decision element to check if the resulting collection is empty. - Create Debug Record: If the collection is empty, create a new custom object record (or a temporary record) containing the time of execution, the user context (if accessible), and the criteria used in the query. Save this debug record.
- Inspect the Record: When the scheduled time passes, check the newly created debug record. This record will tell you definitively what data the Flow thought it was operating on.
Key Takeaways
When a Scheduled Flow runs without errors but performs no DML, focus debugging efforts away from the Flow's internal logic and towards data ingress (what data it finds) and contextual constraints (user permissions and sharing rules specific to the asynchronous runner). Always log the output of your Get Records element immediately after execution to confirm if zero records were returned.
Leave a Comment