A Better Way to Debug a Screen Flow

Have you ever wished you could debug a screen flow using the same UI that you have when you are debugging a triggered flow?

Record Triggered Flow Debugger

You get to see the path taken through the flow and you can chose to expose or hide the details behind each of the executed nodes in the flow.

This is much easier then what you get with a screen flow, which is an almost endless run-on list of everything that happened in the flow.

Screen Flow Debugger

When debugging a screen flow, you are constantly having to scroll and search trying to find the exact information you need to debug your flow.


Wouldn’t something like this make debugging a screen flow much easier?

Screen Flow Debugging Improved

I’ve created a special subflow you can install in your orgs and use in your screen flows to give you this improved debugging experience. You simply insert this subflow into your screen flow where you want it to stop. You then execute the flow from the system or directly from the flow builder debug button. An error will be generated when reaching the subflow and you will get an email with a link that, when clicked, will take you to the flow builder in debug mode with the path outlined and each individual step available to separately review.


If you want to be able to execute your flow both with or without triggering the debug error, just follow these steps.

  • Create an boolean variable called vDebug and make it available for input

Add the “Debug – Force Flow Error – Subflow” subflow to your flow and pass in your vDebug variable. If the value of the variable is True or you don’t pass in anything (Default=True), the error will be triggered. If you pass in a value of False (vDebug was not checked), the error will be bypassed and your flow will continue normally.

  • When debugging the screen flow from the Flow Builder, check the checkbox for vDebug
  • When you run your flow, you will get an error message when the subflow is executed
  • You will also get an error email with a link you can click to be taken to your flow with the improved debugger UI

TIP: You can Cut/Paste the error node to move it around your flow to try out different debugging scenarios.


Installation

Production or Developer Version 1.2
Sandbox Version 1.2

Post Installation: Make sure the flow “Debug – Force Flow Error – Subflow” is Activated

Package Contents:

  • Custom Object – DebugForceError__c
    • Custom Field – ForceValidationError__c
    • Validation Rule – TriggerFlowError
    • Page Layout
  • Flow – Debug_Force_Flow_Error_Subflow

Notes

  • The screen flow version being debugged must be in Auto-Layout mode
  • The screen flow version being debugged must be Active
  • There is a limit to the size of the Flow Interview that may cause the email link to not be available. (I’m not sure yet how big that is)
  • You must be a user who receives flow error emails (see documentation)
Advertisement

Use Flow to get the running User’s Time Zone offset from GMT

A couple of years ago, I created a component to convert a Date value to a Datetime value in a Flow. Recently, Andy Engin Utkan, figured out a way to use this component to overcome issues he was having when using a Display Text component in a Flow when trying to show Datetime values and have them display in the correct time zone.

You are unable to use a formula in Salesforce to determine a User’s time zone. Admins have created very complex formulas trying to calculate an offset based on the User’s State or Country but then they ran into issues trying to handle Daylight Savings Time adjustments as well.

Here’s an example presented by Eric Praud on Jen Lee’s “How I Solved This” Admin Podcast where he created a new custom object, added 9 custom fields to the User object and came up with this formula to get the hour of the day when converted to the User’s local time:

IF( OR(
ISBLANK( $User.Summertime_Start_Offset__c ),

CreatedDate< DATETIMEVALUE(DATE(YEAR(DATEVALUE(CreatedDate)),MONTH($User.Summertime_Start_Date__c),DAY($User.Summertime_Start_Date__c))
-(WEEKDAY(DATE(YEAR(DATEVALUE(CreatedDate)),MONTH($User.Summertime_Start_Date__c),DAY($User.Summertime_Start_Date__c)))-1)) + $User.Summertime_Start_Offset__c /24,

CreatedDate>=
DATETIMEVALUE(DATE(YEAR(DATEVALUE(CreatedDate)),MONTH($User.Wintertime_start_Date__c),DAY($User.Wintertime_start_Date__c))
-(WEEKDAY(DATE(YEAR(DATEVALUE(CreatedDate)),MONTH($User.Wintertime_start_Date__c),DAY($User.Wintertime_start_Date__c)))-1))+$User.Wintertime_Start_Offset__c/24

),

HOUR(TIMEVALUE(CreatedDate+$User.GMT_Offset__c /24))
+IF( AND($User.Southern_Hemisphere__c, NOT(ISBLANK( $User.Summertime_Start_Offset__c ))),1,0)
-IF(HOUR(TIMEVALUE(CreatedDate+ $User.GMT_Offset__c /24))
+IF( AND($User.Southern_Hemisphere__c, NOT(ISBLANK( $User.Summertime_Start_Offset__c ))),1,0)>23,24,0)

,
HOUR(TIMEVALUE(CreatedDate+(1+ $User.GMT_Offset__c )/24))
-IF( AND($User.Southern_Hemisphere__c, NOT(ISBLANK( $User.Summertime_Start_Offset__c ))),1,0)
-IF(HOUR(TIMEVALUE(CreatedDate+(1+ $User.GMT_Offset__c )/24))
-IF( AND($User.Southern_Hemisphere__c, NOT(ISBLANK( $User.Summertime_Start_Offset__c ))),1,0)>23,24,0)

)

Andy figured out that taking the difference between the Datetime returned by my component (midnight GMT) and the Datetime returned by the Flow formula DATETIMEVALUE(Date) you would get the GMT Offset for that date based on the running User’s time zone.

Here’s a sample sub-flow I created that you could use to get the User’s GMT Offset for any date. You can call this sub-flow anywhere you need to get the offset to use in Datetime value calculations for the User’s time zone.

A Date Variable is created for Input and a Number Variable is created for Output.

A Formula is used to assign a default value of Today if no date value is passed into the flow.

The result of the fDate formula is passed into the Convert Date to Datetime Flow Action

And finally, the difference between the two Datetime values is calculated, converted to hours and passed back to the calling Flow.

Here’s an example of how you could call this sib-flow from your flow.


Depending on the date, the difference between GMT (Greenwich Mean Time) also known as UTC (Coordinated Universal Time) and the User’s time could be different.

For Example:

From the second Sunday of March at 07:00 UTC until the last Sunday of March at 01:00 UTC, London is four hours ahead of New York.

From the last Sunday of March at 01:00 UTC until the last Sunday of October at 01:00 UTC, London is five hours ahead of New York.

From the last Sunday of October at 01:00 UTC until the first Sunday of November at 06:00 UTC, London is four hours ahead of New York.

From the first Sunday of November at 06:00 UTC until the second Sunday of March at 07:00 UTC, London is five hours ahead of New York.

So, for me in the Eastern Time Zone, running the Flow with a Date of 3/1/2022 returns a value of -5 for my offset from GMT.

However, because of Daylight Savings Time, running the Flow with a date of 3/15/2022 returns the correct GMT offset of -4 for that date.


Get the Convert Date to Datetime Flow Action on UnofficialSF.
https://unofficialsf.com/convert-date-to-datetime-flow-action/

Find more from Eric Smith on his blog.
https://ericsplayground.wordpress.com/

Find out more from Andy Engin Utkan on his blog.
https://salesforcebreak.com/

How I use Batch Files to Package my Datatable Component for Release

Narender Singh (ForcePanda) recently wrote a blog post on how he uses SFDX Packaging Commands in VSCode to release component packages. Like Narender, I found it difficult to remember and reuse all of the CLI commands needed to create packaged versions of my components.

When I release a new update for my Datatable component, I need to generate a new unmanaged package with a link so users can install and upgrade the component. To create the new package I first need to create a new version by executing a command line command that looks something like this:

sfdx force:package:version:create -v lexhost -w 10 -x -c -n 3.2.1.0 -d force-app\

Then, to make the new package available, I issue another command similar to this:

sfdx force:package:version:promote -v lexhost --package "datatable@3.2.1-0"

To save all of this typing and to make sure I’m issuing the correct syntax for the commands each time, I created a series of Batch Command files and included them in a packaging directory in my Datatable source code project.

To create a new version, I use my CreateNewVersion.bat command file. This file takes an optional input where I can specify the new version number like this:

>CreateNewVersion 3.2.2

To execute the second command, I would need execute another command file by entering:

>Promote 3.2.2

To avoid any typos with mismatched version numbers, I created another command file to ask for and store the version number that is then used by the other commands if no version number is provided on the command line.

Here are each of the Batch Command Files I created and use to publish updates for Datatable.

SetVersion.bat
--------------
@echo off
if "%1" neq "" goto skipprompt
set /p version="Set Version Number: "
goto exit
:skipprompt
set version=%1
:exit
ShowVersion
@echo on

ShowVersion.bat
---------------
@echo off
echo Version: %version%
@echo on

CreateNewVersion.bat
--------------------
@echo off
if "%1" neq "" set version=%1
@echo on
sfdx force:package:version:create -v lexhost -w 10 -x -c -n %version%.0 -d force-app\

Promote.bat
-----------
@echo off
if "%1" neq "" set version=%1
@echo on
sfdx force:package:version:promote -v lexhost --package "datatable@%version%-0"

Details.bat
-----------
@echo off
if "%1" neq "" set version=%1
@echo on
sfdx force:package:version:report -v lexhost --package "datatable@%version%-0"

Here’s an example of them in actual use when I released a recent Datatable update: (My entries shown in bold)

D:\esmit\Documents\VSCode\LightningFlowComponents\flow_screen_components\datatable>packaging\setversion
Set Version Number: 3.4.5
Version: 3.4.5


D:\esmit\Documents\VSCode\LightningFlowComponents\flow_screen_components\datatable>packaging\createnewversion 

D:\esmit\Documents\VSCode\LightningFlowComponents\flow_screen_components\datatable>sfdx force:package:version:create -v lexhost -w 10 -x -c -n 3.4.5.0 -d force-app\
Request in progress. Sleeping 30 seconds. Will wait a total of 600 more seconds before timing out. Current Status='Queued'
Request in progress. Sleeping 30 seconds. Will wait a total of 570 more seconds before timing out. Current Status='Verifying dependencies'
Request in progress. Sleeping 30 seconds. Will wait a total of 540 more seconds before timing out. Current Status='Verifying dependencies'
Request in progress. Sleeping 30 seconds. Will wait a total of 510 more seconds before timing out. Current Status='Verifying dependencies'
Request in progress. Sleeping 30 seconds. Will wait a total of 480 more seconds before timing out. Current Status='Verifying dependencies'
Request in progress. Sleeping 30 seconds. Will wait a total of 450 more seconds before timing out. Current Status='Verifying dependencies'
Request in progress. Sleeping 30 seconds. Will wait a total of 420 more seconds before timing out. Current Status='Verifying dependencies'
Request in progress. Sleeping 30 seconds. Will wait a total of 390 more seconds before timing out. Current Status='Verifying dependencies'
Request in progress. Sleeping 30 seconds. Will wait a total of 360 more seconds before timing out. Current Status='Verifying metadata'
Request in progress. Sleeping 30 seconds. Will wait a total of 330 more seconds before timing out. Current Status='Verifying metadata'
Request in progress. Sleeping 30 seconds. Will wait a total of 300 more seconds before timing out. Current Status='Verifying metadata'
Request in progress. Sleeping 30 seconds. Will wait a total of 270 more seconds before timing out. Current Status='Verifying metadata'
Request in progress. Sleeping 30 seconds. Will wait a total of 240 more seconds before timing out. Current Status='Verifying metadata'
Request in progress. Sleeping 30 seconds. Will wait a total of 210 more seconds before timing out. Current Status='Finalizing package version'
Request in progress. Sleeping 30 seconds. Will wait a total of 180 more seconds before timing out. Current Status='Finalizing package version'
sfdx-project.json has been updated.
Successfully created the package version [08c5G000000kAvYQAU]. Subscriber Package Version Id: 04t5G000003rUrOQAU
Package Installation URL: https://login.salesforce.com/packaging/installPackage.apexp?p0=04t5G000003rUrOQAU
As an alternative, you can use the "sfdx force:package:install" command.

D:\esmit\Documents\VSCode\LightningFlowComponents\flow_screen_components\datatable>packaging\Promote

D:\esmit\Documents\VSCode\LightningFlowComponents\flow_screen_components\datatable>sfdx force:package:version:promote -v lexhost --package "datatable@3.4.5-0"
Are you sure you want to release package version datatable@3.4.5-0? You can't undo this action. Release package (y/n)?: y
Successfully promoted the package version, ID: 04t5G000003rUrOQAU, to released. Starting in Winter ‘21, only unlocked package versions that have met the minimum 75% code coverage requirement can be promoted. Code coverage minimums aren’t enforced on org-dependent unlocked packages.

D:\esmit\Documents\VSCode\LightningFlowComponents\flow_screen_components\datatable>packaging\Details

D:\esmit\Documents\VSCode\LightningFlowComponents\flow_screen_components\datatable>sfdx force:package:version:report -v lexhost --package "datatable@3.4.5-0"
=== Package Version
Name                            Value
──────────────────────────────  ─────────────────────────────────────────────
Name                            Datatable v3
Subscriber Package Version Id   04t5G000003rUrOQAU
Package Id                      0Ho5G000000XZNaSAO
Version                         3.4.5.0
Description                     Datatable Flow Screen Component by Eric Smith
Branch
Tag
Released                        true
Validation Skipped              false
Ancestor                        N/A
Ancestor Version                N/A
Code Coverage                   92%
Code Coverage Met               true
Org-Dependent Unlocked Package  No
Release Version                 54.0
Build Duration in Seconds       426
Managed Metadata Removed        N/A
Created By                      0055G00000607zRQAQ

Let’s Revisit How to use both the Selected and the Edited records in a Datatable

A year ago, I showed you how you can use a Flow with a Loop and a special Apex action to update the Selected records from the Datatable component with the Edited records from the same Datatable.

Now I’m going to show you how you can get rid of the Loop and use a different action and a simple assignment to produce the same results.

My Datatable Flow Screen Component allows a user to both Select records and make Edits to records.  The component returns two separate collection variables.  One of them includes the original values of just the Selected records.  The other one includes just the Edited records whether they were selected or not.

Sometimes, you may want to process just the selected records in your flow, but include the edited values in those selected records.  This sample flow shows how you can create a selected record collection with edits.  The flow uses the Get Common and Uncommon Records action that is part of the Collection Actions from UnofficialSF.com to compare the collections returned by the Datatable and extract the common and unique records from them.  

Here’s a Datatable displaying all of the Product records from an Opportunity.

I’ll select 3 records and make edits to 2 of them.

My normal outputs from the Datatable include the original 3 selected records and a separate collection of just the edited records.

To create a record collection that combines these two requires just a couple extra steps in your flow.  The magic happens in the Get Common and Uncommon Records Flow Action.  This action takes two separate record collections, Source (Selected) and Target (Edited), along with a field from each (Id) that is used to match them up and returns 4 separate record collections.

  • Source Common – Records from the Source collection that are also in the Target collection
  • Source Unique – Records from the Source collection that are not in the Target collection
  • Target Common – Records from the Target collection that are also in the Source collection
  • Target Unique – Records from the Target collection that are not in the Source collection

First, I created a Record Collection variable for Opportunity Product that will be used to store the combined outputs from the Action.

I pass the outputs from the Datatable into the Get Common and Uncommon Records Action.

We are only going to use two of the output collections and they will be combined into a single collection of Selected records that include any of the Edits made to them.

  • Source Unique – All Selected records that are not also Edited records
  • Target Common – All Edited records that are also Selected records

An Assignment node is used to Add each of the desired collection outputs from the Action into our final record collection of selected and edited records.


As you can see in the Results Screen, the Selected Records are unique from the Edited Records, but the combined Selected Products with Edits includes all Selected with any Edits made to those records.

Validation Checker Flow Action

Validation Checker Flow Action

Created by Eric Smith


See https://unofficialsf.com/validation-checker-flow-action/ for the most up to date information on this component.


Don’t your users just love it when they see this on their screen?

The fun really starts when you get this email in your inbox.

You could certainly take the time and effort to add a bunch of decision elements and recreate all of your validation rules in your flow.  I bet it would be even more exciting trying to keep all of that maintained.  

Wouldn’t it be nice if your Flows could check for and trap validation rule failures, missing required fields and text overruns?

Well now they can with this simple Check Validation flow action.  

Before you attempt a Create Records or Update Records element in your Flow, add this simple Flow Action and pass it a single record or a collection of records.

This action will check your record(s) and let you know if there would be any Validation Rule errors, field size overruns or missing required field errors if you tried to create or update the record(s).

All this is done before you try to do something that would cause the Flow to fail with an unhandled fault.  You get to determine what to do next and what information you want to present to your user.


You can also use this action as part of your Fault Paths. Instead of just handling the fault without knowing exactly what it is, you can get the error(s) that caused the fault and act on them as you see fit.


In this Flow, a Toast Message will be displayed with any errors and the User will be given a choice of whether or not they want to Try Again or Exit.


You can specify which field from your record you want to include in the error messages to help identify which record(s) failed.

The action returns an isError boolean value along with a text message of the individual errors.  

Validation Rule Failures

Missing Required Field Error

Text Field Size Error

You can even specify an optional input attribute to use this action instead of the Create Records or Update Records to actually perform the inserts and updates if no errors are generated.  If you perform a successful insert of new records, the new record ID(s) will be returned by the action.

Take control of your automations and no more worrying about Flows crashing when User provided values won’t pass Validation Rules or other input errors.


Restrictions

  • If there are multiple types of errors, only one type of error will be returned.
  • If any Text fields are over their size limit, only those errors will be returned.
  • If any Validation Rule fails, all Validation Rule failures will be returned.
  • If any Required Fields are missing, only those errors will be returned.
  • Fields over their size limit are handled first, followed by Validation Rules, followed by Required Fields.

Attributes

AttributeTypeNotes
INPUT
Input RecordSObjectAny Standard or Custom SObject Record
Required: Provide either a Record or Record Collection, not both
Input Record CollectionSObject CollectionAny Standard or Custom SObject Record Collection
Required: Provide either a Record or Record Collection, not both
Record Identifier Field API NameString (Field API Name)The record’s value for this field will be included as part of the error message.
NOTE: The field must be included in the Record or Record Collection.
Optional, Default: Id
If no errors, commit inserted & updated records?BooleanSet to True if you want the action to upsert the record(s) if there are no errors
Optional, Default: False
OUTPUT
isErrorBooleanTrue if there were any errors
errorMessagesStringA single string that includes the error message for each failing record
firstInsertedIdStringIf the commit attribute is set to True and there are no errors, this will be the recordId of the first inserted record 
insertedIdCollectionString CollectionIf the commit attribute is set to True and there are no errors, this will be a String collection of the recordIds of all of the inserted records

Installation

Production or Developer Version 1.1

Sandbox Version 1.1


View Source

Source Code

How to Use an Apex-Defined Object with the Datatable Flow Component

How to Use an Apex-Defined Object with the Datatable Flow Component

Updated 1/23/21 to reference the v3 version of the Datatable that utilizes a Custom Property Editor.

I’ve updated my Datatable Lightning Web Component for Flow Screens to support a User Defined (also know as an Apex-Defined) object.

See my Flow and Process Builder List View with Batch Delete App for an example.

To work with an Apex-Defined object in your Flow, you need to create an Apex Descriptor Class for the object.

SampleClassDescriptor.cls

// Apex-Defined Variable Sample Descriptor Class
public with sharing class SampleClassDescriptor {

    // @AuraEnabled annotation exposes the methods to Lightning Components and Flows
    @AuraEnabled
    public String field1;

    @AuraEnabled
    public String field2;

    @AuraEnabled
    public Boolean field3;

    @AuraEnabled
    public Integer field4;    

    // Define the structure of the Apex-Defined Variable
    public SampleClassDescriptor(
            String field1,
            String field2,
            Boolean field3,
            Integer field4
    ) {
        this.field1 = field1;
        this.field2 = field2;
        this.field3 = field3;
        this.field4 = field4;
    }

    // Required no-argument constructor
    public SampleClassDescriptor() {}
}

In your Flow, you can create and use Apex-Defined record and record collection variables by referencing your Apex Class.

All of the fields in your variable will be available to use in the Flow.

In this sample Flow, I am setting field values in individual records as seen above.

I then add each record to the Apex-Defined record collection variable.

The Datatable component expects a serialized string of the object’s records and fields like the text seen here.

[{"field1":"StringRec1Value1","field2":"StringRec1Value2","field3":false,"field4":10},
{"field1":"StringRec2Value1","field2":"StringRec2Value2","field3":true,"field4":20},
{"field1":"StringRec3Value1","field2":"StringRec3Value2","field3":true,"field4":30}]

Since you can create Apex Flow Actions to work with your Apex-Defined object, I created an action that converts an Apex-Defined record collection to a serialized string that can be passed to the Datatable component. The action will also convert a serialized string back to a record collection. This can be useful in a Flow where you want to act on a collection of selected or edited records that get passed back to the Flow by the Datatable component.

Special Note: Even without an Apex-Defined Class, you can build a String in your Flow formatted as above and use that to populate a datatable.

You can use this code as a template for your own Apex actions designed to work with a Flow.

TranslateApexDefinedRecords.cls

/** 
 * 
 *  Sample Apex Class Template to get data from a Flow, 
 *  Process the data, and Send data back to the Flow
 * 
 *  This example translates an Apex-Defined Variable 
 *  between a Collection of Object Records and a Seraialized String
 * 
 *  Eric Smith - May 2020
 * 
 * 
**/ 

public with sharing class TranslateApexDefinedRecords {         // *** Apex Class Name ***

    // Attributes passed in from the Flow
    public class Requests {
    
        @InvocableVariable(label='Input Record String')
        public String inputString;

        @InvocableVariable(label='Input Record Collection')
        public List<SampleClassDescriptor> inputCollection;     // *** Apex-Defined Class Descriptor Name ***

    }

    // Attributes passed back to the Flow
    public class Results {

        @InvocableVariable
        public String outputString;

        @InvocableVariable
        public List<SampleClassDescriptor> outputCollection;    // *** Apex-Defined Class Descriptor Name ***
    }

    // Expose this Action to the Flow
    @InvocableMethod
    public static List<Results> translateADR(List<Requests> requestList) {

        // Instantiate the record collection
        List<SampleClassDescriptor> tcdList = new List<SampleClassDescriptor>();    // *** Apex-Defined Class Descriptor Name ***

        // Prepare the response to send back to the Flow
        Results response = new Results();
        List<Results> responseWrapper = new List<Results>();

        // Bulkify proccessing of multiple requests
        for (Requests req : requestList) {

            // Get Input Value(s)
            String inputString = req.inputString;
            tcdList = req.inputCollection;


// BEGIN APEX ACTION PROCESSING LOGIC

            // Convert Serialized String to Record Collection
            List<SampleClassDescriptor> collectionOutput = new List<SampleClassDescriptor>();   // *** Apex-Defined Class Descriptor Name ***
            if (inputString != null && inputString.length() > 0) {
                collectionOutput = (List<SampleClassDescriptor>)System.JSON.deserialize(inputString, List<SampleClassDescriptor>.class);    // *** Apex-Defined Class Descriptor Name ***
            }

            // Convert Record Collection to Serialized String
            String stringOutput = JSON.serialize(tcdList);

// END APEX ACTION PROCESSING LOGIC


            // Set Output Values
            response.outputString = stringOutput;
            response.outputCollection = collectionOutput;
            responseWrapper.add(response);

        }
        // Return values back to the Flow
        return responseWrapper;
    }
}

TranslateApexDefinedRecordsTest.cls

@isTest
public with sharing class TranslateApexDefinedRecordsTest {

    static testMethod void test() {

        List<SampleClassDescriptor> inputList = new List<SampleClassDescriptor>();

        TranslateApexDefinedRecords.Requests testRequest = new TranslateApexDefinedRecords.Requests();

        testRequest.inputString = '[{"field1":"value1","field2":"value2"},{"field1":"value31","field2":"value4"}]';
        testRequest.inputCollection = inputList;

        List<TranslateApexDefinedRecords.Requests> testRequestList = new List<TranslateApexDefinedRecords.Requests>();
        testRequestList.add(testRequest);

        List<TranslateApexDefinedRecords.Results> testResponseList = TranslateApexDefinedRecords.translateADR(testRequestList);
        system.debug('RESPONSE - '+testResponseList);
        system.assertEquals(testResponseList[0].outputCollection.size(), 2);
    }

}

When you configure the attributes for the Datatable in your Flow, you need to be aware of these settings:

  1. Start by checking Input data is Apex-Defined in the Advanced section
  2. In the Data Source section, enter your Datatable Record String
  3. Also in the Data Source section, enter a Datatable Record String for any Pre-Selected Rows
  4. There is no Column Wizard so you will have to list your Column Field names and other attributes manually.
  5. For Currency, Number and Percent fields the Column Scales attribute lets you specify the number of places to display after the decimal point. The default is 0.
  6. When you are using an SObject collection, the Datatable component gets information about all of the fields from the system. For a User Defined object, you need to specify the Column Type of data for each field. This can be left blank if all of the columns are text fields.
  7. You are required to provide the name of the datatable’s Key Field. All of the values in this field need to be unique in order for the datatable to function correctly.

There are separate output parameters for Selected and Edited User Defined objects as well.

The (User Defined) outputs will be serialized record strings rather than SObject collections. Be sure to reference the correct ones based on how you assigned the True or False value for the Input data is Apex-Defined attribute.


Get the Sample Flow and Source Code here:

https://github.com/ericrsmith35/Apex-Defined-Example

Datatable Now Includes a Custom Property Editor

See this in action during DreamTX!

Flow Builder Demos 12/17 (1PM2PM3PM Eastern Time)

The Datatable Flow Screen Component has come a long ways from the original Aura component that included separate attributes for 10 different Salesforce objects in a single component. Once Salesforce supported the ability to pick the desired Object in the Flow Builder, at the time of configuration, it was rebuilt from scratch as a Lightning Web Component. Now Datatable has been reimagined again with the addition of a Custom Property Editor that’s used by the Flow Builder whenever a Datatable is added to a Flow Screen.

This image has an empty alt attribute; its file name is image-18-894x1030.png

Custom Property Editors allow a developer to bypass the standard basic list of all component attributes in the Flow Builder and replace it with a Lightning Web Component that can present a logical and formatted interface for the user to configure the component. The CPE developed for the Datatable component takes this even further by including a button that launches a separate Flow that displays a special Datatable the user can interact with to configure their Datatable. I like to refer to this as my Custom Column Configuration Wizard.

Once installed, this component will appear as Datatable in the Flow Builder. DatatableV2 will still work with your existing Flows and can coexist with the new Datatable.

Here are a few examples showing how to build a Flow with a Datatable, how to configure the Datatable using the Custom Property Editor and how a user can interact with a Datatable. For complete documentation, visit the Datatable page.


Build a Flow with a Datatable


Configure a Datatable with the Custom Property Editor


Interact with a Datatable

Import and Export Flows between Salesforce Orgs

Latest Update: 3/19/23 Version 2.0.1

Have you ever wanted to copy or move a Flow or Process Builder from one org to another without having to create a Change Set or rebuild it from scratch?

How about seeing a great Flow that you or someone else has created and would like to share?

Install this Flow in your org if you would like to Export or Import Flows and Process Builders.

Instructions

Export a Flow

  1. Select Setup> Process Automation> Flows
  2. Open Import/Export Flows

  3. Run the Flow
  4. Select Export, choose your Flow and click Next

  5. You will see the export status while the Flow is being transferred
  6. A success message will display once the Flow has been exported
  7. The exported Flow is saved as a Salesforce File linked to your User record.  To see your Files, click on the Waffle, type in Files and select Files.

  8. To download and save your Flow file so it can be shared, select the drop-down arrow and choose Download.

Import a Flow

  1. Select Setup> Process Automation> Flows
  2. Select Import/Export Flows
  3. Run the Flow

  4. Select Import then click on Upload Files

  5. Pick the Flow file from your computer’s file dialog box.  Hint: The file names for Flows and Process Builders will end with .flow-meta.xml
  6. After the file has been uploaded, select Done then click Next

  7. You will see the status while the Flow is being imported and then deployed to the current org

  8. A success message will display once the Flow has been imported and deployed

NOTE: Objects, fields and referenced components must be available and compatible in the new org in order for the Flow to be deployed successfully.


Release Notes:

3/18/23 – Eric Smith – Version 2.0.1
Refactored to use the latest Flow Base Packs
Intallation updated to use Unlocked Package links

1/2/21 – Eric Smith – Version 1.2
Refactored the Flow to use the Flow Base Packs

9/3/20 – Eric Smith – Version 1.1
Updated the Flow Base Components (v1.2.6) to resolve an issue where some Flows would generate an error on Import or Export

9/1/20 – Eric Smith – Version 1.0
Initial Release

Installation Instructions

Source Code

CONVERT DATE TO DATETIME FLOW ACTION

It’s fairly easy to extract the Date portion of a Datetime value in a Flow formula.

DATEVALUE({!datetimeValue})

If {!datetimeValue} = 7/22/2020 5:00 PM then the formula will return July 22, 2020.

It is a bit trickier to convert a Date value to a Datetime value using a formula. 

DATETIMEVALUE(TEXT({!dateValue}) + ” 00:00:00″)

If {!dateValue} = July 22, 2020 you would want the formula to return 7/22/2020 12:00 AM.  Instead the value returned will be converted to GMT so the result I get in Portland, Maine is 7/21/2020 8:00 PM.  For me, that’s 4 hours ahead of GMT.

I could find no easy way to do time-zone calculations in Flow so I created a Flow Action that would keep everything in my current time-zone.

With this action, my result for July 22, 2020 is 7/22/2020 12:00 AM.

If you want a timestamp of other than 12:00 AM, you can pass in the desired hour, minute and second values.


Attributes

AttributeTypeNotes
Date ValueDateThe Date to be used for the Datetime value
Hour Value (Default = 0)Integer(Optional) The Hour value to be used for the Datetime value (0-23)
Minute Value (Default = 0)Integer(Optional) The Minute value to be used for the Datetime value (0-59)
Second Value (Default = 0)Integer(Optional) The Second value to be used for the Datetime value (0-59)

Use Case

Read about an example that uses this action in a Flow to get the Time Zone offset for the running User.


Install

Created by – Eric Smith – July 2020

Unmanaged v1.0 (Production/Developer)
Unmanaged v1.0 (Sandbox)

Source Code

Source

How to use both the Selected and the Edited records in a Datatable

See an updated version of this post here: Let’s Revisit How to use both the Selected and the Edited records in a Datatable

My Datatable Flow Screen Component allows a user to both Select records and make Edits to records.  The component returns two separate collection variables.  One of them includes the original values of just the selected records.  The other one includes just the edited records whether they were selected or not.

Sometimes, you may want to process just the selected records in your flow, but include the edited values in those selected records.  This sample flow shows how you can create a selected record collection with edits.  The flow loops through each of the selected records and if no edits were made to it, it gets added to the final collection.  If an edited version of the record is available, it gets added to the final collection instead.

Here’s the Select & Edit Screen Datatable displaying all of the Product records from an Opportunity.

I’ll select 3 records and make edits to 2 of them.

My final results show the 3 selected records with the edited fields included.


Let’s go through the steps you can follow to combine the outputs from the Datatable component. In this section, I’ll show you how each of the nodes in the flow is configured.

I start with an Opportunity record with an Id that is passed into the flow.  Whenever possible, let the flow Automatically store all field values.


All of the Opportunity Products are added to a Collection (OpportunityLineItem).


Because I am displaying a similar Datatable multiple times in this flow, I store my attributes in variables so they can be reused for each Datatable.


The Datatable is displayed using the attributes assigned earlier.  Here, the user can select and edit records.


I am manually assigning the outputs from the Datatable to new collection variables to make the steps in this flow more readable.


This loop will take us through each of the records in the collection of Selected records where we will perform the next few steps on each record.  New in Summer ‘20, the flow will automatically create a loop variable for you.


In this step, I look for a record in the collection of Edited records with the same Id as the current Selected record in the loop.  The “Find Records In Collection” Flow Action is part of a group of very powerful actions you can include in your flows to act on record collections.  The entire group can be found and installed from here.


The output from the Flow Action will be the matching record from the collection of Edited records if found, otherwise it will return a null value.  This Decision checks to see if that record was found.


When an Edited record is found, it is added to the final collection variable that we will be using for the rest of our flow.


When there is no matching Edited record, we use the Selected record from the Loop instead.


Whichever record we chose now gets added to the final collection variable.


After the end of the Loop, I display another Datatable with the values from the final collection variable.  This is just an example.  In your own flow you may be doing something here like adding updated products to a new renewal Opportunity or something else with the selected and edited records.


Links

DatatableV2
https://unofficialsf.com/datatablev2-lightning-web-component-for-flow-screens/

Collection Actions for Flow
https://unofficialsf.com/list-actions-for-flow/

Sample Flow (Install in Sandbox)
This includes DatatableV2 and FindRecordsInCollection
https://test.salesforce.com/packaging/installPackage.apexp?p0=04t2i0000005IWc