Salesforce Data Cloud: Hands-on

Time required: 45 minutes

Workshop Goal: The Hands-Lab is designed to provide a first-hand experience of the tools and configuration tasks in the Salesforce Data Cloud platform.

Workshop Use Case

Business Requirement Summary

How to have a 360 view of my company's customers, when information them is spread in data repositories and Salesforce Orgs? For example, if a lead referral is created in one system about a customer who also resides in another system then I should be able to centrally orchestrate such processes across multiple systems for a unified view of my business.

The following note describes the detailed use case scenario & solution based on the above requirement.

A company has customers in two different systems (two different Orgs in our example). The customers in different system have different profile values (e.g., different first names, phone numbers) and each system has it's own set of leads, accounts, contacts.

In this lab, we ingest data from the two systems (i.e., the two orgs), harmonize the profiles by mapping them to one common data model in Data Cloud. We then resolve their different identities to one unified profile.

We now create Calculated Insights (i.e., analytics) on these customers and also segment them (i.e., create group) into a set of customers whom we want to reach out to.

Data Cloud has native connection to Marketing Cloud and many other systems so these segments that we create can become easily available in many activation targets (i.e., the marketing and ad systems and also other external systems like Amazon and Google) for use in marketing communications. In our example, we store segmented data into Data Cloud itself which can then be used to trigger Platform Events or be retrieved through API calls for use externally by any app.

Our Story (Use Case)

Steps in this section have been completed for you by the instructor. Participants are only expected to explore or view.

Create Customer 360 & Unify Related Data

In our workshop, using Data Cloud we create a Unified Profile from multiple contact records in various Salesforce Orgs or other systems (e.g., websites, SAP, EDW).

Details of this unified profile can be pulled back into each of the connected systems (e.g., Salesforce Orgs) to enrich the contact record. In our example below we have pulled in the harmonized Person Name from the Unified Profile.

 

We can also pull in related information such as the various addresses and emails of this customer from different Orgs/systems so we have a complete view of the customer (contact).

 

A great value of harmonizing enterprise data is that now all activities/engagements related to this customer in any system/org can be related back to the contact records in all connected system. We have a full 360 degree view of the customer now.

 

Let's see how Data Cloud helps us in achieving this view!

Module - 1: Setup (Configure)

Steps in this section have been completed for you by the instructor. Participants are only expected to explore or view.

Data Cloud has been provisioned in this instance of Salesforce and is available as an app in the Org. This Org is now known as the Home Org of Data Cloud.

Before you can turn on Data Cloud, you need to have the right permissions assigned to you (even if you are a sysadmin).

Data Cloud Permissions

Permission Sets give you granular control such as which users in your org can do what activity in your Data Cloud instance.

When Data Cloud is provisioned in your Org, several out of the box (standard) permission sets become available. For a standard configuration, you do not need to configure custom permission sets.

Add users to permisson sets. For example assign the Data Cloud Admin permission set to the sysadmin who will manage the Data Cloud instance.

"Turn-On" Data Cloud

Turn on Data Cloud by 'Get Started' button in Data Cloud Setup.

This step installs the Standard Data Model Managed Package, creates the default data space and completes other configurations to get your Data Cloud instance ready for use.

It takes about 30 minutes for this step to complete after you press the 'Get Started' button.

Data Cloud App

To be able to view the Data Cloud app in the App Launcher you need at least one of the Data Cloud permission sets assigned to you (even if you have sysadmin profile).

The workshop participants have been assigned the 'Data Cloud Admin' permission set which allows you to examine the Data Cloud configurations, plus the 'default' permission set which give you access to the default data space.

You are a sysadmin in the workshop environment so please be careful! Please do not delete any of the mappings, calculated insights, data streams or other elements in your data cloud instance.

Data Cloud App - Menu Tabs

Here's a quick look at the various Data Cloud tools available as tabs in the Data Cloud app.

Data Cloud App - Setup

The Data Cloud Setup button takes you to the pre-configuration page.

Connector Configuration

Following are the steps for configuring the CRM Connector which will then allow us to create Data Streams for ingesting data.

We click on the CRM Connector and then Connect. The CRM Connector provides a pre-built connection from Salesforce Orgs to Data Cloud. You can set up ingestion of records (i.e., create Data Streams) from either standard or custom objects.

Please also note that in the org(s) that you want to connect to Data Cloud, you need to assign appropriate permissions to the Data Cloud Salesforce Connector permission set.

 

You will find that your 'Home Org' has been already connected automtically. Connect other source orgs and after connecting each source org, rename the 'Connector Name' so they make sense to you and your team.

Install Data Bundle for the CRM Connector

The Data Bundle consists of pre-defined mappings for certain Salesforce objects (Account, Contact, Leads for Sales Bundle and Account, Contact, Cases for Service Bundle).

Module - 2: Data Processing

Let's explore and make ourselves familiar with Data Cloud app.

Steps in this section have been completed for you by the instructor. Participants are only expected to explore or view.

Ingestion: Create Data Stream

This step has been completed for you by the instructor. This is only an informational section for participants.

We create Data Streams to ingest the data from the Orgs that we connected in the previous steps. To create Data Streams we use the Sales Data Bundle for the CRM Connector that we had installed in the previous steps. The Sales Data Bundle is a pre-built template that ingests three objects (Account, Contact and Leads) from the connected Salesforce Org and includes mappings for these objects to the canonical data model in Data Cloud.

The workshop participants do not have access to create Data Stream permission.

However we have included the screenshots of all the screens that we went through to create the Data Streams.

 

 

 

You can also create Formula Fields while creating Data Streams. This feature lets you apply calculations and simple transformation quickly. In addition to formula fields Data Cloud provides separate batch and streaming transform tools for more complex transformations.

 

In the above example we created a Data Stream using a pre-built connector. Data Cloud also supports both batch and streaming ingestion directly through APIs or with MuleSoft connectors. We have included an example of using Data Cloud Ingestion API at a later point in this guide.

Harmonization: Data Stream Mappings

In our next step, harmonization refers to the process of modeling different data sources into one standardized data model available in Data Cloud. The standard Cloud data model can be extended by adding custom objects and custom fields.

Why do we need to harmonize data? In your different sources your customer may be known by different labels/names. E.g., In an email marketing app the customer may be known as a 'subscriber' while in the CRM app the same customer may be known as a 'contact.' In the harmonization step they will both get mapped to the Individual object in Data Cloud's canonical data model.

The first step in harmonization is mapping. When you create a 'Data Stream' the source data lands in a set of objects called Data Lake Objects (DLO). These DLOs are now mapped to the standard data model in Data Cloud which consist of a set of objects called Data Model Objects (DMO).

Let's click on one of the Data Streams and then click on Review Mappings on the right sidebar.

The harmonization process that we see above maps multiple source objects to a set of consistent objects and fields in the canonical data model.

Data Cloud provides AI based auto-mapping suggestions during the Harmonization step.

This harmonized data can then be matched and resolved to unique identities in the next step of Identity Resolution.

Data Model: Review Data Cloud Objects

In the Data Cloud app click on Data Lake Objects tab. Then click dropdown and select All Data Lake Objects.

The incoming data to Data Cloud from a Data Stream lands in a Data Lake Object (DLO). The fields in the DLO (e.g., field name, field type, field length) mirror the fields in your source table.

After we have created the mappings for the Data Stream, the data from the Data Lake Object is replicated to the Data Model Object (DMO) in Data Cloud.

Data Cloud provides a set of pre-built/standard DMOs which constitute the canoncal data model in Data Cloud. You can extend this canonical data model by creating custom DMOs or by adding fields to standard DMOs.

In the Data Cloud app click on Data Model Objects tab. Click on any Data Model Object (DMO) to review its properties. For example an account DMO shows it belongs to the type Profile. You can also have DMO of type Engagement that would store data coming in from websites (e.g., clicks, email opens). The third type of DMO could be of type Other which could be data such as product information (e.g., store address, models of the product).

Data Explorer

Click on Data Explorer in the Data Cloud app. Put in the options as you see in the following screenshot

The Individual DMO (data model object) contains all the individual profiles aggregated from the incoming data source objects e.g., contacts, subscriber etc.

In the next step of Identity Resolution, we shall use match rules to merge individual profiles that belong to the same person in Unified Individual data model object.

In the following step, with Data Cloud’s Identity Resolution, we build a 360-degree view of our customers by combining data from multiple data sources into a single Unified Profile.

Identity Resolution uses a combination of matching and reconciliation. Match Rules are used to link together multiple records into a unified customer profile. Reconciliation rules determine which profile attributes of Individuals will become part of the Unified Profile.

Identity Resolution: Match Rules

This step has been completed for you by the instructor. This is only an informational section for participants.

Many of the customers/contacts in the two different systems (e.g., the two Salesforce Orgs) belong to the same individual person. However the contacts have different profiles. For example in one Org the person has a contact name Bobby Lombard while in another Org, the person has the name Robert Lombard.

We want to match these different contacts into one unified profile. This will allow us then to aggregate all the leads and other engagement data under this unified profile for a 360 degree view of the customer.

We start by defining match rules. We choose First Name and eMail as the identifiers for our match rule. However you can make the match rule as extensive as you wish using and/or constructs. You can also include custom identiers in your match rule such as MDM ID.

In our match ruleset, we leverage AI /LLM based fuzzy first-name matching approach that leverages to match individuals.

We went through the following screens in this step.

 

 

 

 

 

Identity Resolution: Reconciliation

In the Data Cloud app, click on the tab Identity Resolution - and then click on the "Individual" ruleset name.

In this step you can view the default reconcilation rules that we used.

While match rules that we used in the previous steps are used to link together data into a unified customer profile, reconciliation rules determine the logic for data selection. For example, if the contact has two different first names in two Orgs, a reconciliation rule helps to determine which one to actually use in the unified individual profile.

Examples of reconciliation rules are - Last Updated, Most Frequent, Source Priority.

A Data Cloud admin would have the ability to change these reconciliation rules as desired for each field.

Profile Explorer

Profile Explorer lets you examine a unified profile for an account or individual - including objects related to the unified profile, related engagements and insights (i.e., Calculated Insights defined on Unified Profiles).

Click on Profile Explorer in the Data Cloud app. Put in the options to filter the unified profiles as you see in the following screenshot (i.e., Unified Individual, First Name, Robert). Please note that Profile Explorer is case sensitive (e.g., Robert vs robert)

Click on View. You now see that two different contacts with two different first names have been merged into one unified individual profile using the rulesets and reconciliation rules that we had chosen.

Congratulations! You have now completed the key foundational steps for Data Cloud. In the next sections we examine some of the ways of leveraging Data Cloud in your business applications.

Module - 3: Take Action

Participants in this section are invited to do the hands-on labs.

When creating any of Data Cloud elements like calculated insight, activations etc - please prefix those elements with the first part of your username.

Segmentation & Activation

Let's now create a segment to identify target customers for our new marketing campaign.

Segment Creation - Step I : Calculated Insight
Calculated Insights and Streaming Insights are analytics tool natively available in the Data Cloud. Calculated Insights perform complex calculations on data aggregated in batches, while Streaming Insights work on events happening in near-real time. Calculated Insight lets you define metrics data in Data Cloud using either SQL or point and click builder. These metrics can then be used in other parts of Data Cloud e.g., as an attribute while creating marketing segments. Let's create a Calculated Insight.

First we create a calculated insight to identify customers who have purchased a product from the company, i.e., who have a sales order.

 

Replace the placeholder code with the following sql.

SELECT SUM(ssot__SalesOrder__dlm.ssot__TotalAmount__c) as customer_spend__c, ssot__Individual__dlm.ssot__PersonName__c as Name__c, ssot__Individual__dlm.ssot__Id__c as custid__c FROM ssot__SalesOrder__dlm JOIN ssot__Individual__dlm ON ssot__SalesOrder__dlm.ssot__BillToContactId__c = ssot__Individual__dlm.ssot__Id__c GROUP BY custid__c, Name__c

Gotcha: If the above code gives a syntax error check in your environment for an existing Calculated Insight (CI) called 'Customer Spend'. Use the code from that CI as it's possible that in your environment the field names may be slightly different.

 

 

Please note: We have used a simple Calculated Insight (CI) function in our workshop example so it's easier to understand the syntax and the steps for creating the CI. For your projects you can create much more complex Calculated Insights as in these Calculated Insight examples from Salesforce.

Segment Creation - Step II: Use Attributes From Profile and Calculated Insight

We use attributes from both the Individual DMO and the Calculated Insight we had created to create a marketing segment.

You need to drag the fields from the left sidebar to the right, not just click on it.

Please note - In our example we are using Individual DMO for simplicity. In real-life scenarios you may want to use review segmentation best practices to determine the best entity to segment on.

 

 

 

 

 

 

Segment Creation - Step III: Activation

In this section first we create an activation target. An activation target could be Marketing Cloud, an Ad platform, Data Cloud or other systems such as AWS.

For our workshop we choose Data Platform (i.e., Data Cloud) itself as the activation target. Selecting Data Cloud as the activation target would save the segment data into a DMO in Data Cloud. This data can then be used for customer engagement by pulling it via APIs or Data Actions in any connected system such as Salesforce Orgs.

 

 

 

 

 

 

Now we have an activation target. Let's create an activation from the segment data targeting our activation target.

 

 

 

 

 

After Activation, we would need to publish the Segment again. After both the segment and activation are published, you can view the result of your activation, i.e., the list of segment members in the DMO you had specified in activation, using the Data Explorer tool.

Please note: In real life scenarios you would include additional attributes during segmentation and activation such as email address or home address or SMS - so that the marketing communication could be sent to the list of customers who are in your segment.

 

In the above example, we have chosen Data Cloud itself to be an activation target. What this means is that the segement data is stored in a custom Data Model Object (DMO) that gets created in Data Cloud. Data from this custom DMO can then be extracted via APIs in any desired app for using as a contact list for a marketing campaign.

In addition, Data Cloud supports out of the box integration with Salesforce Marketing Cloud as an activation target. Other ou of the box integrations for activation targets include SFTP, GCS or Azure Storage and ad platforms such as Google or Meta Advertising Platforms.

You have now completed many of the key ways of interacting with Salesforce Data Cloud. Congratulations!

 

Module - 4: OPTIONAL

This is an advanced optional section. If you prefer to skip it, we have included complete set of screenshots of each step that you can review without actually doing the steps in live environment.

Vizualize Data Cloud Data with Tableau Cloud

Tableau Analytics on Data Cloud data

Tableau Cloud provides native connector for Data Cloud, point and click analysis to discover hidden insights, and lets you take action with AI-powered insights in the flow of work.

How to Connect Tableau Online with Data Cloud

Let's sign up for a 15 day free Tableau Cloud trial.

 

 

 

 

 

 

Now that you have the connection from Tableau Cloud to Salesforce Data Cloud you can start visualizing Data Cloud data.

The following guide provides the basics of Data Vizualization in Tableau: Data Vizualization in Tableau Cloud

Data Actions with Platform Events & Flow

Since only a limited set of Data Action Targets can be defined in a Data Cloud instance participants may need to create Data Actions on a common/shared set of Data Action Targets. Alternatively please request the facilitators for access to a separate org if the currently configured Data Action Targets in their Data Cloud instance is not suitable for their needs.

Create Data Action and Monitor Payload in Target Org

We consider a very common scenario of orchestrating and monitoring business processes for leads across the different systems in an enterprise.

The following is a rough high level sketch of the steps for passing leads from Org 2 (source) to Org 1 (destination) via Data Cloud.

 

We can create a Data Action on a DMO or CI and get the payload using a Flow in the target org. The Flow could be triggered by the special Platform Event (Data Object Data Change Event) and the content of the payload can be mapped to fields using Apex or the Flow Mapper action pack available from unoffocialSF site.

In our case we are going to create a Data Action on Lead DMO and our target would be Home Org for simplicity.

We start by connecting the target org explicitly (even if it is the Home Org) This step has already been configured for all participants.

 

We define the Data Action Target. This step has already been configured for all participants.

 

 

 

We define the Data Action.

 

 

 

 

 

 

We use Streaming Monitor app in the target Org to monitor incoming Platform Events. Data Object Data Change Event is a standard Platform Event that is designed to receive changes from Data Actions.

 

 

 

We trigger the Data Action by creating a lead in a connected source system.

Please ask your faciliator to create a lead in the connected source system, as participants may not have access to a connected source orgs.

 

 

We refresh the data stream in Data Cloud to ingest the newly created lead in the source system

 

 

We can use the free Streaming Monitor app from AppExchange to check for incoming Platform Events.

In the Streaming Monitor app we see the new event notification that also shows the json payload information about the new lead. Each of the participant Data Cloud Org has Streaming Monitor app installed.

 

 

Please note that the notifications for Platform Events via Streaming Monitor may be delayed. You may also need to ensure your browser is not blocking notifications.

While the above streaming monitor app is a useful tool to monitor incoming Platform Events, in real life we may want to process the JSON payload in the target application as in the following steps in the next sub-section.

Salesforce Flow to Process Data Action Payload

In the target org - we go to Setup -> Flow, and create a flow that will get triggered by a standard platform event called Data Object Data Change Event.

Data Object Data Change Event is the Platform Event that our Data Action will update.

The following screen shows the flow that we use. In the Flow, we use Data Object Data Change Event to retrieve the payload generated by Data Action.

The payload from the Data Object Data Change Event is in the form of a json string. We need to extract the values corresponding to the keys that we need.

In our flow, we use the Data Mapper tool to convert JSON to Flow Variables, and then insert the value of the variables in a Salesforce Object in the target org (e.g., Lead Object in target org).

You could also use an invocable apex program as an alternative to the Flow DataMapper.

 

 

 

 

 

 

 

 

 

You can view the above flow in Setup in your home org.

In real life implementation you can use similar wiring as above to create complex multi-system orchestrations - e.g., send leads from Data Cloud to destination orgs based on certain conditions, trigger data actions to capture status of leads back to the org in which lead originated(e.g., if the lead that was sent from one org to another is converted to an opportunity).

Data Transformation

There are two Data Transformation tools - batch and streaming. In this workshop we walk through batch data transformation.

In our example we focus on introducing the Batch Data Transform tool and the salient aspects of a transform process such as matching the primary and foreign keys. For an in depth explantion of all the features in the data transformation and data preparation tools please refer to the Help & Training doc.

Transform to Combine Data from Two Objects

In this transformation example, we combine two DMOs (Individual and Contact Email) and output their result in a 3rd custom DMO.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Hope you enjoyed the hands-on. Thank you!

Updated Jan 22, 2024. Faizi Fakhruddin