Salesforce Async SOQL

In this blog, I am going to explain the salesforce Async SOQL. Async SOQL you can use for the long-running data-set which will execute on the data set then the data-set result will be passed to object like a pipeline. Async SOQL is a method for running SOQL queries in the background over Salesforce entity data, including subjects, BigObjects, and external objects (accessed via Lightning Connect). Async SOQL is implemented in the form of a Restful API that enables you to run queries in the familiar syntax of the SOQL language. You can run multiple queries in the background, and monitor their completion status.The results of each query are deposited into an object you specify, which can be a sObject or BigObject. As a result, you can subset, join, and create more complex queries that are not subject to timeout limits. For example, let’s say that you want to analyze the years and years of opportunity history data. The results could help you identify which current and future opportunities are more likely to close and give you a better picture of your forecast by reducing your billion rows of data into the manageable size.

Sort of  Pipeline

Async SOQL is acting like a sort of data Pipeline. the pipeline is a set of data processing elements connected in series, where the output of one element is the input of the next one. below image shows the differences.

Async SOQL Versus SOQL
Use standard SOQL when:
• You want to display the results in the UI without having the user wait for results.
• You want results returned immediately for manipulation within a block of Apex code.
• You know that the query will return a small amount of data.
Use Async SOQL when:
• You are querying against millions of records.
• You want to ensure that your query completes.
• You do not need to do aggregate queries or filtering outside of the index

Run Async SOQL

To Run Async Queries you need to make the POST REST API Call to /services/data/v41.0/async-queries the endpoint with JSON encoded list of the request body.

{
"query": "SELECT  Amount ,ClosedDate ,StageName  FROM Opportunity ",
"operation": "insert",
"targetObject": "Sales_OrderBO__b",
"targetFieldMap": {"Amount ":"Actual_Cost__c",
"StageName ":"Saleschannel__c ",
"ClosedDate":"Expired_Date__c"
}
 
}

The response of an Async SOQL query displays the jobID,  SOQL command, details of the field mapping, and the target object. It also displays the query status, which can have one of three values: Running, Complete, or Failed.you can use Job Id to track the status of your Async SOQL.

Few Standard Use Cases from Salesforce docs 

1.Data Archiving
To archive records for long-term retention in a BigObject, Async SOQL can easily move any number of records from a sObject into a BigObject. The data archive process can be achieved via the following programmatic flow.
1. Define source sObject records.
2. Define target BigObject(s).
3. Define sObject to BigObject field mappings.
4. Use Async SOQL to copy records from sObject to BigObject storage.
5. Conceive and orchestrate the delete process driven by the parent IDs via APIs.

Create a new job by sending a POST request to the following URI. The request body identifies the type of object to be queried, the target object, and supporting mapping. This example archives all cases whose status is closed.

Example request body

{
"jobId": "08PB000000003NS",
"query": "SELECT id, CaseNumber, ClosedDate, ContactId, CreatedById, Description
FROM Case WHERE Status = 'Closed'",
"status": "Complete",
"targetFieldMap": {"id": "originalCaseId__c",
"CaseNumber": "caseNumber__c",
"ClosedDate": "closedDate__c",
"ContactId": "contactId__c",
"CreatedById": "createdById__c",
"Description": "Description__c"
},
"targetObject": "ArchivedClosedCases__b"
}

Example response body

{
"jobId": "08PB000000003NS",
"query": "SELECT id, CaseNumber, ClosedDate, ContactId, CreatedById, Description
FROM Case WHERE Status = 'Closed'",
"status": "Complete",
"targetFieldMap": {"id": "originalCaseId__c",
"CaseNumber": "caseNumber__c",
"ClosedDate": "closedDate__c",
"ContactId": "contactId__c",
"CreatedById": "createdById__c",
"Description": "Description__c"
},
"targetObject": "ArchivedClosedCases__b"
}

 

2. Customer 360° and Filtering

in this use case, customers load a variety of customer engagement data from external sources into Salesforce BigObjects and then process the data to enrich customer profiles in Salesforce. The idea is to store customer transactions and interactions such as point-of-sale data, orders, and line items in BigObjects, then process and correlate that data with your core CRM data.  In the following example, we want to analyze the customer data stored in the Rider record of a car-sharing service. The source BigObject, Rider_Record_b, has a lookup relationship with the Contact object, allowing for an enriched view of the contact’s car riding history

Example request body

{
"query": "SELECT End_Location_Lat__c, End_Location_Lon__c, End_Time__c,
Start_Location_Lat__c, Start_Location_Lon__c, Start_Time__c,
Uber_Type__c, Rider__r.FirstName, Rider__r.LastName,
Rider__r.Email
FROM Rider_Record__b WHERE Star_Rating__c = '5'",
"targetObject": "Rider_Reduced__b",
"targetFieldMap": {"End_Location_Lat__c":"End_Lat__c",
"End_Location_Lon__c":"End_Long__c",
"Start_Location_Lat__c": "Start_Lat__c",
"Start_Location_Lon__c": "Start_Long__c",
"End_Time__c": "End_Time__c",
"Start_Time__c": "Start_Time__c",
"Uber_Type__c": "Uber_Type__c",
"Rider__r.FirstName": "First_Name__c",
"Rider__r.LastName": "Last_Name__c",
"Rider__r.Email": "Rider_Email__c"
}
}

Example response body

{
"jobId": "08PB000000000NA",
"query": "SELECT End_Location_Lat__c, End_Location_Lon__c, End_Time__c,
Start_Location_Lat__c, Start_Location_Lon__c, Start_Time__c,
Uber_Type__c, Rider__r.FirstName, Rider__r.LastName,
Rider__r.Email
FROM Rider_Record__b WHERE Star_Rating__c = '5'",
"status": "Complete",
"targetFieldMap": {"End_Location_Lat__c":"End_Lat__c",
"End_Location_Lon__c":"End_Long__c",
"Start_Location_Lat__c": "Start_Lat__c",
"Start_Location_Lon__c": "Start_Long__c",
"End_Time__c": "End_Time__c",
"Start_Time__c": "Start_Time__c",
"Uber_Type__c": "Uber_Type__c",
"Rider__r.FirstName": "First_Name__c",
"Rider__r.LastName": "Last_Name__c",
"Rider__r.Email": "Rider_Email__c"
},
"targetObject": "Rider_Reduced__b"
}

 

a.Field Audit Trail

You can use Async SOQL to query archived fields, stored in the FieldHistoryArchive object. You can use the WHERE clause to filter the query by specifying comparison expressions for the FieldHistoryType, ParentId, and CreatedDate fields,
as long as you specify them in that order.

Example Request body

{
"query": "SELECT ParentId, FieldHistoryType, Field, Id, NewValue, OldValue
FROM FieldHistoryArchive WHERE FieldHistoryType = ‘Account’
AND CreatedDate > LAST_MONTH”,
"targetObject": "ArchivedAccounts__b",
"targetFieldMap": {"ParentId": "ParentId__c",
"FieldHistoryType": "FieldHistoryType__c",
"Field": "Field__c",
"Id": "Id__c",
"NewValue": "NewValue__c",
"OldValue": "OldValue__c"
}
}

Example Response 

 

{
"jobId": "07PB000000006PN",
"query": "SELECT ParentId, FieldHistoryType, Field, Id, NewValue, OldValue
FROM FieldHistoryArchive WHERE FieldHistoryType = ‘Account’
AND CreatedDate > LAST_MONTH”,
"status": "Complete",
"targetFieldMap": {"ParentId": "ParentId__c",
11
Async SOQL Use Cases
"FieldHistoryType": "FieldHistoryType__c",
"Field": "Field__c",
"Id": "Id__c",
"NewValue": "NewValue__c",
"OldValue": "OldValue__c"
}
"targetObject": "ArchivedAccounts__b"
}

 

b.Event Monitoring

Login Forensics and Data Leakage Detection, both currently in pilot, enable you to track who is accessing confidential and sensitive data in your Salesforce Org. You can view information about individual events or track trends in events to swiftly identify unusual behavior and safeguard your company’s data. This is particularly useful for compliance with regulatory and audit requirements.For example, let’s say you want to know everyone who viewed the contact record of your company’s CEO. The key to this query is the CEO’s contact record Id. Let’s say the CEO’s name is Jane Doe, and her Id is 003D000000QYVZ5

Example Request Body

{
"query": "SELECT Soql, SourceIp, Username, EventTime FROM ApiEvent
WHERE RecordInfo Like '%003D000000QYVZ5%'",
"targetObject": "QueryEvents__c",
"targetFieldMap": {"Soql":"QueryString__c","SourceIp":"IPAddress__c",
"Username":"User__c", "EventTime":"EventTime__c",
"UserAgent":"UserAgent__c"
}
}

 

Example Response 

{
"jobId": "05PB000000001PQ",
12
Async SOQL Use Cases
"query": "SELECT Soql, SourceIp, Username, EventTime
FROM ApiEvent WHERE RecordInfo Like '%003D000000QYVZ5%'",
"status": "Complete",
"targetFieldMap": {"Soql":"QueryString__c","SourceIp":"IPAddress__c",
"Username":"User__c", "EventTime":"EventTime__c",
"UserAgent":"UserAgent__c"
},
"targetObject": "QueryEvents__c"
}

 

Salesforce Big Objects

Introduction

Salesforce Big objects let you store and manage massive amounts of data on
the Salesforce platform like storing the huge historical data.Big objects capture data for use within Force.com and are accessible via a standard set of APIs to clients and external systems. big objects are that they have been built to provide consistent performance whether there are 1 million records, 100 million, or even 1 billion records. This scale is what gives big objects their power and what defines the features that are provided.

Two Types of Big Objects 

Standard big objects — Objects defined by Salesforce and included in Salesforce products. FieldHistoryArchive is a standard big object that stores data as part of the Field Audit Trail product.
 Custom big objects— New objects that you create to store information unique to your Org.

How to Create a BigObject?

you need to define the big objects using metadata, not from the standard Salesforce UI. What we are going to do here is we are creating Sales Order Big Object to store the past historical orders. here is the metadata file you can download  Here. After downloading the zip file go to the workbench to create the big object. go to the workbench and deploy the above file as shown below

Once you can deploy the big objects  successfully, you can able to see the big objects that are created in Salesforce as shown below

Considerations

  • Big objects support only object and field permissions.
  • You must use the Metadata API to define a big object or add a field to a custom big object.Not from the UI
  •  SOQL relationship queries are based on a lookup field from a big object to a standard or custom object in the select field list
  • Big objects support custom Lightning and Visualforce components rather than standard UI elements home pages, detail pages, list views, and so on).
  •  You can create up to 100 big objects per org. The limits for big object fields are similar to the limits on custom objects and depend on your org’s license type.
  •  Big objects don’t support transactions including both big objects, standard object and custom objects.
  • At least you should have one indexed field
  • To support the scale of data in a big object, features like triggers, flows, processes, and Salesforce1, are not available.

Own Storage Space

Good to know that Salesforce big objects are come up with its own storage  space

Access Control

You can control  Read or create actions from the profile  

Insert Data into Big Object.

You can able to insert the data into the big object by using API  or  You can use a CSV file to load data into a custom big object. Here is the single record insert from the workbench. But even you can load data from CSV file.

Using Apex to insert the data 

You can create and update custom big object records in Apex using the insertImmediate method as shown below. Here in this code, I am inserting ten records for the sales order object. What we are going to do after inserting the Big Objects, we are going to create a visual force to show the custom Object and its Big object in the related list.

List<Sales_OrderBO__b> orderList = new List<Sales_OrderBO__b>();
for(integer i =1; i<10 ; i++){
    Sales_OrderBO__b  sOrder = new Sales_OrderBO__b();
    sOrder.Actual_Cost__c = 122.3 ; 
    sOrder.Saleschannel__c = 'Sales Channel' ; 
    sOrder.Created_From__c ='Internal' ; 
    sOrder.Expired_Date__c = System.now();
    sOrder.SalesorderBORel__c='a0B63000002qHXP' ; 
    orderList.add(sOrder);
}

Database.SaveResult[] sRes =  Database.insertImmediate(orderList);


Using Rest API

you can insert the data into the object by using rest api post method as shown below.

You can expose the data by using the visual force page or lightning component. Here are the apex class and page.

<apex:page standardController="Sales_Order__c" extensions="BigObjectData">
    <apex:form>
        
        <apex:pageBlock>
            <apex:pageBlockSection title="Big Objects" columns="1">
                <apex:pageBlockTable value="{!boData}" var="bo">
                    <apex:column headerValue="Actual Cost">
                        <apex:outputField value="{!bo.Actual_Cost__c}"/>
                    </apex:column>
                    <apex:column headerValue="Sales Channel">
                        <apex:outputField value="{!bo.Saleschannel__c}"/>
                    </apex:column>
                  
                      <apex:column headerValue="Created from ">
                        <apex:outputField value="{!bo.Created_From__c}"/>
                    </apex:column>
                  
                  
                </apex:pageBlockTable>
            </apex:pageBlockSection>
            
        </apex:pageBlock>
    </apex:form>
</apex:page>
public class BigObjectData {
    private  List<Sales_OrderBO__b> orderDetails; 
    public BigObjectData(ApexPages.StandardController con)
    {
        boData = fetchData();
    } 
    public List<Sales_OrderBO__b> fetchData()
    {
        return [Select Id ,Saleschannel__c ,Created_From__c,SalesorderBORel__c,Expired_Date__c,Actual_Cost__c
                from Sales_OrderBO__b ];
    }
    
    public List<Sales_OrderBO__b> boData{get;set;}
    
}

 

 

Salesforce Skinny Tables

Introduction : 

Potentially most of the customers have the large amount of the data in salesforce tables, which may cause more processing time is, at least partly, dependent on how long it takes to scan the data especially when we are scanning on the relationships. So one way of improving performance is to trim up the Skinny table. Naturally, incremental processing is useful for these very large fact tables, but everything will go faster with Skinny tables.

What is the Skinny table?

A skinny table is a custom table in the Force.com platform that contains a subset of fields from a standard or custom base Salesforce object. Force.com can have multiple skinny tables, if needed, and maintains them under the hood.

 

Skinny Table to Speed up Account Queries

Do we need a Skinny table? 

Before deciding do we need the Skinny table or not, first, let’s understand how salesforce design the Standard database.Below Images how salesforce organizes the Account table under the hood.

From the user view, an account object is a single table. But internally salesforce stores in two different table for an Account. One table for account standard fields and one table Account custom field which is partitioned rather than a single table. When end-user query the account table salesforce will perform the join on both the tables to return the data, which is little overhead if you have the huge amount of records. In this case, you can contact salesforce to create Skinny tables which will return more rows per fetch.

When to Use?

Skinny tables are most useful with tables containing millions of records. They can be created on custom objects, and on Account, Contact, Opportunity, Lead, and Case objects. In addition, they can enhance performance for reports, list views, and SOQL. The skinny table allows you to reorganize selected columns and rows of data into a separate table.

Any Considerations?

  • Contact Salesforce to create Skinny tables
  • Used to fetch frequently used fields and to avoid joins.
  • Skinny tables are most useful with tables containing millions of records.
  • Skinny tables can contain these types of fields.  Checkbox,  Date, Date and time, Email, Number, Percent, Phone picklist, Picklist (multi-select), Text, Text area, Text area (long), URL
  • skinny tables do not include soft-deleted rows (i.e., records in the Recycle Bin with isDeleted = true),
  • The Force.com platform automatically synchronizes the rows between the base object and the skinny table, so the data is always kept current.

Any Impact?

  1.  Skinny tables can contain a maximum of 100 columns.
  2.  Skinny tables cannot contain fields from other objects.
  3.  Skinny tables are copied to your Full sandbox organization from Summer ’15 release.
  4.    Skinny tables are custom tables in the underlying Force.com database. They don’t have the dynamic metadata flexibility you find in the base object. If you alter a field type (e.g., change a number field to a text field) the skinny table becomes invalid, and you must contact salesforce.com Customer Support to create a new skinny table.