05 January 2016

Using Asynchronous SOQL with Event Monitoring

Working with BigData requires re-thinking how you work on the Salesforce platform. Since you can assume that you're working with billions of records instead of thousands or even millions, common tasks like aggregate functions (e.g. SELECT count() FROM LoginEvent)  or complex queries (e.g. SELECT username FROM ApiEvent WHERE Soql LIKE '%SSN__c%)) are difficult to perform quickly. By quickly, a SOQL query needs to typically complete within two minutes or it will timeout.

The advantage of working with BigData on the Salesforce platform is that use cases involving billions of records are now achievable for a variety of use cases including:

  • long term adoption metrics
  • audit
  • performance monitoring
  • archiving

As a result, it's better to think of BigData as a data lake, where massive amounts of data can be processed at scale to meet any of the above use cases.

In a more concrete case of user login events which we now store in the Salesforce equivalent of BigData called a BigObject, it becomes important to re-think how you can work with the data at scale. This is especially important since not all capabilities that we've come to expect on the Salesforce platform, like operational reports or workflow, may be possible with a BigObject. It's a trade off of scale for limited platform capabilities.

For instance, rather than querying a BigObject like LoginEvent using a convention like synchronous SOQL via the API for use in real-time applications, it's better to use a new convention called asynchronous SOQL, currently in pilot, for extracting select use cases with analytic apps.

Asynchronous SOQL is similar to the Bulk API in the way it's job based and invoked using the API. But instead of the full CRUD capabilities with Bulk API designed primarily to work with the data off-platform, asynchronous SOQL provides a query-and-insert capability to retrieve sets of data and insert it into a structured object. Currently, this means moving data between BigObjects and custom objects on the platform.

This doesn't just mean BigObject only data but really any use cases where there is a significant amount of data including (but not inclusive of):

  • LoginHistory
  • SetupAuditTrail
  • AccountHistory (or any other field history object including Field Audit Trail)
  • Profile or PermissionSet
  • AccountShare (or any other sharing object)

For instance, you may want to create a subset set of data hourly, daily, or weekly in order to have the full transactionality of the Salesforce platform at your fingertips. Using this design pattern, we can utilize schedule and batch Apex to migrate subsets of data from a BigObject into custom objects. For instance, the ability to extract last week's worth of LoginEvents in order to report on it.

As a result, it's now possible to report on it using operational reports:

or with Wave for advanced exploratory capabilities:

or utilize workflow for generating tasks and alerts:

Asynchronous SOQL provides a new way of interacting with large sets of data on the Salesforce platform. For Event Monitoring use cases, it brings us closer to the kinds of analytic apps that enable CISOs and VPs of IT to understand the state and health of their organization.

To try this out, contact your Account Executive to have the Asynch SOQL pilot enabled in your org. Then just download the sample code and configurations in my Github asyncSOQL repository to get started with Event Monitoring use cases like Login Forensics, Data Leakage, or Apex Limit Events.


  1. This is precisely what OLAP/Analytical reporting is for. Can't the use cases move there? Salesforce can feed off the aggregated data.

    1. Not sure your question. Thanks for clarifying


Note: Only a member of this blog may post a comment.