07 July 2016

Get Your Event Monitoring Wave App

Hey there! Salesforce Shield and Event Monitoring expands from Event Log API to built in, out of the box data visualization with Event Monitoring Wave App. Now Generally Available (GA)! Big thanks to Adam for all the heavy lifting with the Admin Analytics pilot (former name). 



If you missed the announcement from June, here's the deal what you need to know.
  • Event Monitoring supports 32 different Salesforce event types and it can be quite a job to integrate the data flow and figure out which events to subscribe and visualize and build custom dashboards
  • Event Monitoring customers and partners have now access to Event Monitoring Wave App with 15 built-in dashboards for the core use cases 1. Security, 2. Application Development and Performance monitoring and 3. Salesforce Use and Adoption
  • Event Monitoring Wave App includes API integration with Event Log Files API providing immediate value out of the box by simply turning Event Monitoring on for your app but also a great point and click interface to slice and dice the data your way with ways to customize dashboards your own way
  • The Event Monitoring Wave App is licensed for 10 users and 50 million record row limit and there's configuration wizard to select which datasets to include and for how long (default is 7 days) depending on your app's data volume


Security and compliance are a very strong drivers for Event Monitoring customers and we have spent the most of our time building different views for security and compliance related dashboards. Hope you enjoy them, here's a quick walkthrough of each:
  • My Trust: inspired by trust.salesforce.com, My Trust is a single place to view the health of your Salesforce app, active users, total transactions, average and max page time and end user page time. Drill down to different event types and compare daily trends.
  • Report Downloads: see the percentage of viewed reports that resulted in exports, as well as report export trends by different user agents and IPs that can be filtered down by inactive users to indicate suspicious or large volume of data export activity
  • REST API: analyze who is using the API for example with Data Loader to manage or move large data sets and identify possible hot spots for REST API that are used by managed packages 
  • Login As: understand admin behavior logging in as end users and identify possible abuse, where they logged in and who are they and what pages were accessed
  • User Logins: see login trends per user, who is using the application the most, identify IP addressed with shared logins for signs of suspicious use as well as understand what browsers are being used and average times being logged in
  • Setup Audit Trail: identify what admins are doing in the setup and keep track on most common audit changes and their types
  • Files: get visibility which files are being downloaded by different roles, period of time to help identify the top files or resources that are barely being used

Application Development and Performance is also a very important topic to continuously monitor and stay in the know of the application health and understand if some reports are taking long to produce or if certain Apex jobs should be timed differently to avoid hitting governance limits. Here's what we've built for Salesforce developers:


  • Apex Execution: help to prioritize which Apex classes to fix to improve overall performance by comparing overall Apex performance, CPU time, SOQL and DML interactions based on total DB time
  • Reports: see report usage trends accross users and profiles and identify top reports and get visibility into most used reports as well as their performance to load
  • API: see API trends per Object and the overall API performance during certain period of time including average CPU time per API
  • Dashboards: get visibility into Dashboard usage trends over time and the performance of these dashboards so you can prioritize in troubleshooting


Last but definitely not least, understanding Adoption and User Engagement for the Salesforce application is key. What are my users doing, how are they accessing the application, when and what are the top resources or click paths. These are valuable for the line of business, executives, IT teams as well as developers alike:

  • Lightning SFX: provides visibility who are the users using the new Lightning User Interface and how it's performing, see how many total user interactions took place and what the average and max end user page time (EPT) looks like
  • Page Views (URI): see what pages the users are clicking the most and how much time they are spending, on average, on these pages. Drill down to additional details for users details and he/she is accessing or drill down to actual pages who are the users using them
  • Visualforce Requests: see the most used Visualforce pages and prioritize troubleshooting based on performance e.g. sorting by runtime you can quickly see the slowest pages, or AppExchange adoption
  • Wave Adoption: last but not least, you have pushed out Event Monitoring Wave App or Sales or Service Wave, and you want to know are your users actually using it, identify details at user level and how many interactions they have with Wave dashboards and which ones they are customizing
We hope you enjoy the app and will find these built in visualizations useful. You can use your 10 permission set licenses as viewers or editors/managers. If you require more users or are nearing the 50 million limit you can get in touch with your Account Executive to get more with Wave Platform.

If you are an existing Event Monitoring customer and haven't yet tried out the Event Monitoring Wave App: please follow these instructions to get set up. If you're new a customer interested to learn more about Event Monitoring and the Event Monitoring Wave App, get in touch with you Salesforce Account Executive to get started. 

For anything else, please leave questions or comments here or reach out on Twitter to @salomaa. Thanks and sunny summertime from San Francisco!


-Jari


08 June 2016

New in Summer '16 with Event Monitoring and Transaction Security

Event Monitoring with Transaction Security has expanded the support for policy options from API based report export events to cover also UI triggered report exports. You may have seen the updated Apex class for Data Export Policy for Leads? This means any type of report export for specific resources like Leads or Accounts or Opportunities can be set to a specific trigger to apply the appropriate security condition according your security policies. In the stock policy we have defined the condition for more than 2,000 records or more than 1000 ms which would indicate a large data download. You're free to customize the resource and condition in your own policy.

global class DataLoaderLeadExportCondition implements TxnSecurity.PolicyCondition {
    public boolean evaluate(TxnSecurity.Event e){
        // The event data is a Map<String, String>.
        // We need to call the valueOf() method on appropriate data types to use them in our logic.
        Integer numberOfRecords = Integer.valueOf(e.data.get('NumberOfRecords'));
        Long executionTimeMillis = Long.valueOf(e.data.get('ExecutionTime'));
        String entityName = e.data.get('EntityName');

        // Trigger the policy only if and export on leads, where we are downloading more than 2000
        // records or it took more than 1 second (1000ms).
        if('Lead'.equals(entityName)){
            if(numberOfRecords > 2000 || executionTimeMillis > 1000){
                return true;
            }
        }

        // For everything else don't trigger the policy.
        return false;
    }
}

This helps security teams to stay in the know with realtime actions for large data export events or shield from unwanted data loss.

You can use various triggers, such as time, geolocation, IP, profile etc to customize the report export criteria. Simply choose the Data Export event from the dropdown menu and select Account, Case, Contact, Lead or Opportunity from from the resource name and apply your wanted action, in-app notification, email notification, two factor authentication or block.

Please see the following short demo for applying policy condition on report export with Accounts.




01 June 2016

Login Forensics: Login History plus for auditing user logins

The Salesforce App Cloud platform has important auditing capabilities built in to ensure that you can focus on what's most important: your business. One of these foundational audit tools is Login History.  The Login History audit trail enables administrators to download the last six months of logins to the Force.com platform, either via a CSV download link in the setup user interface or via the API. With Login History, you can track login successes and failures by user, IP, application, API, or browser to name a few key attributes. In addition, Event Monitoring provides access to the Login log lines as well. As you can tell, we consider Login an important event to keep track of!

We're proud to announce the general availability of a premium add-on service on top of our Event Monitoring product line that goes beyond both the Login log line as well as Login History by tracking login information for ten years!

Here's a breakdown of how the three compare:

Login History Login Forensics Login Log Line
Data Duration until Deleted 6 months 10 years 30 days
Storage Oracle Hbase Oracle
Access Setup UI, API API only API Download only
Permissions Manage Users View Platform Events View Event Log Files
Extensibility No Yes, via Additional Information No
Packaging Included with every org Included with Event Monitoring add-on Included with Event Monitoring add-on
Login Data Captured Login success and failure Login success and failure Login successes only
Name of sObject or File LoginHistory LoginEvent Login Event Type

How is it possible to store this critical data for so long? Salesforce recently adopted an open-source NoSQL database called HBase. HBase is the same database that we use to store up to 10 years of Field Audit Trail data.



Who cares? Well, I do. As does anyone who wants to maintain an audit trail of login information either for regulatory reasons or to track down anomalous login activity. For instance, imagine that a user always logs in from the same IP address, or during the same login hours, or using the same Chrome browser on Windows. Well, wouldn't it be strange if all of a sudden those behaviors changed over the course of a day, a week, a month, a year, or even a decade? 

All of this is possible with SOQL because of the HBase rowkeys we’ve defined. An HBase rowkey defines how we index these objects for fast queries. Imagine if you had to query a billion rows of LoginEvent records from the past decade in less than 120 seconds! That’s fast and furious query performance. 

The LoginEvent object, which stores the raw login data, has a rowkey consisting only of EventTime (in a descending sort) and the unique record id. And the PlatformEventMetric object, which stores the hourly roll-up metrics, has rowkey consisting of EventType and then EventTime (in a descending sort). These simple rowkeys enable fast response using standard SOQL. You just need to know the time frame you want to query and in the case of metrics, which metric and for which time frame.

SELECT Application, Browser, EventDate, Id, LoginUrl, UserId
FROM LoginEvent
WHERE EventDate>Yesterday
LIMIT 10

This works because EventDate is the first field in the rowkey and the sort works because of the way we store the rows in descending sort order. This is powerful for querying the last ten Login Events that happened in near real-time. 

It’s also powerful for integrating. You can create a polling app that queries every minute in the case of the raw events, and every hour in the case of the metrics, in order to easily integrate the last set of login data since the last query.

Alternatively, you can use the Asynchronous SOQL solution outlined in my previous blog post: Using Asynchronous SOQL with Event Monitoring.

Events are captured in near real-time. What does that mean? Well, there can be a minor delay from when the event occurred and when you can query it. If you want, you can self-monitor the near real-time nature of our events. If you take the average difference between the EventDate and the CreatedDate fields, you’ll see how near real-time your events have been captured.

Near Real Time Example

There's even the ability to introduce your own metadata into the login flow to further fingerprint user’s login profile and identify anomalies in the login process. We call it Additional Info. It's the ability to introduce your own data through a HTTP Header. This can be done via the browser, a proxy, or API authentication. For instance, you might want to register header name (e.g. "x-sfdc-addinfo-correlationid") and value (e.g. "d18c5a3f-4fba-47bd-bbf8-6bb9a1786624"). Then when you look at your login events, you just need to look for any logins that do not have this identifier to investigate further.



Finally, there's a transaction dye that's important to the process. Every Login Event can be traced back to a single Login History Id. This is useful for a couple of reasons. The first one is that Login History connects to Login Geo which captures geographical information like latitude and longitude of your users. As a result, you can use the composite API to orchestrate un-related queries in order to generate the location of every user onto a mapping service like Google Maps. Secondly, with each subsequent activity where the user interacts with data like looking at accounts, you'll be able to track each interaction back to a single Login on both the Login Event and Login History objects. For example, when tracking down which records were viewed from an API query (Data Leakage blog post where this is explained). And after six months, when Login History is deleted, you'll continue to be able to track every interaction back to a single login for nine and one half years more. So even if you login via your iPhone, your Nexus tablet, your Chrome browser on your Mac, and Salesforce for Outlook, we'll be able to separate each set of transactions and link them back to a single login for the next ten years.



All of the screen shots in this post can be recreated using the sample code found in my Github repo.

Login Forensics ushers in a new age of storing near real-time system generated user activity on the Salesforce platform.