17 April 2015

Clone This User

Have you ever been at lunch and received an emergency email from your CEO to create a new user so they can be up and running immediately?

In this connected, real-time world, there's no reason why you have to leave a perfectly good lunch to go online and create a user when you can create one from your mobile phone in two minutes or less.

I wrote about cloning users last year. It's the fastest path to creating a new user and follows a simple premise that new users are similar to existing users. Therefore existing users can be used as a template for creating creating new users.

Arkus made this incredibly easy with Clone This User, a mobile app built on the Salesforce1 platform that takes the fewest number of inputs to clone a user so they can be up and running immediately.

This app came up in several conversations just the other day.

The first use case was around ensuring the newly onboarded user had access to Wave which requires a permission set license. When cloning a user with Clone This User, both the permission set licenses and permission sets for Wave came over from the template user without having to think about it. The template becomes a black box that doesn't require the administrator to think about the minutiae of permissions a new user needs.

The second use case was around a custom user provisioning app built on top of the API. The custom app asked the administrator cloning the user to pick and choose additional permission sets above and beyond what the template user already had. This somewhat defeats the idea of a clone since now the administrator has to go back to the difficult task of figuring out all the permissions a user needs to get up and running. This created additional decision points for the administrator during what should have been a time savings workflow of cloning a user.

Cloning users with Clone This User is a great way to simplify the workflow of creating new users, even when you're at lunch.

09 April 2015

The Power of Compression

I had an interesting customer case come up last week. The customer was using the Splunk App for Salesforce. But unfortunately, they kept getting an error when trying to download Event Log Files.

We finally discovered that there is a ten minute timeout on API requests by default. That means an API call must complete within ten minutes or an error will occur. Because we were trying to download a large amount content in California, but the data center was based in Chicago, the addition of network latency and file size contributed to the ten minute timeout.

Upon further investigation, we determined that weren't compressing the CSV file content over the network and instead downloading everything in an uncompressed format. However, we also determined that there is a white list of content types where you can request compression during download including:
  • text/html
  • application/json
  • text/css
  • text/javascript
  • text/xml
  • application/javascript
  • application/x-javascript
  • application/vnd.edgemart
As a result, even when requesting text/csv content in a compressed format, we were still delivering it uncompressed. We have since patched a fix to allow compression when you request text/csv file content using the API.

It's important to know that this is an optional configuration. Nothing changes from current functionality while downloading Event Log Files. Your scripts that worked previously should still work. However, if you add the compression flag to your API request or header, we'll transmit your Event Log File in a compressed format enabling quicker delivery.

How much quicker?

With some initial testing I conducted downloading NA1 files from California using a modified python download script, I was able to download files on average 65% faster than before with this option.


To request compression with cURL, it's as easy as adding the --compression flag:

curl --compressed "https://na8.soma.salesforce.com/services/data/v33.0/sobjects/EventLogFile/0ATD00000000HnXOB3/LogFile" -H "Authorization: Bearer ${access_token}" -H "X-PrettyPrint:1" -o "compressedELF.csv"

Otherwise, you typically can request compression via a header. For example, with Python:

request.add_header('Accept-encoding', 'gzip')

So regardless of whether your files are large or you are in a different region from the data center, it's still worth downloading your files using compression.


Icons by DryIcons

23 March 2015

Event Monitoring Quick Start Guide or How to get from zero to dashboard in 10 minutes or less


So you are trying Event Monitoring out or perhaps you've purchased the add-on. Now what?

That was exactly what happened the other day when a customer asked me where the check box in setup was to get started with Event Monitoring.

Unfortunately, he stumbled upon two critical points when getting started with Event Monitoring:
  1. It's an API only feature. There is no check box in setup.
  2. It's just log data. What you do with that data is up to you whether you store it for a long time or you analyze it with a dashboard.
But it dawned on me that what the customer really needed was a quick start guide - from zero to dashboard in ten minutes or less.

Below is a set of steps to help get started quickly regardless of whether you are trying it out or you are ready to implement.

Steps to getting started


1. What should you expect to get for your edition?

  • Enterprise, Unlimited, Performance Edition: Login/Logout log lines for free with 1 day data retention
  • Enterprise, Unlimited, Performance Edition: full 28 log lines for add-on price with 30 day data retention
  • Developer Edition: full 28 log lines for free with 1 day data retention
That means that pretty much any organization that has access to our API has some form of Event Log Files already provisioned, even if it's only the Login and Logout log file types. Knowing what edition you have is important because it's what determines what you can access using the API. However, if you don't have a production organization and still want to try it out, sign up for Developer Edition which is free.

Troubleshooting tip: You won't see any Event Log File records for the first 24 hours from when it is provisioned. So if you don't see any records at first, try again tomorrow.

2.  What permissions do you need to access Event Log Files?

You need at least the following permissions:
  • API Enabled
  • View Event Log Files (However, any user with View All Data automatically gets access as well)
To add them, go to Setup > Administer > Manage Users > Permission Sets to create a new permission set with these two permissions.

To assign them to your user,  click on the Manage Assignments button in the permission set you just created and click Add Assignments to find your user and assign them to the permission set you just created. 

Troubleshooting tip: if you don't have the ability to create, edit, and assign a permission set, talk with a system administrator who does. 

3. Now you have access, but how do you actually access Event Log Files?

Remember, there is no check box in setup. You have to use the API.

I typically try new things out in the API using the workbench which works on any platform and provides access to a Swiss Army knife of great API features.


Once you login to workbench, the first thing to verify is that you have EventLogFile data. Go to queries > SOQL Query and from the object drop down, select EventLogFile.

To verify that you have some data, run the following query:

SELECT count() FROM EventLogFile

Troubleshooting tips:
  • if you can't login, you don't have API access and need to go back to step 2.
  • if you don't see EventLogFile from the object drop down, you don't have access and need to go back to step 2.
  • if you get 0 records returned from the query, you don't have any data yet and you should plan on trying again tomorrow.

4. Now that you have data, how do you view it?

While in workbench, go to utilities > REST Explorer and enter the following query into the text box:

/services/data/v33.0/query?q=SELECT+Id+,+EventType+,+LogDate+,+LogFileLength+,+LogFile+FROM+EventLogFile+ORDER+BY+CreatedDate+DESC+Nulls+Last


You should get some records back. Expand one of the records and click on the LogFile link attribute.


Copy everything in double quotes and pasted into a text editor like Notepad or Sublime. If you save that content with a '.csv' file extension, you now have a CSV file with your log data.

Troubleshooting tips:
  • if workbench times out, try a smaller file. Login as and Report Export tends to render in workbench. URI, Visualforce, Apex, and API tend to be too large.

5. Now you've seen the data, how do you download it when there's a lot of it?

That's where an automation script works great. I've written several blog posts to get started with writing an automation script:


You can also use a middleware provider like Cast Iron or Informatica. Just make sure they can deserialize base64 (Blob) content and/or handle CSV data.

Troubleshooting tips:
  • if you only have a Windows machine, I recommend using the python script since the other scripts are optimized for Linux and Mac operating systems.
  • if you run into some problems, comment on this blog post and I'll try to help out.

6. Now you have the data locally, but how do you make it look good?

To make it look good, you need a visualization layer. Event Monitoring doesn't come with one by default; however, there are a number of add-on tools that you can use. Below are some great ISVs (independent software vendors) who already build on top of Event Monitoring:
  • Salesforce Analytics Cloud and the Wave Platform
  • Splunk
  • New Relic
  • Fairwarning
  • ezCloudAudit
  • SkyHigh Networks
  • Cloudlock
That doesn't mean you can't use other tools like Tableau or Qlik.  As long as they can handle CSV data, they can visualize Event Monitoring data.

In case you don't have access to any of these, I recommend a free service like Plot.ly or Raw Designs.

In the Visualizing Identity Fraud Using Login History blog post, I discussed the Raw app from Density Designs.

To use Raw, just paste the CSV data you downloaded in step 4 and pick from one of their great visualizations like a Circular Dendrogram.


When you map your dimensions, pick USER_ID and URI to get a sense of who is download what reports.


And finally visualize your data.


Event Monitoring enables organizations to have self-service access to the app logs for a variety of use cases. What are you going to do with the data once you have it?