24 November 2014

Downloading Event Log Files using a Script

Event Monitoring, new in the Winter '15 release, enables use cases like adoption, user audit, troubleshooting, and performance profiling using an easy to download, file based API to extract Salesforce app log data.

The most important part is making it easy to download the data so that you can integrate it with your analytics platform.

To help make it easy, I created a simple bash shell script to download these CSV (comma separated value) files to your local drive. It works best on Mac and Linux but can be made to work with Windows with a little elbow grease. You can try these scripts out at http://bit.ly/elfScripts. These scripts do require a separate JSON library called jq to parse the JSON that's returned by the REST API.

It's not difficult to build these scripts using other languages such as Ruby, Perl, or Python. What's important is the data flow.

I prompt the user to enter their username and password (which is masked). This information can just as easily be stored in environment variables or encrypted so that you can automate the download on a daily basis using CRON or launchd schedulers.

#!/bin/bash
# Bash script to download EventLogFiles
# Pre-requisite: download - http://stedolan.github.io/jq/ to parse JSON

#prompt the user to enter their username or uncomment #username line for testing purposes
read -p "Please enter username (and press ENTER): " username

#prompt the user to enter their password 
read -s -p "Please enter password (and press ENTER): " password

#prompt the user to enter their instance end-point 
echo 
read -p "Please enter instance (e.g. na1) for the loginURL (and press ENTER): " instance

#prompt the user to enter the date for the logs they want to download
read -p "Please enter logdate (e.g. Yesterday, Last_Week, Last_n_Days:5) (and press ENTER): " day

Once we have the credentials, we can log in using oAuth and get the access token.

#set access_token for OAuth flow 
#change client_id and client_secret to your own connected app - http://bit.ly/clientId
access_token=`curl https://${instance}.salesforce.com/services/oauth2/token -d "grant_type=password" -d "client_id=3MVG99OxTyEMCQ3hSja25qIUWtJCt6fADLrtDeTQA12.liLd5pGQXzLy9qjrph.UIv2UkJWtwt3TnxQ4KhuD" -d "client_secret=3427913731283473942" -d "username=${username}" -d "password=${password}" -H "X-PrettyPrint:1" | jq -r '.access_token'`

Then we can query the event log files to get the Ids necessary to download the files and store the event type and log date in order to properly name the download directory and files.

#set elfs to the result of ELF query
elfs=`curl https://${instance}.salesforce.com/services/data/v31.0/query?q=Select+Id+,+EventType+,+LogDate+From+EventLogFile+Where+LogDate+=+${day} -H "Authorization: Bearer ${access_token}" -H "X-PrettyPrint:1"`

Using jq, we can parse the id, event type, and date in order to create the directory and file names

#set the three variables to the array of Ids, EventTypes, and LogDates which will be used when downloading the files into your directory
ids=( $(echo ${elfs} | jq -r ".records[].Id") )
eventTypes=( $(echo ${elfs} | jq -r ".records[].EventType") )
logDates=( $(echo ${elfs} | jq -r ".records[].LogDate" | sed 's/'T.*'//' ) )

We create the directories to store the files. In this case, we download the raw data and then convert the timestamp to something our analytics platform will understand better.

Then we can iterate through each download, renaming it to the Event Type + Log Date so that we easily refer back to it later on. I also transform the Timestamp field to make it easier to import into an analytics platform like Project Wave from Salesforce Analytics Cloud.

#loop through the array of results and download each file with the following naming convention: EventType-LogDate.csv
for i in "${!ids[@]}"; do
    
    #make directory to store the files by date and separate out raw data from 
    #converted timezone data
    mkdir "${logDates[$i]}-raw"
    mkdir "${logDates[$i]}-tz"

    #download files into the logDate-raw directory
    curl "https://${instance}.salesforce.com/services/data/v31.0/sobjects/EventLogFile/${ids[$i]}/LogFile" -H "Authorization: Bearer ${access_token}" -H "X-PrettyPrint:1" -o "${logDates[$i]}-raw/${eventTypes[$i]}-${logDates[$i]}.csv" 

    #convert files into the logDate-tz directory for Salesforce Analytics
    awk -F ','  '{ if(NR==1) printf("%s\n",$0); else{ for(i=1;i<=NF;i++) { if(i>1&& i<=NF) printf("%s",","); if(i == 2) printf "\"%s-%s-%sT%s:%s:%sZ\"", substr($2,2,4),substr($2,6,2),substr($2,8,2),substr($2,10,2),substr($2,12,2),substr($2,14,2); else printf ("%s",$i);  if(i==NF) printf("\n")}}}' "${logDates[$i]}-raw/${eventTypes[$i]}-${logDates[$i]}.csv" > "${logDates[$i]}-tz/${eventTypes[$i]}-${logDates[$i]}.csv"

done

Downloading event log files is quick and efficient.  You can try these scripts out at http://bit.ly/elfScripts. Give it a try!

10 November 2014

Event Log File Field Lexicon

Event Log Files, new in the Winter '15 release, enables adoption, troubleshooting, and auditing use cases using an easy to download, file based API to extract Salesforce app log data.

It's an extremely rich data source, originally created by Salesforce developers to better understand the operational health of the overall service and better support our customers.

Extending access to these log files provides our customers the ability to support themselves using some of the same tools we've used to support them.

Most fields in the log files are self-describing like CLIENT_IP or TIMESTAMP. However, some of the log file fields can be difficult to understand without a lexicon.

There are a couple of reasons for this. One reason is because some fields are derived where data is encoded in an enumerated value or with an acronym which is defined in a separate place in the code.

A lot of time, this is done because less characters or numeric codes take up less total storage space which is important when you're storing terabytes of log files every day.

But this leaves us with a problem, what in the world does the data actually mean?

For instance, rather than store 'Partner' for the API_TYPE in the API log file, we store a simple code of 'P'.

Another example is when the code is spelled out and still needs interpretation. For instance, VersionRenditionDownload for the TRANSACTION_TYPE in the ContentTransfer log file simply means someone previewed a file in the app instead of downloading it (which is actually VersionDownloadAction or VersionDownloadApi).


All of this means we need a lexicon to map codes to possible values or examples so that we understand the data we're downloading.

Below are some example fields to help make sense of the data from Event Log Files.

Common Log File Fields
These are log fields you'll see across many different log files and typically address who, what, when, where, and how.

Field NameDescriptionPossible Values or Examples (e.g.)
CLIENT_IPThe IP address of the client using Salesforce services.e.g. 192.168.0.1
EVENT_TYPEThe type of event, such as content sharing.e.g. URI
ORGANIZATION_IDThe 15-character ID of the organization.e.g. 00DB00000000mZw
REQUEST_IDThe unique ID of a single transaction.e.g. 3nWgxWbDKWWDIk0FKfF5DV
REQUEST_STATUSThe status of the request for a page view or user interface action.Possible values include:
• S: Success
• F: Failure
• U: Uninitialized
TIMESTAMPThe access time of Salesforce services, in UTC time.e.g. 20130715233322.670,
which equals 2013-07-15T23:33:22.670+0000.
URIThe URI of the page receiving the request.e.g. /home/home.jsp
USER_IDThe 15-character ID of the user using Salesforce services, whether through the UI or the API.e.g. 005B00000018C2g

Log File Specific Fields
These are log fields that are typically unique to one or two log files and typically represent a type, operation, or other enumerated value.

EventType (File Type)Field NameDescriptionPossible Values or Examples (e.g.)
APEX_CALLOUT_EVENTMETHODThe HTTP method of the callout.e.g. GET, POST, PUT, DELETE
APEX_CALLOUT_EVENTTYPEThe type of calloute.g. REST, AJAX
APEX_TRIGGER_EVENTTRIGGER_TYPEThe type of this trigger.The types of triggers are:
• AfterInsert
• AfterUpdate
• BeforeInsert
• BeforeUpdate
API_EVENTMETHOD_NAMEThe API method that is invoked.e.g. query(), insert(), upsert(), delete()
API_EVENTAPI_TYPEThe type of API invoked.values include:
• X: XmlRPC
• O: Old SOAP
• E: SOAP Enterprise
• P: SOAP Partner
• M: SOAP Metadata
• I: SOAP Cross Instance
• S: SOAP Apex
• D: Apex Class
• R: REST API
• T: SOAP Tooling
ASYNC_REPORT_EVENTDISPLAY_TYPEThe report display type, indicating the run mode of the report.Possible values include:
• D: Dashboard
• S: Show Details
• H: Hide Details
ASYNC_REPORT_EVENTRENDERING_TYPEThe report rendering type, describing the format of the report output.Possible values include:
• W: Web (HTML)
• E: Email
• P: Printable
• X: Excel
• C: CSV (comma-separated values)
• J: JSON (JavaScript object notation)
CONTENT_DOCUMENT_LINK_EVENTSHARING_OPERATIONThe type of sharing operation on the document.e.g. INSERT, UPDATE, or DELETE.
CONTENT_DOCUMENT_LINK_EVENTSHARING_PERMISSIONWhat permissions the document was shared with.The possible values include:
• V: Viewer
• C: Collaborator
• I: Inferred—that is, the sharing permissions were inferred from a relationship between the viewer and document. For example, a document’s owner has a sharing permission to the document itself. Or, a document can be a part of a content collection, and the viewer has sharing permissions to the collection, rather than explicit permissions to the document directly.
CONTENT_TRANSFER_EVENTTRANSACTION_TYPEThe operation performed.The possible operations include:
• VersionDownloadAction and
VersionDownloadApi represent downloads via the user interface and API respectively.
• VersionRenditionDownload represents a file preview action.
• saveVersion represents a file being uploaded.
DASHBOARD_EVENTDASHBOARD_TYPEThe type of dashboard.Valid types include:
• R: Run as Running User
• C: Run as Context User
• S: Run as Specific User
LOGOUT_EVENTUSER_INITIATED_LOGOUTThe user type used when logging out.The value is 1 if the user intentionally logged out by
clicking the Logout link, and 0 if they were logged out by a timeout or other implicit logout action.
MDAPI_OPERATION_EVENTOPERATIONThe operation being performede.g. DEPLOY, RETRIEVE, LIST,
DESCRIBE
PACKAGE_INSTALL_EVENTOPERATION_TYPEThe type of package operation.Possible values include:
• INSTALL
• UPGRADE
• EXPORT
• UNINSTALL
• VALIDATE_PACKAGE
•INIT_EXPORT_PKG_CONTROLLER
REPORT_EVENTDISPLAY_TYPEThe report display type, indicating the run mode of the report.Possible values include:
• D: Dashboard
• S: Show Details
• H: Hide Details
REPORT_EVENTRENDERING_TYPEThe report rendering type, describing the format of the report output.Possible values include:
• W: Web (HTML)
• E: Email
• P: Printable
• X: Excel
• C: CSV (comma-separated values)
• J: JSON (JavaScript object notation)
REST_API_EVENTMETHODThe HTTP method of the requeste.g. GET, POST, PUT, DELETE
SITES_EVENTHTTP_METHODThe HTTP method of the requestGET, POST, PUT, DELETE
SITES_EVENTREQUEST_TYPEThe request type.Possible values include:
• page: a normal request for a page
• content_UI: a content request for a page
originated in the user interface
• content_apex: a content request initiated
by an Apex call
• PDF_UI: a request for a page in PDF format
through the user interface
• PDF_apex: a request for PDF format by an
Apex call (usually a Web Service call)
UI_TRACKING_EVENTCONNECTION_TYPEMethod used by the mobile device to connect to the web.Values can include:
• CDMA1x
• CDMA
• EDGE
• EVDO0
• EVDOA
• EVDOB
• GPRS
• HSDPA
• HSUPA
• HRPD
• LTE
• OFFLINE
• WIFI
VISUALFORCE_EVENTREQUEST_TYPEThe request type.Possible values include:
• page: a normal request for a page
• content_UI: a content request for a page
originated in the user interface
• content_apex: a content request initiated
by an Apex call
• PDF_UI: a request for a page in PDF format
through the user interface
• PDF_apex: a request for PDF format by an
Apex call (usually a Web Service call)

03 November 2014

Hadoop and Pig come to the Salesforce Platform with Data Pipelines


Event Log Files is big - really, really big. This isn't your everyday CRM data where you may have hundreds of thousands of records or even a few million here and there. One organization I work with does approximately twenty million rows of event data per day using Event Log Files. That's approximately 600 million rows per month or 3.6 billion every half year.

Because the size of the data does matter, we need tools that can orchestrate and process this data for a variety of use cases. For instance, one best practice when working with Event Log Files is to de-normalize Ids into Name fields. Rather than reporting on the most adopted reports by Id, it's better to show the most adopted reports by Name.

There are many ways to handle this operation outside of the platform. However, on the platform there's really only been one way to handle this in the past with Batch Apex.

In pilot with the Winter '15 release (page 198 of the release notes), data pipelines provides a way for users to upload data into Hadoop and run Pig scripts against it. The advantage is that it handles many different data sources including sobjects, chatter files, and external objects at scale.

I worked with Prashant Kommireddi on the following scripts which help me understand which reports users are viewing:

1. Export user Ids and Names using SOQL into userMap.csv (Id,Name) which I upload to chatter files
-- 069B0000000NBbN = userMap file stored in chatter
user = LOAD 'force://chatter/069B0000000NBbN' using gridforce.hadoop.pig.loadstore.func.ForceStorage() as (Id, Name);
-- loop through user map to reduce Id from 18 to 15 characters to match the log lines
subUser = foreach user generate SUBSTRING(Id, 0, 15) as Id, Name;
-- storing into FFX to retrieve in next step
STORE subUser INTO 'ffx://userMap15.csv' using gridforce.hadoop.pig.loadstore.func.ForceStorage();

2. Export report Ids and Names using SOQL into reportMap.csv (Id,Name) which I upload to chatter files
-- 069B0000000NBbD = reportMap file stored in chatter
report = LOAD 'force://chatter/069B0000000NBbD' using gridforce.hadoop.pig.loadstore.func.ForceStorage() as (Id, Name);
-- loop through user map to reduce Id from 18 to 15 characters to match the log lines
subReport = foreach report generate SUBSTRING(Id, 0, 15) as Id, Name;
-- storing into FFX to retrieve in next step
STORE subReport INTO 'ffx://reportMap15.csv' using gridforce.hadoop.pig.loadstore.func.ForceStorage();

3. createReportExport - Upload ReportExport.csv to chatter files and run script to combine all three
-- Step 1: load users and store 15 char id
userMap = LOAD 'ffx://userMap15.csv' using gridforce.hadoop.pig.loadstore.func.ForceStorage() as (Id, Name);
-- Step 2: load reports and store 15 char id
reportMap = LOAD 'ffx://reportMap15.csv' using gridforce.hadoop.pig.loadstore.func.ForceStorage() as (Id, Name);
-- Step 3: load full schema from report export elf csv file
elf = LOAD 'force://chatter/069B0000000NB1r' using gridforce.hadoop.pig.loadstore.func.ForceStorage() as (EVENT_TYPE, TIMESTAMP, REQUEST_ID, ORGANIZATION_ID, USER_ID, RUN_TIME, CLIENT_IP, URI, CLIENT_INFO, REPORT_DESCRIPTION);
-- Step 4: remove '/' from URI field to create an Id map
cELF = foreach elf generate EVENT_TYPE, TIMESTAMP, REQUEST_ID, ORGANIZATION_ID, USER_ID, RUN_TIME, CLIENT_IP, SUBSTRING(URI, 1, 16) as URI, CLIENT_INFO, REPORT_DESCRIPTION;
-- Step 5: join all three files by the common user Id field
joinUserCELF = join userMap by Id, cELF by USER_ID;
joinReportMapELF = join reportMap by Id, cELF by URI;
finalJoin = join joinUserCELF by cELF::USER_ID, joinReportMapELF by cELF::USER_ID;
-- Step 6: generate output based on the expected column positions
elfPrunedOutput = foreach finalJoin generate $0, $1, $2, $3, $4, $5, $7, $8, $10, $11, $12, $13;
-- Step 7: store output in CSV
STORE elfPrunedOutput INTO 'force://chatter/reportExportMaster.csv' using gridforce.hadoop.pig.loadstore.func.ForceStorage();

By combining the power of data pipelines, I can transform the following Wave platform report from:

To:


To learn more about data pipelines and using Hadoop at Salesforce, download the Data Pipelines Implementation Guide (Pilot) and talk with your account executive about getting into the pilot.