24 November 2014

Downloading Event Log Files using a Script

Event Monitoring, new in the Winter '15 release, enables use cases like adoption, user audit, troubleshooting, and performance profiling using an easy to download, file based API to extract Salesforce app log data.

The most important part is making it easy to download the data so that you can integrate it with your analytics platform.

To help make it easy, I created a simple bash shell script to download these CSV (comma separated value) files to your local drive. It works best on Mac and Linux but can be made to work with Windows with a little elbow grease. You can try these scripts out at http://bit.ly/elfScripts. These scripts do require a separate JSON library called jq to parse the JSON that's returned by the REST API.

It's not difficult to build these scripts using other languages such as Ruby, Perl, or Python. What's important is the data flow.

I prompt the user to enter their username and password (which is masked). This information can just as easily be stored in environment variables or encrypted so that you can automate the download on a daily basis using CRON or launchd schedulers.

# Bash script to download EventLogFiles
# Pre-requisite: download - http://stedolan.github.io/jq/ to parse JSON

#prompt the user to enter their username or uncomment #username line for testing purposes
read -p "Please enter username (and press ENTER): " username

#prompt the user to enter their password 
read -s -p "Please enter password (and press ENTER): " password

#prompt the user to enter their instance end-point 
read -p "Please enter instance (e.g. na1) for the loginURL (and press ENTER): " instance

#prompt the user to enter the date for the logs they want to download
read -p "Please enter logdate (e.g. Yesterday, Last_Week, Last_n_Days:5) (and press ENTER): " day

Once we have the credentials, we can log in using oAuth and get the access token.

#set access_token for OAuth flow 
#change client_id and client_secret to your own connected app - http://bit.ly/clientId
access_token=`curl https://${instance}.salesforce.com/services/oauth2/token -d "grant_type=password" -d "client_id=3MVG99OxTyEMCQ3hSja25qIUWtJCt6fADLrtDeTQA12.liLd5pGQXzLy9qjrph.UIv2UkJWtwt3TnxQ4KhuD" -d "client_secret=3427913731283473942" -d "username=${username}" -d "password=${password}" -H "X-PrettyPrint:1" | jq -r '.access_token'`

Then we can query the event log files to get the Ids necessary to download the files and store the event type and log date in order to properly name the download directory and files.

#set elfs to the result of ELF query
elfs=`curl https://${instance}.salesforce.com/services/data/v31.0/query?q=Select+Id+,+EventType+,+LogDate+From+EventLogFile+Where+LogDate+=+${day} -H "Authorization: Bearer ${access_token}" -H "X-PrettyPrint:1"`

Using jq, we can parse the id, event type, and date in order to create the directory and file names

#set the three variables to the array of Ids, EventTypes, and LogDates which will be used when downloading the files into your directory
ids=( $(echo ${elfs} | jq -r ".records[].Id") )
eventTypes=( $(echo ${elfs} | jq -r ".records[].EventType") )
logDates=( $(echo ${elfs} | jq -r ".records[].LogDate" | sed 's/'T.*'//' ) )

We create the directories to store the files. In this case, we download the raw data and then convert the timestamp to something our analytics platform will understand better.

Then we can iterate through each download, renaming it to the Event Type + Log Date so that we easily refer back to it later on. I also transform the Timestamp field to make it easier to import into an analytics platform like Project Wave from Salesforce Analytics Cloud.

#loop through the array of results and download each file with the following naming convention: EventType-LogDate.csv
for i in "${!ids[@]}"; do
    #make directory to store the files by date and separate out raw data from 
    #converted timezone data
    mkdir "${logDates[$i]}-raw"
    mkdir "${logDates[$i]}-tz"

    #download files into the logDate-raw directory
    curl "https://${instance}.salesforce.com/services/data/v31.0/sobjects/EventLogFile/${ids[$i]}/LogFile" -H "Authorization: Bearer ${access_token}" -H "X-PrettyPrint:1" -o "${logDates[$i]}-raw/${eventTypes[$i]}-${logDates[$i]}.csv" 

    #convert files into the logDate-tz directory for Salesforce Analytics
    awk -F ','  '{ if(NR==1) printf("%s\n",$0); else{ for(i=1;i<=NF;i++) { if(i>1&& i<=NF) printf("%s",","); if(i == 2) printf "\"%s-%s-%sT%s:%s:%sZ\"", substr($2,2,4),substr($2,6,2),substr($2,8,2),substr($2,10,2),substr($2,12,2),substr($2,14,2); else printf ("%s",$i);  if(i==NF) printf("\n")}}}' "${logDates[$i]}-raw/${eventTypes[$i]}-${logDates[$i]}.csv" > "${logDates[$i]}-tz/${eventTypes[$i]}-${logDates[$i]}.csv"


Downloading event log files is quick and efficient.  You can try these scripts out at http://bit.ly/elfScripts. Give it a try!


  1. Replies
    1. Thanks Ryan for following the blog!

    2. This comment has been removed by a blog administrator.

  2. ny updates to this?

    The sample script in this blog works perfectly against my Developer account but not production.

    The result of the token request is:
    "error" : "invalid_grant",
    "error_description" : "authentication failure"

    I've been round and round. It MUST be something simple I missed on the Salesforce setup side.

    1. Can you check to see if a security token is needed for your user?

    2. You can also get a sample script or download files via http://Salesforce-elf.herokuapp.com

    3. ello Adam,
      I have a problem with your scripts on Cygwin.
      1. {access_token} variable get correct values but when it's used on curl query it doesn't show any results :(
      I mean this part:
      elfs=`curl https://${instance}.salesforce.com/services/data/v32.0/query?q=Select+Id+,+EventType+,+LogDate+From+EventLogFile+Where+LogDate+=+${day} -H "Authorization: Bearer ${access_token}" -H "X-PrettyPrint:1"`

      2. If I assign a fixed value to that the scipt starts file download but I get the following message

      .csv.-03-27.csv to URI
      .csver of input files: 1 merged to output file: URI
      .csv.-03-27.csv to VisualforceRequest
      .csver of input files: 1 merged to output file: VisualforceRequest
      .csv.-03-27.csv to WaveChange
      .csver of input files: 1 merged to output file: WaveChange

      and no files are downloaded.
      I've tried using the script on Ubuntu 16.04 with a similar result :(

      Please help.

    4. Hi Jakub,

      Thanks for reading the blog! I'm sorry you're having this issue.

      A way to simplify this is to bootstrap the bash script via the web browser app on heroku: http://Salesforce-elf.herokuapp.com which will auto generate the bash script and may help to troubleshoot what's going on here.

      Something odd is definitely happening in your file because it's trying to merge files. The issue with that is that the code would have to try to merge the files into a single one which is another reason to try to simplify the code to try downloading just one day and one file type first before trying to download all files at once.

  3. is there a way to do this without having Admin access to SF? I could run the SOAP in workbench but downloading the results one by one is really painful.

  4. If you do not have the serial key and if none of the IDM serial keys 2019 mentioned above you can try downloading IDM cracked version from a website........ download

  5. Free version of latest software download from softwarestoic.com

  6. Great Article
    Cyber Security Projects

    projects for cse

    Networking Security Projects

    JavaScript Training in Chennai

    JavaScript Training in Chennai

    The Angular Training covers a wide range of topics including Components, Angular Directives, Angular Services, Pipes, security fundamentals, Routing, and Angular programmability. The new Angular TRaining will lay the foundation you need to specialise in Single Page Application developer. Angular Training

  7. You understand your projects stand out of the crowd. There is something unique about them. It seems to me all of them are brilliant. Hire a computer hacker

  8. thanks it really works

  9. Thanks for sharing this information. I really like your blog post very much. You have really shared a informative and interesting blog post with people.download

  10. I visited a lot of website but I believe this one has got something extra in it in itappspourpcz

  11. First, Thanks for giving the step by step example of extracting the event log files.

    However, I have a concern with the code. If you have let's say 10K events for a user * 500 users in an org. You are looking at making at least 5M API calls, will that not be a concern for the client?

    I tried querying the eventlogfile object in workbench and it produced LogFile field with base64 encoded value which can be decoded and stored as a file. When I tried doing the same with SOAP API, I failed I got LogFile urls. I would appreciate if you could provide some pointer.

    1. Thank you for your time and consideration.

      Ketan Benegal

  12. I think this is an informative post and it is very beneficial and knowledgeable. Therefore, I would like to thank you for the endeavors that you have made in writing this article. All the content is absolutely well-researched. Thanks... quads for sale

  13. I as of late ran over your website and have been perusing along. I thought I would leave my first remark. I don't realize what to say aside from that I have delighted in perusing. Decent blog. I will continue going to this online journal frequently. kajabi website designer

  14. Presently, because of globalization transfer of files is occurring at an enormous scope. file transfer

  15. Wow, excellent post. I'd like to draft like this too - taking time and real hard work to make a great article. This post has encouraged me to write some posts that I am going to write soon. taxikosten berekenen

  16. Thanks for picking out the time to discuss this, I feel great about it and love studying more on this topic. It is extremely helpful for me. Thanks for such a valuable help again. taxibusje huren
    Thanks for picking out the time to discuss this, I feel great about it and love studying more on this topic. It is extremely helpful for me. Thanks for such a valuable help again. taxibusje huren

  17. Very nice, pretty good information. Thank you so much for sharing such amazing information with us. Visit Ogen Infosystem for creative website designing and Digital Marketing Services in Delhi, India.
    Website Designing Company in India

  18. The article looks magnificent, but it would be beneficial if you can share more about the suchlike subjects in the future. Keep posting. YouTube Vanced

  19. You make such a large number of extraordinary focuses here that I read your article two or three times. Your perspectives are as per my own generally. This is extraordinary substance for your perusers.dallas heating Ac

  20. Baccarat is actually money making and it's outstanding accessibility. Optimal For you it's being sold that there are quite interesting alternatives. Which is thought to be a thing that's really diverse And it's rather something which may be hit with a perfect match The most excellent, also, is a really good alternative. Moreover, it's an extremely interesting alternative. It is a way that could make money. Superbly prime The variety of best earning baccarat is actually the accessibility of making the most money. Almost as possible is most suitable for you A substitute which may be assured. To a wide range of performance and accessibility And see outstanding benefits as well









  21. Nice to be visiting your blog once more, it has been months for me. Well this article that ive been waited for therefore long. i want this article to finish my assignment within the faculty, and it has same topic together with your article. Thanks, nice share. 토토사이트

  22. I really appreciate this wonderful post that you have provided for us. I assure this would be beneficial for most of the people.ซีรี่ย์เกาหลี

  23. I’ve read some good stuff here. Definitely worth bookmarking for revisiting. I surprise how much effort you put to create such a great informative website. 토토사이트