Thursday, October 4, 2012

The Joy of Salesforce Governor Limits

Working within the governor limits in Salesforce can be a real balancing act.  I understand why they're there; it's a shared environment and you can't have developers writing code that bogs down the servers for everyone else. Especially because, let's face it, there are lot of really shitty developers out there. But that doesn't mean I don't curse their name when I see those dreaded LimitException errors.

I started writing Apex as soon as it came out; back when the governor limits were a lot more stringent and nifty tricks for avoiding them like future methods didn't exist yet.  Nowadays, I have a number of clients processing large amounts of data and/or implementing very complex business processes on their data, so I've come up with quite a few tips and tricks on dealing with the limits that I thought I'd share.

Bulk Proofing Code and Future Methods

I'm lumping these together because you don't really need me to explain them to you.  Unlike many other bloggers, I'm not interested in proving how clever I am by regurgitating information already published much more eloquently by Salesforce themselves.  So instead I'll just refer you to them-
If you want more info on those, Google is your friend.  If you can't figure out how to bulk-proof your code, you probably have no business writing code; this is shared environment coding 101 (and is not so bad for developers working within their own environments to know either).  

Two drawbacks of Future methods that are worth noting if you're considering implementing them-
  • Your future method may be generating records or updating fields that your users are expecting to see instantly, and it can be confusing for them if they don't. Generally it happens fast enough that all they have to do is refresh the page to see it, but asynchronous logic is a little abstract to explain to a non-developer (and even to a lot of developers).
  • If your future method hits an error, either your user will never know it or you'll have to write in logic that emails them if an error is hit- but again, kind of confusing to an end-user.  Be prepared to have to fix errors being hit in future methods that you receive by email and remember to go back and clean-up any data affected by the fact that your method didn't fire correctly.

Chunking Future Methods

I've had scenarios where I have DML operations on multiple records, either set off by Apex or data imports, that fire a future method and then hit governor limits within that method.  Each time you run code, you can set off up to 10 Future methods so if governor limits are an issue in your future method, consider breaking up whatever you're sending into the method into manageable chunks.  An example of some trigger logic that does exactly that here-

//Breaking this into chunks of 30 to avoid governor limits
Map<Integer, Set<Id>> mapSetStats = new Map<Integer, Set<Id>>();
for (Attendance__c a : Trigger.new){ 
    if ((a.Status__c == 'Attended' &&Trigger.isInsert)
     || (a.Status__c != Trigger.oldMap.get(a.Id).Status__c 
         &&Trigger.oldMap.get(a.Id).Status__c == 'Attended')
    ){
    //for first attended or when last Set in the Map has 30 ids, add a new Set to the map
    if (mapSetStats.size() == 0 || mapSetStats.get(mapSetStats.size()).size() == 30){
    mapSetStats.put(mapSetStats.size() + 1, new Set<Id>());
    }
    mapSetStats.get(mapSetStats.size()).add(a.Id);
    }

}
//Sync Monthly/Annual Client Statistics:
if (mapSetStats.size() > 0){
for (Integer i : mapSetStats.keySet()) 
            clsAttendance.syncClientStatistics(mapSetStats.get(i), new Set<Id>());
}

In this example, "mapSetStats" is populated by individual Sets of Ids, each of which is no larger than 30 records each.  I happen to know that users will never update more than 300 records at a time (10 Future calls * 30 record batches) so I can get away with this.  If chunking isn't feasible for any reason, then you'll need another solution, such as....

Business Logic in Scheduled Code

I've used this not only for getting around governor limits but also for limits on the other side of external web service callouts. Here's the basic idea-
  • Instead of sending records to a Future method for processing, find a way to gather records that require processing in a query.  In some scenarios, I've created a checkbox field specifically for this purpose and updated that field in a trigger to true, but sometimes the very nature of what you need to accomplish (e.g. a blank field that needs to be filled in) means that you don't have to explicitly update it to identify those records in a query.
  • Create a Batchable class which queries for the records that have been marked in the step above.  For details on this, see Salesforce's article on Using Batch Apex.  
  • Create a Schedulable class which performs your logic on records from your Batchable class.  Again, you don't need me to figure this out, see Apex Scheduler.
  • When you use Database.executeBatch to get records from the Batchable class, you can pass in an optional second parameter to set the limit on the number of records returned- you just need to figure out what's the largest number of records you can process at once without hitting the limits.
  • If you did need to explicitly mark records for processing, like my checkbox example above, make sure you un-mark them when the processing is complete.  Otherwise, the same records will just keep being update in an endless loop.
As with Future methods, users will need to understand that there will be a delay in the updates that happen as a result of scheduled code.  I've been lucky enough that in the cases where I have had to use scheduled code, users haven't had an issue with that.  If your users do have a problem with it, remind them that the fancy, magical updates they're looking for were probably absolutely impossible for them to accomplish just a few years ago so they'll have to shut up and deal.  Depending on your relationship with them, you might need to re-phrase that.

No comments:

Post a Comment