Friday, October 5, 2012

Zoo Life- The End

My stint as a docent at the Prospect Park Zoo came to an end a few months back and I thought I'd take a moment to talk about why.  Partly just for the irony of it; you'll see what I mean.

Here's how the story began, but not why I ultimately quit.  I was part of the animal handling program there.  This had been going on for a number of years; docents who had gone through training and been certified were able to present animals to the public during tours and scheduled presentations on  weekends.  Not all of the zoo animals mind you, only ones the animal keepers had designated as tractable based on species and the individual, like chinchillas, chickens and some snake and lizard species.  It was a rewarding program for the docents and a nice chance for the public to get up-close-and-personal with some of the animals, including a chance to touch them.

Rather suddenly, the program was ended.  To my knowledge, there was no unfortunate incident that brought this on, and that's the kind of rumor that spreads pretty quickly.  This seemed more like a policy decision from the managerial level, probably from someone new to the zoo who was worried about liability.  That's a guess though, we were just told that the program was no more.  I was disappointed, but also recognized that I was just a volunteer there and as such didn't have any say in policy.  Some of the other docents took it a lot harder, there were meetings (including tearful pleas) and emails and petitions on the web to appeal the end of the program.  Frankly, it was all a little melodramatic for my tastes.

At this point though, I'm still a loyal volunteer and staying out of the fray.  I even received a call from the NY Post (blech) on the matter and had nothing but good things to say about the zoo.  Until the day I was informed that a new condition of being a volunteer was that I had to sign an agreement saying that I would never make any public (via media, Facebook, email or however) negative statements about the zoo.  Excuse me?  I had no problem with restrictions about what I could say while I was at the zoo acting as a representative, but trying to restrict my free speech as an individual outside of the zoo?  That was unacceptable to me.  If I was an employee, it would be illegal to require that of me; we have whistleblower protection laws in this country for that.  If they did ask their employees to sign the same agreement, I hope the illegality of it comes back to bite them in the ass later.

I don't like it when people try to inappropriately assert authority over me, or anyone.  People who do so are generally bad human beings even beyond their pathetic little power grabs.  Being the CEO of my own company has made me rather sensitive to it.  So told them I wouldn't sign and why and left the zoo on my final day of volunteering sniffling as I did so.  I really did enjoy my time there until some asshole  who didn't even have the courage to identify themselves came along and ruined it for me.  God I hate having principles sometimes.

Thursday, October 4, 2012

The Joy of Salesforce Governor Limits

Working within the governor limits in Salesforce can be a real balancing act.  I understand why they're there; it's a shared environment and you can't have developers writing code that bogs down the servers for everyone else. Especially because, let's face it, there are lot of really shitty developers out there. But that doesn't mean I don't curse their name when I see those dreaded LimitException errors.

I started writing Apex as soon as it came out; back when the governor limits were a lot more stringent and nifty tricks for avoiding them like future methods didn't exist yet.  Nowadays, I have a number of clients processing large amounts of data and/or implementing very complex business processes on their data, so I've come up with quite a few tips and tricks on dealing with the limits that I thought I'd share.

Bulk Proofing Code and Future Methods

I'm lumping these together because you don't really need me to explain them to you.  Unlike many other bloggers, I'm not interested in proving how clever I am by regurgitating information already published much more eloquently by Salesforce themselves.  So instead I'll just refer you to them-
If you want more info on those, Google is your friend.  If you can't figure out how to bulk-proof your code, you probably have no business writing code; this is shared environment coding 101 (and is not so bad for developers working within their own environments to know either).  

Two drawbacks of Future methods that are worth noting if you're considering implementing them-
  • Your future method may be generating records or updating fields that your users are expecting to see instantly, and it can be confusing for them if they don't. Generally it happens fast enough that all they have to do is refresh the page to see it, but asynchronous logic is a little abstract to explain to a non-developer (and even to a lot of developers).
  • If your future method hits an error, either your user will never know it or you'll have to write in logic that emails them if an error is hit- but again, kind of confusing to an end-user.  Be prepared to have to fix errors being hit in future methods that you receive by email and remember to go back and clean-up any data affected by the fact that your method didn't fire correctly.

Chunking Future Methods

I've had scenarios where I have DML operations on multiple records, either set off by Apex or data imports, that fire a future method and then hit governor limits within that method.  Each time you run code, you can set off up to 10 Future methods so if governor limits are an issue in your future method, consider breaking up whatever you're sending into the method into manageable chunks.  An example of some trigger logic that does exactly that here-

//Breaking this into chunks of 30 to avoid governor limits
Map<Integer, Set<Id>> mapSetStats = new Map<Integer, Set<Id>>();
for (Attendance__c a : Trigger.new){ 
    if ((a.Status__c == 'Attended' &&Trigger.isInsert)
     || (a.Status__c != Trigger.oldMap.get(a.Id).Status__c 
         &&Trigger.oldMap.get(a.Id).Status__c == 'Attended')
    ){
    //for first attended or when last Set in the Map has 30 ids, add a new Set to the map
    if (mapSetStats.size() == 0 || mapSetStats.get(mapSetStats.size()).size() == 30){
    mapSetStats.put(mapSetStats.size() + 1, new Set<Id>());
    }
    mapSetStats.get(mapSetStats.size()).add(a.Id);
    }

}
//Sync Monthly/Annual Client Statistics:
if (mapSetStats.size() > 0){
for (Integer i : mapSetStats.keySet()) 
            clsAttendance.syncClientStatistics(mapSetStats.get(i), new Set<Id>());
}

In this example, "mapSetStats" is populated by individual Sets of Ids, each of which is no larger than 30 records each.  I happen to know that users will never update more than 300 records at a time (10 Future calls * 30 record batches) so I can get away with this.  If chunking isn't feasible for any reason, then you'll need another solution, such as....

Business Logic in Scheduled Code

I've used this not only for getting around governor limits but also for limits on the other side of external web service callouts. Here's the basic idea-
  • Instead of sending records to a Future method for processing, find a way to gather records that require processing in a query.  In some scenarios, I've created a checkbox field specifically for this purpose and updated that field in a trigger to true, but sometimes the very nature of what you need to accomplish (e.g. a blank field that needs to be filled in) means that you don't have to explicitly update it to identify those records in a query.
  • Create a Batchable class which queries for the records that have been marked in the step above.  For details on this, see Salesforce's article on Using Batch Apex.  
  • Create a Schedulable class which performs your logic on records from your Batchable class.  Again, you don't need me to figure this out, see Apex Scheduler.
  • When you use Database.executeBatch to get records from the Batchable class, you can pass in an optional second parameter to set the limit on the number of records returned- you just need to figure out what's the largest number of records you can process at once without hitting the limits.
  • If you did need to explicitly mark records for processing, like my checkbox example above, make sure you un-mark them when the processing is complete.  Otherwise, the same records will just keep being update in an endless loop.
As with Future methods, users will need to understand that there will be a delay in the updates that happen as a result of scheduled code.  I've been lucky enough that in the cases where I have had to use scheduled code, users haven't had an issue with that.  If your users do have a problem with it, remind them that the fancy, magical updates they're looking for were probably absolutely impossible for them to accomplish just a few years ago so they'll have to shut up and deal.  Depending on your relationship with them, you might need to re-phrase that.