Showing posts with label Technical. Show all posts
Showing posts with label Technical. Show all posts

Thursday, October 4, 2012

The Joy of Salesforce Governor Limits

Working within the governor limits in Salesforce can be a real balancing act.  I understand why they're there; it's a shared environment and you can't have developers writing code that bogs down the servers for everyone else. Especially because, let's face it, there are lot of really shitty developers out there. But that doesn't mean I don't curse their name when I see those dreaded LimitException errors.

I started writing Apex as soon as it came out; back when the governor limits were a lot more stringent and nifty tricks for avoiding them like future methods didn't exist yet.  Nowadays, I have a number of clients processing large amounts of data and/or implementing very complex business processes on their data, so I've come up with quite a few tips and tricks on dealing with the limits that I thought I'd share.

Bulk Proofing Code and Future Methods

I'm lumping these together because you don't really need me to explain them to you.  Unlike many other bloggers, I'm not interested in proving how clever I am by regurgitating information already published much more eloquently by Salesforce themselves.  So instead I'll just refer you to them-
If you want more info on those, Google is your friend.  If you can't figure out how to bulk-proof your code, you probably have no business writing code; this is shared environment coding 101 (and is not so bad for developers working within their own environments to know either).  

Two drawbacks of Future methods that are worth noting if you're considering implementing them-
  • Your future method may be generating records or updating fields that your users are expecting to see instantly, and it can be confusing for them if they don't. Generally it happens fast enough that all they have to do is refresh the page to see it, but asynchronous logic is a little abstract to explain to a non-developer (and even to a lot of developers).
  • If your future method hits an error, either your user will never know it or you'll have to write in logic that emails them if an error is hit- but again, kind of confusing to an end-user.  Be prepared to have to fix errors being hit in future methods that you receive by email and remember to go back and clean-up any data affected by the fact that your method didn't fire correctly.

Chunking Future Methods

I've had scenarios where I have DML operations on multiple records, either set off by Apex or data imports, that fire a future method and then hit governor limits within that method.  Each time you run code, you can set off up to 10 Future methods so if governor limits are an issue in your future method, consider breaking up whatever you're sending into the method into manageable chunks.  An example of some trigger logic that does exactly that here-

//Breaking this into chunks of 30 to avoid governor limits
Map<Integer, Set<Id>> mapSetStats = new Map<Integer, Set<Id>>();
for (Attendance__c a : Trigger.new){ 
    if ((a.Status__c == 'Attended' &&Trigger.isInsert)
     || (a.Status__c != Trigger.oldMap.get(a.Id).Status__c 
         &&Trigger.oldMap.get(a.Id).Status__c == 'Attended')
    ){
    //for first attended or when last Set in the Map has 30 ids, add a new Set to the map
    if (mapSetStats.size() == 0 || mapSetStats.get(mapSetStats.size()).size() == 30){
    mapSetStats.put(mapSetStats.size() + 1, new Set<Id>());
    }
    mapSetStats.get(mapSetStats.size()).add(a.Id);
    }

}
//Sync Monthly/Annual Client Statistics:
if (mapSetStats.size() > 0){
for (Integer i : mapSetStats.keySet()) 
            clsAttendance.syncClientStatistics(mapSetStats.get(i), new Set<Id>());
}

In this example, "mapSetStats" is populated by individual Sets of Ids, each of which is no larger than 30 records each.  I happen to know that users will never update more than 300 records at a time (10 Future calls * 30 record batches) so I can get away with this.  If chunking isn't feasible for any reason, then you'll need another solution, such as....

Business Logic in Scheduled Code

I've used this not only for getting around governor limits but also for limits on the other side of external web service callouts. Here's the basic idea-
  • Instead of sending records to a Future method for processing, find a way to gather records that require processing in a query.  In some scenarios, I've created a checkbox field specifically for this purpose and updated that field in a trigger to true, but sometimes the very nature of what you need to accomplish (e.g. a blank field that needs to be filled in) means that you don't have to explicitly update it to identify those records in a query.
  • Create a Batchable class which queries for the records that have been marked in the step above.  For details on this, see Salesforce's article on Using Batch Apex.  
  • Create a Schedulable class which performs your logic on records from your Batchable class.  Again, you don't need me to figure this out, see Apex Scheduler.
  • When you use Database.executeBatch to get records from the Batchable class, you can pass in an optional second parameter to set the limit on the number of records returned- you just need to figure out what's the largest number of records you can process at once without hitting the limits.
  • If you did need to explicitly mark records for processing, like my checkbox example above, make sure you un-mark them when the processing is complete.  Otherwise, the same records will just keep being update in an endless loop.
As with Future methods, users will need to understand that there will be a delay in the updates that happen as a result of scheduled code.  I've been lucky enough that in the cases where I have had to use scheduled code, users haven't had an issue with that.  If your users do have a problem with it, remind them that the fancy, magical updates they're looking for were probably absolutely impossible for them to accomplish just a few years ago so they'll have to shut up and deal.  Depending on your relationship with them, you might need to re-phrase that.

Sunday, September 2, 2012

Using Third-Party Generated Certificates in HTTPRequest Calls

I've been doing various integrations with external web services lately and have learned some things along the way. The latest integration I did required two-way authentication.  The Salesforce documentation makes some assumptions about these integrations-
  1. That the service you're integrating with provides a parseable WSDL.
  2. That the service you're integrating with allows you to upload a signed certificate that was generated from Salesforce for the two-way integration.
And if both of those hold true, the Salesforce documentation (and various related blog posts) will be sufficient for you.  However-
  1. Not all services provide a WSDL, and even those that do might include XML elements that are not supported by Salesforce.  In this case, you're on your own for writing an Apex class to interact with the web service.
  2. Some services provide a WSDL that Salesforce can parse, but you'll find that you need to edit the resulting Apex class to include properties that were excluded.  Though I didn't dig in to see if this was the fault of the WSDL file itself or the parser, I suspect the latter.  How to go about editing that generated Apex class is worth it's own post.
  3. Some services provide you with a client certificate file that you need to include in your request- you don't have the option of generating one in Salesforce and getting it signed.
None of these are insurmountable, but they'll take a lot more leg-work on your part and in the case of having to provide a third-party generated client certificate in an HTTP request, I found myself having to piece together how to do so from various forum posts.  So I'll focus on that today.  

So an external service (in my case, the credit card processor First Data) provides you with a client certificate file and a password for that file.  Now what?  In Salesforce's documentation for the HTTPRequest class you'll notice a method called "setClientCertificate" with a note that this method is deprecated and you should really use "setClientCertificateName".  But if you've been provided with the client certificate, "setClientCertificate" is what you'll need.  Here's how to get it to work-
  • Upload the client certificate as a Static Resource
  • In Apex, query for that file and base64 encode the body of the file into a string variable.
  • In your HTTPRequest variable, use the setClientCertificate method with the base64 encoded certificate as the first argument, and the certificate password as the second argument.
Oh sure, it's easy when you know how to do it.  And a note that alternatively, you can base64 encode the certificate yourself and use the resulting string in setClientCertificate, but the above seemed a little more elegant.  The actual code to accomplish this-


String cert;

for (StaticResource sr : [Select Id, Body from StaticResource where Name =: <certfilename>]){
cert = EncodingUtil.base64Encode(sr.Body);
}
HttpRequest hReq = new HttpRequest();
hReq.setEndpoint(<url>);
hReq.setMethod('POST');
hReq.setClientCertificate(cert, <certpassword>);

If you were lucky enough to have a WSDL that Salesforce was able to parse, you'll be setting the "clientCert_x" variable of the stub to the base64 encoded string and the "clientCertPasswd_x" variable to the password.

Once again, hope this saves someone some time!

Saturday, September 1, 2012

Encrypted Fields in Apex- one gotcha

I've been doing some work integrating Salesforce with a credit card processor and hit upon a issue with encrypted fields in Apex that I haven't found documented anywhere.  It may actually be entirely intentional, but without documentation it's confusing and I did see a couple other people hitting the same problem.  According to Salesforce documentation, when you work with an encrypted field in Apex, the code will always see the unmasked value of the field.  But there's an exception, if you pass an Sobject into a method, you'll find that you'll be retrieving the masked value of the field.   I've only tested with static methods so far, and it might be specific to those.

More concrete examples.  You have an Opportunity that has already been saved with a value in the Credit Card Number field, which is encrypted.  You pass that Opportunity record into the following method.


public static void EncryptedExample(Opportunity opp){
String strCC = opp.Credit_Card_Number__c;
System.debug('Can the code see the masked number 9 in this field? '     
                     +strCC.contains('9'));
//No, it can't!
}

This is only an issue if the SObject you're passing in has been saved already.  If you pass in an SObject that either hasn't been inserted or if the value of the encrypted field has been updated but not committed to the database, then that field hasn't been encrypted and masked yet and the code therefore sees the actual value.

The solution?  You'll have to query for the record within your method and then all works as expected-


public static void EncryptedExample(Id idOpp){
Opportunity opp = [Select Amount, Payment_Type__c, Deposit__c, Credit_Card_Number__c, Credit_Card_Exp__c from Opportunity where Id =: idOpp];
String strCC = opp.Credit_Card_Number__c;
System.debug('Can the code see the masked number 9 in this field? ' 
                     +strCC.contains('9'));
//Yes it can!
}
Hope this helps someone, I know I spent a couple hours baffled (at first I thought it was a mistake in my webservice callout).

Thursday, August 9, 2012

If you say "social media" just one more time!

As a tangentially related aside to the following rant- I'm really sick of the high percentage of "experts" I see on TV brought on to talk to me about technology, though they have zero background actually working in technology themselves.  And I'm talking about working in a capacity that creates technology, not just that they've used technology to blog or tweet before, or that they had an idea that they then hired actual technical people to implement for them.

It's too much with the social media.  Social media is just one small part of a larger culture of technology, and frankly not as earth-shaking as our "experts" would have us believe.  And if you don't believe that, go check the price of Facebook's stock.  It hardly even matters when you read this, that will likely hold true. And this is a company that gets non-stop, 24-hour, free advertising on television, radio, web sites, business cards... I could go on for awhile here.  But still you can't gain any traction in the market?

The only thing that surprised me about the Facebook stock plummet was how quickly it happened. I figured the bubble would hold for at least a couple months after the IPO. But despite the hype around the launch, corporations, and in particular publicly-owned corporations, are about profit and the Instagram purchase was just one of many cracks in the facade of Facebook's potential profitability.  Which reminds me, you should really check out The State of the Web, Spring 2012 on The Oatmeal.

But I digress, the whole Facebook thing has been nauseating and really worth an entry of it's own, but at the heart of the problem is the over-inflated sense of importance of social media in general.  Social media has it's place and people have found ingenious ways to make use of it.  But it's not the end-all, be-all of the evolution of human technology.  And let's not ignore how many relationships and I'm sure even lives have been destroyed with the help of social media.

Are you an organization struggling to figure out how to employ social media?  Maybe you don't have to; maybe there are other tools you should be investing in altogether.  Social media has become the hammer, and now everything is a nail.  As a consultant offering software development services, I've gotten some strange requests related to social media.  The funniest one was an organization who provides services to people with involvement in the criminal justice system asking if I could set up Facebook accounts for all of their clients that could be controlled and accessed by the organization.  Um, no, and more importantly, why on earth would you want to do that?

Oh yeah, and social media is not new.  I'm not sure how people have been convinced of this, but I'm sure everyone involved in the early development of Bulletin Board Systems in the late 1970s and early 80s cringes when they hear how "revolutionary" this all is.


Monday, February 6, 2012

You know it's bad when you forget about chocolate

A week ago I bought myself a dark chocolate bar.  It was very much an impulse buy one morning in the midst of the Client Cutover From Hell.  I slipped that bar of chocolate into my purse... and today I remember it's there.  That chocolate bar that I had a week ago decided might give some brief chemical reprieve.  And now as I sit at the tail-end of the Client Cutover From Hell, I celebrate with my chocolate.  It really tells you more about this week than much else I could say.  And yet I will.

I have a few gripes after going through this process.  These are things I have long been sick of but now that they've become the cause of much sleep deprivation, I therefore declare war on....
  • Crap software companies that over-charge non-profits with promises of technology like you haven't seen since the mid-90s.  I'm looking at you, Convio Luminate!  This I think is worth a post in and of itself.
  • Crap IT support who don't have any clue what they're doing. Some of my difficulties ended up being the result of having to reverse engineer what turned out to be a completely asinine domain configuration by this organization's IT support consultant.  They've already expressed dissatisfaction with this guy, and I'm going to recommend that they drop him altogether.
  • Visualforce reRenders.  I used to think it was me, that there were just nuances to reRendering in Visualforce that I was missing.  And while I'm sure that is the case at times, I'm now also quite sure that the way reRender works in Visualforce can be just plain inconsistent.  I discovered in this project that pages that I wrote months ago that were working perfectly fine, now had to re-written because reRenders which previously worked no longer did.  My fellow developer and contractor has found the same thing and has taken to using a third-party Javascript to do the reRenders in many scenarious.  I'm with her (yup, another female developer!)

Sunday, January 8, 2012

Visualforce to Word


I've been having lots of fights with Microsoft applications lately.  Perhaps as a web developer, this goes without saying.  I just won a fight getting a Visualforce page to display correctly when converted to a Word file.  I was generating a file of certificates with each certificate on a separate page and a very specific layout that had to be followed. Thought I'd share a few tips on what I learned.

Basics
First off, how to get a Visualforce page to generate as a Word file.  In the "page" tag, include the following:

<apex:page contentType="application/msword#FileName.doc" showHeader="false" standardStylesheets="false">

Page Formatting
  • Save a Word doc as HTML
    Not sure how to get the font, border, layout, etc specified in Visualforce to display correctly in Word?  Do it in Word and then save it as an HTML file.  The file it generates is a bit of a beast, but if you scroll down, you'll find <style> tags with Word-specific attributes you can use in <style> tags in Visualforce. You can also copy and paste paragraphs from the html into your Visualforce, though I wouldn't recommend trying to paste the whole file, there's just too much crap (very technical term) in there and Salesforce won't even let you save it the way Word writes the html tags.

    For my document, I grabbed these styles to create the following in Visualforce; note the orientation style to set the page as landscape:
    <style>
            @page Section1{
                size:11.0in 8.5in;
                mso-page-orientation:landscape;
                margin:.25in .25in .25in .25in ;
                mso-header-margin:.5in;
                mso-footer-margin:.6in;
            }
            div.Section1
            {
                 page:Section1;
            }
            body {
                font-family:"Gotham-Book";
            }
    </style>
    For the style above to work, you'll need to surround your content in Visualforce with: 
<div class="Section1">your content</div>
  • Use "in" (inches) for widths to layout the page
    Word is designed to render documents for printing, so widths should be specified in inches (and I'm guessing centimeters for everyone that's abandoned our archaic measuring system).  Just a note that "px" does work in Word (in specific contexts) but if you're trying to fit something to a page like a border that goes around everything, save yourself the trouble and just do it in inches.
  • Use "pt" for font sizes
    Setting the font size using style="font-size:12pt;" in VF is the equivalent of selecting 12 for the font size in Word.
  • Margin instead of Padding Padding wasn't doing the trick to space out lines of text but margin works.  Here's a sample:

  • <p style="font-size:22.0pt;font-family:Gotham-Medium;margin-top:12pt;margin-bottom:.0001pt;">
                    Some text here
    </p>
  • Page breaks
    Wherever you want a page break, insert this:

    <br clear="all" style="mso-special-character:line-break;page-break-before:always;"/>

    Note that I tried using "page-break-before:always;" as the style for a div and that was a no go, Word wants it in a break or paragraph tag.
Including Images
  • Don't use a secure URL (https) as the image URL
    I came across forum posts where people were able to successfully use the secure Salesforce URL (i.e. https://na2.salesforce.com/...) to link to images in Word, so my guess is that this varies based on the version of Word you have.  I have Word 2007 and the images would not display until I switched to a non-secure URL, which I did using Salesforce Sites.  It's probably a good idea to do this even if the secure URL works for you because it may not work for others.  
  • Use the full URL
    If your image is stored in Salesforce Documents, you can generate that URL like this:
    http://yourdomain.force.com/servlet/servlet.ImageServer?id=015D0000000Dpwc&oid=00DD0000000FJbGyourdomain = 
    Your Salesforce Sites domain
    id= The Id of the Document
    oid= Your Organization Id, which you can find in Setup in Company Information

    You can also store the image as a Static Resource and link to it that way.
Exclude "Visualforce" from your web searches
Still need help formatting your Visualforce page for Word? When it comes to figuring out formatting in Word, Google (as is often the case) is your friend and no doubt how you arrived here, but don't limit your searches by including "Visualforce" in the search terms. People use html to generate Word files in other languages, and some of the best tips I found were on PHP and ASP forums.

Tuesday, January 3, 2012

Salesforce general sentiments


Right now my business is based almost exclusively on developing on the Salesforce.com platform, but I'm not one of those Marc Benioff idolizing, Salesforce can do no wrong type of people (those people are annoying).  I think Salesforce has done a good job of creating an extensible CRM that allows for non-technical people to customize it up to a point, and developers to customize it beyond that point.

For awhile, the supposed plan for Salesforce was to become a true platform- they handle the storage and maintenance and provide the framework on which everyone could build or install pre-built applications.  Included in that pitch was the statement that they wouldn't build enhancements specific to industry verticals or job functions, that would be left to the customers and third-party developers.  Good bye Salesforce.com, hello Force.com.  I liked this plan. 

This is clearly no longer the plan.  Salesforce continues to build functionality for sales and other verticals, and has gone the Microsofty route of acquiring companies who have built Salesforce solutions and incorporating those solutions into the platform.  This plan I don't like for the same reason I never cared for Microsoft doing it.  Trying to incorporate software into a platform that was not specifically engineered to be incorporated creates complexities (i.e. opportunities for bugs) and inefficiencies that could have been avoided by engineering something from the ground up to actually be a part of the platform.  It's a big part (though not the only reason) of why Windows is the giant bloated piece of crap that is today.  Microsoft has managed to prosper despite this by having a strong foothold in the market, particularly corporate sector of the market, and because computers have progressed to be able to handle the bloatiness of Windows and still function. 

As of late, we've seen some instability in the Salesforce sandbox environment. There was that week this past fall where one of the sandboxes was dead to the world, and just last night the entire sandbox environment, in North America anyway, was hiccuping.  Do I know for sure that these glitches were the result of the way they've expanded the platform?  Absolutely not.  But they were no real surprise to me when they hit and I suspect we'll see more; it's as hard for Salesforce to avoid as a painfully long boot-up time is for MS Windows.  Be careful there Icarus, your wings are looking a little gooey.

Friday, December 9, 2011

Simile Timeline in Salesforce, Revisited

The Simile Timeline is a javascript widget that's available as open source to create interactive timelines.  If you go to the main page for the Simile Timeline, you'll see how you can drag the timeline back and forth through time and click on items to see a detailed description; it's pretty slick.

Back in another life, I had implemented this timeline in Salesforce to provide a graphical display of a Contact's history on their detail page: Events, Tasks, (Job) Opportunities, Interviews, Jobs.  I did it by customizing the free Timeline S-control mashup that utilizes that Simile Timeline widget.  It provided a nice graphical narrative that you could scroll forward or backward in time through, and clicking an individual item on the Timeline would display details on and a link to the record.  This is the screenshot of that application from the appExchange:

That application is still available, but in it's original S-control form, now replaced by Visualforce.  Salesforce still supports S-controls, so you can  use it if you're fine with the default configuration it comes with.  If you're going to need to customize the information that's displayed on the timeline, for instance showing custom object child records to Account, Contact, Case or Opportunity or values of custom fields in the pop-up details, I'd recommend just doing this in Visualforce, which is what I've just done for a client.  Anyone with a bit of javascript and Apex savvy can easily implement this themselves.

I'm considering creating a nifty little package that anyone can install to implement it with no code, probably with an admin screen to let Sys Admins select the Timeline settings such as the intervals of time displayed, and a query interface to display custom object records or fields without requiring any code.  Interested?  Leave me a comment.

Sunday, December 4, 2011

Why won't this damn thing reRender properly...

Since Visualforce came into being, the aspect of it I've had the most trouble with is the reRender, or more specifically, getting reRenders to behave the way I want them to.  This is partly because much of the coding I did before Salesforce was behind the scenes logic and not web interface design, but it's partly because reRenders can be a real pain in the ass.  Along the way, I've figured out some tips and thought I'd share.

actionRegion
You will drive yourself insane with reRenders if you don't know how to use actionRegions. actionRegions are to submitting what reRenders are to rerendering.  Perhaps a little more explanation and a demo are in order.

When you perform any action on your VF page, by default the entire form is submitted.  You may not actually need everything submitted though; you may just be need a portion of your variables submitted, or maybe none at all.  Submitting everything needlessly means that not only are you slowing down performance of your page, but the page is attempting to validate what's being submitted before it sends anything back to the controller.  So if you have any required inputFields, for instance, those fields will need to be filled in before the form submits at all.  And if you don't know that's what's happening, either because your form has no <apex:pageMessages/> or it's not being reRendered, it will appear as if your action isn't working for no reason at all. 

Let's use this simple page and controller as an example, first without an actionRegion (incorrect):
Page before adding actionRegion (the wrong way)
<apex:page standardController="Account" extensions="extTest">
    <apex:form >
        <apex:pageBlock>
            <apex:pageBlockButtons >
                <apex:commandButton value="Show Billing Street" reRender="opBilling">
                    <apex:param value="true" assignTo="{!ShowBilling}"/>
                </apex:commandButton>
            </apex:pageBlockButtons>
           
            <apex:pageBlockSection >
                <apex:inputField value="{!Account.Name}"/>
            </apex:pageBlockSection>
           
            <apex:outputPanel id="opBilling">
                <apex:pageBlockSection rendered="{!ShowBilling}" id="pbMailing">
                    <apex:inputField value="{!Account.BillingStreet}"/>
                </apex:pageBlockSection>
            </apex:outputPanel>
        </apex:pageBlock>
    </apex:form>
</apex:page>
 Extension:
public with sharing class extTest {
    public Boolean ShowBilling {get;set;}
   
    public extTest(ApexPages.StandardController controller) {
        ShowBilling = false;
    }

}
The idea is, the page opens showing only the Account Name field. Clicking the Showing Billing Street button will update the ShowingBilling variable to true and reRender the opBilling outpanel so the Billing Street field is displayed.  As it's written above however, it will not work if the Account Name field isn't filled in first.  That Name field is required, so the command button tries to submit the whole form and it fails validation without a Name value entered.  Since only the opBilling outputPanel is being reRendered, you won't event see an error message since pageMessages isn't being reRendered (Debug logs are your friend when troubleshooting such issues).

One way to fix this is by updating the page with an actionRegion around the commandButton like so:
pageBlockButtons after adding actionRegion (one right way)
<apex:pageBlockButtons >
                <apex:actionRegion >
                    <apex:commandButton value="Show Billing Street" reRender="opBilling">
                        <apex:param value="true" assignTo="{!ShowBilling}"/>
                    </apex:commandButton>
                </apex:actionRegion>
 </apex:pageBlockButtons>
Now with actionRegion around that button, only the values inside of the actionRegion are submitted, so BillingStreet will be rendered with or without a value in the Account Name field.

Immediate
For the sake of keeping the example above simple, the button didn't do anything other than updating ShowBilling using the param tag and reRendering opBilling.  In reality, you'd only need actionRegion when you need some portion of the variables on the page submitted to the controller/extension to perform logic in a method.  If you don't need to submit any variables, then you can use the "immediate" property of the tag that's performing an action.  Let's see this with the example above:
pageBlockButtons after adding immediate (another right way)
<apex:pageBlockButtons >
                <apex:commandButton value="Show Billing Street" immediate="true" reRender="opBilling">
                    <apex:param value="true" assignTo="{!ShowBilling}"/>
                </apex:commandButton>
</apex:pageBlockButtons>
Make sure you know what immediate means before using it. I've seen people set immediate to true to avoid issues with required fields not realizing that their controller/extension was not receiving any updated variable values from their page, thereby screwing up their logic and leaving them flummoxed.

Choosing what to reRender
I'll probably expand on this in a separate post because there are a lot of scenarios to consider, but I'll use the example above to give one basic tip. 
You may have noticed in my example that I've surrounded the pageBlockSection that contains BillingStreet with an outputPanel, and the reRender acts on that outputPanel.  This may seem like an extra component- why not just reRender the pageBlockSection itself and skip that outputPanel altogether?

What I've learned is that if a reRender updates a variable on your page, you need to reRender a parent of the tag that contains that variable, and not the tag itself. OutputPanels are a useful means of doing so.  To see this in action, try updating the example above so the button attempts to reRender the pageBlockSection directly:
pageBlockButtons with reRender of the pageBlockSection (wrong way)
<apex:pageBlockButtons >
                <apex:commandButton value="Show Billing Street" immediate="true" reRender="pbMailing">
                    <apex:param value="true" assignTo="{!ShowBilling}"/>
                </apex:commandButton>
  </apex:pageBlockButtons>
If you try this example with the commandButton set to reRender the pageBlockSection, it just doesn't work.  So either you need that outputPanel surrounding the pageBlockSection, or alternatively you could move the rendered="ShowBilling" property to the inputField instead, but in this example, that would mean you're rendering an empty pageBlockSection.

As above, there are more scenarios to explore here, but I'll leave it at this for now and hopefully this is helpful to people frustrated with the nuances of reRendering.

Wednesday, June 10, 2009

To Visualforce or not to Visualforce

Visualforce is a wonderful thing. User interface requests that used to be impossible or at the least very complex to implement in Salesforce are now quite do-able. The end users at my organization look at my latest Visualforce creations with eyes wide and mouths agape, dreaming of the new world of possibilities open to them. But with great power comes great responsibility, and I feel it my duty to caution people of the disadvantages of Visualforce.

There are two major down-sides to replacing standard Salesforce page layouts with Visualforce pages to be aware of going in. These don't apply to VF pages created to supplement the standard SF pages, which have no down-side that I can think of and I'll soon post about some rather nifty pages my developer-in-training and I have come up with. The first is the loss of on-demand configuration. The days of calling your SF admin, requesting a new field and having it magically appear 5 minutes later are over once you become dependent on VF for creating/editing records. Gone also are the days of non-technical power users being able to go in and quickly add or change fields. Adding new and changing existing fields, as in olden times (e.g. 10 years ago), will require a developer to make the changes in the Visualforce page, possibly in a related controller/extension as well, in the development environment, test the changes and then deploy those changes to production. If any development is already ongoing in that page, you may find yourself having to wait for that development to be completed before the developer is willing to make and deploy the change you want. Companies with larger Salesforce implementations are no doubt employing development best practices and scheduling releases of new code into production. This also applies to changes in the VF page layout; there is no dragging and dropping of elements on a VF page around as you please (not yet, anyway!).

The second down-side is that end users will become detached from the design of Salesforce objects, which is going to make report creation more difficult. As much as my users have complained that they've been forced to enter data for 3 different objects into 3 separate forms, it has at least forced them to gain a certain understanding of how all of those objects are related. That understanding is invaluable when creating SF reports. Users that are presented with a single, wondrous VF page that allows them to create and edit records across two or more objects may not even realize which objects they're interacting with, and therefore won't know where to start to create a report on that information.

The point is, Visualforce pages will make users more dependent on developers. Often, the benefits of a slick VF page far outweigh that cost, just be sure to understand that it is a cost when deciding whether or not to implement a new VF page.

Tuesday, May 26, 2009

Ding Dong, the Lotus Witch is Dead

I've known for awhile that I'd be migrating us off of Lotus Notes and into Gmail, or more specifically, Google Apps education edition, which is free for non-profits with 7 GBs of space per user. Why pay for Lotus and a Blackberry Enterprise Server and deal with all of the headaches associated with both, when Gmail more than meets our needs for free? I know there are some die-hard Lotus defenders out there, and if any happen to come across this blog, I'm really not interested in any pro-Lotus rants. All the security you've convinced yourself is there because of the mind-numbingly complex interface, it ain't there. If you've built some complex application using it, more power to you, but for anyone looking for just basic email functionality, Lotus is a poor choice.

But I didn't write this just to pick on Lotus, I thought I'd share a bit about the migration. If you talk to anyone at Google, they'll assure you that it's easy to migrate from Lotus to Gmail, that the users can do it themselves. But you know nothing's easy with Lotus. It took a lot of trial and error, but I eventually figured out how to transfer emails. What I did was to create an Email Account in Outlook that pulls from the Domino server using IMAP (don't use POP, or you'll only get emails marked as new). Connecting to Domino using IMAP didn't become available until some version of Release 6, so you're mail files will all need to be that version or later. The password to connect to IMAP is the Internet Password. Once you've created the account in Outlook, right-click on the root of the IMAP mail folder and subscribe to all of the folders. I had to create a new folder in Lotus called Sent Mail and copy everything from their Sent mail there to get it via the IMAP connection. It ends up in Sent Mail in Google. Then you can use the Google Email Uploader to transfer everything from Lotus via Outlook into Gmail.

Google Apps does provide a function to import using IMAP, but I could not get it functioning with Lotus. I kept getting an unknown error message in Google, which didn't give me much to go on, and I tried tweaking every variable that I thought might affect it. If you can get that to work, definitely go that route instead. The Outlook route works and for our organization, with only ~90 users, it was feasible to go around one by one and transfer that way, but any org with more users will probably not want to go this route. One thing we did which eased the transition was to set up users Google accounts to pull their new email from Lotus using pop3, which for some reason did work from Google to Lotus, and then transfer their historical mail with the Outlook trick. That way we were able to start moving people into Google even before our email was actually re-directed there. By the time we officially switched our MX records, a lot of people were already migrated and I was feeling pretty confident about the change.

Well that's my overview of the transition process. Not everyone will even have Outlook to use like this, and if so, you might want to try Thunderbird instead, which I believe is freeware. If anyone is considering or in the process of moving to Google and looking for more details, feel free to contact me and I'll help if I can.

Wednesday, February 11, 2009

Re-directing Salesforce users to another page

I did this awhile back and know that other organizations have similar requirements. This is an S-Control, so I'll need to re-do it in Visualforce at some point, but it works for now. What my client wanted to do was this: on saving a record, re-direct the user to a record other than the one that was just saved. In some cases, this can be accomplished by overriding the Edit button to pass the record ID you want the user to land on into the retURL parameter. But that wasn't an option here, because they wanted the user to go to a record that was being created by Apex code on save. So, I created the S-Control below and pass the S-Control's ID to the retURL parameter. It obviously needs to be tweak for use in any other Salesforce instance, but this should give anyone with a similar requirement a good starting point:

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<html>
<head>

<script type="text/javascript" src="/js/functions.js"></script>
<script src="/soap/ajax/12.0/connection.js"></script>

<script type="text/javascript">
function initPage() {
var newid = "{!$Request.newid}";
var strSQL;
var strRe;

//Redirecting after matching child case to AP case
if (newid.substring(0,3) == "a0V"){
strSQL = "Select Id, Child_Case__c from Match_Child_Case__c where Id = '" +newid +"'";
}

//Redirecting after creating child/case
if (newid.substring(0,3) == "003"){
strSQL = "Select Id from Child_Case__c where Contact__c = '" +newid +"'";
}


if (strSQL != null){
var result = sforce.connection.query(strSQL);
var records = result.getArray("records");

i=0;

while ( i<records.length) {
var rec = records[i];

if (newid.substring(0,3) == "a0V"){
strRe = rec.Child_Case__c;
}

if (newid.substring(0,3) == "003"){
strRe = rec.Id +"/e";
}
i++;
}
}

if (strRe != null){
window.parent.location.href = "/" +strRe;
} else {
//window.parent.location.href = "/" +newid;
}
}
</script>

</head>

<body onLoad="javascript:initPage();">

</body>
</html>

Thursday, January 15, 2009

Google Maps in Visualforce

So as I go to write this, I see a project up on the Salesforce code share site for Google Maps and Earth in Visualforce. Perhaps this will still be useful to someone though, so I'll forge on. My organization tracks interviews (as a custom object) that we send our participants on and generally our Job Developers generate directions from Hopstop.com to give to them. This seemed like a good candidate for what I thought was a quick Visualforce page. I went down a few avenues trying to accomplish this, but I'll start with the path I ultimately chose and then review the options I gave up on.

What I ended up using was Google Maps for mobile devices. Why the mobile version? I wanted the maps and route in a frame because I was also including information from Salesforce about the interview. The full site was unwieldy in the frame and didn't print properly. And also because I found the parameter options for the mobile version that could easily be passed via a URL to generate the route. So here's the code that I ended up with; you'll notice some if statements in the parameters because the address used can come from two different places, but the concept should still come through:

<apex:iframe id="hopstop" height="1400px" width="600px" frameborder="true" src="http://www.hopstop.com/pda?address1=32+Broadway&county1=Manhattan&address2={!if(IsNull(Interview__c.Opportunity__r.Address__c),Interview__c.Opportunity__r.Account.BillingStreet, Interview__c.Opportunity__r.Address__c)}&county2={!if(IsNull(Interview__c.Opportunity__r.Borough__c),Interview__c.Opportunity__r.Account.Borough__c, Interview__c.Opportunity__r.Borough__c)}&language=en&mode=a&transfer_priority=1&day=1&time=50400&city1=newyork&city2=newyork&submitted=y&sid=&sub_action=" >

That frame is inside of a table with details about the interview. We're always using the same starting point since people are leaving from our office. The format for the mobile device works well for printing, the only unfortunate aspect is that the height of the frame couldn't be dynamic (100% set it to the height of the table, not the content inside the frame) so it can potentially have unnecessary space at the end.

So this ended up being pretty simple, but now for an overview of the other attempts I made just in case anyone else considers trying them. I started out playing with Hopstop.com. I was able to find some parameters that I could pass in in the URL that would set the To and From fields, but didn't actually execute the search. It was also too much having the Hopstop header, sidebar, ads and everything else included on the page that was returned, even if I could get it to execute.

Next attempt, the Hopstop API. To use the API, you have to sign up and get a key which will only work with 1 IP address. That ruled out using the API in Apex, since Salesforce could make the calls from a number of IPs and there would have been legal issues anyway. So I considered making the calls in Javascript on the Visualforce page, since it would always be called from our office here from 1 IP address. The issue there was that the API returns XML files, and Firefox (our primary browser here) has security restrictions regarding XML file transfers being called from a page on one domain to another. There's a way around this, but it involves not only changing settings in Firefox, but updating (or inserting) a config file on the user's machine. Well, we're using Salesforce so that we have a web-based application, so I wanted to avoid that method. It also just didn't seem like the best idea to be making people's browsers less secure just to display a route.

Google API, same issues as Hopstop. So I finally stumbled on the Google Mobile wiki and went with that. As is often the case when first playing with new technologies, I spent a great deal of time coming up with a rather simple solution.

Friday, November 21, 2008

CEO wins Appy Award at Dreamforce 2008

So it occurs to me that I shouldn't be so humble and should announce the fact that my organization, Center for Employment Opportunities, received the "Power of Us" Appy award at Dreamforce 2008, the annual Salesforce conference. Having done the bulk of the customization in Salesforce for CEO, this was a proud moment for myself, and it was great getting some recognition for the organization as well. CEO is just starting to provide Salesforce consulting services so recognition from Salesforce that we've created a best-in-class solution goes a long way in proving to our potential clients that we have the skills and experience to help them.

For my die-hard fans, you can watch the Appy award presentation here, it's about 23 minutes in.

Friday, November 14, 2008

Salesforce and Crystal Reports

And specifically how they don't play nice together. I've used Crystal Reports for many years now, and as far as reporting tools for relational databases go, I've always been happy with it. The problem being, Salesforce isn't truly a relational database and while it's great for data input, has never been well optimized for data output. Crystal has attempted to create a data connector for Salesforce, and I've been able to do a lot with it, but it's still fraught with problems. I've also come to have issues with the company itself, which is now SAP, but I'll get to that.

The problems. First off, while they give the impression that you can build a report on Salesforce data the same as you would on any other database, that's not entirely true. Like I said, Salesforce isn't really relational (it's relational-ish, perhaps?) so the standard methods of relating tables and defining filters in the Select Expert doesn't always produce what you're expecting. I haven't fully put my finger on what works and what doesn't, and I'm disinclined to do much in the way of QA for Crystal, but I often find that filters I've defined in the Select Expert have to essentially be replicated by suppressing groups or detail sections of records that shouldn't have been returned in the first place based on my criteria. Pay very close attention to results returned where your Select Expert criteria contains a lot of "and"s and "or"s, that's where I hit the most problems.

Then there are the issues with efficiency. Anyone who has developed reports on a relational database knows that complex reports can take a long time to run, hence the rise of OLAP (which I really need to learn more about). Well, if Salesforce is your data source, then expect the time to run a report to increase exponentially. When you include multiple Salesforce objects and define filters on what data to return in the Select Expert, you often end up pulling in a lot more data than you really need because of the limitations of Salesforce's SoQL language and Crystal has to then process what should be displayed. The workaround to this is to write Command statements in SoQL to pull the data from Salesforce wherever possible instead of pointing CR to multiple Salesforce objects. But part of the advantage of a tool like CR is supposed to be the GUI interface, so it's frustrating always having to write long, complex SoQL statements for reports that shouldn't really be that complicated.

Then there's the limitations of Salesforce itself that impact Crystal. Any other database, you'd have the ability to write SQL statements to do aggregate functions on the data (max, min, count, etc) and then let that processing happen on the database server itself and just return the results. But SoQL isn't that robust. A common request I get is to see the last time a participant (Contact) met with a JD, or the first time they worked transitional work after graduating from Life Skills class. I have 2 options to accomplish this. I can create a roll-up summary field in Salesforce and then use that field to pull in details about the record associated with that roll-up summary field. So in my first example, I create a roll-up summary field in the Contact record for the maximum Event date where the Event Record Type = "JD Meeting". Then if I want more information than just that date, like who the meeting was with, I have to relate the Contact table with the Event table on both the Contact Id and that roll-up summary date field. Kind of a long way to get not very far.

But even that roundabout solution for aggregates is not always an option, because the roll-up summary fields don't allow you to filter data based on another field. So in my second example, it's not enough to get the first time someone attended transitional work ever, I need to see the first time they attended transitional work after they've graduated from class, and our participants sometimes attend class multiple times over many years. I can't create a roll-up summary field for that, so my only option is to pull every single transitional work attendance record into the report, group it by participant, filter it for dates that are greater than their last class graduation (which has to happen in Crystal, and not in Salesforce, for the same reason as the roll-up summary limitation) and then only display the first record, hiding all the rest. That's a lot of data being pulled into Crystal, when I ultimately only need about 1% of it.

So I realize all of that was a bit wonky, but anyone who's done reporting should get the gist of it.

Now I have a number of reports designed in Crystal that can't run on crystalreports.com because they take so long to run that they time out. So despite this great on-demand system we have, I still have users calling me to run reports periodically, just what we were trying to get away from. Then further complicate this by crystalreports.com having unexpected outages 3 times during business hours in the last 6 months, and users running operational reports (like payroll) who can't wait for that crap, call me then. And ever since SAP took over Crystal, there's no more phone or email support, even something as catastrophic as their entire system being down has to be posted on a forum and then you just sit back and hope one of their techies is paying attention. (They claim they have email support, but don't believe it, the support people just email you and direct you to that forum). Very annoying, very unreliable, very unprofessional.

So what now? I'm still trying to figure that out. At the recent Dreamforce conference, I explored many reporting solutions. Everyone's answer is similar though- yank all of the data out of Salesforce into a data warehouse, either on your servers or theirs, and then do the reports on that. So I'm looking into the solutions that include data warehousing as part of an on-demand service so we can keep moving in the direction of cloud computing. They're all a bit young though, and I've now been burned by 2 supposed Salesforce reporting solutions (Crystal and Jasper Reports) so I'm wary about moving too quickly. I'm hoping to just hold things together long enough to let some of the bigger players test out available solutions and get them past the 1.0 stage. More to come on this I'm sure.

Wednesday, November 12, 2008

Scheduling Code in Salesforce

Update 12/15/11: It's worth noting to anyone stumbling onto this page that Salesforce has since implemented functionality internally to Schedule Apex.  There are some scenarios were the code below may still make sense to implement, but this was originally a workaround to get around Salesforce's lack of scheduling for code, so be sure to research Scheduled Apex before deciding if this is useful to you.  This does still make sense in cases where the code to be scheduled isn't being run on a regular, periodic basis but instead should be triggered based on criteria in your data.
-----
 
First off, credit where credit is due. My initial inspiration for this was work done by Steve Anderson that he shared on his blog gokubi.com, which is a great resource. What this allows you to do is both schedule code to run at particular times, and what I added on to his work is the ability to break processes into chunks that won't hit the governor limits so that large processes can be scheduled as well. I believe SF is working on providing this functionality, but until then, feel free to make use of the code below.

The anecdotal explanation: I created a custom object called Scheduled Jobs which I use to set off a workflow rule. The workflow rule is triggered by a checkbox field called "Schedule". One of the fields in Scheduled Jobs is "Next Run Datetime" and there's a workflow rule that is set to run 0 hours after that datetime. When it runs, all it does is update a "Run Job" field which sets off the Apex code. Each time a job is run, in addition to whatever code I want run, I have Apex insert a new Scheduled Job record with the Next Run Datetime and Schedule = true, so the process starts over again. Since I have a few different jobs running this way, the Scheduled Job includes fields to indicate which method to run, the time it should be set off (I store it as military time and convert it to a datetime in the code). I even have a job which deletes old Scheduled Job records so it doesn't get too cluttered.

The code, minus some irrelevent bits, is below. The intMagicNum is something I had to include because some of the methods I run have to be run in chunks otherwise they hit Apex governor limits. So, for instance, I have jobs that send out emails but I can only send out 10 emails at a time. Each time it's run, the method is tracking which users have been emailed and only emails people that haven't received it yet that day. So when it's done, it returns how many emails it actually sent and if that number = 10, the job is scheduled to run again immediately (there's generally a 15 minute delay) until everyone has been emailed (less than 10 returned).

Code:
trigger InsertUpdateScheduledJobAfter on Scheduled_Job__c (after insert, after update) {
Integer intReturn;
Integer intMagicNum;
List sjList = new List();
Datetime dte;

for (Scheduled_Job__c sj : Trigger.new){
 if (sj.Run_Job__c == true){
 
  //Create Site Log records for 7 business days in the future, if needed
  if (sj.Method_To_Execute__c == 'CreateSiteLogs'){
   intReturn = clsScheduledJobs.CreateSiteLogs(null);
   //This job is run in batches of 1 site invoice (the max that can be processed
   //before hitting the governor limit, sadly), so less than 1 means it's done
   intMagicNum = 1;
  }
 
  //Send out Job Loss Report
  if (sj.Method_To_Execute__c == 'JobLossReport'){
   intReturn = clsScheduledJobs.JobLossReport(null);
   //Single email limited to 10 at a time
   intMagicNum = 10;
  }
    
  if (sj.Method_To_Execute__c == 'ScheduledJobCleanup'){
   intReturn = clsScheduledJobs.ScheduledJobCleanup();
   //This job always returns 0, shouldn't need to be re-run anytime soon
   intMagicNum = 1;
  }
 
  Scheduled_Job__c new_sj = new Scheduled_Job__c(
   Method_To_Execute__c = sj.Method_To_Execute__c,
   Name = sj.Name,
   Run_Job__c = false,
   Schedule__c = true,
   Time_to_Start__c = sj.Time_to_Start__c);
 
  //If the number returned by the method is less than the Magic Number,
  //create a Scheduled Job record to run it again tomorrow night
  if (intReturn < intMagicNum){
   //Tomorrow, 12am
   dte = Datetime.newInstance(System.today().year(), System.today().month(), System.today().day()).addDays(1);
  
   //Add on hour/minutes from Time To Start field
   dte = dte.addhours(Integer.valueOf(sj.Time_to_Start__c.substring(0,2)));
   dte = dte.addminutes(Integer.valueOf(sj.Time_to_Start__c.substring(2,4)));
  
   //Sometimes if there are multiple scheduled jobs around the same time,
   //SF processes them in bulk and the governor limits are hit, so make sure
   //they're all spaced out at least 15 minutes
   sjList = [Select Next_Run_Datetime__c from Scheduled_Job__c
      where Next_Run_Datetime__c >=: dte and 
      Next_Run_Datetime__c <: dte.addMinutes(30) and 
      Schedule__c = true and Run_Job__c = false
      order by Next_Run_Datetime__c DESC
      LIMIT 1];
   dte = (sjList.size() > 0 — sjList[0].Next_Run_Datetime__c.addMinutes(15) : dte);
   //Reset Scheduled Job record to midnight of tomorrow
   new_sj.Next_Run_Datetime__c = dte;
  
  //Otherwise, there are more records to process, so create a record
  //to run it again immediately
  } else {
   //Make sure this is offset from other jobs so they don't run in bulk
   sjList = [Select Next_Run_Datetime__c from Scheduled_Job__c
      where Next_Run_Datetime__c >=: System.now().addMinutes(-15) and
      Next_Run_Datetime__c <: System.now().addMinutes(30) and 
      Schedule__c = true and Run_Job__c = false
      order by Next_Run_Datetime__c DESC
      LIMIT 1];
   new_sj.Next_Run_Datetime__c = (sjList.size() > 0 – sjList[0].Next_Run_Datetime__c.addMinutes(15) : System.now());
  
  }
 
  insert new_sj;
 }

}
}

Prelude

So I thought to myself, what self-respecting geek in this day and age doesn't have their own blog? Why, it's precisely the self-respecting geek that doesn't have a blog. Come now, cynical voice in my head, there are plenty of respectable blogs out there; we can't condemn all news shows just because of Fox News, can we?

And that's how I'd like to set the tone for this blog, schizophrenia. In the spirit of open source, I'd like this to be a place where I share tips, tricks and snippets of code that others might find useful. Much of my work these days is in Salesforce, so much of my ranting will be on that topic, but I've also just adopted a bouncing baby network so I'll be getting into that as well. Actually, it's more of a coughing, hacking old man of a network, but in either case, it needs a lot of attention, TLC and diapers.

Enjoy and let me know what's useful, what's not and what you'd like to see more of.