Salesforce integration is challenging if you don’t know exactly what you’re getting into. It’s an enormously valuable project, but diving into it without proper understanding and planning is just asking for unexpected failure to jump out and surprise you. And it will, at multiple levels, if you don’t keep the following ‘gotchas’ in mind from the beginning of your Salesforce integration initiative:
If you don’t develop collaborative goals that identify specific business process needs at the outset, you shouldn’t be surprised by failure down the road. In order to avoid this gotcha, end-state objectives need to clearly establish which other applications will need to be integrated as a part of which business process before the Salesforce integration project gets underway.
It needs to be abundantly clear to everyone at the outset which system will be storing the master records. This will ensure that if there is a data mismatch between different applications, the final call for correct data will be made from the established system of record. Having this clarity on systems of record is needed to setup the right data validation rules when the Salesforce integration flows are executed.
Another complication that can halt efficient Salesforce integration is a failure to remove obsolete data and clean up systems before integrating. As the saying goes: ‘garbage in, garbage out.’ If you input old records and unnecessary information to other applications, you will have data integrity and duplication problems.
Another issue is the wrongful assumption that importing and exporting data from Salesforce is the same thing as real-time integration. Since batch export of data in files only provides a snapshot of information at a moment in time, no matter how often data is imported and exported as files, some applications will remain out-of-sync with real-time changes. It is important to be aware of this in order to make absolutely certain that data is accurate and consistent.
Perhaps one of the most unexpected ‘gotchas’ Salesforce integrators run into is the breakdown in relationships between records as a result of improperly matched external IDs. The solution to this is to take the time to understand how external and unique IDs work in order to properly match them to objects in Salesforce.
It is crucial to address data validation and quality issues from the beginning of a Salesforce integration project. There should be a clear strategy about the handling of data management issues such as whether to merge or reject duplicate records and how to properly map data fields for type conflicts and custom fields.
Frequently integrators do not realize that there is a governor limit imposed by Salesforce APIs which determines how much data can be accessed concurrently in a single call and in one day. As a result, they run into maddening run-time errors in the middle of a process. There is no way around them, so it’s vital to understand the limits and design them into the Salesforce integration solution in order to avoid them.
Too often integration projects fail because not as much attention is paid to testing as to the implementation itself. Quality assurance testing by IT staff is imperative in order to ensure that errors are being handled correctly, and user acceptance testing is crucial for business users to verify that the correct information is being loaded into the right places.
In general, all these ‘gotchas’ can be avoided with a clearly developed strategy that anticipates these types of unexpected hang-ups. With strong, collaborative leadership, Salesforce Integration can work smoothly and achieve huge value across the enterprise.
For further reading, check out:
7 Alcoholic Drinks To Imbibe As Your Sharepoint Integration Project Fails
CIOs, This Is How To Enrage Your CMO: Tell Her Marketo Integration Will Take Another Year
The Mad Max Approach to SaaS Integration