Salesforce is a terrific tool, and it continues to grow in capabilities as well as popularity. However, companies don’t use Salesforce in a vacuum; they need to integrate it with their other systems to maximize value. Here are some pitfalls to beware of in Salesforce integration projects.
Speed bumps and pitfalls
Incomplete discovery
A Salesforce integration project, even if it seems small, needs to follow the same best practices as any other project. You can’t scrimp on discovery or scope definition.
- What are the project’s goals? Define them clearly and specifically.
- Whose work will change as result of the project? Is the integration project likely to produce a domino effect, either good or bad?
- What are the possible ways to integrate the target applications with Salesforce? Check the pros and cons of each approach if there’s more than one possibility.
- Which resources do Salesforce and the integration candidates use now? After the integration project is completed, which resources will be used differently?
Poorly documented scope
The goal of a well-written scope is to ensure that everyone affected by the project understands what that end product will be. Be sure to:
- Write user stories, and make sure to include end users as you write. Check with I.T. support and anyone else whose work will be affected.
- Specify which applications the project will integrate. Define the “how” of all the integrations. Make sure to identify any third-party tool requirements.
- Document how the new integration will impact end user work flow, data flow, hardware and network resources, and software license levels.
Granted, the project team may not know all of these details at first. If that’s the case, be sure to build in regular checkpoints for the team to fill information gaps and fine-tune the implementation details. If any new or expanded information changes the user experience or system maintenance needs, send an update to anyone affected.
Poor fit with users’ overall workflows
Make sure the end result doesn’t make anyone’s job harder. This principle might seem obvious, but it’s surprisingly easy to overlook. That’s a big mistake. Bad user experiences lead to poor adoption and compliance. Poor compliance leads to poor data quality. Poor data quality and inefficient human workflows completely defeat the purpose of the project! The whole point of a Salesforce integration project, or any data integration project, is to increase data quality and efficiency.
This pitfall goes right back to discovery and scope definition. Listen to end users as they describe their tasks and procedures, both inside and outside Salesforce. Make sure they understand what their new procedures will look like. As far as possible, sell the end users on the end result.
“Dirty” backfill data
Data integration generally includes a one-time backfill operation as well as an ongoing import schedule. These imports aren’t necessarily identical.
It would be easy to assume that backfill data will follow the same model as current/future data. Bad idea. Depending on the age of the data to import, that may not be true. Your third-party applications may have changed their data models over time. Alternately, the organization may have changed their own data requirements. Sometimes that can include “I’m tired of seeing blank due dates on tasks! Let’s require due dates, even if the software doesn’t enforce it.”
For live data connections, the initial sync operation will take care of backfilling old data automatically. In that case, you simply need to make sure that the old data is clean before you connect to it. That data cleanup may be a tall order, but it is definitely worth the effort.
The point here is that Salesforce integration projects must identify all inbound data conditions. Backfill data may need to be handled differently from live data. Just make sure that your source data is clean before you integrate it.
Inadequate quality assurance and user acceptance testing
You defined user stories during discovery and scoping, didn’t you? Good! Now use those stories to create user acceptance testing (UAT) scripts.
Before you deliver your project to staging, step through those testing scripts in the development environment. Once the system passes internal quality assurance testing, it’s ready for staging and UAT.
Work with your UAT team to make sure they understand the new design. If possible, walk through a couple of test scripts with them. Once testers are comfortable, let them finish the process in peace. Afterward, reach out to them for feedback.
Laissez-faire data management
Expect the unexpected. Even the most careful users can make mistakes, and even the best data transfer system can experience glitches. Build in good data management practices for the ongoing data transfer processes. Decide how to handle duplicates and incomplete transfers. Provide tools for administrative users to manage conflicts. Set up periodic validation reports to ensure good data quality going forward.
Exceeding Salesforce data limits
Many of these hazards apply to data integration in general. One pitfall that’s specific to Salesforce integration projects relates to data transfer limits. Even if your project makes use of third-party or AppExchange tools, it’s important to know your data transfer scale. How many records will you import per day? Per hour? Per batch? Should you use the bulk API, or are your data sets small enough for SOAP?
There are many tools that automatically create jobs and batches appropriately for the right API. They can also select the right API based on how much data is ready for transfer. That helps, but it’s not enough. Double check the time-based data transfer limits as well as your organization’s license limits on total data. Make sure there are no surprises!
Prepped for success
Well-designed data integrations save a lot of time and money. Boost your odds of a successful integration and avoid the traps by doing your homework. There are many Salesforce and third-party tools to make Salesforce integration projects easier. Use them wisely, and reap the benefits.