Designing a scalable data model
As Salesforce implementations grow in size and complexity, so does the volume of data. Salesforce, being a multi-tenant architecture, handles the scaling up automatically, but as the volume of data grows, the processing time for certain operations increases too.
Typically, two areas are affected by different data architectures or configurations on the Salesforce platform:
- Loading or updating large amounts of records. This can be through the UI (directly) or with one or more integrations.
- Extracting data, be it through reports or other views into the data or querying the data.
Optimizing the data model generally involves doing the following:
- Only hosting data that truly needs to reside on the Salesforce platform based on business purpose and intent
- Deferring or temporarily disabling sharing change processing and other business rule logic when performing certain data operations
- Choosing the best (most efficient) operation...