19 minute read Secrets of a successful Data Cloud Implementation Home » Blog » Secrets of a successful Data Cloud Implementation Home » Blog » Secrets of a successful Data Cloud Implementation What is Data Cloud Salesforce Data Cloud aggregates customer data from external sources, serving as a dynamic platform and Customer Data Platform. It provides a 360-degree view of customer relationships, fulfilling a promise pursued for the last 30 years. This unified customer profile enables actionable insights and data-driven decision-making directly within the Salesforce core platform using clicks, not code. However, this added complexity requires a more rigorous, intentional, and architected approach to the implementation process to ensure successful implementation and achieve business objectives. A structured approach is essential, with 80% of the project dedicated to planning and 20% to configuration. Yet, there are significant benefits to be gained, including improved decision-making, operational efficiency, and the ability to personalize actionable insights and actions. Salesforce Data Cloud serves as the foundation for AI Agents (Agentforce), which can drive customer satisfaction, enhance user experience, and reduce operational costs. These capabilities allow organizations to make data-driven decisions and gain valuable insights into customer behavior. Customer360 The promise of a Customer Data Platform (CDP)—Customer 360—has always been Salesforce’s strategic goal. For most organizations, customer data is siloed across multiple systems, not just Salesforce, and often stored in various formats. Salesforce Data Cloud provides a unified platform to create a Unified customer profile and a single view of customer relationships through seamless integration. It harmonizes data (unify, manage, cleanse) from external sources, making it accessible for both low-code and pro-code tools like Flow and Apex to drive actionable insights and data-driven decisions. This comprehensive approach enhances customer engagement and aligns business processes with strategic goals. Data Cloud, the Customer360 platform, and Agentforce are fully integrated. This seamless integration is made possible by the metadata framework that underpins Salesforce. This metadata drives configuration and connects Salesforce Data Cloud with the core platform, enabling real-time insights and actions for enhanced customer experiences. Metadata simplifies the Salesforce Data Cloud implementation process and supports compliance requirements, ensuring adherence to necessary regulatory requirements for successful implementation. “Agentforce understands your business. You’ve taught Salesforce your business with metadata – with custom fields. It knows that there is a ‘High Value’ custom field. It interpreted that.” – John Kucera, SVP Product Management, Salesforce Integrating Data Cloud What makes Salesforce Data Cloud a powerful tool is its ability to integrate different external sources, harmonize the data, and make it available to the core platform, surfacing it to AI—Agentforce. This capability is central to everything Salesforce does, providing real-time data for strategic decision-making and data-driven insights. Data can be copied into Data Cloud or managed through Zero Copy for data lakes like Snowflake, Google, Microsoft, or AWS, ensuring seamless integration and operational efficiency. Zero Copy means that the data remains in its original location but is still accessible in real-time, making Salesforce Data Cloud a true real-time data platform. For many external sources, a subset of their data is copied into Data Cloud through seamless integrations. This flexibility minimizes manual effort, enhances operational efficiency, and supports informed decision-making and data-driven decisions. “Data Cloud is the heartbeat of the Salesforce Platform.” – Rahul Auradkar, EVP & GM of Unified Data Services & Einstein, Salesforce Extending the Metadata Framework At a high level, Salesforce Data Cloud uses Data Lake Objects (DLOs) that store data with fields. For Zero Copy external sources, the DLO is a reference to the data location. Data Model Objects (DMOs) are virtual objects related to DLOs, helping harmonize and normalize the data for a unified customer profile. With over 100 standard DMOs available, you can extend them with custom fields or create new DMOs. When building Flows or writing Apex, DMO fields appear as metadata, simplifying seamless integration and reducing manual effort during a successful implementation. This structure supports data-driven decisions and improves operational efficiency. Consumption pricing The pricing model for Salesforce Data Cloud is based on consumption of credits. Credits are consumed when performing various actions inside Data Cloud, such as data ingestion, harmonization, or activation. This makes Data Cloud accessible to organizations of any size or industry, allowing them to leverage its power as a Customer Data Platform. However, a clear use case and mindful data architecture are essential to avoid inadvertently running up costs. Calculating Key Performance Indicators and fine-tuning segmentation are important key steps in the strategic planning and implementation process to ensure operational efficiency and data-driven decision-making. Implementation secrets Just because Salesforce Data Cloud is available to every organization doesn’t mean it’s always the right fit. Consider your business objectives carefully: you may not have the immediate business need, in-house skills, or resources to engage a consulting firm. Your organization might have technical debt that must be addressed first. However, with Data Cloud’s strategic growth and improved management capabilities, the technical expertise and analytics tools will only strengthen over time. By addressing complexity, enhancing metadata documentation, and improving data quality, you can lay the groundwork for future Data Cloud projects and ensure alignment with strategic goals and business processes. A successful Data Cloud implementation is contingent on: Clear use case Successful Salesforce Data Cloud projects require a clear use case with a solid ROI, as the consumption-based pricing can escalate if not carefully managed. Aggregating data from multiple external sources and restructuring it to create a Unified customer profile is complex, but the results deliver deeper insights and informed decision-making. The process involves understanding the Salesforce Data Cloud architecture and planning meticulously, making data-driven insights the cornerstone of the project, and supporting strategic goals and business objectives. A well-articulated need backed by a clear business case is essential. This should be supported by key stakeholders across different business units to ensure alignment with business growth and strategic goals. A critical output of this process is a detailed cost-benefit analysis, which guides strategic decision-making and ensures a positive ROI. By focusing on data-driven insights and business objectives, this approach helps foster successful implementation and maximize valuable insights. 80% Planning, 20% Building Unlike typical Salesforce implementations, Salesforce Data Cloud requires a shift in mindset. Many decisions are one-way and irreversible, demanding a meticulously structured approach to strategic planning. This reduces the risk of missteps that could consume unnecessary credits and cause frustration. Detailed planning enables actionable strategies that align with data-driven decision-making and strategic goals, helping to minimize risks and support successful implementation. “Do not start by clicking buttons. Planning and design is more important than implementation.” – Andres Perez, Senior Manager, Trailhead Academy Solution Architect Lead, Salesforce Executive cross-departmental sponsorship Salesforce Data Cloud projects require strong executive sponsorship, particularly because they are cross-departmental, involve consumption pricing, and address data quality and compliance requirements. Establishing a Center of Excellence can ensure that best practices are followed, keeping the project aligned with regulatory requirements and Key Performance Indicators. This structure helps manage the complexities of integrating Data Cloud with existing management processes, supporting informed decision-making and maintaining strategic goals. Document, document, document Documentation is vital for a successful Salesforce Data Cloud implementation. It supports effective collaboration across business units and helps futureproof Salesforce Data Cloud projects. This documentation includes compliance requirements, use cases, process maps, architecture diagrams, metadata dictionaries and descriptions, and user stories. Good documentation ensures consistency across future use cases, accelerates time to value, and reduces risk by enabling data-driven decision-making and aligning with strategic goals. The implementation approach for success Repeatable success Our experience and advice from Salesforce emphasize that successful Salesforce Data Cloud implementation requires meticulous planning. 80% of the project involves planning—the Use Case, Analysis, and Design phases. During these phases, the deliverables include architecture and design documentation and a business case. Each document builds on those created in the early phase, as there is refinement in understanding the business objectives, compliance requirements, and data architecture, which may require updates to align with strategic goals. The documents are interrelated and should be linked to make navigation and cross-referencing easier. If Elements.cloud is used as the documentation platform, they can all be easily connected, forming the heart of your Change Intelligence platform. This setup ensures compliance requirements and management capabilities are maintained and accessible for impact analysis on future use cases. Additionally, the documents can be version-controlled, supporting successful implementation and consistent alignment with strategic goals. Each subsequent use case builds on the earlier implementations. The power of Salesforce Data Cloud is that data made available via DMO is reused effectively. For example, by establishing and validating a primary email address for an individual from multiple external sources within Salesforce Data Cloud, you create a Unified customer profile. Reusing that DMO in the next use case avoids duplicating efforts and aligns with the data-driven decision-making principle of Data Cloud. However, to confidently reuse DMO objects and fields, they need to be well-documented, supporting operational efficiency and successful implementation. That is why you need a metadata dictionary that is always up to date. This applies to all the diverse applications, not just Salesforce and Salesforce Data Cloud. The dictionary should reflect changes in data structure, automation, and analytics tools as these applications evolve. Keeping metadata current supports operational efficiency and data-driven decisions, providing valuable insights and enabling a successful implementation aligned with strategic goals. Below is a schematic showing the documentation created and updated during the Data Cloud implementation. These are all inside Elements.cloud so that there is a single source of all version-controlled, connected documentation. This speeds up the implementation and ensures consistency and standardization. Use Case Phase In the use case phase, you evaluate the potential use cases that were brainstormed in the Setup Phase. You need to evaluate the impact and potential benefits for the business. But then you need to balance that with the technical complexity. You need to select the easiest and simplest for the first one to implement. This is stored as a Requirement along with a Use Case ERD and Use Case E2E (end-to-end) process diagram. These are the inputs for the analysis phase. Analysis phase The key output of the Analysis phase is a data architecture and a business case to move to the Design phase. During the analysis phase, you need to perform enough analysis to build a viable business case. As Data Cloud pricing is consumption-based, it is critical that you are able to estimate the data flows, volumes, and the usage of credits to be able to build a cost model. The Analysis phase will be iterative. There will inevitably be tradeoffs between costs, risks, and benefits. As you understand the use case requirements in more detail, you can confirm the data sources that are required, and cost any enhancements that are needed to support the use case. For example, additional fields may be required for segmentation, or new data export functions need to be built. High-level architectural decisions will impact how the use case is delivered. There are also data-related issues that need to be addressed, which will inform the data architecture and design. These include quality, governance, privacy and security, regulatory compliance, and data volumes. Do not underestimate the architectural and security implications of implementing Data Cloud. Firstly, is the data that you need accessible or do you need permissions to be opened up? When data is aggregated it could change the levels of security and privacy. For example: when customer personal details are stored in Salesforce, but their medical records are in another system, and then claims and payments are in a 3rd system, you have a level of regulatory compliance to adhere to. But when this data is aggregated, suddenly that level of compliance is higher. So, this means that you may need to reconsider and re-architect the data that you are aggregating, or you need to ensure the right level of security is in place. Once you have identified the data sources, you can also assess the quality of the data and kick off a separate project to review and improve the data quality. This needs to be initiated as soon as the data sources are confirmed so that it is complete before the first data ingestion into Data Cloud. You know when you have reached the end of the Analysis phase you can get a sign-off for the business case to move to the next phase which is Design. Design phase The analysis established the feasibility and set the overall data architecture. The Design phase takes that to a more detailed level, with the output being a set of user stories for the changes that need to be made to Data Cloud, core Salesforce, and the 3rd party applications. This could also include work to develop new data extract utilities and intermediary applications to harmonize data outside Data Cloud. This phase develops a detailed data architecture and governance strategy, taking the data from the source systems and taking it through the different steps of harmonization and filtering to get to the final data sets that are exposed in Salesforce as DMOs. This could be a complex multi-step process for more sophisticated use cases or where the data from the source systems is in very different formats or is difficult to harmonize. For example where one system uses the customer’s SSN as the core identified, whilst in other systems it is one or more email addresses, and in Salesforce it is a unique customer number. As the data structures are understood in more detail, then some of the decisions made in the analysis phase may be revisited. This could impact the data volumes flowing into Data Cloud and the level of manipulation. This in turn changes the consumption of credits and the ROI of the use case. This could be because it is discovered that the data needed from the source system has to be restructured or aggregated to be able to be used by Data Cloud, or the data volumes are so great that the data needs to be aggregated or filtered in an existing or new intermediary application such as AWS to make the use case cost-effective. It could be the data architecture. It could be security and regulatory compliance implications. The existing Context DFD is updated with the changes. But a new document is created in the Design Phase called a Detailed DFD/ERD. This is the pivotal document in this phase as it shows the DSO, DLO, and DMOs. It also shows any intermediary applications. The Detailed DFD/ERD could be relatively simple with one set of DSO, DLO, and DMOs. For complex use cases, there could be multiple sets of DMOs to transform the data. These are designed in this phase down to field level. They should be shown on the Detailed DFD/ERD linked back to “proposed” metadata items that are created in the metadata dictionaries. The Salesforce core metadata is automatically generated in the metadata dictionary using the metadata API. The metadata dictionaries for the source systems could document all the metadata or just the metadata that is going to be used in the Data Cloud implementation. At a minimum, you need the metadata for the implementation so that you can document the dependencies with Data Cloud for future impact analysis. The DSO, DMO, and DLOs cannot be created in Data Cloud – yet – because you need to be completely certain that they are correct. You can only be sure at the end of the Design Phase. If you create them in Data Cloud now and it is later proven to be incorrect it will result in you having to unpick it all and start again. This is extremely wasteful in terms of time and consumption credits. It saps morale, is frustrating, and is de-motivating. Making changes to proposed metadata in the metadata dictionary and the related documentation in the Design Phase is quick and easy. But to be able to easily validate that all the changes are correct it all needs to be connected. You can see the relationships in the image above. Trying to manage it all in a series of text documents and spreadsheets will quickly become unmanageable for all but the most trivial of use cases. The Detailed DFD/ERD shows the Data Cloud DSOs, DLOs, DMOs, and also the external source systems and intermediary apps. But it also shows how they relate to each other – the data flows. In the build phase, this is what you will be doing in Data Cloud – creating them and linking them. The first sub-phase of Design is architecting and designing the data structures. The second sub-phase is identifying the work that needs to be done to deliver the Data Cloud implementation. Both sub-phases of the work are required if it is a brand new Data Cloud implementation or a new use case building on an existing implementation. To create the listing of work items is relatively straightforward. You go through every data item and data flow on the Detailed DFD/ERD and you create a user story. If you are using Elements you can create the user story from within the Detailed DFD/ERD diagram for every data item and data flow. They are automatically linked. This means that you can see all the user stories listed in a table, with links back to the Detailed DFD/ERD diagram and the metadata dictionary. In this view, any metadata conflicts are automatically flagged. Also, you can allocate the user stories to releases so that you can phase the work. You can also see all the user stories that relate to a particular data item when you look at the Detailed DFD/ERD diagram or you look at the data item in the metadata dictionary. You are now looking at the data volume at a field level. In the analysis phase, it was at a data source level, to get some ballpark estimates. You can track the data volumes at the new detailed level for each of the data sources on the Detailed DFD/ERD. You will need to revisit and revise the consumption costs. This is fed into the business case to ensure that there is still a positive ROI. You know when you have reached the end of the analysis phase you can get sign-off for the business case to move to the next phase: Build. Build phase This is where your Data Consultant skills will come to the fore. During the Build phase, you implement each of the user stories developed during the Design phase. The user stories can be synced from Elements.cloud to a ticketing system e.g. Jira so that they can be actioned from there. The Elements Jira integration and extension means that all the Elements content is visible inside Jira. The metadata objects and fields (DSO, DLO, DMO) are built in Data Cloud, and through the Metadata API, they will be synced and visible in the Salesforce metadata dictionary in Elements. This means that all the dependencies are also visible on the dependency tree. Because you created them as “proposed” in the Design phase, they are already documented in the metadata dictionary. Now the proposed metadata is connected to the implemented metadata. There may also be changes to existing Salesforce metadata and these should also be documented as they are made. Finally, there may be 3rd party apps, utilities, or data structures that need to be built. For example: you could use AWS S3 as an intermediary data aggregation platform to simplify what is ingested into Data Cloud. So you will need to configure S3 and build the utilities to get the data into S3. Data Cloud has the connectors to get the data out of S3. The advice here is to work through from source data to outcome in order – left to right. Test that you are getting the expected data at each step before you start building the next step. That way you can confirm you are going to get the results you need. As we’ve said (multiple times) mistakes are easy to make but are time-consuming and complicated to rectify. If you need to make changes to the design to make it work, then you must ensure that you update the documentation – the ContextDFD and DetailedDFD/ERD. When you implement the next use case, you will struggle massively in the analysis and design if the Data Cloud configuration does not match the documentation. This includes the data volume estimates that go into the cost planning. Monitor phase Once you have implemented the first use case you then monitor it. You are monitoring to confirm that you are getting the data that you expected and that the activations are working correctly. You also need to look at the recurring costs to ensure that they are in line with your planning assumptions. There are a number of stories where customers were represented with unexpected monthly bills for consumption. You started the project with a requirement and a series of assumptions. Post the implementation of the use case you need to check back with the end users to establish if the ROI was met, or if there is fine-tuning before you rush into implementing the next use case. This should be a formal checkpoint. If one of the activations is Agentforce, then there is more detailed monitoring required. The data is being presented to the LLM via a prompt in Agentforce. The result of that prompt will “drift” over time as the LLM evolves. The result – which could be a draft email, a service recommendation, or a data update – could be getting better with time or it could be degrading. You need external monitoring of the results because you cannot monitor them automatically. For example, if Agentforce is generating a draft email for a service agent, you need to know how much the agent has to change the email before it can be sent to the customer. Does the agent even use the draft email? You need to get insights so that you can fine-tune the data in Data Cloud, the prompt, or even the choice of LLM. Data Cloud + Elements.cloud Throughout this article, we’ve assumed you would use Elements.cloud for creating and managing all the documentation during the project. If you are not using Elements.cloud then you will need to use diagramming tools, documents, spreadsheets, and ticketing systems to keep track of what needs to be built. For all but the most trivial of Data Cloud implementations, this approach will be unworkable. The administrative effort trying to cross-reference and the mistakes due to issues falling through the gaps will increase the risk of the project. To make it easier to implement Data Cloud using Elements.cloud we’ve provided: Documented a proven implementation approach: Built as a multi-level UPN process map with content linked to activities. That content is training, standards, and knowledge articles. Standard diagrams available as templates: The Detailed DFD/ERD diagram could be simple for the first implementation. But it could be very complex for later use cases that need more sophisticated data harmonization and unification. Data Cloud metadata in the metadata dictionary: Data Spaces, DSO, DLO, DMO down to field level, and other Data Cloud metadata such as calculated insights, are synced to the metadata dictionary. The dependencies between metadata in Data Cloud, Salesforce, and 3rd parties apps are visualized in Dependency Trees. With Elements.cloud you can: Deliver successful implementation with a proven approach Improve planning with connected documentation Reduce the risk of changes with metadata dependency analysis Accelerate your Data Cloud implementation with templates Future proof next use case implementations with connected documentation Take a peek into our own Data Cloud journey, and understand how the outlined approach, streamlines implementation and prevents costly rework. Final Word Salesforce Data Cloud aggregates customer data from external sources, creating unified customer profiles that provide a 360-degree view of customer interactions. It enables actionable insights and real-time data analysis, driving valuable insights from within the Salesforce platform using clicks, not code. With the support of Elements.cloud, planning and documentation become more streamlined, allowing for a successful implementation that aligns with your business objectives and strategic goals. Data Cloud delivers substantial benefits, making the investment in a structured approach worthwhile. It provides the foundation for AI Agents (Agentforce) that can enhance customer experiences, streamline business processes, and support strategic growth. Sign up for our newsletter Subscribe to our newsletter to stay up-to-date with cutting-edge industry insights and timely product updates. Back to News Share Ian Gotts Founder & CEO 19 minute read Published: 27th December 2024 Table of contentsWhat is Data CloudCustomer360Integrating Data CloudExtending the Metadata FrameworkConsumption pricingImplementation secretsClear use case80% Planning, 20% BuildingExecutive cross-departmental sponsorshipDocument, document, documentThe implementation approach for successRepeatable successUse Case PhaseAnalysis phase Design phaseBuild phaseMonitor phaseData Cloud + Elements.cloudWith Elements.cloud you can:Final Word Post navigation Monitor, Identify, and Keep Your Salesforce Automations on the Up-to-Date API VersionPrevious