6 minute read

Reflecting on the Elements IMPACT events

Home » Blog » Reflecting on the Elements IMPACT events
Home » Blog » Reflecting on the Elements IMPACT events

What is IMPACT?

The IMPACT events started with a virtual Summit and then a series of in-person events in Europe and North America. There are some common themes from the events that encapsulate the state of the Salesforce ecosystem.

The power of aggregation

The events enable professionals from around the ecosystem to come together to share Salesforce implementation best practices and how a Change Intelligence Platform can support them. Change Intelligence is an emerging technology category that is complementary to DevOps and Data Governance.

As the category evolves, there are 1,000s of evangelists. They are happy to talk about the benefits before it becomes mainstream, but they feel like lone voices. A comment I heard recently was, “I’m, going to be on this island on my own for a while”. So the IMPACT events enable evangelists to “find their tribe” and realize that they are not alone. It is also valuable for us to understand the support that they need.

You’d expect it to SMBs or niche SI’s. But we’ve evangelists in Fortune 500 companies and the major SIs. This is not surprising. Change Intelligence is not completely new, high risk and unproven.

Change Intelligence is the confluence of two established approaches. The power is when you put them together. The areas are

  • process-led change: requirements, mapping processes, drawing architecture diagrams, writing user stories
  • metadata management: app metadata dictionaries, metadata documentation, dependencies, change logs.

When you connect all the data from these different approaches and connect them in context you get intelligence that helps you drive change…… Change Intelligence.

Staggering state of tech debt

The Change Intelligence Research Series analyzed metadata based on the org metadata sync’d by Elements.cloud. They sync and analyze 50,000 orgs per month and that is over 1.3 billion metadata items. They took a sample of the data and anonymized it for the reports which can be downloaded here.

And whilst there is a recognition that tech debt is fairly widespread, no-one was prepared the hard evidence. 

  • 50% of all custom objects never used
  • 41% of custom fields never populated
  • 150 fields on the Opportunity page

This level of tech debt impacts the ability to change Salesforce quickly without dramatically increasing the risk of any change breaking it. Forrester, industry analysts, called this “Salesforce @scale dilemma”. Forrester published this in 2017, and the feeling is that it has gotten worse.

The complexity of scale crushes Salesforce’s responsiveness. As Salesforce use grows, innovation slows and flexibility evaporates

Forrester, Salesforce @scale dilemma

This concept is explored in this blog which digs into the Forrester report and the implications for the Salesforce ecosystem.

Cause of tech debt and org complexity

Whilst there can be specific instances where metadata is created and never used, it was agreed that there was one over-arching reason: a lack of rigorous up-front business analysis. This analysis is needed to bottom-out the true needs before building. 

One message was loud and clear. The promise of rapid development was heard by business users and senior management, and they put enormous pressure on the implementation teams to “just start building”. It is incredibly difficult to justify the cost/benefit of taking longer to do the analysis, even though the adoption and results will be better.

The Change Intelligence Research Series data can help change the narrative. It can provide the data to show the impact of insufficient analysis. That can help drive a change in the rigor of the implementation approach with more emphasis on up-front planning and analysis. Here is a recent blog on where to look to build a business case for managing technical debt.

Spoiler alert: Implementing Data Cloud requires a planning-first, data-driven methodical approach. So now is the time to implement a more structured approach to business analysis.

Metadata, metadata, metadata

From our perspective, there was another discussion that surprised us. Very few orgs had any formal way of tracking, documenting and analyzing the impact of changing metadata. Salesforce is now talking about metadata. It is on their TDX, World Tour, and DF slide decks. Marc Benioff mentioned it five times since 2018 on earnings calls. In a recent earnings call, he called it out 26 times. Why? So what is the sudden interest?

Metadata is what allows apps to be easily configurable by every customer for their differing requirements. They need to be configured and extended via non-technical teams, and this is by metadata. The metadata approach used by Data Cloud is what makes it so compelling. Back in 2023, Parker Harris at a True To The Core session called out the need for a metadata dictionary and gave Elements.cloud a shout-out.

Data Cloud – the future of Salesforce

One common area of discussion was Data Cloud. The future of Salesforce is Data Cloud with a huge level of development, marketing and training investment. It is not an add-on, but core to their long-term strategy. It completes the Customer360 story and being able to aggregate data from 3rd party systems and take action within Salesforce is extremely compelling. Those actions could be populating list views, initiating automation, enhancing the data context for AI, or custom code. But speculating, eventually all CRM data could reside in Data Cloud, and the CRM when currently known and understood is simply the UI into that data.

The lack of understanding of the power of Data Cloud combined with the (incorrect) view that it is expensive and not valid for SMBs has dampened the interest. But when the Elements.cloud implementation approach was presented, it changed the entire conversation. 

Demonstrating a clear use case is the starting point. Then being able to understand the data sources and flows means that the data volumes can be used to calculate the costs. That enables the ROI to be validated before any design. Once the data design is completed you have a clear list of the work to be done, tracked as user stories. The costs can be validated again to ensure there is a positive ROI. You are 80% of the way there. The final 20% is now implementing the changes. This is a very different approach to typical Salesforce projects, where there is limited analysis and you iterate your way to the correct solution. This is a wasteful approach for core Salesforce, as it generates rework and tech debt. But with Data Cloud, this approach will not work.

What is clear is that more education is required on the use cases, implementation approach, and pricing. Elements.cloud has documented the implementation approach to make it repeatable, with an emphasis on analysis, design, and documentation. Here is a short video by Adrian King (Elments.cloud CTO) talking through our implementation approach.

Bringing it all together

Data Cloud draws the previous themes together.  To have any chance of success with Data Cloud, you need to perform rigorous business and data analysis. 80% of the project is planning. Only 20% is Data Cloud configuration. It is iterative, with each use case building on the prior one, so metadata and architecture documentation is critical. A central part of the documentation strategy is metadata dictionaries for Salesforce including Data Cloud, and the external applications that are connected to Data Cloud.

Back to News