Using AI to evaluate metadata descriptions
Salesforce needs/reads your descriptions
With the launch of Einstein1 and its powerful AI capabilities, there is suddenly a more pressing reason for filling out metadata descriptions. Einstein1 Copilot uses descriptions for the core platform and Data Cloud metadata so that it can perform its magic. In the compelling demos at TDX and World Tours, you can see that just a few sentences are prompting Copilot, and it is making sense of the request. It is going out and looking for fields with the data it needs. You are now telling it where to look. It is also looking for the Flows or Apex that it can run to take action and chain those actions together.
If you are going to start documenting, don’t limit yourself to just what Einstein needs. Good documentation accelerates time to value for future changes. It reduces the time needed for impact analysis and helps you identify metadata to remove, thereby cleaning up technical debt. AI like ElementsGPT can read it, helping you architect and design better solutions.
For all of those of you who have been fastidiously documenting…. You are done. And for the slackers out there, you have another reason to add descriptions. You give a description to an action, just like you would introduce it to a colleague. That teaches Einstein. Let’s give it up for documentation. I love it.
John Kucera, SVP Product Management, Salesforce
@TDX Keynote
Why, not What
In a recent article “Metadata descriptions. Can AI write them for you” I argued that it can’t. This is because the description for metadata needs to explain “Why” it was created: the background and the decisions that were made. “What” the metadata does is less useful when determining how and when to use it.
Also, you need to write the descriptions so that AI can understand them. AI doesn’t understand your organization-specific acronyms and abbreviations. It is not good at making assumptions. It was not there when you made architectural or design decisions. So it doesn’t understand why. We wrote a standard called MDD (Metadata Description Definition) to help you understand how to structure your descriptions.
Evaluating current descriptions
Maybe someone has been diligent and has been adding descriptions. And if they have, how useful are the descriptions? Are they “What” or “Why”? Before you start a huge amount of work documenting metadata, it would be good to understand the current state: “metadata description discovery”.
What you need is a view of how many metadata items have descriptions. And if they do, are they “What’’ and therefore of questionable value, vs “Why”. Just doing this sounds like a huge analysis job. So why not get AI to help us? If it can’t write the descriptions, it could at least help us estimate the work.
It can.
BTW Certain metadata types – standard object, standard field, page layout, list view, apex – do not have description fields in Salesforce. But if you have an Elements.cloud metadata dictionary EVERY metadata item can have a description, including all the picklist values for picklist field.
Approach
This prompt looks at metadata descriptions and, if there is one, evaluates if the description is “What” or “Why.” In the prompt below is the scope of the evaluation. The scope in the prompt is XXX and needs to be replaced – for example ‘all custom fields in opportunity object.‘
BTW There is no point in evaluating standard fields because you cannot update the description. We can ignore certain metadata types that do not have description fields and all metadata for the managed packages because you cannot update them.
Below is the prompt for evaluating the quality of your descriptions. You can use it inside Elements.cloud in the metadata dictionary org copilot. If you don’t have Elements.cloud you are going to have to work out how to export all metadata descriptions into a XLS and then use this prompt in ChatGPT.
Prompt
‘The scope of the evaluation is XXX.
You are an AI trained to evaluate text descriptions. Your task is to determine whether the metadata description explains “what the metadata is” or “why the metadata was created”.
What the field is: This refers to the content or data type of the field. For example, it describes the kind of information the field holds or its properties.
Why the field was created: This refers to the purpose or reason behind the creation of the field. It explains the motivation or need that led to the field’s inclusion in the application. It often has the following words, and how they appear in the sentence
- Why: The reason why this field is important
- Shows: Field shows the Date the Licences have been issued to the customer
- Used to: This is used on a renewal opportunity to show if a discount has been applied
- For: This feature is for tracking user activity.
- Allows: This option allows users to customize their settings.
- Enables: This function enables the processing of large datasets.
- Helps: This software helps in managing project timelines.
- Designed to: This application is designed to streamline workflow.
- Serves to: This feature serves to enhance security.
- Aims to: This module aims to improve user experience.
- Facilitates: This platform facilitates communication between team members.
- Provides: This tool provides a way to automate tasks.
- Intended for: This program is intended for data analysis.
- Function: The function of this feature is to organize files.
- Purpose: The purpose of this tool is to simplify scheduling.
- Utilized for: This system is utilized for real-time monitoring.
- Allows for: This mechanism allows for quick adjustments.
Here are the steps to follow:
- For each metadata item read the description. Ignore the following metadata types because they do not have a description: Standard Object, Standard Field, Page Layout, List View, Apex. Ignore all metadata for the managed packages.
- Determine if the description focuses on the nature and content of the field or on the purpose and rationale for its creation.
- Clearly state whether the description explains “what the field is” or “why the field was created”.
- The status of the field is one of the 3 following: Why, What, No description. Export a csv with each field. The columns are Field name, API name, Standard/Custom, Status, Description.
- Give the distribution of statuses for the evaluated descriptions, by metadata type, split by standard and custom. Distribution % should be to 2 decimal places.‘
Results
The output is a CSV file and also a distribution. You can now take this for your own detailed evaluation
Here is a chart based on the distribution data by taking the CSV of the distribution and putting it into a spreadsheet, eliminating the metadata type with no descriptions, i.e. ‘No description’ = 100%.
Final word
Descriptions have always been important. But, now they help AI make better decisions. It is critical now, that descriptions outline why a field was created. AI cannot help you write the descriptions, but it can help you evaluate the current state of your org descriptions. This should help you document the critical metadata and estimate the effort required.
But, you may need more than descriptions. A metadata dictionary is the heart of your org documentation. AI can make it beat a little faster.
With a Change Intelligence Platform, updating your descriptions becomes significantly easier and quicker. Plus you can add descriptions for every metadata item. Elements.cloud is the single platform to create all documentation. Connect with our team today, to discover how Elements can help you architect and design better solutions.
Sign up for
our newsletter
Subscribe to our newsletter to stay up-to-date with cutting-edge industry insights and timely product updates.
Ian Gotts
Founder & CEO6 minute read
Published: 24th May 2024