You may have noticed that as the schema of Salesforce has continued to expand that the Apex SObject describe API layer has also been getting more complex to use in performant and scalable ways. In this post I’m hoping to share some tips on best practices, as well as some of the more nuanced details about what we’ve been up to under the hood in terms of trying to improve the status quo. As always, the dev community was also up to the same thing, and came up with some interesting tricks independently.
In Spring ‘20 the Apex team spent a substantial amount of effort to bring safe lazy loading of describe attributes into the language to attempt to minimize the up-front cost of building
Schema.SObjectDescribe instances - especially if you don’t need to read every attribute of it for your use case. Internally,
Schema.DescribeSObjectResult instances now keep a private copy of the context they were generated in - what namespace, API version,
without sharing, and a number of other aspects that can have an impact what’s accessible. For the sake of consistency with non-lazy
Schema.DescribeSObjectResult instances, where all the attributes were calculated in a single security context, we have to actually restore the same context in which the describe was generated in, then calculate the attribute.
On that note though, we did discover that unless the “Use Improved Schema Caching” critical update is enabled that this could lead to functional changes in edge cases, especially when mixing and matching API versions in the same request. So to take advantage of this in Spring ‘20 requires at least one of two things to be true:
- The “Use Improved Schema Caching” critical update is enabled in the org. This applies deferred mode as the default retroactively across all API versions.
- The Apex code generating the describe is using the new overload, and passing in the DEFERRED
SObjectDescribeOptionsenum value to the
getDescribe()call to specifically request lazy loading of properties on use.
For ISV partners specifically, where it’s challenging (at best) to ask customers to enable a critical update across the board I’d strongly recommend making use of the new method overload for performance sensitive code paths to ensure your customers have minimal code overhead.
In other words, if you’re working on a managed package and want to ensure your customers are taking advantage of this deferred mode, you’d want your code to look something like
DescribeSobjectResult dsr = Account.SObjectType.getDescribe(SObjectDescribeOptions.DEFERRED);
Deferred mode is especially useful when you aren’t going to need the
getChildRelationships() data for a specific SObject type from the describe. As the schema of an organization grows in complexity pre-calculating this method was observed to be one of the most substantial increases in terms of CPU time as it in turns requires running a non-trivial volume of field-level
getDescribe() calls to fully populate.
As a best-case example for DEFERRED:
Actual results from a customer's org:— Zach McElrath (@zachelrath) March 9, 2020
---> takes 700 ms
---> takes 1 ms
If only describeSObjects() supported DEFERRED...
Take a look at the Apex Developer Guide’s Spring ‘20 version’s topic on
Schema.SObjectType.getDescribe(options), which has a full truth table that describes the default behavior in more depth.
When to Not be Lazy
While being lazy, especially for describe objects, is amusingly a generally good quality there are situations where it may not be exactly what you want.
One particular case we discovered was Einstein related picklist fields. These have a fairly unique and complex implementation that automatically expands and contracts the valid scope of picklist entries based on what SObject types the current user context has access to. While lazy describe objects make every effort to restore the security context fully sometimes that’s not good enough to guarantee 100% accuracy for this type of field if the valid picklist values are interrogated in a different context. This is very much a case where you’d want to either ensure the same API version across all the code making use of the same describe instance, or else using the FULL describe mode to avoid any potential quirks.
To force the FULL mode, your code would end up looking like
DescribeSobjectResult dsr = Account.SObjectType.getDescribe(SObjectDescribeOptions.FULL);
Consider the Metadata Catalog
It is worth calling out that describes are not the only way to inspect the schema anymore, and in many cases may not be the optimal tool. If you haven’t ever made use of the metadata catalog family of standard objects they’re exceptionally well suited for getting specific bits of metadata about fields or objects without needing to search through the whole global describe for a match. For describe-like use cases, you’ll want to pay special attention to
EntityParticle`. Despite their docs being in the tooling API guide, the same SObjects and fields are available in the data API, which means you can query these in Apex without needing anything special like callouts.
For bonus fun, there’s aspects of entities exposed via the Metadata Catalog entities that aren’t in the Apex describe API, like ExternalSharingModel, which is the best way to detect what specific sharing model any standard or custom object is making use of.
In other words, a query like
SELECT NamespacePrefix, DeveloperName, ExternalSharingModel, InternalSharingModel FROM EntityDefinition gives you back a wealth of info that isn’t even available via the describe API at all. And if you further refine it, say with a
WHERE NamespacePrefix = 'mine' clause you can end up getting back details about the org schema much more efficiently than looping through Schema.getGlobalDescribe() results, calling getDescribe() on each, and then checking the namespace prefix. Any time you’re looking for a needle in the haystack of org schema you should be considering Metadata Catalog queries before describes.
Take a close look at the tooling API documentation, since there’s a number of hidden gems, like the Publisher field which can be used with the “isSalesforce” value to find standard vs custom objects, i.e.
sfdx force:data:soql:query -q "SELECT NamespacePrefix, QualifiedApiName, ExternalSharingModel, InternalSharingModel, Publisher.isSalesforce FROM EntityDefinition WHERE DeveloperName = 'Case'" NAMESPACEPREFIX QUALIFIEDAPINAME EXTERNALSHARINGMODEL INTERNALSHARINGMODEL PUBLISHER.ISSALESFORCE ─────────────── ──────────────── ──────────────────── ──────────────────── ────────────────────── null Case Private ReadWriteTransfer true null Case__c Private ReadWrite Total number of records retrieved: 2.
Do note that in this example developerName is not the full API name of the SObject type, which is why we get back both the custom and standard “Case” objects when filtering on DeveloperName, and why we select the QualifiedApiName field, which does distingish between them via the __c suffix. Be sure to read the documentation about metadata catalog types carefully before you rely on specific behavior!
If you have any feedback, please let us know either on twitter or in the comments. I’m especially interested in feedback about the viability of the metadata catalog - let us know performance, functional, or any other kinds of concerns you have upon closer inspection.