We did a podcast about Explain Query in which we talked about the usefulness of this tool. Here is a written breakdown of that podcast for this specific ACS Adobe Experience Manager Tool. Remember, these should only be used in your non-production lanes.
What is it?
From the ACS GitHub page about Explain Query: “Explain Query is a tool that explains how Oak is executing a query. For any given query, Oak attempts to figure out the best way to execute based on the repositories’ defined Oak indexes (under /oak:index). Depending on the query, different indexes may be chosen by Oak. Understanding how Oak is executing a query is the first step to optimizing the query.”
How it works
Perhaps the biggest change in the Adobe Experience Manager 6.x line is the transition to Jackrabbit Oak, which brings with it a whole new way of storing data. As a result of the new storage engine, you will likely find yourself occasionally revisiting how your content is indexed so that JCR queries can bring back the correct results in a reasonable amount of time.
Traditional relational databases have historically included a utility to help you determine where you need to improve or expand your indexes, but JCR users have largely been left to their own devices. Whether it’s a new migration to the 6.x line, or a new feature you’re implementing in an existing implementation, or just that performance has tanked because some part of the content tree grew in an unexpected way, you now have a tool you can use to determine the correct indexing on your content.
Example Application
Axis41 recently worked with a customer whose previous implementer had created an AEM 5.x implementation, which was driven by multiple levels of indirect queries; a given component would perform a query for some node properties, use those properties to build a string that was fed again into the query engine, which would return a set of node properties…and so on, ad nauseum. When we deployed this same code to Adobe Experience Manager 6.1, in some cases it would take several minutes per component to render—bringing the front page of the site to a total of 18-20 minutes to render. By taking the queries and running them through the Explain Query tool, we were able to define a set of indexes that brought much more reasonable response times (while we rewrote the implementation on a parallel work-stream to be more directly manageable).