Skip to main content

Designing Queries in GraphQL

· 2 min read
Alex Weinle
GraphQL Developer and AWS Architect

So you've managed to send off your first few esignature requests using our Quick Start Guide but you can see there's a lot more in the Legalesign API that you'd like to use, but how do you get at it?

Your first stop should be look read this quick summary of how to use the GraphIQL Explorer here.

But that's not the end of the story. As well as being a huge help for application and plugin developers, GraphQL can encourage an I-want-it-all attitude. The temptation is to fetch everything in a single request but a few moments thought will suggest these common problems.

  • Is the data likely to become stale?
  • Am I crippling my application performance with an unnecessarily long start up time?

If you're using technology similar to TanStack Query (and to some degree TanStack Router), you'll understand the benefit of breaking your queries up into usable chunks that can be cached, optimistically updated or invalidated. Here are a few golden guidelines that we came up with through painful experience.

Avoid Query Depth on Lists

Avoid listed queries with a tree depth (the depth of child objects to the main data item list) greater than 2. On single item queries, the child depth can be far greater without causing significant trouble. Sometime it can be worth calculating aggregate metrics on the parent above to avoid another resolver being called. So if the important thing to know on a Batch is the number of documents in it (which can be thousands), but not the specifics of that document, you can use the documentCount included for just that reason.

Isolate Data Types

Keep object types in seperate queries unless it is slow changing, global data (Group retention settings for example). It doesn't become obvious why this is helpful until you need to make data stale or update the cache (then you'll thank me).

Test Query Changes

Sometimes it isn't obvious that a quick alteration to a query comes with a penalty. We found several cases of previously speedy queries that evolved into giant slugs through generations of alterations. Set a conceptual limit for query time depending on your integration or application.