We will begin by creating the stand-alone proxy service. The role of this service is to connect the Shopify GraphQL APIs to a REST endpoint that can be called by a Syllable Agent Tool. If you would like to use the built-in Syllable Shopify Tools, you can skip this step. 

Why not directly link the Syllable Agent Tools to the Shopify APIs? To retrieve the data from Shopify we will need for this agent, we must use Shopify’s GraphQL API. Syllable Tools must use a REST HTTP endpoint, thus Shopify’s GraphQL API is incompatible for direct interface to a Syllable Tool. Additionally, several proxy-service endpoints will require additional business logic to prepare and parse the GraphQL queries and responses (such as iterating through lists, etc). This functionality, while easy to implement in code, is difficult for an LLM to perform consistently and reliably. Therefore, separating business logic into a separate service will produce the best overall results.

How and where you choose to build your proxy service is not important. For this tutorial, we will build a simple Flask Python application that is hosted in AWS AppRunner. To get started quickly, clone this git repository for the completed proxy-service code. Follow the AWS Getting Started Guide to setup this repo as a service on AppRunner. Again, you can host this service however you want (or build a new one entirely).

The provided Shopify proxy service contains endpoints for the following:

  • GET /shopify/order-by-number
  • GET /shopify/order-by-confirmation-number-and-email
  • GET /shopify/products
  • GET /shopify/get-product-url

Each of these endpoints (except /get-product-url) will perform a GraphQL query using the Shopify API. All endpoints will return a concise response that is simple for the LLM Syllable Agent to interpret.

Once your service is up and running, perform a health check using Postman or curl.

Next, click “Create a prompt” to continue the tutorial.