
- Prompt name: The name of the prompt is used to reference it elsewhere in Console, so you should pick something that’s easily identifiable. Enter “Weather prompt” here.
- Prompt description (optional): This description is displayed for extra context on the prompt list screen you just saw. For this prompt you can enter “Prompt for a weather agent” here.
- Model provider: This is the provider for the LLM that the prompt, and therefore any agents using it, will converse with. Select “OpenAI” here.
- Model and Version: This is the specific LLM model that the prompt will use. Select “GPT-4.1” and April version (or the latest model and version) here.
- Seed (optional): This is an integer value that controls the reproducibility of the job. You can skip this step.
- Temperature (optional): This is a decimal value between 0 and 2 that controls randomness, higher is more varied, lower is more focused. You can skip this step.
- Tools: Here we can provide the prompt access to tools. You can select “hangup” for now, since these are standard tools that all prompts should be able to use, but we will be coming back to this field later after we’ve created our custom tools.
- Content: This is the actual text of the prompt that will be sent to the LLM at the beginning of the conversation. Enter the following here: