Amazon Bedrock Prompt Management
Accelerate prompt engineering and easily share prompts for use in generative AI applications
Amazon Bedrock Prompt Management simplifies the creation, evaluation, versioning, and sharing of prompts to help developers and prompt engineers get the best responses from foundation models (FMs) for their use cases.
Create and iterate on your prompts faster
To create a prompt, you can use the Prompt Builder to experiment with multiple FMs, model configurations, system instructions, user/assistant messages, and tool configurations. As you iterate to find the best fit, you can create up to three variations of the prompt for side-by-side comparison. The variations may include different FMs, model parameters, and prompt messages. The comparison window allows you to inspect the outputs side-by-side to choose the best variant.
Test and share prompts seamlessly
Prompts are automatically executed when you want to test, which means you don’t have to implement additional code to deploy the prompt for testing . You can instantly see the output and focus on improving the prompt to meet your business needs. To share the prompt for using in downstream applications, you can simply create a version and make an API call to retrieve or to execute the prompt.
Collaborate on prompt creation
Prompt Management is available in Amazon Bedrock Studio, an SSO-enabled web interface that provides the easiest way for developers across an organization to experiment and collaborate on prompt engineering. You can collaborate directly with your teammates to create, evaluate and share the right prompts for your use case.
Run prompts easily using the Bedrock API
Amazon Bedrock Runtime APIs Converse and InvokeModel support executing a prompt using a Prompt identifier. You can simply provide the prompt identifier as a parameter to Converse and InvokeModel API calls. We will retrieve the necessary prompt information including the Foundation Model, model configuration, system instructions, user/assistant messages, and tool configuration to execute the prompt and return the model response.
Store enterprise metadata with your prompts
Prompt Management on Bedrock also enables you to store custom metadata with the prompts. You can store metadata such as author, team, department etc. to meet your enterprise prompt management needs.
Create prompts for Bedrock Agents
While creating the prompt, you can use Bedrock Agents in your AWS account as the target Generative AI resource to run the prompt. This enables you to get more value out of your agents by storing useful user prompts that can get the best response from your agents.
Getting started
To get started, navigate to the Amazon Bedrock console or Amazon Bedrock Studio and create a new prompt, or start with an existing prompt in Prompt Management. Using the prompt builder, select a model to invoke FM inference, set model parameters, write the prompt message, and quickly evaluate the output.
Did you find what you were looking for today?
Let us know so we can improve the quality of the content on our pages.