Integrations
Composable Prompts integrates with leading GenAI model and inference providers. Users can assemble multiple models from different providers into a synthetic LLM environment for load balancing between models or multi-head execution and LLM-mediated selection.
The integration with OpenAI provides access to AI models such as GPT-3.5 and GPT-4.
The Bedrock integration supports Claude, Cohere, LLama 2, AI21's models, and AWS Titan.
Google's Vertex AI is a machine learning (ML) platform for use in AI-powered applications.
Groq provides extremely fast inference for computationally intensive applications.
Run and fine-tune open-source models and deploy custom models at scale.
Together AI offers one of the fastest inference stacks available for open-source models.
MistralAI is a fully integrated AI model provider that gives access to models such as Mistral 7B and Mixtral 8×7B.
Support for Hugging Face Inference Endpoints to easily deploy Transformers and Diffusion models.