Setup
Setup the CopilotKit Backend Endpoint:
Currently all you require is an endpoint which serves as a proxy to an OpenAI-like LLM endpoint. See below for examples using NextJS edge functions, Portkey Gatewway, and Flask.Manually specified endpoint: nodeJS (typescript)
Make sure to install theai
and openai
packages:
/api/copilotkit/chat/route.ts (NextJS)
Portkey endpoint for connecting to 20+ LLMs: nodeJS (typescript)
Portkey serves as a unified gateway to access over 20 LLMs, such as OpenAI, Azure OpenAI, Anthropic, Google, and more. It also enhances your app’s reliability with features like load balancing various models or keys, triggering automated fallbacks in case of failures, and more. Make sure to install theai
and portkey-ai
packages:
/api/copilotkit/chat/route.ts (NextJS)
Manually specified endpoint: flask (python)
Make sure to install theFlask
and openai
packages.