Inference Client
The Mecha Agent Inference Client is a package provides a Chat UI client you can put into your web apps so people on the internet can interact and chat with your agents through it.
In this quick tutorial, I’ll walk you through integrating the Inference Client into your existing web app so your visitors can engage with your AI agent.
Here is how the Inference Client look 👇
- Dark
- Light
Installation​
We currently support only Next.js
and Sveltekit
frameworks, So your web app must be built with these frameworks to be able to add the Inference Client into your app.
- CLI setup
- Manual setup
With our CLI tool you can easily add The Mecha Agent Inference Client into your project by a single command 👇
- Next.js
- Sveltekit
npx @mecha_agent_inference_client/cli nextjs -ts
npx @mecha_agent_inference_client/cli sveltekit -ts
You have to remove the -ts
option if you use only JavaScript
You can use pnpm dlx
, yarn dlx
or bunx
to run our @mecha_agent_inference_client/cli
CLI tool.
After running the CLI tool, you should see that a new package has been added to your dependencies and a new API route handers file has been created.
To setup the Mecha Agent Inference Client into your project manually you need to install the right package for your framework first.
- Next.js
- Sveltekit
npm install @mecha_agent_inference_client/nextjs
npm install @mecha_agent_inference_client/sveltekit
Or use pnpm install
, yarn add
, bun add
or deno add npm:...
depending on which packages manager do you use
Then create the API route handler file of the backend side of the Mecha Agent Inference Client in your project
- Next.js
- Sveltekit
// app/api/mecha-agent/route.(ts or .js)
import { handler } from "@mecha_agent_inference_client/nextjs";
export const routeHandler = handler({
agentId: process.env.AGENT_ID as string,
apiKey: process.env.MECHA_AGENT_API_KEY as string,
})
export { routeHandler as GET, routeHandler as POST };
The API route handler file must be placed in app/api/mecha-agent/route
file to be accessible via HTTP requests
from the frontend side of the inference client like this: 👇
fetch("/api/mecha-agent")
// routes/api/mecha-agent/+server.(ts or .js)
import { MECHA_AGENT_API_KEY, AGENT_ID } from "$env/static/private";
import { handler } from "@mecha_agent_inference_client/sveltekit/server";
export const fallback = handler({
agentId: AGENT_ID,
apiKey: MECHA_AGENT_API_KEY,
})
The API route handler file must be placed in routes/api/mecha-agent/+server
file to be accessible
via HTTP requests from the frontend side of the inference client like this: 👇
fetch("/api/mecha-agent")
Environment variables​
Now you need to bring and set the values of MECHA_AGENT_API_KEY
and AGENT_ID
environment variables so that the
Mecha Agent Inference Client works as it's supposed to be.
Bring an API Key​
To bring an API Key, Go to your account in Mecha Agent App, Navigate to /api-keys
page,
Create an API Key with inference
and read
permissions, Click on the "Copy" icon beside the Key in API keys table to
copy it, And put the key in your .env
file as MECHA_AGENT_API_KEY=the API Key here
environment variable.
The API Key must have inference
and read
permissions, Otherwise, The Mecha Agent Inference Client
will not work as expected.
Bring agent's id​
To get your agent's id, Go to /agents
page and click on your agent to open it up in its full page,
Then click on the "copy" icon beside your agent's name to copy it's id, And past the id of the agent in your .env
file as AGENT_ID=agent's id here
environment variable.
Usage​
Now you are ready to use the Inference Client in your app!, Just import the MechaAgentChat
component and put it wherever you want in the UI of your project !.
- Next.js
- Sveltekit
import { MechaAgentChat } from "@mecha_agent_inference_client/nextjs";
<MechaAgentChat /> // Put this somewhere
import { MechaAgentChat } from "@mecha_agent_inference_client/sveltekit";
<MechaAgentChat /> // Put this somewhere