Close up of a yellow tulip in bloom.

Using LangChain With NextJS 13

The latest version of NextJS, has a new way of handling routing — using the app directory and React Server Components. I’m still learning how it works, and last week they added support for Vercel, Cloudflare, and other runtimes. So I wanted to see how it works and figured I’d play with both at once.

Last week, I wrote a little about why LangChain is so interesting to me. It’s a tool for composing AI-powered applications with LLMs like OpenAI’s GPT models or HuggingFace or open source models.

Getting Started

First, create a new NextJS app with the experimental router flag:

npx create-next-app@latest --experimental-app

Then install that LangChain:

npm install -S langchain

Create a .env file and put your OpenAI API key in OPENAI_API_KEY.

Modify Hello API Endpoint To Handle Prompts

By default, app/api/hello/route.ts has a GET endpoint. Let’s make that handle a prompt. First, get the prompt from the request, or return an error:

export async function GET(request: Request) {
  //Get prompt from url query args
  const prompt = new URL(request.url).searchParams.get("prompt" ) as string;
  if( !prompt ) {
    return NextResponse.json({
      message: "Missing prompt"
    }, {
      status: 400,
    })
  }
}

Ok, now, let’s actually use ChatGPT to return generated text and return a response:

import { ChatOpenAI } from "langchain/chat_models/openai";
import { HumanChatMessage, SystemChatMessage } from "langchain/schema";

const chat = new ChatOpenAI({
  temperature: 0,
  openAIApiKey:process.env.OPENAI_API_KEY
 });
export async function GET(request: Request) {
  //Get prompt from url query args
  const prompt = new URL(request.url).searchParams.get("prompt" ) as string;
  if( !prompt ) {
    return NextResponse.json({
      message: "Missing prompt"
    }, {
      status: 400,
    })
  }

  //Ask the chat model
  const response = await chat.call([
    new HumanChatMessage(prompt),
  ]).catch((error) => {
    console.log(error);
    return new Response(JSON.stringify({error}))
  });

  return NextResponse.json(
    {text: response.text}
  );
}

Question And Answer Chains

The last example is using ChatGPT API, because it is cheap, via LangChain’s Chat Model. You can also use other LLM models. If that’s all you need to do, LangChain is overkill, use the OpenAI npm package instead.

A great use for LangChain is analyzing multiple documents and asking questions about them. Here is something I’m playing with to load files from Github — using the Github loader — and asking it questions:

import { GithubRepoLoader } from "langchain/document_loaders/web/github";
import { OpenAI } from "langchain/llms/openai";
import { loadQAStuffChain, loadQAMapReduceChain } from "langchain/chains";
import { Document } from "langchain/document";

export const ask = async ({ docs, question }: {
  docs: Document[],
  question: string
}) => {
  const llmA = new OpenAI({
    openAIApiKey: process.env.OPENAI_API_KEY,
  });
  const chain = loadQAStuffChain(llmA);
  const res = await chain.call({
    input_documents: docs,
    question,
  });
  return res;
};
export async function GET(request: Request) {
  const url = new URL(request.url);
  const question = url.searchParams.get("question") as string;
  const owner = url.searchParams.get("owner");
  const repo = url.searchParams.get("repo");
  const branch = url.searchParams.get("branch") as string || 'main';
  const loader = new GithubRepoLoader(
    `https://github.com/${owner}/${repo}`,
    { branch, recursive: false, unknown: "warn", accessToken: process.env.GITHUB_ACCESS_TOKEN }
  );
  const docs = await loader.load();
  const response = await ask({ docs, question }).catch((e) => {
    return new Response(JSON.stringify({
      error: e
    }))
  });
  return NextResponse.json(
    {text: response.text}
  );
}

What Do You Want To Build With AI?

What do you want to build with AI? I’m working on rebuilding the API server, and maybe UI for the AI-assisted writing plugin. It’s open source and still very basic. Send me a DM on WordPress Slack or Twitter or Mastodon if you want to play along.

CCO licensed photo by me from the WordPress Photo Directory.

New eBook

Refactoring WordPress Plugins

The PHP Parts

 For experienced WordPress developers who are looking to improve the PHP in their plugins, use new modern best practices and adopt test-driven development.