We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hello there,
I'm trying to use gpt-tokenizer inside a NextJs edge function and get the following error
gpt-tokenizer
Attempted import error: './index.js' does not contain a default export (imported as 'cjs').
When using the require version of the import I get the following error:
{ error: [TypeError: Cannot read properties of undefined (reading 'encodeChatGenerator')] }
Here is the utility function that I'm using in the edge route:
import { type Message } from "ai"; import { encodeChat } from "gpt-tokenizer"; // const api = require("gpt-tokenizer/cjs/encoding/cl100k_base"); // const { encodeChat } = require("gpt-tokenizer"); // Their interface is missing the `function` role interface ChatMessage extends Omit<Message, "role"> { role: "system" | "user" | "assistant" | undefined; } export function getTokenUsage( messages: Omit<Message, "id">[], model?: | "gpt-4" | "gpt-4-32k" | "gpt-4-0314" | "gpt-4-32k-0314" | "gpt-3.5-turbo" | "gpt-3.5-turbo-0301" | "text-davinci-003" | "text-davinci-002" | "text-davinci-001" | "text-curie-001" | "text-babbage-001" | "text-ada-001" | "davinci" | "curie" | "babbage" | "ada" | "code-davinci-002" | "code-davinci-001" | "code-cushman-002" | "code-cushman-001" | "davinci-codex" | "cushman-codex" | "text-davinci-edit-001" | "code-davinci-edit-001" | "text-embedding-ada-002" | "text-similarity-davinci-001" | "text-similarity-curie-001" | "text-similarity-babbage-001" | "text-similarity-ada-001" | "text-search-davinci-doc-001" | "text-search-curie-doc-001" | "text-search-babbage-doc-001" | "text-search-ada-doc-001" | "code-search-babbage-code-001" | "code-search-ada-code-001" | undefined, tokenLimit?: number ) { const chatTokens = encodeChat(messages as ChatMessage[], model); console.log({ chatTokens }); console.log({ length: chatTokens.length }); return chatTokens.length; }
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Hello there,
I'm trying to use
gpt-tokenizer
inside a NextJs edge function and get the following errorWhen using the require version of the import I get the following error:
Here is the utility function that I'm using in the edge route:
The text was updated successfully, but these errors were encountered: