PromptLayer OpenAI
danger
This module has been deprecated and is no longer supported. The documentation below will not work in versions 0.2.0 or later.
LangChain integrates with PromptLayer for logging and debugging prompts and responses. To add support for PromptLayer:
- Create a PromptLayer account here: https://promptlayer.com.
- Create an API token and pass it either as
promptLayerApiKey
argument in thePromptLayerOpenAI
constructor or in thePROMPTLAYER_API_KEY
environment variable.
import { PromptLayerOpenAI } from "langchain/llms/openai";
const model = new PromptLayerOpenAI({
temperature: 0.9,
apiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.OPENAI_API_KEY
promptLayerApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.PROMPTLAYER_API_KEY
});
const res = await model.invoke(
"What would be a good company name a company that makes colorful socks?"
);
Azure PromptLayerOpenAI
LangChain also integrates with PromptLayer for Azure-hosted OpenAI instances:
import { PromptLayerOpenAI } from "langchain/llms/openai";
const model = new PromptLayerOpenAI({
temperature: 0.9,
azureOpenAIApiKey: "YOUR-AOAI-API-KEY", // In Node.js defaults to process.env.AZURE_OPENAI_API_KEY
azureOpenAIApiInstanceName: "YOUR-AOAI-INSTANCE-NAME", // In Node.js defaults to process.env.AZURE_OPENAI_API_INSTANCE_NAME
azureOpenAIApiDeploymentName: "YOUR-AOAI-DEPLOYMENT-NAME", // In Node.js defaults to process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME
azureOpenAIApiCompletionsDeploymentName:
"YOUR-AOAI-COMPLETIONS-DEPLOYMENT-NAME", // In Node.js defaults to process.env.AZURE_OPENAI_API_COMPLETIONS_DEPLOYMENT_NAME
azureOpenAIApiEmbeddingsDeploymentName:
"YOUR-AOAI-EMBEDDINGS-DEPLOYMENT-NAME", // In Node.js defaults to process.env.AZURE_OPENAI_API_EMBEDDINGS_DEPLOYMENT_NAME
azureOpenAIApiVersion: "YOUR-AOAI-API-VERSION", // In Node.js defaults to process.env.AZURE_OPENAI_API_VERSION
azureOpenAIBasePath: "YOUR-AZURE-OPENAI-BASE-PATH", // In Node.js defaults to process.env.AZURE_OPENAI_BASE_PATH
promptLayerApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.PROMPTLAYER_API_KEY
});
const res = await model.invoke(
"What would be a good company name a company that makes colorful socks?"
);
The request and the response will be logged in the PromptLayer dashboard.
Note: In streaming mode PromptLayer will not log the response.
Related
- LLM conceptual guide
- LLM how-to guides