模块
LLMS
Promptlayer Openai

PromptLayer

PromptLayer是第一个允许您跟踪、管理和共享GPT提示工程的平台。PromptLayer充当您的代码和OpenAI Python库之间的中间件。

PromptLayer记录所有您的OpenAI API请求,允许您在PromptLayer仪表板中搜索和探索请求历史记录。

此示例演示了如何连接到[PromptLayer](https://www.promptlayer.com),以开始记录您的OpenAI请求。 (opens in a new tab)

另一个示例在[这里](https://python.langchain.com/en/latest/ecosystem/promptlayer.html) (opens in a new tab)

Install PromptLayer#

The promptlayer package is required to use PromptLayer with OpenAI. Install promptlayer using pip.

!pip install promptlayer

Imports#

import os
from langchain.llms import PromptLayerOpenAI
import promptlayer

Set the Environment API Key#

You can create a PromptLayer API Key at www.promptlayer.com (opens in a new tab) by clicking the settings cog in the navbar.

Set it as an environment variable called PROMPTLAYER_API_KEY.

You also need an OpenAI Key, called OPENAI_API_KEY.

from getpass import getpass

PROMPTLAYER_API_KEY = getpass()
os.environ["PROMPTLAYER_API_KEY"] = PROMPTLAYER_API_KEY
from getpass import getpass

OPENAI_API_KEY = getpass()
os.environ["OPENAI_API_KEY"] = OPENAI_API_KEY

Use the PromptLayerOpenAI LLM like normal#

You can optionally pass in pl_tags to track your requests with PromptLayer’s tagging feature.

llm = PromptLayerOpenAI(pl_tags=["langchain"])
llm("I am a cat and I want")

The above request should now appear on your PromptLayer dashboard (opens in a new tab).

Using PromptLayer Track#

If you would like to use any of the PromptLayer tracking features (opens in a new tab), you need to pass the argument return_pl_id when instantializing the PromptLayer LLM to get the request id.

llm = PromptLayerOpenAI(return_pl_id=True)
llm_results = llm.generate(["Tell me a joke"])

for res in llm_results.generations:
    pl_request_id = res[0].generation_info["pl_request_id"]
    promptlayer.track.score(request_id=pl_request_id, score=100)

Using this allows you to track the performance of your model in the PromptLayer dashboard. If you are using a prompt template, you can attach a template to a request as well. Overall, this gives you the opportunity to track the performance of different templates and models in the PromptLayer dashboard.