Readme
Files and versions
2.4 KiB
Ernie Bot 文心一言
author: Jael
Description 描述
A LLM operator generates answer given prompt in messages using a large language model or service. This operator is implemented with Ernie Bot from Baidu. Please note you will need Ernie API key & Secret key to access the service.
LLM 算子使用大语言模型或服务,为输入的问题或提示生成答案。LLM/Ernie 利用了来自百度的文心一言。请注意,您需要文心一言服务的 API Key 和 Secret Key才能访问该服务。
Code Example 代码示例
Write a pipeline with explicit inputs/outputs name specifications:
from towhee import pipe, ops
p = (
pipe.input('messages')
.map('messages', 'answer', ops.LLM.Ernie(api_key=ERNIE_API_KEY, secret_key=ERNIE_SECRET_KEY))
.output('answer')
)
messages=[
{'question': 'Zilliz Cloud 是什么?', 'answer': 'Zilliz Cloud 是一种全托管的向量检索服务。'},
{'question': '它和 Milvus 的关系是什么?'}
]
answer = p(messages).get()[0]
Factory Constructor 接口说明
Create the operator via the following factory method:
LLM.Ernie(api_key: str, secret_key: str)
Parameters:
api_key: str=None
The Ernie API key in string, defaults to None. If None, it will use the environment variable ERNIE_API_KEY
.
secret_key: str=None
The Ernie Secret key in string, defaults to None. If None, it will use the environment variable ERNIE_SECRET_KEY
.
**kwargs
Other OpenAI parameters such as temperature, etc.
Interface 使用说明
The operator takes a piece of text in string as input. It returns answer in json.
__call__(txt)
Parameters:
messages: list
A list of messages to set up chat. Must be a list of dictionaries with key value from "question", "answer". For example, [{"question": "a past question?", "answer": "a past answer."}, {"question": "current question?"}]. It also accepts the orignal Ernie message format like [{"role": "user", "content": "a question?"}, {"role": "assistant", "content": "an answer."}]
Returns:
answer: str
The next answer generated by role "assistant".
2.4 KiB
Ernie Bot 文心一言
author: Jael
Description 描述
A LLM operator generates answer given prompt in messages using a large language model or service. This operator is implemented with Ernie Bot from Baidu. Please note you will need Ernie API key & Secret key to access the service.
LLM 算子使用大语言模型或服务,为输入的问题或提示生成答案。LLM/Ernie 利用了来自百度的文心一言。请注意,您需要文心一言服务的 API Key 和 Secret Key才能访问该服务。
Code Example 代码示例
Write a pipeline with explicit inputs/outputs name specifications:
from towhee import pipe, ops
p = (
pipe.input('messages')
.map('messages', 'answer', ops.LLM.Ernie(api_key=ERNIE_API_KEY, secret_key=ERNIE_SECRET_KEY))
.output('answer')
)
messages=[
{'question': 'Zilliz Cloud 是什么?', 'answer': 'Zilliz Cloud 是一种全托管的向量检索服务。'},
{'question': '它和 Milvus 的关系是什么?'}
]
answer = p(messages).get()[0]
Factory Constructor 接口说明
Create the operator via the following factory method:
LLM.Ernie(api_key: str, secret_key: str)
Parameters:
api_key: str=None
The Ernie API key in string, defaults to None. If None, it will use the environment variable ERNIE_API_KEY
.
secret_key: str=None
The Ernie Secret key in string, defaults to None. If None, it will use the environment variable ERNIE_SECRET_KEY
.
**kwargs
Other OpenAI parameters such as temperature, etc.
Interface 使用说明
The operator takes a piece of text in string as input. It returns answer in json.
__call__(txt)
Parameters:
messages: list
A list of messages to set up chat. Must be a list of dictionaries with key value from "question", "answer". For example, [{"question": "a past question?", "answer": "a past answer."}, {"question": "current question?"}]. It also accepts the orignal Ernie message format like [{"role": "user", "content": "a question?"}, {"role": "assistant", "content": "an answer."}]
Returns:
answer: str
The next answer generated by role "assistant".