Readme
Files and versions
Updated 2 years ago
LLM
文心一言
author: Jael
Description
A LLM operator generates answer given prompt in messages using a large language model or service. This operator is implemented with Ernie Bot from Baidu. Please note you will need Ernie API key & Secret key to access the service.
Code Example
Use the default model to continue the conversation from given messages.
Write a pipeline with explicit inputs/outputs name specifications:
from towhee import pipe, ops
p = (
pipe.input('messages')
.map('messages', 'answer', ops.LLM.Ernie(api_key=ERNIE_API_KEY, secret_key=ERNIE_SECRET_KEY))
.output('messages', 'answer')
)
messages=[
{'question': 'Zilliz Cloud 是什么?', 'answer': 'Zilliz Cloud 是一种全托管的向量检索服务。'},
{'question': '它和 Milvus 的关系是什么?'}
]
answer = p(messages).get()[0]
Factory Constructor
Create the operator via the following factory method:
LLM.Ernie(api_key: str, secret_key: str)
Parameters:
api_key: str=None
The Ernie API key in string, defaults to None. If None, it will use the environment variable ERNIE_API_KEY
.
secret_key: str=None
The Ernie Secret key in string, defaults to None. If None, it will use the environment variable ERNIE_SECRET_KEY
.
**kwargs
Other OpenAI parameters such as temperature, etc.
Interface
The operator takes a piece of text in string as input. It returns answer in json.
__call__(txt)
Parameters:
messages: list
A list of messages to set up chat. Must be a list of dictionaries with key value from "question", "answer". For example, [{"question": "a past question?", "answer": "a past answer."}, {"question": "current question?"}]. It also accepts the orignal Ernie message format like [{"role": "user", "content": "a question?"}, {"role": "assistant", "content": "an answer."}]
Returns:
answer: str
The next answer generated by role "assistant".
| 2 Commits | ||
---|---|---|---|
|
1.1 KiB
|
2 years ago | |
|
2.1 KiB
|
2 years ago | |
|
102 B
|
2 years ago | |
|
3.2 KiB
|
2 years ago | |
|
8 B
|
2 years ago |