logo
Ernie
repo-copy-icon

copied

You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
Readme
Files and versions

86 lines
2.1 KiB

# 文心一言
*author: Jael*
<br />
## Description
A LLM operator generates answer given prompt in messages using a large language model or service.
This operator is implemented with Ernie Bot from [Baidu](https://cloud.baidu.com/wenxin.html).
Please note you will need [Ernie API key & Secret key](https://ai.baidu.com/ai-doc/REFERENCE/Lkru0zoz4) to access the service.
<br />
## Code Example
Use the default model to continue the conversation from given messages.
*Write a pipeline with explicit inputs/outputs name specifications:*
```python
from towhee import pipe, ops
p = (
pipe.input('messages')
.map('messages', 'answer', ops.LLM.Ernie(api_key=ERNIE_API_KEY, secret_key=ERNIE_SECRET_KEY))
.output('messages', 'answer')
)
messages=[
{'question': 'Zilliz Cloud 是什么?', 'answer': 'Zilliz Cloud 是一种全托管的向量检索服务。'},
{'question': '它和 Milvus 的关系是什么?'}
]
answer = p(messages).get()[0]
```
<br />
## Factory Constructor
Create the operator via the following factory method:
***LLM.Ernie(api_key: str, secret_key: str)***
**Parameters:**
***api_key***: *str=None*
The Ernie API key in string, defaults to None. If None, it will use the environment variable `ERNIE_API_KEY`.
***secret_key***: *str=None*
The Ernie Secret key in string, defaults to None. If None, it will use the environment variable `ERNIE_SECRET_KEY`.
***\*\*kwargs***
Other OpenAI parameters such as temperature, etc.
<br />
## Interface
The operator takes a piece of text in string as input.
It returns answer in json.
***\_\_call\_\_(txt)***
**Parameters:**
***messages***: *list*
​ A list of messages to set up chat.
Must be a list of dictionaries with key value from "question", "answer". For example, [{"question": "a past question?", "answer": "a past answer."}, {"question": "current question?"}].
It also accepts the orignal Ernie message format like [{"role": "user", "content": "a question?"}, {"role": "assistant", "content": "an answer."}]
**Returns**:
*answer: str*
​ The next answer generated by role "assistant".
<br />
2 years ago