# Ernie Bot 文心一言

*author: Jael*

<br />

## Description 描述

A LLM operator generates answer given prompt in messages using a large language model or service.
This operator is implemented with Ernie Bot from [Baidu](https://cloud.baidu.com/wenxin.html).
Please note you will need [Ernie API key & Secret key](https://ai.baidu.com/ai-doc/REFERENCE/Lkru0zoz4) to access the service.

LLM 算子使用大语言模型或服务,为输入的问题或提示生成答案。LLM/Ernie 利用了来自百度的[文心一言](https://cloud.baidu.com/wenxin.html)。请注意,您需要[文心一言服务的 API Key 和 Secret Key](https://ai.baidu.com/ai-doc/REFERENCE/Lkru0zoz4)才能访问该服务。

<br />

## Code Example 代码示例

*Write a pipeline with explicit inputs/outputs name specifications:*

```python
from towhee import pipe, ops

p = (
    pipe.input('messages')
        .map('messages', 'answer', ops.LLM.Ernie(api_key=ERNIE_API_KEY, secret_key=ERNIE_SECRET_KEY))
        .output('answer')
)

messages=[
        {'question': 'Zilliz Cloud 是什么?', 'answer': 'Zilliz Cloud 是一种全托管的向量检索服务。'},
        {'question': '它和 Milvus 的关系是什么?'}
    ]
answer = p(messages).get()[0]
```

<br />

## Factory Constructor 接口说明

Create the operator via the following factory method:

***LLM.Ernie(api_key: str, secret_key: str)***

**Parameters:**


***api_key***: *str=None*

The Ernie API key in string, defaults to None. If None, it will use the environment variable `ERNIE_API_KEY`.

***secret_key***: *str=None*

The Ernie Secret key in string, defaults to None. If None, it will use the environment variable `ERNIE_SECRET_KEY`.

***\*\*kwargs***

Other OpenAI parameters such as temperature, etc.

<br />

## Interface 使用说明

The operator takes a piece of text in string as input.
It returns answer in json.

***\_\_call\_\_(txt)***

**Parameters:**

***messages***: *list*

​	A list of messages to set up chat.
Must be a list of dictionaries with key value from "question", "answer". For example, [{"question": "a past question?", "answer": "a past answer."}, {"question": "current question?"}].
It also accepts the orignal Ernie message format like [{"role": "user", "content": "a question?"}, {"role": "assistant", "content": "an answer."}]

**Returns**:

*answer: str*

​	The next answer generated by role "assistant".

<br />