*Write a [retrieval-augmented generation pipeline](https://towhee.io/tasks/detail/pipeline/retrieval-augmented-generation) with explicit inputs/outputs name specifications:*
```python
from towhee import pipe, ops
temp = '''{question}
Input:
{context}
'''
docs = ['You can install Towhee via the command `pip install towhee`.']
history = [
('What is Towhee?', 'Towhee is an open-source machine learning project that helps you encode your unstructured data into embeddings.')
Create the operator via the following factory method:
***LLM.Dolly(model_name: str)***
**Parameters:**
***model_name***: *str*
The model name in string, defaults to 'databricks/dolly-v2-12b'. Supported model names:
- databricks/dolly-v2-12b
- databricks/dolly-v2-7b
- databricks/dolly-v2-3b
- databricks/dolly-v1-6b
***\*\*kwargs***
Other Dolly model parameters such as device_map.
<br/>
## Interface
The operator takes a piece of text in string as input.
It returns answer in json.
***\_\_call\_\_(txt)***
**Parameters:**
***messages***: *list*
A list of messages to set up chat.
Must be a list of dictionaries with key value from "system", "question", "answer". For example, [{"question": "a past question?", "answer": "a past answer."}, {"question": "current question?"}]