*Write a [retrieval-augmented generation pipeline](https://towhee.io/tasks/detail/pipeline/retrieval-augmented-generation) with explicit inputs/outputs name specifications:*
```python
from towhee import pipe, ops
temp = '''Use the following pieces of context to answer the question at the end.
If you don't know the answer, just say that you don't know, don't try to make up an answer.
{context}
Question: {question}
Helpful Answer:
'''
docs = ['You can install towhee via command `pip install towhee`.']
history = [
('What is Towhee?', 'Towhee is an open-source machine learning pipeline that helps you encode your unstructured data into embeddings.')
The model name in string, defaults to 'gpt-3.5-turbo'. Supported model names:
- gpt-3.5-turbo
- pt-3.5-turbo-0301
***api_key***: *str=None*
The OpenAI API key in string, defaults to None.
***\*\*kwargs***
Other OpenAI parameters such as max_tokens, stream, temperature, etc.
<br/>
## Interface
The operator takes a piece of text in string as input.
It returns answer in json.
***\_\_call\_\_(txt)***
**Parameters:**
***messages***: *list*
A list of messages to set up chat.
Must be a list of dictionaries with key value from "system", "question", "answer". For example, [{"question": "a past question?", "answer": "a past answer."}, {"question": "current question?"}]