logo
OpenAI
repo-copy-icon

copied

You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
Readme
Files and versions

86 lines
1.9 KiB

# OpenAI Chat Completion
*author: Jael*
<br />
## Description
A LLM operator generates answer given prompt in messages using a large language model or service.
This operator is implemented with Chat Completion method from [OpenAI](https://platform.openai.com/docs/guides/chat).
Please note you need an [OpenAI API key](https://platform.openai.com/account/api-keys) to access OpenAI.
<br />
## Code Example
Use the default model to continue the conversation from given messages.
*Write a pipeline with explicit inputs/outputs name specifications:*
```python
from towhee import pipe, ops
p = (
pipe.input('messages')
.map('messages', 'answer', ops.LLM.OpenAI(api_key=OPENAI_API_KEY))
.output('messages', 'answer')
)
messages=[
{'question': 'Who won the world series in 2020?', 'answer': 'The Los Angeles Dodgers won the World Series in 2020.'},
{'question': 'Where was it played?'}
]
answer = p(messages).get()[0]
```
<br />
## Factory Constructor
Create the operator via the following factory method:
***chatbot.openai(model_name: str, api_key: str)***
**Parameters:**
***model_name***: *str*
The model name in string, defaults to 'gpt-3.5-turbo'. Supported model names:
- gpt-3.5-turbo
- pt-3.5-turbo-0301
***api_key***: *str=None*
The OpenAI API key in string, defaults to None.
***\*\*kwargs***
Other OpenAI parameters such as max_tokens, stream, temperature, etc.
<br />
## Interface
The operator takes a piece of text in string as input.
It returns answer in json.
***\_\_call\_\_(txt)***
**Parameters:**
***messages***: *list*
​ A list of messages to set up chat.
Must be a list of dictionaries with key value from "system", "question", "answer". For example, [{"question": "a past question?", "answer": "a past answer."}, {"question": "current question?"}]
**Returns**:
*answer: str*
​ The next answer generated by role "assistant".
<br />
1 year ago