logo
Llama-2
repo-copy-icon

copied

Browse Source

Update example codes

Signed-off-by: Jael Gu <mengjia.gu@zilliz.com>
main
Jael Gu 1 year ago
parent
commit
570486a3ca
  1. 17
      README.md

17
README.md

@ -25,9 +25,13 @@ Use the default model to continue the conversation from given messages.
```python
from towhee import ops
chat = ops.LLM.Llama_2('llama-2-13b-chat', max_tokens=2048)
chat = ops.LLM.Llama_2('llama-2-13b-chat', max_tokens=512)
message = [{"question": "Building a website can be done in 10 simple steps:"}]
message = [
{'system': 'You are a very helpful assistant.'},
{'question': 'Who won the world series in 2020?', 'answer': 'The Los Angeles Dodgers won the World Series in 2020.'},
{'question': 'Where was it played?'}
]
answer = chat(message)
```
@ -39,13 +43,14 @@ from towhee import pipe, ops
p = (
pipe.input('question', 'docs', 'history')
.map(('question', 'docs', 'history'), 'prompt', ops.prompt.question_answer())
.map('prompt', 'answer', ops.LLM.Llama_2('llama-2-7b-chat', stop='</s>'))
.map('prompt', 'answer', ops.LLM.Llama_2('llama-2-7b-chat'))
.output('answer')
)
history=[('Who won the world series in 2020?', 'The Los Angeles Dodgers won the World Series in 2020.')]
question = 'Where was it played?'
answer = p(question, [], history).get()[0]
history=[('What is Towhee?', 'Towhee is a cutting-edge framework designed to streamline the processing of unstructured data through the use of Large Language Model (LLM) based pipeline orchestration.')]
knowledge = ['You can install towhee via `pip install towhee`.']
question = 'How to install it?'
answer = p(question, knowledge, history).get()[0]
```
<br />

Loading…
Cancel
Save