# Zhipu AI
*author: Jael*
## Description 描述
This operator is implemented with [ChatGLM services from Zhipu AI](https://open.bigmodel.cn).
It directly returns the original response in dictionary without parsing.
Please note you will need [API Key](https://open.bigmodel.cn/login?redirect=%2Fusercenter%2Fapikeys) to access the service.
LLM/ZhipuAI 利用了来自[智谱AI开放平台](https://open.bigmodel.cn)的大语言模型服务。该算子以字典的形式直接返回原始的模型回复。请注意,您需要[API Key](https://open.bigmodel.cn/login?redirect=%2Fusercenter%2Fapikeys)才能访问该服务。
## Code Example 代码示例
*Write a pipeline with explicit inputs/outputs name specifications:*
```python
from towhee import pipe, ops
p = (
pipe.input('messages')
.map('messages', 'response', ops.LLM.ZhipuAI(
api_key=ZHIPUAI_API_KEY,
model_name='chatglm_130b', # or 'chatglm_6b'
temperature=0.5,
max_tokens=50,
))
.output('response')
)
messages=[
{'system': '你是一个资深的软件工程师,善于回答关于科技项目的问题。'},
{'question': 'Zilliz Cloud 是什么?', 'answer': 'Zilliz Cloud 是一种全托管的向量检索服务。'},
{'question': '它和 Milvus 的关系是什么?'}
]
response = p(messages).get()[0]
answer = response['choices'][0]['content']
token_usage = response['usage']
```
## Factory Constructor 接口说明
Create the operator via the following factory method:
***LLM.ZhipuAI(api_key: str, model_name: str, \*\*kwargs)***
**Parameters:**
***api_key***: *str=None*
The Zhipu AI API key in string, defaults to None. If None, it will use the environment variable `ZHIPUAI_API_KEY`.
***model_name***: *str='chatglm_130b'*
The model used in Zhipu AI service, defaults to 'chatglm_130b'. Visit Zhipu AI documentation for supported models.
***\*\*kwargs***
Other ChatGLM parameters such as temperature, etc.
## Interface 使用说明
The operator takes a piece of text in string as input.
It returns answer in json.
***\_\_call\_\_(txt)***
**Parameters:**
***messages***: *list*
A list of messages to set up chat.
Must be a list of dictionaries with key value from "system", "question", "answer". For example, [{"question": "a past question?", "answer": "a past answer."}, {"question": "current question?"}].
It also accepts the orignal ChatGLM message format like [{"role": "user", "content": "a question?"}, {"role": "assistant", "content": "an answer."}]
**Returns**:
*response: dict*
The original llm response in dictionary, including next answer and token usage.