logo
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
Readme
Files and versions

Updated 2 years ago

text-embedding

Text Embedding with Transformers

author: Jael Gu

Desription

A text embedding operator implemented with pretrained models from Huggingface Transformers.

from towhee import ops

text_encoder = ops.text_embedding.transformers("bert-base-cased")
text_embedding = text_encoder("Hello, world.")

Factory Constructor

Create the operator via the following factory method

ops.text_embedding.transformers(model_name)

Interface

A text embedding operator takes a sentence, paragraph, or document in string as an input and output an embedding vector in ndarray which captures the input's core semantic elements.

Parameters:

text: str

​ The text in string.

Returns: numpy.ndarray

​ The text embedding extracted by model.

Code Example

Use the pretrained Bert-Base-Cased model ('bert-base-cased') to generate a text embedding for the sentence "Hello, world.".

Write the pipeline in simplified style:

import towhee.DataCollection as dc

dc.glob("Hello, world.")
  .text_embedding.transformers('bert-base-cased')
  .show()

Write a same pipeline with explicit inputs/outputs name specifications:

from towhee import DataCollection as dc

dc.glob['text']('Hello, world.')
  .text_embedding.transformers['text', 'vec']('bert-base-cased')
  .select('vec')
  .show()
Jael Gu 923b3c6ff4 Refactor operator 2 Commits
file-icon .gitattributes
1.1 KiB
download-icon
Initial commit 2 years ago
file-icon README.md
1.4 KiB
download-icon
Refactor operator 2 years ago
file-icon __init__.py
693 B
download-icon
Refactor operator 2 years ago
file-icon auto_transformers.py
2.3 KiB
download-icon
Refactor operator 2 years ago
file-icon requirements.txt
42 B
download-icon
Refactor operator 2 years ago