diff --git a/README.md b/README.md index 705363f..a9b6a85 100644 --- a/README.md +++ b/README.md @@ -16,7 +16,7 @@ operation, which scales quadratically with the sequence length. To address this we introduce the Longformer with an attention mechanism that scales linearly with sequence length, making it easy to process documents of thousands of tokens or longer[2]. -## Reference +### References [1].https://huggingface.co/docs/transformers/v4.16.2/en/model_doc/longformer#transformers.LongformerConfig @@ -34,7 +34,7 @@ from towhee import dc dc.stream(["Hello, world."]) - .text_embedding.longformer(model_name="allenai/longformer-base-4096") + .text_embedding.longformer(model_name=c"allenai/longformer-base-4096") .show() ```