From 2b2da727e38d19f179ca743198faab0bcd668c13 Mon Sep 17 00:00:00 2001 From: Jael Gu Date: Thu, 19 Sep 2024 15:24:42 +0800 Subject: [PATCH] Update README Signed-off-by: Jael Gu --- README.md | 4 +--- 1 file changed, 1 insertion(+), 3 deletions(-) diff --git a/README.md b/README.md index 3395d9f..2a19b42 100644 --- a/README.md +++ b/README.md @@ -82,9 +82,7 @@ and then return translated text in string. # More Resources -- [What is a Transformer Model? An Engineer's Guide](https://zilliz.com/glossary/transformer-models): A transformer model is a neural network architecture. It's proficient in converting a particular type of input into a distinct output. Its core strength lies in its ability to handle inputs and outputs of different sequence length. It does this through encoding the input into a matrix with predefined dimensions and then combining that with another attention matrix to decode. This transformation unfolds through a sequence of collaborative layers, which deconstruct words into their corresponding numerical representations. - -At its heart, a transformer model is a bridge between disparate linguistic structures, employing sophisticated neural network configurations to decode and manipulate human language input. An example of a transformer model is GPT-3, which ingests human language and generates text output. +- [What is a Transformer Model? An Engineer's Guide](https://zilliz.com/glossary/transformer-models): A transformer model is a neural network architecture. It's proficient in converting a particular type of input into a distinct output. Its core strength lies in its ability to handle inputs and outputs of different sequence length. It does this through encoding the input into a matrix with predefined dimensions and then combining that with another attention matrix to decode. This transformation unfolds through a sequence of collaborative layers, which deconstruct words into their corresponding numerical representations. At its heart, a transformer model is a bridge between disparate linguistic structures, employing sophisticated neural network configurations to decode and manipulate human language input. An example of a transformer model is GPT-3, which ingests human language and generates text output. - [Experiment with 5 Chunking Strategies via LangChain for LLM - Zilliz blog](https://zilliz.com/blog/experimenting-with-different-chunking-strategies-via-langchain): Explore the complexities of text chunking in retrieval augmented generation applications and learn how different chunking strategies impact the same piece of data. - [Massive Text Embedding Benchmark (MTEB)](https://zilliz.com/glossary/massive-text-embedding-benchmark-(mteb)): A standardized way to evaluate text embedding models across a range of tasks and languages, leading to better text embedding models for your app - [About Lance Martin | Zilliz](https://zilliz.com/authors/Lance_Martin): Software / ML at LangChain