Skip to content

Commit f962459

Browse files
committed
feat: Updated LlamaIndex EF adapter example
1 parent 49d9842 commit f962459

File tree

1 file changed

+24
-15
lines changed

1 file changed

+24
-15
lines changed

docs/integrations/llamaindex/embeddings.md

+24-15
Original file line numberDiff line numberDiff line change
@@ -4,36 +4,45 @@
44

55
Chroma and LlamaIndex both offer embedding functions which are wrappers on top of popular embedding models.
66

7-
Unfortunately Chroma and LI's embedding functions are not compatible with each other. Below we offer an adapters to convert LI embedding function to Chroma one.
7+
Unfortunately Chroma and LI's embedding functions are not compatible with each other. Below we offer an adapters to
8+
convert LI embedding function to Chroma one.
89

910
```python
10-
from llama_index.embeddings.base import BaseEmbedding
11-
from chromadb.api.types import EmbeddingFunction
11+
from llama_index.core.schema import TextNode
12+
from llama_index.core.base.embeddings.base import BaseEmbedding
13+
from chromadb import EmbeddingFunction, Documents, Embeddings
14+
1215

1316
class LlamaIndexEmbeddingAdapter(EmbeddingFunction):
14-
def __init__(self,ef:BaseEmbedding):
15-
self.ef = ef
17+
def __init__(self, ef: BaseEmbedding):
18+
self.ef = ef
1619

17-
def __call__(self, input: Documents) -> Embeddings:
18-
return [node.embedding for node in self.ef(input)]
20+
def __call__(self, input: Documents) -> Embeddings:
21+
return [node.embedding for node in self.ef([TextNode(text=doc) for doc in input])]
1922

2023
```
2124

25+
!!! warn "Text modality"
26+
27+
The above adapter assumes that the input documents are text. If you are using a different modality,
28+
you will need to modify the adapter accordingly.
29+
2230
An example of how to use the above with LlamaIndex:
2331

24-
> Note: Make sure you have `OPENAI_API_KEY` as env var.
32+
!!! note "Prerequisites for example"
33+
34+
Run `pip install llama-index chromadb llama-index-embeddings-fastembed fastembed`
2535

2636
```python
27-
from llama_index.embeddings import OpenAIEmbedding
28-
from llama_index import ServiceContext, set_global_service_context
2937
import chromadb
38+
from llama_index.embeddings.fastembed import FastEmbedEmbedding
3039

31-
embed_model = OpenAIEmbedding(embed_batch_size=10)
40+
# make sure to include the above adapter and imports
41+
embed_model = FastEmbedEmbedding(model_name="BAAI/bge-small-en-v1.5")
3242

3343
client = chromadb.Client()
3444

35-
col = client.get_or_create_collection("test_collection",embedding_function=LlamaIndexEmbeddingAdapter(embed_model))
45+
col = client.get_or_create_collection("test_collection", embedding_function=LlamaIndexEmbeddingAdapter(embed_model))
3646

37-
col.add(ids=["1"],documents=["this is a test document"])
38-
# your embeddings should be of 1536 dimensions (OpenAI's ADA model)
39-
```
47+
col.add(ids=["1"], documents=["this is a test document"])
48+
```

0 commit comments

Comments
 (0)