You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
*[Configure your OpenAI API key](https://cocoindex.io/docs/ai/llm#openai). Alternatively, we have native support for Gemini, Ollama, LiteLLM. You can choose your favorite LLM provider and work completely on-premises.
The GraphDB interface in CocoIndex is standardized, you just need to **switch the configuration** without any additional code changes. CocoIndex supports exporting to Kuzu through its API server. You can bring up a Kuzu API server locally by running:
377
-
378
-
``` sh
379
-
KUZU_DB_DIR=$HOME/.kuzudb
380
-
KUZU_PORT=8123
381
-
docker run -d --name kuzu -p ${KUZU_PORT}:8000 -v ${KUZU_DB_DIR}:/database kuzudb/api-server:latest
382
-
```
383
-
384
-
In your CocoIndex flow, you need to add the Kuzu connection spec to your flow.
If you are building multiple CocoIndex flows from different projects to neo4j, we recommend you to
56
+
57
+
- bring up separate container for each flow if you are on community edition, or
58
+
- setup different databases within one container if you are on enterprise edition.
59
+
60
+
This way, you can clean up the data for each flow independently.
61
+
62
+
In case you need to clean up the data in the same database, you can do it manually by running `cocoindex drop <APP_TARGET>` from the project you want to clean up.
0 commit comments