A collection of basic examples for using Ollama with client-side JavaScript. The Express server proxies requests to Ollama. This could be done directly via p5.js but this is best to avoid CORS issues as well as a foundation for plugging in other services and cloud-based LLMs.
- Install dependencies:
npm install
-
Make sure Ollama is running on your machine (http://localhost:11434)
-
Start the server:
npm start
- Open your browser to http://localhost:3000
- Ollama - Run LLMs locally
- Ollama API Documentation
- 1-chat - streaming chatbot interface
- 2-code-generator - generate and run p5.js sketches from text descriptions
- 3-vision - image description of canvas drawings