Instill AI changelog

Local Full-Stack AI with Ollama Component

changelog cover

The Ollama Component is a key addition to 🔮 Instill Core, extending our full-stack AI platform to your local laptops! Self-hosting your own 🔮 Instill Core is ideal for secure, efficient, and flexible AI development and deployment.

With this integration, you can access and utilize AI models served on the Ollama for tasks such as:

  • Text generation chat

  • Text embedding

For setup, refer to our deployment documentation and the Ollama server tutorial.

Note: You will need to adjust your Docker network settings to ensure connectivity between 🔮 Instill Core and Ollama containers.

More at: Instill AI