Docker with Conversational AI: Build, Share, and Run
Learn the essentials for packaging your own applications and running powerful tools with Docker
For anyone interested in working with conversational AI, Docker is a powerful tool to streamline deployment, testing, and collaboration. It can help you share your applications with others and use applications others made. This 6-part series of articles walks through the essentials of Docker - from setup to sharing/running complex applications.
Introduction to Docker
Get your local environment set up and work with Docker with a simple API application.
Part 1: Using Docker to work with conversational AI applications
Part 2: Creating your own Docker images for conversational AI applications
Part 3: Adding the secrets to make your dockerized conversational app run
Part 4: Using Docker Compose to set up and run multiple services
Practical Applications
Use some powerful applications others have built to experiment with conversational AI locally.
Part 5: Local, offline chatting with LibreChat and Ollama
Part 6: Experimenting with generative workflows using Flowise, Ollama, and LangFuse
Hope you enjoy this series.