LocalAI Hosting
LocalAI is a drop-in replacement for the OpenAI API that runs entirely locally. Use LLMs, image generation, and audio generation without sending data to external services.
What is LocalAI?
LocalAI is a drop-in replacement for the OpenAI API that runs entirely locally. Use LLMs, image generation, and audio generation without sending data to external services.
Features & Benefits
OpenAI-compatible API
LocalAI on your own vServer gives you full control and flexibility.
LLM, TTS, image generation
LocalAI on your own vServer gives you full control and flexibility.
GPU & CPU support
LocalAI on your own vServer gives you full control and flexibility.
No cloud dependency
LocalAI on your own vServer gives you full control and flexibility.
Model gallery
LocalAI on your own vServer gives you full control and flexibility.
Container-optimized
LocalAI on your own vServer gives you full control and flexibility.
3 Steps to LocalAI
From ordering to running installation in just minutes
Order vServer
Choose a suitable vServer plan. For LocalAI we recommend at least the XE (6 vCores, 32GB RAM). Your server is ready in 60 seconds.
Install LocalAI with 1-Click
In the customer center, simply select LocalAI as a template. The installation runs fully automatically.
Configure & Get Started
After installation, access LocalAI directly through your browser. Set everything up to your liking.
Recommended vServer for LocalAI
All plans with 1-click LocalAI installation, root access and unlimited traffic
Virtual NVMe XS
For Testing
- vCores: 2
- ECC RAM: 2 GB
- NVMe SSD: 75 GB
- Traffic: Flatrate
- DDoS Protection
- 1-Click LocalAI
Virtual NVMe XB
Standard
- vCores: 4
- ECC RAM: 8 GB
- NVMe SSD: 150 GB
- Traffic: Flatrate
- DDoS Protection
- 1-Click LocalAI
Virtual NVMe XP
Power User
- vCores: 4
- ECC RAM: 16 GB
- NVMe SSD: 256 GB
- Traffic: Flatrate
- DDoS Protection
- 1-Click LocalAI
Virtual NVMe XE
Recommended
- vCores: 6
- ECC RAM: 32 GB
- NVMe SSD: 512 GB
- Traffic: Flatrate
- DDoS Protection
- 1-Click LocalAI
Frequently Asked Questions
Everything you need to know about LocalAI hosting
LocalAI is a drop-in replacement for the OpenAI API that runs entirely locally. Use LLMs, image generation, and audio generation without sending data to external services.
For LocalAI we recommend at least the XE (6 vCores, 32GB RAM). This plan provides enough resources for smooth operation. For higher usage or more users, we recommend upgrading to a larger plan.
With our 1-click installation, LocalAI is automatically set up on your vServer. After ordering, simply select LocalAI as a template in the customer center. All dependencies are automatically installed and configured.
Yes, LocalAI is open source and licensed under the MIT license. The software itself is completely free. You only pay for the vServer running LocalAI – starting from just €2.95/month.
Yes! LocalAI runs exclusively on your own vServer in our datacenter in Stuttgart. Your data never leaves Germany and is stored fully GDPR compliant. We operate our own infrastructure – no third-party providers, no external hardware.
Yes, upgrading to a larger vServer is possible at any time. Simply contact our support and we'll take care of it – without data loss and without interruption.
Discover more 1-click apps
LocalAI is just one of many apps you can install with 1-click on your vServer.
View all apps →Our Promise
What sets us apart from other providers
Own Datacenter
We operate our entire infrastructure ourselves in Stuttgart – no resellers, no external hardware.
Cancel Monthly
No long contract terms. Cancel monthly – fair and flexible.
30-Day Money Back
Not satisfied? Get your money back within 30 days – no questions asked.
100% Germany
Your data never leaves Germany. GDPR compliant and under German law.
Green IT –
Hosting with Responsibility
Climate protection and our environment matter to us. That's why we operate our entire infrastructure exclusively with electricity from renewable energy sources. We even generate part of it ourselves with our own solar power plant directly at the datacenter.
