Llamatik

Kotlin. LLMs. On your terms.

About

llamatik-icon-logo.png

Kotlin-first llama.cpp integration for on-device and remote LLM inference.

  • Kotlin Multiplatform: shared code across Android, iOS, and desktop
  • Offline inference via llama.cpp (compiled with Kotlin/Native bindings)
  • Remote inference via optional HTTP client (e.g. llamatik-server)
  • Embeddings and text generation support
  • Works with GGUF models (e.g. Mistral, Phi, LLaMA)
  • Lightweight and dependency-free runtime

Latest Post

Nov 25, 2025

“Introducing the Llamatik App an Offline AI Chat on Your Mobile Device”

🚀 Introducing the Llamatik App: Offline AI Chat on Your Mobile Device We’re excited to announce the launch of the Llamatik App, a fully offline AI chatbot designed to showcase the power and versatility of the Llamatik Kotlin Multiplatform library. Whether you’re a developer curious about on-device AI or a user who wants a private, fast, and modern chatbot experience, the Llamatik app brings the full potential of lightweight LLMs directly to your mobile device. Read more
All Posts

Contact

Have a question about Llamatik? Want to contribute, collaborate, or integrate llama.cpp into your project using Kotlin? We’d love to hear from you.

💬 General inquiries: hello@multiplatformkickstarter.com

🛠️ Technical support: support@multiplatformkickstarter.com

🧑‍💻 Interested in contributing? Check out our GitHub repository

📰 For announcements, follow us on Twitter or Multiplatform Kickstarter Twitter

We’re always excited to connect with developers, researchers, and companies who believe in open, offline-friendly AI.

My current local time is .

hello@multiplatformkickstarter.com