Llamatik

Kotlin. LLMs. On your terms.

About

llamatik-icon-logo.png

Kotlin-first llama.cpp integration for on-device and remote LLM inference.

  • Kotlin Multiplatform: shared code across Android, iOS, and desktop
  • Offline inference via llama.cpp (compiled with Kotlin/Native bindings)
  • Remote inference via optional HTTP client (e.g. llamatik-server)
  • Embeddings and text generation support
  • Works with GGUF models (e.g. Mistral, Phi, LLaMA)
  • Lightweight and dependency-free runtime

Latest Post

Jul 22, 2025

“Introducing Llamatik Offline LLMs for Kotlin Multiplatform”

🦙 Introducing Llamatik: Offline LLMs for Kotlin Multiplatform We’re thrilled to introduce Llamatik — an open-source Kotlin Multiplatform library that brings local Large Language Models (LLMs) to Android, iOS, desktop, and beyond using the power of llama.cpp. Llamatik makes it simple and efficient to integrate offline, on-device inference and embeddings into your KMP apps, whether you’re building an AI assistant, a RAG chatbot, or an edge intelligence tool. ⸻ ✨ Why Llamatik? Read more
All Posts

Contact

Have a question about Llamatik? Want to contribute, collaborate, or integrate llama.cpp into your project using Kotlin? We’d love to hear from you.

💬 General inquiries: hello@multiplatformkickstarter.com

🛠️ Technical support: support@multiplatformkickstarter.com

🧑‍💻 Interested in contributing? Check out our GitHub repository

📰 For announcements, follow us on Twitter or Multiplatform Kickstarter Twitter

We’re always excited to connect with developers, researchers, and companies who believe in open, offline-friendly AI.

My current local time is .

hello@multiplatformkickstarter.com