Introducing Bleta-Logjike: Advanced Reasoning for Albanian Language AI

MARCH 23, 2025 • 8 MIN READ
Introducing Bleta-Logjike: Advanced Reasoning for Albanian Language AI
By Dialogo Team
We're proud to introduce Bleta-Logjike, an important development for Albanian capable large language models. Fine-tuned by the Dialogo team, Bleta harnesses the power of Gemma-3 architecture and transforms it through specialized training on multi-step reasoning and math datasets. This landmark model excels particularly in logical reasoning and step-by-step thinking, making it invaluable for complex problem-solving in the Albanian language.

Try Bleta Today
Experience Bleta immediately through our interactive notebook or download the model:
- ▶️ Run in Browser: Google Colab Notebook
- 📥 Full Model: klei1/bleta-logjike-27b
- 🖥️ GGUF Format: klei1/bleta-logjike-27b-gguf
Building Upon Previous Success
Bleta represents the evolution of our ongoing efforts to improve open-source Albanian language models. Following our successful release of Bleta-DeepSeek-R1 last year—our first Albanian model with enhanced logical reasoning capabilities—this new multilingual 27B parameter model marks a significant leap forward.
The new Bleta model maintains the Albanian language capabilities of its predecessor while substantially expanding its logical reasoning strengths. This allows it to perform complex reasoning tasks in Albanian with remarkable clarity and precision.
Revolutionary Logical Reasoning Training
What makes Bleta truly special is our efficient training approach. Drawing inspiration from the methodology used in DeepSeek's renowned logical reasoning models, we employed GRPO (Generative Reasoning-guided Policy Optimization) to fine-tune Bleta's thinking processes.
This sophisticated training process involved:
- Starting with a Strong Foundation: We built upon Google DeepMind's impressive Gemma 3 27B, which already exhibits excellent Albanian language capabilities and strong overall benchmark performance
- Step-by-Step Reasoning: Training the model to break down complex problems into logical steps before reaching conclusions
- Format Control: Teaching the model to clearly demarcate its reasoning process from its final answers
- Reward-Based Learning: Using sophisticated reward mechanisms to reinforce accurate logical deductions and problem-solving approaches
Bleta in Action: Logical Reasoning Examples

Step-by-step Thinking Process
Bleta breaks down complex problems into logical steps before reaching conclusions, making the reasoning process transparent and understandable. This example demonstrates how the model resolves multi-step logical challenges.
Technical Specifications
27 Billion Parameters
With 27B parameters, Bleta has the capacity to understand complex linguistic patterns and perform sophisticated reasoning in Albanian.
1024 Token Context
The model supports extended reasoning chains with a 1024 token context window, ideal for processing complex problems.
Efficient Quantization
Available in 4-bit and 8-bit quantized formats, allowing deployment on consumer-grade hardware while maintaining performance.
Compatible Integration
Works with standard LLM interfaces and libraries, including llama.cpp, LM Studio, and other popular frameworks.
# Using llama.cpp with GGUF format git clone https://github.com/ggerganov/llama.cpp cd llama.cpp make ./main -m path/to/bleta-logjike-27b-finetune.Q8_0.gguf -n 1024 -p "Cila është zgjidhja e problemit të mëposhtëm? Nëse x + y = 10 dhe x - y = 4, sa është vlera e x?"
Supporting Albania's EU Integration Journey
Beyond its exceptional logical reasoning capabilities, future models in the Bleta line could prove invaluable in supporting Albania's EU integration process. The current model is an 8-bit quantized version of the 27B parameter model. With more Albanian language datasets, higher computing power, and improved training algorithms, we envision future Bleta releases to process and analyze complex texts more effectively.
The Bleta line of fine-tuned Albanian-capable LLMs can assist in the challenging process of transposing the EU acquis—the massive body of EU legislation comprising over 100,000 pages—into Albanian legal frameworks. Its logical reasoning skills help identify inconsistencies, compare regulatory approaches, and suggest potential harmonization strategies.
Privacy and Security:
The Bleta models can run entirely locally, enabling government institutions and educational organizations to process sensitive information without external data transmission. This makes it ideal for confidential work and secure applications across various sectors, including the EU integration process.
Real-World Applications
Education
Enhancing critical thinking skills with step-by-step problem-solving assistance in Albanian schools and universities.
Legal Analysis
Helping legal professionals analyze complex regulations and identify potential issues or inconsistencies.
Research
Supporting Albanian researchers with literature review, hypothesis testing, and methodology development.
Government
Assisting in policy analysis and development, including EU integration documentation and compliance checking.
Business
Enabling Albanian businesses to conduct market analysis, strategic planning, and business intelligence in their native language.
Healthcare
Supporting medical professionals with complex diagnostic reasoning and medical literature review in Albanian.
A Vision for Albania's AI Future
Bleta represents more than just a technological advancement—it's our latest effort to synthesize substantial recent developments in open-source LLMs for the benefit of Albanian speakers, businesses, and government. By enriching a benchmark-leading, open-source model with sophisticated logical reasoning capabilities, we're opening new possibilities across education, research, government, and business sectors.
Our roadmap includes further enhancing the Bleta line of fine-tuned LLMs in reasoning abilities, Albanian language skills, and real-world usefulness. As we rapidly move into the age of AI agents, we hope that our efforts will enable Albanian speakers, students, businesses, and government institutions to fully benefit from productivity gains without being tied to proprietary providers.
Get Started Today
Access Bleta through any of these options:
- 🔬 Experiment in our interactive Colab notebook
- ⬇️ Download the full model from Hugging Face
- 🖥️ Use the GGUF version for local deployment
This groundbreaking project was developed by the Dialogo team as part of ongoing efforts to advance Albanian language technology. We welcome collaboration with researchers, educators, legal professionals, and government institutions interested in leveraging Bleta's powerful logical reasoning capabilities.