Repository: localaiLicense: apache-2.0

Jan-Nano is a compact 4-billion parameter language model specifically designed and trained for deep research tasks. This model has been optimized to work seamlessly with Model Context Protocol (MCP) servers, enabling efficient integration with various research tools and data sources.
Links
Tags
Repository: localaiLicense: apache-2.0

Jan-Nano-128k represents a significant advancement in compact language models for research applications. Building upon the success of Jan-Nano, this enhanced version features a native 128k context window that enables deeper, more comprehensive research capabilities without the performance degradation typically associated with context extension methods. Key Improvements: š Research Deeper: Extended context allows for processing entire research papers, lengthy documents, and complex multi-turn conversations ā” Native 128k Window: Built from the ground up to handle long contexts efficiently, maintaining performance across the full context range š Enhanced Performance: Unlike traditional context extension methods, Jan-Nano-128k shows improved performance with longer contexts This model maintains full compatibility with Model Context Protocol (MCP) servers while dramatically expanding the scope of research tasks it can handle in a single session.
Links
Tags
Repository: localaiLicense: apache-2.0
This is an uncensored version of Menlo/Jan-nano created with abliteration (see remove-refusals-with-transformers to know more about it). This is a crude, proof-of-concept implementation to remove refusals from an LLM model without using TransformerLens. Ablation was performed using a new and faster method, which yields better results.
Links
Tags
Repository: localaiLicense: gemma
This model is a fine-tuned version of google/gemma-3-12b-pt. As it is intended solely for text generation, we have extracted and utilized only the Gemma3ForCausalLM component from the original architecture. Unlike our previous EEVE models, this model does not feature an expanded tokenizer. Base Model: google/gemma-3-12b-pt This model is a 12-billion parameter, decoder-only language model built on the Gemma3 architecture and fine-tuned by Yanolja NEXT. It is specifically designed to translate structured data (JSON format) while preserving the original data structure. The model was trained on a multilingual dataset covering the following languages equally: Arabic Bulgarian Chinese Czech Danish Dutch English Finnish French German Greek Gujarati Hebrew Hindi Hungarian Indonesian Italian Japanese Korean Persian Polish Portuguese Romanian Russian Slovak Spanish Swedish Tagalog Thai Turkish Ukrainian Vietnamese While optimized for these languages, it may also perform effectively on other languages supported by the base Gemma3 model.
Links
Tags
Repository: localaiLicense: gemma

**YanoljaNEXT-Rosetta-27B-2511** *A multilingual, structure-preserving translation model built on Gemma3* This 27-billion-parameter language model, developed by Yanolja NEXT, is fine-tuned from **Googleās Gemma3-27B** to excel at translating structured data (JSON, YAML, XML) while preserving the original format. It supports **32 languages**, including English, Chinese, Korean, Japanese, German, French, Spanish, and more, with balanced training across all languages. Designed specifically for **high-accuracy, structured translation tasks**āsuch as localizing product catalogs, translating travel content, or handling technical documentationāthe model ensures output remains syntactically valid and semantically precise. It achieves top-tier performance on English-to-Korean translation (CHrF++ score: **37.21**) and is optimized for efficient inference. The model is released under the **Gemma license**, making it suitable for research and commercial use with proper attribution. **Use Case:** Ideal for developers and localization teams needing reliable, format-aware translation in multilingual applications. **Base Model:** `google/gemma-3-27b-pt` **License:** Gemma (via Google) **Repository:** [yanolja/YanoljaNEXT-Rosetta-27B-2511](https://huggingface.co/yanolja/YanoljaNEXT-Rosetta-27B-2511)
Links
Tags