Product

The Model

Rabbit is a fine-tuned 3.8B parameter language model built specifically for organizational memory. Small enough for on-premise deployment. Specialized enough to outperform general-purpose models on memory tasks.

Base model

Phi-3.5 Mini Instruct

Parameters

3.8 billion

Training examples

82,314

Quantization

4-bit (2.2GB)

Signals

12 specialized tasks

Fine-tuning

LoRA (r=16, alpha=16)

Architecture decisions

We chose Phi-3.5 Mini as our base for three reasons: it excels at structured output and instruction following, it is commercially usable under the MIT license, and at 3.8B parameters it fits on a single T4 GPU with room for embeddings and reranking models alongside it.

All 12 signals share the same weights. Signal routing happens through task-specific system prompts and prefix tokens at inference time. This means a single deployment serves all capabilities with no additional model loading.