feat: index Reader LM 1.5B + BGE-M3 for OSINT pipeline jinaai/reader-lm-1.5b (safetensors, 1 shard, 3.1 GB): HTML→Markdown local model. No Jina API needed. bgz7 → palette → O(1) HTML structure recognition. CompendiumLabs/bge-m3-gguf (GGUF F16, ~1.2 GB): Multilingual embedding model. Replaces DeepNSM for non-English. bgz7 → palette → O(1) semantic similarity. Together: Reader LM reads the web, BGE-M3 embeds it, AriGraph stores it as SPO triplets, AutocompleteCache routes it at 17K tok/sec. https://claude.ai/code/session_01M3at4EuHVvQ8S95mSnKgtK #72
+88
−491