When we introduced the **GLiNER bi-encoder** in 2024, it enabled efficient zero-shot NER across hundreds of entity types. But that was just the beginning. Our bigger goal was always clear: **linking text to millions of entities dynamically, without retraining**.
In other words: **true entity linking at scale** ⚡
This unlocks powerful applications: ▪️ More precise search with real-world entity disambiguation ▪️ Knowledge graph construction across diverse document collections ▪️ Wikification — turning raw text into richly linked, navigable knowledge
After nearly two years of research + engineering, this vision is now real.
We’re excited to release **GLinker** — a **production-ready**, zero-shot entity linking system powered by our novel **GLiNER bi-encoder**. It efficiently detects entity spans of any length and matches them directly to entity descriptions — **no retraining required**.
We’re Knowledgator, the team behind open-source NLP models like GLiNER, GLiClass, and many other used for zero-shot text classification and information extraction.
If you’ve explored them on Hugging Face or used our frameworks from GitHub, we’d love your input: 🧩 Which of our models, like GLiNER or zero-shot classifiers, do you find helpful in your practical workflows? 🧩 How’s the setup, performance, and accuracy been for you? 🧩 Anything confusing, buggy, or missing that would make your workflow smoother?
Your feedback helps us improve speed, clarity, and stability for everyone in the open-source community.