Wednesday, May 13, 2026

AntAngelMed: The Largest Open-Source Medical LLM — 103B Parameters, Only 6.1B Active

Good news! Does this model also cover traditional Chinese medicine? 😊

Note only a fraction of all parameters are active!

"A team of Chinese researchers has released AntAngelMed, a 103B-parameter open-source medical language model built on a 1/32 activation-ratio MoE architecture — meaning only 6.1B parameters are active at inference time. You get frontier-scale medical knowledge capacity at a fraction of the compute cost. It is the largest open-source medical LLM released to date, built on top of Ling-flash-2.0 and available on GitHub now."

Inside: Mira Murati's real-time AI → Gemini-powered cursor → 300M beats 27B → world's largest medical LLM drops open-source


No comments: