Good news! Does this model also cover traditional Chinese medicine? 😊
Note only a fraction of all parameters are active!
"A team of Chinese researchers has released AntAngelMed, a 103B-parameter open-source medical language model built on a 1/32 activation-ratio MoE architecture — meaning only 6.1B parameters are active at inference time. You get frontier-scale medical knowledge capacity at a fraction of the compute cost. It is the largest open-source medical LLM released to date, built on top of Ling-flash-2.0 and available on GitHub now."
No comments:
Post a Comment