Trinity large: An open 400B sparse MoE model
created: Jan. 28, 2026, 12:57 a.m. | updated: Jan. 29, 2026, 3:23 a.m.
At the time, Trinity Nano Preview and Trinity Mini had just released, and Trinity Large had started training.
A less advanced version of this curation approach worked well for smaller models like Trinity Nano and Trinity Mini, but we wanted to shoot for the moon with Trinity Large.
So DatologyAI delivered a number of curation advancements specifically for inclusion into Trinity Large.
It excels in creative writing, storytelling, role-play, chat scenarios, and real-time voice assistance, better than your average reasoning model usually can.
If you're already using one of those for coding, Trinity Large should show up as an option.
1 day, 3 hours ago: Hacker News