MapleOS Forums

Join discussions with fellow MapleOS users and the core team. Share experiences, get help, and discover new ways to use the platform.

LLMs4 replies

Cohere Support at CanXP AI and why we are excited by it!

Started Sep 22, 2025

JE
Jess B
Original post · Sep 22, 2025

Hey all,

We're pretty excited about supporting Cohere's Command A model.

If you've missed our Press Release here it is: CanXP AI Adds Support for Cohere’s Command A Model. A Match Made in Canada.

At 256K context length this is our biggest model to support to date. This model is capable of extremely long and complex coherence.

It's also one of the world's most green models. The model to run only requires 2 GPUs! And it sports ~150% more throughput.

For us at CanXP AI this model brings us closer to our goals of homegrown AI strength and data sovereignty & trust.

If you are a Paid subscriber you will have immediate access to try our Command A today! Enjoy.

Expect to see more great Canadian models supported soon!

TI
Tim Bradsfield
Reply · Sep 22, 2025

This was a wise choice to choose to support! Very happy to see this announced so quickly. When Vince and I chatted he mentioned briefly he was considering it. I had no idea Command A was coming so soon.

To put this another way for folks.

Lamma3.2 is capable of 128K Context Length!

Command A has double the capacity at 256K Context Length!

This means that Command A is capable of far more concrete work on larger datasets, papers, and documents.

For Canadians it is also very Canadian. It speaks Canadian Francais. This model was trained on Canadian materials.

KE
Ken G
Reply · Sep 22, 2025

@Tim is that the entire context length or is the system length even more?

I've never used Cohere's newest model so this should be interesting. Command R+ felt more like a minimal GPT3. So I am curious what Command A will feel like.

TI
Tim Bradsfield
Reply · Sep 22, 2025

I believe 256K is the entire context length. Try it out.

Reply

Add your response to this thread.