Cabo Verde Industry News
SEE OTHER BRANDS

The best news from Cabo Verde on industries and services

Cirrascale and Ai2 Accelerate AI Innovation with the Availability of the OLMo, Molmo, and Tülu Models on the Cirrascale Inference Platform

Organizations can now rapidly commercialize Ai2’s state-of-the-art open models with instant scalability and production-ready endpoints

PARIS, July 07, 2025 (GLOBE NEWSWIRE) -- Cirrascale Cloud Services, the leading provider of innovative cloud and managed solutions for training, inference, and inference-as-a-service, today announced the availability of Allen Institute for Artificial Intelligence’s (Ai2) OLMo, Molmo, and Tülu models on the Cirrascale Inference Platform. OLMo 2 delivers language understanding in compact 7B, 13B, and 32B versions. Molmo extends those gains to vision with an open multimodal family, and Tülu 3 offers a fully transparent post-training pipeline that achieves state-of-the-art results. These permissively licensed models let enterprises fine-tune with proprietary data, avoid vendor lock-in, and deploy lighter, cost-efficient workloads without sacrificing quality. Building and maintaining the infrastructure to support AI model deployment at scale is complex, requiring specialized technical expertise that many businesses simply do not have. The Cirrascale Inference Platform now gives organizations easy access to Ai2's state-of-the-art open-source AI Models, allowing for experimentation, testing, and production-scale use without the need to build their own infrastructure.

“Since launching our family of truly open models last year, the AI community has been asking for API access. Today, in partnership with Cirrascale, we’re excited to deliver just that – an API to enable scalable, flexible, and cost-efficient integration,” said Sophie Lebrecht, COO for Ai2. “Our fully open models have already changed the way the AI community thinks about language models, and API access makes it that much easier for them to get into the hands of builders, developers and researchers everywhere. We can’t wait to see what people will build with our OLMo, Molmo, and Tülu models now that they are so easily accessible through the Cirrascale Inference Platform.”

With this launch, Cirrascale is the first to offer commercial endpoints for Ai2's fully open models. OLMo is an advanced open language model, released with full transparency, open weights, training data, and code under the Apache 2.0 license, empowering the AI community to advance open research. Molmo, Ai2’s multimodal family, delivers high-performance image, text, and speech capabilities that outperform proprietary models at a fraction of the size, thanks to its focus on high-quality, efficiently curated data. Tülu is a leading instruction following model family, offering fully open-source data, code, and recipes designed to serve as a comprehensive guide for modern post-training techniques. Tülu uses a post-training recipe to Llama-405B, achieving competitive or superior performance to both DeepSeek v3 and GPT-4o, while surpassing prior open-weight post-trained models of the same size including Llama 3.1 405B Instruct on many standard benchmarks.

“Our new Inference Platform is designed for two core audiences: developers building differentiated models and needing an endpoint offering in order to commercialize quickly and enterprise customers with customized or fine-tuned models looking to deploy them at scale,” said Dave Driggers, CEO and Co-Founder, Cirrascale Cloud Services. “Including Ai2’s OLMo, Molmo, and Tülu models illustrates our ability to bring leading models to life in real-world applications without the friction of standing up hardware or figuring out accelerator compatibility.”

The availability of these Ai2 models adds to the already impressive list of benefits of the Cirrascale Inference Platform:

  • Instant Deployment: Turn any model into a live, scalable endpoint — no infrastructure setup required.
  • Multi-Model Support: Bring your own model or use pre-integrated ones like OLMo, Molmo, and Tülu.
  • Accelerator Optimization: Cirrascale auto-selects and configures the best AI accelerators and hardware for your model, enabling faster innovation.
  • Simplified Management and Scaling: Allows enterprises to maintain low-volume models on-premises while offloading more demanding workloads.

Cirrascale will showcase its offerings, including the Inference Platform, on July 8 and 9 at the 2025 RAISE Summit at Carrousel du Louvre, Paris. Visit Cirrascale at booth No. 11 to discover firsthand how the platform accelerates AI adoption for both enterprises and model creators and learn how to deploy their own differentiated models or leverage the latest from Ai2.

For more information on the Cirrascale Inference Platform, visit https://www.cirrascale.com/inference.

About Cirrascale Cloud Services
Cirrascale Cloud Services is a leading cloud and managed services provider dedicated to deploying tailored, state-of-the-art compute resources and high-speed storage solutions at scale. Our AI Innovation Cloud and Inference Platform services are purpose-built to enable clients to scale their training and inferencing workloads for generative AI, large language models, and high-performance computing. To learn more about Cirrascale Cloud Services and its unique cloud offerings, please visit https://cirrascale.com or call (888) 942-3800.

Contact Information:
BOCA Marketing Agency for Cirrascale
Email: cirrascale@bocamarketing.com

Mike LaPan
Cirrascale Cloud Services
(888) 942-3800
info@cirrascale.com


Primary Logo

Legal Disclaimer:

EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.

Share us

on your social networks:
AGPs

Get the latest news on this topic.

SIGN UP FOR FREE TODAY

No Thanks

By signing to this email alert, you
agree to our Terms of Service