• Menu
  • Skip to right header navigation
  • Skip to main content
  • Skip to primary sidebar

DigiBanker

Bringing you cutting-edge new technologies and disruptive financial innovations.

  • Home
  • Pricing
  • Features
    • Overview Of Features
    • Search
    • Favorites
  • Share!
  • Log In
  • Home
  • Pricing
  • Features
    • Overview Of Features
    • Search
    • Favorites
  • Share!
  • Log In

AI networking startup Upscale builds open-standard infrastructure using Switch Abstraction Interface, Ultra Ethernet protocols, and unified SONiC operating system to scale GPU clusters with vendor-agnostic hardware freedom

September 19, 2025 //  by Finnovate

Startup Upscale AI is targeting an AI networking infrastructure market that’s already valued at more than $20 billion annually, and is quietly confident it will be able to take a huge bite out of that segment. According to Upscale AI Chief Executive Barun Kar, AI networks require a full-stack redesign, and that means developing specialized auxiliary processing units or XPUs, ultra-low latency interconnects and a more power-efficient operating system that’s able to scale and support enormous clusters of thousands of graphics processing units. Upscale AI is racing to build this, with one of its core components being a new kind of AI Network Fabric that it says is designed to enhance the performance of XPU clusters. XPUs are critical for AI, as they take care of the specialized infrastructure tasks in AI, so that GPUs can be used exclusively for computation. The startup has also built an all-new unified network operating system based on open standards such as the Switch Abstraction Interface and Software for Open Networking in the Cloud, known as SAI/SONiC, which enables infrastructure to scale with in-service network upgrades that maximize uptime. Another key development is Upscale AI’s novel AI networking rack platform, which gives network operators the freedom to choose networking hardware from any vendor. Although Upscale AI does not provide any numbers, it’s confident that its redesigned networks will deliver “breakthrough performance” for AI training, inference and edge AI deployments, and its long list of financial backers suggests that many believe its claims. The startup’s network brings together a host of different open standards, enabling complete freedom of choice for data center operators. In addition to SAI/SONiC, it’s built on standards such as Ultra Accelerator Link and Ultra Ethernet.

Read Article

Category: Additional Reading

Previous Post: « Pulumi’s AI platform engineer automates end-to-end infrastructure lifecycle through multi-cloud IaC foundation supporting thousands of providers with human-in-the-loop approval workflows and enterprise guardrails
Next Post: Enterprise security platform consolidation leverages unified governance frameworks with centralized identity management and automated threat response playbooks; reducing alert fatigue while maintaining architectural standards across diverse operations »

Copyright © 2025 Finnovate Research · All Rights Reserved · Privacy Policy
Finnovate Research · Knyvett House · Watermans Business Park · The Causeway Staines · TW18 3BA · United Kingdom · About · Contact Us · Tel: +44-20-3070-0188

We use cookies to provide the best website experience for you. If you continue to use this site we will assume that you are happy with it.