In brief

  • Gradient raised $10M from Pantera, Multicoin, and HSG to introduce Lattica and Parallax—protocols designed to run AI models across decentralized devices instead of centralized servers.
  • The system leverages untapped computing power from phones, laptops, and IoT devices, using Solana to coordinate data and payments.
  • The team says this approach slashes costs, keeps user data local, and pushes back against AI monopolies—but critics say latency and complexity could be roadblocks.

Gradient Network closed a $10 million seed funding round to build what it calls a decentralized AI infrastructure stack, with venture firms Pantera Capital and Multicoin Capital leading the investment alongside HSG (formerly Sequoia Capital China).

The Singapore-based startup plans to use the funds to develop two core protocols—Lattica and Parallax—that would allow artificial intelligence models to run across a distributed network of devices rather than in centralized data centers. The company said both protocols will debut this week.

"We believe intelligence should be a public good, not a corporate asset," Eric Yang, co-founder of Gradient Network, said in an announcement shared with Decrypt. "This round gives us the momentum to build infrastructure that brings decentralization to the heart of AI."


The timing arrives as AI companies face mounting criticism over data privacy and the concentration of computational power among a handful of tech giants. Gradient's approach would tap into unused processing power from smartphones, computers, and other devices to create what could basically be the equivalent of a global, crowdsourced supercomputer.

Lattica functions as a peer-to-peer data communication protocol like Bitcoin or Torrent—think of it as plumbing that moves information between devices without going through central servers. The company said its network of "Sentry Nodes" has already facilitated over 1.6 billion connections across more than 190 regions.

Decentralizing AI

Parallax tackles the problem of how to run massive AI models without massive data centers. The protocol dissects large language models into smaller pieces that can run simultaneously across multiple devices. Instead of sending data to OpenAI's or Amazon’s servers for processing, Parallax would let the computation happen on a network of participating devices, keeping user data local.

To be sure, critics have raised concerns that coordinating tasks across thousands of devices introduces complexity and that network latency remains a challenge for decentralized systems.

But the company says its distributed approach and technology could slash costs compared to traditional cloud computing while addressing privacy concerns. When AI models run on centralized servers, user queries and data get transmitted to and processed by those servers. Gradient's system would process data closer to where it's generated.

Gradient Network operates on Solana's blockchain, chosen for its high transaction speeds and low costs compared to other networks. The blockchain handles the coordination and payment mechanisms for devices contributing computing power to the network.

The startup joins a growing field of companies attempting to decentralize AI infrastructure. Competitors include SingularityNET, which focuses on creating a marketplace for AI services, the Superintelligence Alliance network and various projects building on different blockchains. Bittensor and Gensyn have pursued similar distributed computing models, though with different technical approaches.

Gradient said it will release additional protocols beyond Lattica and Parallax, though it hasn't specified what these might include. The company also mentioned plans to publish research papers and open channels for developers to contribute to the project.

Edited by James Rubin

Generally Intelligent Newsletter

A weekly AI journey narrated by Gen, a generative AI model.