Getty Images/iStockphoto

Storj buys Valdi for cloud compute to bolster storage offers

The acquisition brings Storj's distributed storage offerings together with Valdi's distributed compute services for high-performance workloads, including generative AI.

Distributed storage vendor Storj will now offer GPU compute cycles on demand, claiming it will lower the barrier to entry for generative AI training compared with hyperscaler offerings.

Storj acquired cloud computing vendor Valdi on Tuesday, adding a new division of capabilities to its portfolio. Storj did not disclose details of the acquisition, including price.

Purchasing compute on demand can be useful for organizations that want to experiment with generative AI without having to buy their own GPUs or commit to a single provider, according to Ray Lucchesi, founder and president of Silverton Consulting.

Having access to lower-cost storage could also draw more customers to the service, he said. The cost of moving data in and out of cloud compute can be costly and time-consuming to manage.

"GPUs are becoming more and more critical to enterprise workloads," Lucchesi said. "There's a time cost [and] some sticking power to the storage that isn't there with the GPUs."

Distributed colleagues

Valdi has offered Storj's distributed storage to its customers, according to Nikhil Jain, co-founder and CEO at Valdi.

The two businesses eventually found enough overlap in customers and partnerships that the acquisition made sense. Executives for both companies said they plan to more tightly integrate their technologies to develop an alternative public cloud for compute and storage.

"One of the visions of our combined company now is the user interface experience and making ourselves an alternative to a hyperscaler," Jain said.

Valdi, founded in 2022, has built a network of partners to provide access to unused GPU computing resources in data centers around the world for high-performance workloads. Valdi customers can use the servers as needed without a contract and pay by credit card. Partners are compensated by Valdi for offering up unused hardware.

Storj operates under a similar model by using available storage media in data centers to provide distributed S3-compatible object storage. Hardware owners are compensated from the service cost, similar to Valdi.

Storj's model, however, enables the general public to resell excess storage space, compared with the partnerships used by Valdi. Storj's technology shards customer data across several data centers, creating redundancy when reassembling data. Customers can pay in local currency or cryptocurrency. The company has also expanded into other services, such as a developer gateway.

Advantages to the distributed model include worldwide access to generative AI technology in underserved regions, such as Africa and the Middle East, as well as reducing environmental effects by using existing data center resources, according to Storj and Valdi executives.

Performance concerns

Storj has worked to expand its use cases and public image from its early days as a cryptocurrency project, according to Mitch Lewis, an analyst at Futurum Group.

Storj has done the best job at bridging from being an interesting crypto project to being an actual technology provider and storage company.
Mitch LewisAnalyst, Futurum Group

The company's decentralized storage services can offer another form of data security and privacy, competing on price with hyperscalers, Lewis said. Valdi might compete with the hyperscalers for service, but direct competitors are few, such as the Render Network.

"Their big advantage is cost-efficiency and scalability," Lewis said. "Storj has done the best job at bridging from being an interesting crypto project to being an actual technology provider and storage company."

Both Lucchesi and Lewis said Storj needs to show how performant its technology can be when pushing data to GPUs for AI training, due to its distributed nature.

Less demanding AI activities such as inferencing, which relies on a trained model, might not suffer as much. Storage technologies that feed data faster to the GPUs, such as GPUDirect software, would need to be supported, both said.

"If you're doing AI work, you probably want to have your GPU and data in the same place," Lucchesi said.

Tim McCarthy is a news writer for TechTarget Editorial covering cloud and data storage.

Dig Deeper on Cloud storage