Quick take:
- Clem Delangue, cofounder of AI platform HuggingFace was among those who joined as angel investors.
- The funding was structured as equity with warrants for future tokens if Pluralis decides to launch its native token.
- Although the company has no product in development yet, it has assembled a research team to build a decentralised AI infrastructure.
Pluralis, a startup looking to disrupt the artificial intelligence sector, currently dominated by the likes of OpenAI by building a decentralised version of Amazon Web Services has raised $7.6 million across pre-seed and seed rounds.
CoinFund and Union Square Ventures led the rounds with participation from Topology, Variant, Eden Block, and Bodhi Ventures, while Balaji Srinivasan and Clem Delangue, cofounder of AI platform HuggingFace, joined as angel investors.
The fundraising was structured as equity with warrants for future tokens if Pluralis launches its native token.
Commenting on the announcement, Alexander Long, founder and CEO of Pluralis told Fortune the company does not have a product in the market yet but has assembled a team to research solutions for the current AI challenges that include the cost of computing power by developing a ‘decentralised version of Amazon Web Services.’
“I raised the round on, ‘This team is the right team to try and tackle this problem’,” Long said. “No one else is trying. We think we can do it.”
Before setting out to form his new venture, Long, who holds a doctorate in computer science worked at Amazon for more than three years as an AI engineer. He has now assembled a team of seven computer scientists all with doctorates or stints as postdoctoral researchers, in a bid to build powerful AI algorithms through a decentralized network of servers.
Pluralis is not the first Web3 startup to try to address the challenge of lowering the cost of computing power for AI companies. However, he claims his company’s approach is different, highlighting the limiting factor in the current approaches to decentralising computing power.
According to Long, because the current decentralised models require computers to download the entire model, if the servers are small, that puts a ceiling on the size—and power—of a model.
His idea is to attempt to train a portion of a model, rather than the entire model. “If you can make the problem precise enough, it often leads to immediate ways you can start to solve it,” he said.
Stay on top of things:
Subscribe to our newsletter using this link – we won’t spam!
Follow us on X and Telegram.
Credit: Source link