OpenAI’s Altman, Ethereum’s Buterin Outline Competing Visions for AI’s Future

by shayaan

This week, two of the tech sector’s most influential voices offered contrasting views on the development of artificial intelligence, highlighting the growing tension between innovation and security.

CEO Sam Altman announced this in an email on Sunday evening blog post about his company’s journey that OpenAI has tripled its user base to more than 300 million weekly active users as it races toward artificial general intelligence (AGI).

“We are now confident that we know how to build AGI as we have traditionally understood it,” Altman said, claiming that AI agents could “join the workforce” by 2025 and “substantially increase the output of companies could change.”

Altman says OpenAI is moving toward more than just AI agents and AGI, saying the company is starting to work toward “superintelligence in the true sense of the word.”

A timetable for the delivery of AGI or super intelligence is unclear. OpenAI did not immediately respond to a request for comment.

But hours earlier on Sunday, Ethereum co-creator Vitalik Buterin said suggested using blockchain technology to create global failsafe mechanisms for advanced AI systems, including a “soft pause” capability that could temporarily limit industrial-scale AI operations if warning signs arise.

Crypto-based security for AI safety

Buterin speaks here of “d/acc” or decentralized/defensive acceleration. In its simplest sense, d/acc is a variation on e/acc, or effective acceleration, a philosophical movement embraced by high-profile Silicon Valley figures like a16z’s Marc Andreessen.

Buterins d/acc also supports technological progress, but prioritises developments that increase safety and human freedom of choice. Unlike effective accelerationism (e/acc), which takes a “growth at any cost” approach, d/acc focuses on building defensive capabilities first.

See also  Bitcoin Going to $140K Say Trio of AIs Managing $30M Investment Fund

“D/acc is an extension of the underlying values ​​of crypto (decentralization, resistance to censorship, open world economy and society) to other areas of technology,” Buterin wrote.

Reflecting on the progress d/acc has made over the past year, Buterin wrote about how a more cautious approach to AGI and super-intelligent systems could be implemented using existing crypto mechanisms such as zero-knowledge proofs.

Under Buterin’s proposal, large AI computers would need weekly approval from three international groups to keep running.

“The signatures would be device independent (if desired, we could even require zero-knowledge proof that they were published on a blockchain), so it would be all-or-nothing: there would be no practical way to authorize one device to pass through without authorizing all other devices,” Buterin explains.

The system would operate as a master switch in which either all approved computers operate or none operate – preventing anyone from taking selective enforcement action.

“Until such a critical moment occurs, just having the ability to soft pause would do little harm to developers,” Buterin noted, describing the system as a form of insurance against catastrophic scenarios.

In any case, the explosive growth of OpenAI from 2023 – from 100 million to 300 million weekly users in just two years – shows how AI adoption is progressing rapidly.

From an independent research lab to a major technology company, Altman recognized the challenges of building “an entire company, almost from scratch, around this new technology.”

The proposals reflect wider industry debates about managing AI development. Proponents have previously argued that implementing a global surveillance system would require unprecedented collaboration between major AI developers, governments and the crypto sector.

See also  Russia's Yandex bans ads for crypto exchanges, miners

“A year of ‘war mode’ can easily be worth a hundred years of work under conditions of complacency,” Buterin wrote. “If we have to restrict people, it seems better to restrict everyone equally and do the hard work of actually working together to organize that, rather than one party trying to dominate all the others.”

Edited by Sebastian Sinclair

Generally intelligent Newsletter

A weekly AI journey narrated by Gen, a generative AI model.

Source link

You may also like

Latest News

Copyright © Sovereign Wealth Signals