If Meta CEO Mark Zuckerberg has any say in it, Artificial General Intelligence (AGI) may be just around the corner. founder of facebook Announced on Instagram He is investing more than $10 billion in computing infrastructure to develop AGI – AI that can match or surpass humans in cognitively demanding tasks.
“Today I’m launching Meta’s two AI research efforts to support our long-term goals of building general intelligence, responsibly open-sourcing it, and making it available and useful to everyone in our daily lives,” Zuckerberg said in a statement. Bringing it together.” .18 in a recorded message, “It is clear that the next generation of services requires building full general intelligence, building the best AI assistants, AI for creators, AI for businesses and much more that requires services in every area of AI. “
Unlike today’s artificial intelligence (AI) systems, which are highly specialized and cannot understand nuances and context as well as humans, an AGI system will be able to solve problems in a wide range of environments, in 2019. According to an essay published. magazine EMBO report, It would therefore mimic key features of human intelligence, particularly learning and flexibility.
Connected: 3 scary breakthroughs AI will make in 2024
Achieving AGI may also feel like a point of no return for the human race Google CEO Sundar Pichai As recently as 2018, it was said that the scope of AI research is “deeper than lightning or fire.” Last year, dozens of experts and prominent figures – including OpenAI CEO Sam Altman and Microsoft founder Bill Gates – signed a statement Emphasis was placed on humanity’s collective need to reduce “extinction risk from AI” along with other societal-level risks such as pandemics and nuclear war. She said, lots of Scientists believe humanity may never build AGI,
But Zuckerberg announced in an Instagram Reel that the company is purchasing 350,000 Nvidia H100 graphics processing units (GPUs) — some of the most powerful graphics cards in the world — which are critical to training today’s best AI models. This will more than double Meta’s total computing power for AI training, with Meta aiming to utilize computing power equivalent to 600,000 H100 GPUs in total.
Nvidia’s H100 is a newer version of the A100 graphics card that OpenAI used to train ChatGPT. Our best available knowledge, based on unverified leakssuggests that OpenAI used around 25,000 Nvidia A100 GPUs for training the chatbot – although other estimates suggest the number is lower.
Zuckerberg said this “absolutely massive amount of infrastructure” will be ready by the end of the year. His company is currently training Meta’s answer to ChatGPIT and Google’s Gemini, called “Llama 3” – and has teased a future roadmap that includes future AGI systems.
Source: www.livescience.com