Uncategorized

Meta To Build Open-Source Artificial General Intelligence For All, Zuckerberg Says



Meta CEO Mark Zuckerberg announced Thursday on Threads that he’s focusing Meta on building full general intelligence, or artificial general intelligence, and then releasing it as open source software for everyone.

“It’s become clearer that the next generation of services requires building full general intelligence,” he said in a personal video. “Building the best AI assistants, AIs for creators, AIs for business and more—that needs advances in every area of AI, from reasoning to planning to coding to memory and other cognitive abilities.”

To support this effort, Zuckerberg said that Meta would have a massive array of compute power in its cloud facilities by the end of 2024: 350,000 Nvidia H100s, or around 600,000 H100 equivalents if you include other GPUs.

Only Microsoft is ordering enough H100s to build equivalent capacity, and with such huge orders, the H100 delivery times are stretching out as long as a year.

Each Nvidia H100, announced in 2022, contains 80 billion transistors, is up to six times faster than previous models and has memory bandwidth of up to three terabytes per second. Just one implementation of only 4,600 H100s forms a supercomputer, Eos, while Meta’s will be the equivalent of 130 times bigger.

Do the math, and Zuckerberg’s massive AI compute capacity will have 4.8e+16 transistors. That’s 48,000,000,000,000,000, or 48 quadrillion.

With that massive amount of compute power, Zuckerberg says Meta will continue training Llama 3, and an “exciting roadmap of future models we’re going to be training responsibly and safely, too.”

Llama 3 is a generative AI text model that some say could challenge or even surpass OpenAI’s GPT-4, currently the gold standard for generative AI models. Meta made Llama 2, its predecessor, openly available, and it sounds like something similar will happen with Llama 3.

Zuckerberg thinks AI and the metaverse are intimately connected, and he says that smart glasses will be the way most people experience AI and the metaverse together.

“A lot of us are going to talk to AI throughout the day,” he said. “I think a lot of us are going to do that using glasses, because glasses are the ideal form factor for letting an AI see what you see and hear what you hear.”

He also referenced Meta’s Ray-Ban Meta glasses, which he said are “off to a very strong start.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *