Artificial intelligence researchers at Meta Platforms have been in panic mode. In recent days, leaders of some of the company’s AI teams openly worried that new conversational AI made by a Chinese hedge fund meant Meta was falling behind in the AI race.
Meta's chief AI scientist, Yann LeCun, says DeepSeek's success with R1 says more about the value of open source than Chinese competition.
Meta’s Yann LeCun asserts open-source AI is the future, as the Chinese open-source model DeepSeek challenges ChatGPT and Llama, reshaping the AI race.
Zuckerberg anticipates that Meta's AI assistant will serve more than 1 billion people in 2025, up from approximately 600 million monthly active users in 2024. Meta Platforms has announced plans to invest up to $65bn this year to expand its artificial intelligence (AI) infrastructure.
A trove of newly released documents reveals Meta’s plans to use book piracy site LibGen to train its AI models.
Executives and researchers leading Meta's AI efforts obsessed over beating OpenAI's GPT-4 model while developing Llama 3, according to internal messages
DeepSeek’s real achievement lies in its ability to develop a cutting-edge AI model while spending a fraction of what its US counterparts have. OpenAI’s development of GPT-4 cost upwards of $100 million,
Zuckerberg expects Meta’s AI assistant — available across its services, including Facebook and Instagram — to serve more than 1 billion people in 2025.
Meta Platforms will invest up to $65 billion in 2025 to expand its AI infrastructure, significantly increasing from its $38-40 billion spending in 202
When Chinese quant hedge fund founder Liang Wenfeng went into AI research, he took 10,000 Nvidia chips and assembled a team of young, ambitious talent. Two years later, DeepSeek exploded on the scene.
Mark Zuckerberg said this year will be a "defining" year for AI, announcing plans to spend over $60-$65 billion in capital expenditures.