• WeeklyDispatch.AI
  • Posts
  • The week in AI: Mark Zuckerberg says AI will do the work of Meta's midlevel engineers in 2025

The week in AI: Mark Zuckerberg says AI will do the work of Meta's midlevel engineers in 2025

Plus: The best from Consumer Electronics Show 2025

In partnership with

Welcome to The Dispatch! We are the newsletter that keeps you informed about AI. Each Thursday, we aggregate the major developments in artificial intelligence - we pass along the news, useful resources, tools and services; we highlight the top research in the field as well as exciting developments in open source. Even if you aren’t a machine learning engineer, we’ll keep you in touch with the most important developments in AI.

NEWS & OPINION

-------------------------

Zuck got the headline, but our top breakdown this week is a bit more broad in scope - the AI news this week was flooded with some high-profile job disruption commentary and research. The World Economic Forum released its Future of Jobs 2025 Report, which sparked headlines anywhere from alarmist (“41% of companies worldwide plan to reduce workforces by 2030 due to AI” and “39% of skills obsolete by 2030”) to utopian (“In five years, 170 million jobs will be created worldwide by artificial intelligence”).

What the Future of Jobs Report does claim is that AI will create more jobs than it will destroy in the next five years (if the report was going to make headlines, that probably should have been the one!). So there’s that.

But Zuckerberg himself made the biggest waves on the world’s most popular podcast when he told host Joe Rogan, “… in 2025, we at Meta, as well as the other companies that are basically working on this, are going to have an AI that can effectively be a sort of midlevel engineer that you have at your company that can write code.” This podcast appearance was followed almost immediately by Bloomberg picking up an internal memo in which Zuckerberg notified Meta employees that they would be cutting 5% of their workforce - the “lowest level performers”. This all falls on the heels of Meta’s makeover shifting the company dramatically to the right overall.

It’s not just Meta touting AI’s still-burgeoning prowess while simultaneously cutting staff. Take Replit, for example, whose CEO casually noted this week that Replit “doesn’t care about professional coders anymore.” They’ve been at the forefront of providing AI-powered coding tools from the beginning, allowing users to generate code with just a few prompts in natural language. Their Agent product symbolizes a future where one can create apps with natural language only (zero coding knowledge required), and the product has already been wildly successful - successful enough, in fact, that the company changed their entire business model to accommodate Agent and the idea that anyone will soon be able to build anything with these tools.

And Replit, a small startup in the midst of success, made layoffs not as a result of a labor failure but of a labor success with AI-driven efficiency at the heart of it. Even CEO Amjad Masad noted how unusual it was to cut staff when they were doing great financially, and with so much work ahead. But it turned out to be a good decision for the company. They’d already built the tools they knew would allow them to do more with less already, much like the AI "mid-level engineer" that Zuckerberg envisions. These aren’t stories about struggling companies cutting labor to survive; they’re about a shift in how work is going to be done. And who, or what, employers deem is worth compensating.

Whether or not you choose to believe the hype or claims from some of these tech bro CEOs, change does appear to be coming faster even than many closest to AI could have guessed, as Professor Ethan Mollick notes:

“Whatever their incentives, the researchers and engineers inside AI labs appear genuinely convinced they're witnessing the emergence of something unprecedented. Their certainty alone wouldn't matter - except that increasingly public benchmarks and demonstrations are beginning to hint at why they might believe we're approaching a fundamental shift in AI capabilities. The water, as it were, seems to be rising faster than expected.”

-------------------------

Consumer Electronics Show 2025 began last week and ended on Saturday. With the "AI in everything" train having already left the station last year, CES this year was focused more on proving how seamlessly and un-annoyingly it can be integrated into our lives - but still in almost everything (even the water). Here’s the rundown and a few standout products:

  • Nvidia CEO Jensen Huang delivered the keynote and announced the company’s new book-sized AI supercomputer, Digits, that can run up to 200B parameter models locally (and connect easily, so two could run Llama 3.1 405B) for $3000.

  • Samsung’s “AI for All” - Samsung is embedding AI across its entire product range, from phones and TVs to appliances and PCs, aiming to make AI a seamless part of everyday life. Several of Samsung’s new Smart TVs will also have Microsoft Copilot built in, while also teasing a potential AI partnership with Google. AI is also being infused into Samsung’s laundry appliances, art frames, home security equipment, and other devices within its SmartThings ecosystem.

  • LG’s "Affectionate Intelligence" - The new ‘affectionate’ moniker focuses on leveraging AI, including LG’s FURON AI agent, to understand and respond to customer needs in real-time across various spaces (home, vehicle, commercial). It uses generative AI, spatial sensing, and lifestyle pattern analysis to personalize experiences, proactively coordinate devices and services, and even predict user needs.

  • Halliday Smart Glasses - One of the most hyped items at CES this year should help set a new standard for AI smart glasses that don’t look terrible. Sporting a pint-sized display that projects info on the lenses instead of waveguide tech, it follows the heels of Ray-Ban Meta’s success by simply looking like traditional eyewear rather than bulky geek-wear.

  • Hisense 116-inch TriChroma LED TV - This giant TV uses an AI engine to analyze each frame in real time, auto-adjusting brightness, color, and contrast for optimal picture quality and smoother transitions between scenes.

  • JBL Tour ONE M3 Headphones - These headphones use an advanced AI algorithm in their 4-microphone system to ensure crystal-clear calls, even in noisy environments. They utilize JBL's True Adaptive Noise Cancelling 2.0, which uses 8 mics to scan the external environment and make real time adjustments to block out distracting background noise.

  • Withings OMNIA Concept/AI mirror - This is a central health hub that uses AI to gather and interpret data from Withings devices (mirror, scales, blood pressure monitors, etc.) and third-party apps, offering a 360-degree view of your health. It provides personalized insights and connects users to remote care teams for expert analysis, using AI to help decode health metrics and make them more actionable.

  • Nuwa Pen - This ink pen uses three tiny cameras to capture handwriting on paper and create a digital copy in an app. The app uses an integrated large language model to allow users to search their notes by content, ask questions about what they've written, and manipulate them on an "infinite canvas," bridging the gap between analog note-taking and digital organization.

There were over 4,000 companies at CES 2025 - check out CNET, The Verge, or WIRED for more comprehensive coverage.

MORE IN AI THIS WEEK

Your daily AI dose

Mindstream is the HubSpot Media Network’s hottest new property. Stay on top of AI, learn how to apply it… and actually enjoy reading. Imagine that.

Our small team of actual humans spends their whole day creating a newsletter that’s loved by over 150,000 readers. Why not give us a try?

TRENDING AI TOOLS, APPS & SERVICES

  • North, from Cohere: new AI workspace for enterprises combines LLMs and automation for air-gapped environments

  • Answers by Reddit: ask AI to find answers to your questions from discussions on Reddit (surprisingly good at parsing between Reddit/the news and the LLM itself for Reddit’s first/beta AI product)

  • Captions: AI video editing app now has a free plan

  • Trellis AI: let swarms of agents automate your most manual PDF tasks

  • Ambient Agents by Langchain: human-in-the-loop agents to scale ourselves (demo/open source email assistant)

  • Minduck Discovery: search the world with your mind-map AI

  • Create.xyz: turn your words into sites, tools, apps and products, now with Stripe integration

  • Liveblocks: AI copilots ready to drop into your app

  • AI reading club: an experiment to integrate LLMs in reading

GUIDES, LISTS, PRODUCTS, UPDATES, INFORMATIVE

VIDEOS, SOCIAL MEDIA & PODCASTS

  • Simon Willison on the Latent Space Podcast: Things we learned about LLMs in 2024 [Podcast]

  • Google Research Unveils "Transformers 2.0" aka TITANS - is infinite memory for AI coming? [YouTube]

  • 17 weird new tech products at CES 2025 you need right now... [YouTube]

  • Introducing Ray2 from Luma Labs, a new frontier in generative video models [X]

  • What if GPT-5 is already real? What is it being used for? [X]

  • Replit CEO on AI breakthroughs: ‘We don’t care about professional coders anymore’ [Reddit]

  • In Eisenhower's farewell address, he warned of the military-industrial complex. In Biden's farewell address, he warned of the tech-industrial complex, and said AI is the most consequential technology of our time which could cure cancer or pose a risk to humanity [Reddit]

TECHNICAL NEWS, DEVELOPMENT, RESEARCH & OPEN SOURCE

  • Google’s Titan research: towards giving AI almost infinite memory by making it more human

  • Microsoft’s AutoGen v0.4: Reimagining the foundation of agentic AI for scale, extensibility, and robustness

  • Codestral 2501 by Mistral: coding assistant is now #1 on Copilot Arena by LMSYS (together with DeepSeek and Claude) - 256k context window, proficient in 80 coding languages

  • LlamaV-o1: a new multimodal model that excels in step-by-step visual reasoning and achieves SOTA open-source performance

  • Transformer²: self-adaptive LLMs

That’s all for this week! We’ll see you next Thursday.