When I wrote about bees this spring, I noted that this country’s honeybee population was buzzing at an all-time high.
Now it seems a different species of bee is threatening Facebook CEO Mark Zuckerberg’s plans to run Meta Artificial Intelligence (AI) using nuclear power, a non-carbon-based power source capable of supplying its energy needs.
Plans to build a new AI data center near an unspecified nuclear plant, which may have become the first nuclear-powered facility of its kind, hit a roadblock after the pollinators were discovered on the land, according to a Financial Times report on Monday.Zuckerberg confirmed the project’s setback at an internal company meeting last week.Surveyors reportedly stumbled upon the endangered bees during an environmental review of the area, forcing regulators to suspend the project. Details on the species of bee or the precise project location remained under wraps.
There is no information currently available on exactly what bee species is killing Meta’s AI buzz. The 1973 Endangered Species Act currently only protects one species of endangered bee in the continental US: the Rusty Patched Bumblebee.
While it’s unclear which bee species has posed a challenge to Meta’s nuclear plans, according to a map from the US Fish and Wildlife Service, there are only about 471 Rusty Patched Bumblebees left, and most of them are in Minnesota, Wisconsin, Illinois, and around the Virginia-West Virginia border.Meta’s website currently lists nearly two dozen data centers worldwide, with the majority concentrated in the US. A map shows 26 data centers either completed or being built in addition to 75 different solar power locations, 21 wind power locations, and 25 “Water Restoration” projects.
Its important to note that AI models, particularly large language models, require substantial energy for both training and inference.
GPT-3, OpenAI’s 175 billion parameter model, reportedly used 1,287 MWh to train, while DeepMind’s 280 billion parameter model used 1,066 MWh.This is about 100 times the energy used by the average US household in a year.However, training a model is a one-time fixed energy cost, which can be amortized over the lifetime of the model.Where energy costs can start to add up even further is in the use of the model over its lifetime. Running a query through an AI model is called inference, where the query is passed through the model’s parameters to deliver an output. Depending on how long the model is in use, which can be years, and how many queries it is fielding per day, the vast bulk of an AI model’s energy usage will come from inference.
Some estimates have AI accounting for 3% to 4% of global power demand by 2030.
I find it fascinating that Zuckerberg was forced to abandon the plans to fuel his AI center with nuclear power. The government seems to have no issue allowing massive offshore wind farms be constructed, despite objections by citizens and reasonable concerns about the impact on whales and marine lifeforms.
Hopefully, in 2025, more reasonable people with a better-balanced sense of priorities will be assuming office. Maybe Zuckerberg can revisit the issue then.
CLICK HERE FOR FULL VERSION OF THIS STORY