Since the first paper to study how this technology affects the environment was published three years ago, the team has grown among researchers to demonstrate for themselves the energy used by the gases from their work. Having the right numbers is the most important step you can take, but gathering those numbers can be difficult.
“You can’t fix what you can’t measure,” says Jesse Dodge, a research scientist at the Allen Institute for AI in Seattle. “First of all, for us, if we want to make progress in reducing emissions, we need to measure better.”
To that end, the Allen Institute recently partnered with Microsoft, the company AI Hugging Face, and three other universities to develop a device that tests the use of electronic devices on any Azure machine, Microsoft’s cloud service. With it, Azure developers of the new models are able to see all the power used by graphics processing units (GPUs) – computer chips that we read aloud – at each stage of their project, from model selection to training and use. . It is the first major cloud provider to give users the opportunity to learn more about the power of their machine learning software.
Although tools already exist that measure energy consumption and emissions from machine learning systems running on local servers, the tools do not work when researchers use cloud services provided by companies such as Microsoft, Amazon, and Google. These functions do not provide users with a specific interface in GPU, CPU, and memory-consumer memory devices — and existing tools, such as Carbontracker, Experiment Tracker, EnergyVis, and CodeCarbon, require these features to provide accurate comparisons.
The new Azure tool, which launched in October, is currently showing power consumption, not emissions. So Dodge and other researchers figured out how to use the electric power to produce air, and presented a paperwork for the project at FAccT, a computer science conference, in late June. The researchers used Watttime to compare the air emissions based on the zip codes of cloud servers with 11 types of machine learning.
They found that emissions could be significantly reduced if researchers used servers at specific locations and at certain times of the day. Output from small machine learning courses can be reduced by up to 80% if the course starts when renewable energy is available on the grid, while output from major components can be reduced by more than 20% if the teaching function is suspended once added. power outages and restart when they are plentiful.
Recent Comments