Skip to content
Tiatra, LLCTiatra, LLC
Tiatra, LLC
Information Technology Solutions for Washington, DC Government Agencies
  • Home
  • About Us
  • Services
    • IT Engineering and Support
    • Software Development
    • Information Assurance and Testing
    • Project and Program Management
  • Clients & Partners
  • Careers
  • News
  • Contact
 
  • Home
  • About Us
  • Services
    • IT Engineering and Support
    • Software Development
    • Information Assurance and Testing
    • Project and Program Management
  • Clients & Partners
  • Careers
  • News
  • Contact

4 paths to sustainable AI

Regulators, investors, customers, and even employees are pushing companies to minimize the climate impact of their AI initiatives. Everything from geothermal data centers to more efficient graphic processing units (GPUs) can help. But AI users must also get over the urge to use the biggest, baddest AI models to solve every problem if they truly want to fight climate change.  

Concerns that AI contributes to global warming stem from estimates that GPUs used to develop and keep AI models running use four times as much energy as those serving conventional cloud applications, and that AI could be on track to use as much electricity as Ireland.

In response, regulators in Europe and the US are moving to require large users of AI to report on its environmental impact. Credit rating agencies and customers are paying closer attention to environmental, social, and governance (ESG) issues such as carbon emissions, says Faith Taylor, VP of global sustainability and ESG officer at global infrastructure services provider Kyndryl. In addition, she says, “Employees, especially the younger generation, say they’re not going to work at a company that doesn’t have certain environmental goals. We see it as a recruiting and retention factor.”

As sustainable efforts become a greater priority, here are four ways companies are succeeding in streamlining their AI efforts.

Use more efficient processes and architectures

Boris Gamazaychikov, senior manager of emissions reduction at SaaS provider Salesforce, recommends using specialized AI models to reduce the power needed to train them. “Is it necessary for a model that can also write a sonnet to write code for us?” he asks. “Our theory is no. Our approach has been to create specific models for specific use cases rather than one general-purpose model.”

He also recommends tapping the open-source community for models that can be pre-trained for various tasks. One example, he cites Meta’s Llama-2, from which he says more than 13,000 variants have been created. “All those 13,000 new models didn’t require any pre-training,” he says. “Think about how much computer and carbon that saved.” Salesforce’s AI Research team has also developed methods such as maximum parallelism, he adds, which split up compute-intensive tasks efficiently to reduce energy use and carbon emissions. 

Rather than training the model on all the training data at once, Salesforce trains the model in multiple “epochs” in which a portion of the data is slightly modified in each one based on the results of the earlier training. This results in a reduction of power consumption, he says.

Some hyperscalers offer tools and advice on making AI more sustainable, such as Amazon Web Services, which provides tips on using serverless technologies to eliminate idle resources, data management tools, and datasets. AWS also has models to reduce data processing and storage, and tools to “right size” infrastructure for AI application. If used properly, such tools can help minimize compute resources needed for AI, and thus its environmental impact.  

Use less data

Reducing the size of the dataset used to train a model is one of the most effective ways to minimize energy use and carbon emissions involved in AI. “You can reduce the size of many AI models by an order of magnitude, and only lose two to three percent of your accuracy,” says professor Amanda Stent, director of Colby College’s Davis Institute for Artificial Intelligence. “These techniques are well known but not as well used as they could be because people are enamored with the idea of size.” There’s also the matter of all the attention massive models have received in the press.

Gamazaychikov says the latest version of Salesforce’s CodeGen model, which allows users to generate executable code using natural language, performs just as well as models twice its size. As a rough rule of thumb, he says, about a 50% drop in size means about an equivalent drop in carbon emissions.

At video and music streaming service Plex, head of data science Scott Weston cuts the size of his training data by focusing on a specific need. “We don’t just want to find users who are going to subscribe or leave the platform, but those who should subscribe and how to make sure they do,” he says. Model training is simpler because the data set is more focused and confined to the specific business problem it’s trying to solve, he adds. “Then environment wins because we’re not using all this extra computing to train the models,” he says.

Weston uses uplift modeling, running a series of A/B tests to determine how potential customers respond to different offers, and then uses the results of those tests to build the model. The size of the data sets is limited by business concerns. “We’re careful when conducting sizable tests as we don’t want to interrupt the regular communications flow with our customers.”

Use renewable energy

Hosting AI operations at a data center that uses renewable power is a straightforward path to reduce carbon emissions, but it’s not without tradeoffs.

Online translation service Deepl runs its AI functions from four co-location facilities: two in Iceland, one in Sweden, and one in Finland. The Icelandic data center uses 100% renewably generated geothermal and hydroelectric power. The cold climate also eliminates 40% or more of the total data center power needed to cool the servers because they open the windows rather than use air conditioners, says Deepl’s director of engineering Guido Simon. Cost is another major benefit, he says, with prices of five cents per KW/hour compared to about 30 cents or more in Germany.

The network latency between the user and a sustainable data center can be an issue for time-sensitive applications, says Stent, but only in the inference stage, where the application provides answers to the user, rather than the preliminary training phase.

Deepl, with headquarters in Cologne, Germany, found it could run both training and inference from its remote co-location facilities. “We’re looking at roughly 20 milliseconds more latency compared to a data center closer to us,” says Simon. “During the inference process, making the initial connection to the AI engine might take 10 round trips, resulting in roughly a 200 to 300 millisecond delay due to distance, but you can optimize application to reduce that initial time.”   

The speed of the internet connection to the remote site can, of course, mitigate latency issues. Verne Global Iceland, one of Deepl’s Icelandic providers, claims to be the interconnect site for all submarine cable systems to and from Iceland, with redundant, high-capacity fiber connectivity to Europe and the US.

Another consideration, says Stent, is whether a “renewable” data center is running the latest and most efficient GPUs, or tensor processing units (TPUs). If not, it might end up using more power than in a conventionally powered, but more modern, data center. That isn’t an issue for Deepl, though, because it houses its own “super state-of-the-art” servers in its co-location facilities, says Simon.

Don’t use AI at all

While AI generates buzz among employees and customers, it might be overkill if other approaches are easier to implement, and have less impact on the environment. “Always ask if AI/ML is right for your workload,” recommends AWS in its sustainability guidelines. “There’s no need to use computationally intensive AI when a simpler, more sustainable approach might succeed just as well. For example, using ML to route IoT messages may be unwarranted; you can express the logic with a rules engine.”

Along with environmental considerations, Plex isn’t able to throw millions of dollars of compute into training the largest models. “It’s all about being scrappy and making sure you think through everything and not just throw dollars at the problem,” says Weston.

Online gaming company Mino Games uses DataGPT, which integrates analytics, a caching database, as well as extract, translate and load (ETL) processes to speed queries, such as which new features to offer players. Data analytics lead Diego Cáceres urges caution about when to use AI. “Phrase the business problem carefully and determine whether simple math is good enough,” he says.

Ongoing challenges

Besides the cost of implementing sustainable AI within a distributed cloud-based workload, finding out which workload is consuming power is a problem, says Yugal Joshi, a partner at consulting firm Everest Group. As a result, he says, most companies focus first on business results from AI, and only then on sustainability.

Another challenge, says Salesforce’s Gamazaychikov, is getting information from developers about the carbon footprint of their foundational AI models. With added regulation from sources such as the European Union and the U.S. Securities and Exchange Commission, “if companies don’t disclose the numbers already, they’ll have to start doing so soon,” he says.

Yet another is the lure of dramatic AI-powered breakthroughs, whatever the cost to the environment.

“Some companies say `I want to be sustainable,’ but they also want to be known for the excellence of their AI, and their employees want to do something transformational,” says Colby College’s Stent. “Until financial pressures force their AI efforts to become more efficient,” she says, “something else will drive them away from sustainability.”

Artificial Intelligence, Budgeting, Business Process Management, CIO, Data Architecture, Data Center Management, Data Management, Energy Efficiency, GPUs, IT Leadership
Read More from This Article: 4 paths to sustainable AI
Source: News

Category: NewsJanuary 31, 2024
Tags: art

Post navigation

PreviousPrevious post:How AIOps can help reduce costs and drive economic efficienciesNextNext post:Trust: The foundation for successful digital transformation

Related posts

IA segura y nube híbrida, el binomio perfecto para acelerar la innovación empresarial 
May 23, 2025
How IT and OT are merging: Opportunities and tips
May 23, 2025
The implementation failure still flying under the radar
May 23, 2025
보안 자랑, 잘못하면 소송감?···법률 전문가가 전하는 CISO 커뮤니케이션 원칙 4가지
May 23, 2025
“모델 연결부터 에이전트 관리까지” 확장 가능한 AI 표준을 위한 공개 프로토콜에 기대
May 23, 2025
AWS, 클라우드 리소스 재판매 제동···기업 고객에 미칠 영향은?
May 23, 2025
Recent Posts
  • IA segura y nube híbrida, el binomio perfecto para acelerar la innovación empresarial 
  • How IT and OT are merging: Opportunities and tips
  • The implementation failure still flying under the radar
  • 보안 자랑, 잘못하면 소송감?···법률 전문가가 전하는 CISO 커뮤니케이션 원칙 4가지
  • “모델 연결부터 에이전트 관리까지” 확장 가능한 AI 표준을 위한 공개 프로토콜에 기대
Recent Comments
    Archives
    • May 2025
    • April 2025
    • March 2025
    • February 2025
    • January 2025
    • December 2024
    • November 2024
    • October 2024
    • September 2024
    • August 2024
    • July 2024
    • June 2024
    • May 2024
    • April 2024
    • March 2024
    • February 2024
    • January 2024
    • December 2023
    • November 2023
    • October 2023
    • September 2023
    • August 2023
    • July 2023
    • June 2023
    • May 2023
    • April 2023
    • March 2023
    • February 2023
    • January 2023
    • December 2022
    • November 2022
    • October 2022
    • September 2022
    • August 2022
    • July 2022
    • June 2022
    • May 2022
    • April 2022
    • March 2022
    • February 2022
    • January 2022
    • December 2021
    • November 2021
    • October 2021
    • September 2021
    • August 2021
    • July 2021
    • June 2021
    • May 2021
    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    Categories
    • News
    Meta
    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org
    Tiatra LLC.

    Tiatra, LLC, based in the Washington, DC metropolitan area, proudly serves federal government agencies, organizations that work with the government and other commercial businesses and organizations. Tiatra specializes in a broad range of information technology (IT) development and management services incorporating solid engineering, attention to client needs, and meeting or exceeding any security parameters required. Our small yet innovative company is structured with a full complement of the necessary technical experts, working with hands-on management, to provide a high level of service and competitive pricing for your systems and engineering requirements.

    Find us on:

    FacebookTwitterLinkedin

    Submitclear

    Tiatra, LLC
    Copyright 2016. All rights reserved.