AI is transforming everything, including lubricants
Unless you have been living under a rock for the past year or so, you’re likely familiar with the meteoric rise of artificial intelligence (AI). Many analysts designated 2023 as the breakout year for AI, which seemingly appeared out of nowhere. This undoubtedly undermines the years of hard toil by machine learning engineers and data scientists to develop AI solutions.
The demo release of ChatGPT in November 2022 caused a sensation, sparking universal discussions about the merits of generative AI and its potential impact on multiple sectors. ChatGPT reached 100 million users in just a couple of months after it went viral on social media.
Generative AI refers to a class of AI systems designed to generate new content. It can create original and diverse outputs without specific programming for each task. AI stands at the forefront of technological evolution, with the potential to revolutionise the way we live, work, and interact with our environment.
AI is propelling us into a new era of innovation and efficiency from enhanced problem-solving capabilities to unprecedented automation. A recent UBS report anticipates 15x growth in AI revenue from 2022 to 2027—with revenues reaching an estimated USD420 billion. However, concerns over the far-reaching consequences of the technology are impeding the willingness to undertake more AI initiatives.
At the recent Conference of the Parties (COP) meeting in Dubai, United Arab Emirates, AI was one of the stars of the show. The annual meeting of the parties to the United Nations Framework Convention on Climate Change (UNFCCC), is a forum to negotiate and agree on action on how to tackle climate change. There is a strong belief that rapidly advancing AI technology and its capacity to process extensive volumes of information will allow it to play a central role in accelerating efforts to mitigate climate change.
At the COP28 Climate Innovation Forum, global technology leaders convened for a series of engaging discussions on cutting-edge solutions, prominently featuring the integration of AI. Google’s Chief Sustainability Officer, Kate Brandt, outlined one instance where the Chilean government is undergoing pilot testing on using AI for efficient grid management. The United Nations (UN) also outlined a partnership with Microsoft that will use AI to monitor each country’s adherence to fossil fuel pledges and evaluate actual emissions performance.
A Boston Consulting Group (BCG) report, commissioned by Google, suggests that AI has the potential to unlock insights that could mitigate 5% to 10% of global greenhouse gas (GHG) emissions by 2030. Certainly, AI holds tremendous potential to expedite the discovery and design of low-emission energy technologies. However, at this stage, numbers are still “pie in the sky.”
Some observers believe the enormous amount of energy required to power the AI revolution could make matters worse, however, potentially sending emissions soaring. A study published by Alex de Vries, founder of Digiconomist, a research company dedicated to exposing the unintended consequences of digital trends, proposed that AI servers could require 85 to 134 terawatt hours (TWh) of electricity annually, in one scenario. This is akin to the annual electricity consumption of some entire countries, such as Argentina, the Netherlands and Sweden. While AI undeniably wields immense power, scaling it presents notable challenges. The evolution of IT workloads in data centres, especially with the surge in AI applications, has exponentially heightened the demand for both power, storage and efficient cooling. The substantial computational processing power required for AI generates significant heat, with sharp spikes in heat loads a distinctive feature. Highly effective thermal management strategies are critical to sustain optimal functioning.
There is “hope” that the benefits of AI can outrun the incremental energy used to power it, but that is no certainty. In a recent interview, Brad Smith, the president of Microsoft, stressed the importance of improving the sustainability of their data centres in response to the huge incremental energy demands of AI, as well as the need for greater renewable energy. The data centre sector is at a pivotal crossroads as businesses navigate increasing requirements for AI capabilities, while concurrently working to reduce energy consumption, costs and greenhouse gas emissions.
Today 40% of a data centre’s total energy consumption can be required for cooling, according to Chris Lockett, VP – Electrification and Castrol Product Innovation at bp. Cooling is essential for maintaining the optimal performance of servers and other computing hardware. Power Use Effectiveness (PUE) is a key metric to consider when choosing cooling technology. A 30% efficiency benefit can be achieved by making the transition from air cooling to liquid cooling single-phase immersion cooling, says Joseph Star, business development manager at ExxonMobil Product Solutions, in a recent episode of F+L Webcast.
Two innovative cooling techniques that have gained popularity for their efficiency and effectiveness are liquid cooling and immersion cooling. While both methods use liquids to remove heat, they differ significantly in their approach and implementation.
Liquid cooling involves circulating a coolant through pipes or tubes that are in direct contact with the heat-generating components, such as central processing units (CPUs), graphics processing units (GPUs) or server chassis. The coolant absorbs heat from these components and then passes through a heat exchanger or radiator to dissipate the heat away from the hardware.
Immersion cooling involves submerging electronic components or entire servers in a non-conductive liquid coolant. The liquid absorbs the heat generated by the components, and natural convection or pumps circulate the liquid to a heat exchanger, where the heat is then expelled from the system.
There are two types of immersion cooling:
Single-phase immersion cooling: The coolant remains in a liquid state, absorbing heat and being circulated without changing phase. Single-phase immersion cooling typically uses hydrocarbon fluids such as polyalphaolefin or synthetic ester base oils.
Two-phase immersion cooling: The coolant absorbs heat and changes from a liquid to a gas upon reaching a certain temperature. The gaseous coolant then condenses back into a liquid after passing through a condenser. While more energy efficient, the use of hazardous per- and polyfluoroalkyl substances (PFAS), commonly called “forever chemicals,” and the global warming potential of some fluids is expected to significantly impact the ability to leverage two-phase solutions in the future due to increasing regulations on PFAS.
Single-phase immersion cooling represents 8% of the data centre market today. This is expected to grow to 18% by 2032, says Star, driven by the need for more efficient cooling solutions as data centre density increases and energy efficiency becomes a higher priority.
At the Open Compute Project (OCP) Global Summit in San Jose, California, U.S.A., in October 2023, the expectation that was shared at the meeting was that the entire liquid cooling value chain could grow from USD10 billion to USD20 billion by 2032. Immersion cooling represents a fundamentally distinct approach compared to the traditional method of air cooling. Advances in high-performance computing (HPC) and indeed generative AI are pushing the biggest names in the lubricant industry to get involved in this market.
At the Open Compute Project (OCP) Global Summit, ExxonMobil unveiled a comprehensive suite of synthetic and non-synthetic fluids for single-phase immersion cooling. During the podcast, Star outlined the key characteristics of immersion cooling fluids including dielectric performance and fluid viscosity. The lower the viscosity the better, as less energy is required to circulate the fluid, says Star. With more than 200 different components in the immersion cooling fluid, material compatibility is also a vitally important consideration from a base oil and additive standpoint, he says. ExxonMobil claims a 40% reduction in the total cost of ownership of IT equipment is achievable compared to air cooling, alongside enhanced energy efficiency by reducing PUE, a common metric in the data centre industry.
Shell Lubricants also announced a range of single-phase immersion cooling fluids, designed to efficiently cool computer components and aid energy consumption, in October 2023. The fluids are made from natural gas using Shell’s gas-to-liquids (GTL) process. In December 2023, a collaboration between India’s Infosys, a global leader in digital services and consulting, and Shell New Energies UK Ltd, was announced—which aims to foster the adoption of immersion cooling services in data centres. The integrated offering incorporates Shell Immersion Cooling Fluid and Infosys’ AI-first services, solutions, and platforms—including Infosys Topaz, which employs generative AI technologies.
Castrol, another of the world’s leading lubricant brands, has worked with immersion cooling frontrunner Submer since 2021, with Castrol’s ON Immersion Cooling Fluid, fully approved for use across their portfolio of products in early 2023. In July 2023, a collaboration with Hypertec—a global provider of customised IT solutions and services—was announced. Hypertec is a specialist in immersion-cooled server design. Castrol and Hypertec will leverage Castrol’s existing collaboration with Submer.