Immersion cooling: a short comparison of EV battery thermal management approaches
In one of our last articles, we covered the phenomenon (and problem) of thermal runaway in batteries—particularly lithium-ion batteries, and how engineers are often pressed to find effective ways of mitigating the heat that causes thermal runaway.
In this article, we will pick up where we left off with an explanation of one of the most novel battery thermal management technologies with both big technical challenges and even bigger promises for ultra-fast charging: immersion cooling.
The current state of the battery cooling art
To understand immersion cooling’s role in the evolution of EV technology, let’s first talk about what immersion cooling is not. There are a handful of other industry-standard cooling methods that are – albeit less effective – both easier and cheaper to implement than immersion cooling.
The first and simplest method is passive air cooling. Air is simply allowed to pass over the battery and carry away heat. When the vehicle is moving and proper attention to routing of the surrounding airflow is given, this is usually not a terribly ineffective method. However, the battery sees the most intense thermal loading during charging, when the vehicle (hold regenerative braking, for example) isn’t moving.
So, to meet the ever-increasing charging demands of today’s consumer vehicles, passively air-cooled systems are becoming more and more obsolete. Moving up the ladder of effectiveness, we come to actively air-cooled systems. Still employing the advantageous qualities of air as a coolant, these systems achieve higher cooling rates as the air flow over the hot battery is now actively driven by a fan or pump (open or closed configurations, respectively). More air flow means more heat able to be absorbed by and carried away by the (larger mass of) air.
A short terminology aside
When engineers and scientists talk about thermal management problems like this, the most relevant metric they refer to is the “overall heat transfer coefficient”. While its physical derivation is the topic of a week’s worth of college level heat transfer theory, the layman may regard it as a measure of how effectively heat in a particular scenario gets from where it is to where you want it to be.
Back to air…
Air, however, while non-corrosive, non-toxic, abundant, and (basically) free, is relatively bad at transferring heat. It’s why double-paned windows are so effective and why down jackets are so warm. The simple explanation for the insulating effects of air is that it is a gas – gasses are usually 2-3 orders of magnitude (100x-1000x) less dense than liquids and solids at standard conditions and thus conduct heat very poorly. Typical values for engineered heat transfer systems with air as the coolant lie in the range of 5-30 W/m²K (if the units mean nothing to you, that’s okay — just know we are going to compare values with the same units).
So what about liquids?
Because of their higher density and higher thermal conductivity, liquids are typically (simplifying here…) much more effective at absorbing heat than gases. In fact, liquids have been used since the 1940’s in the automotive industry to cool things. Most ubiquitous is the example of the classic car radiator: a system that pumps liquid coolant throughout hot engine compartments and then cools that liquid with air passing over an array of tiny metal fins.
Liquid cooled battery systems have a similar arrangement. Some simpler systems aim to cool large plates that sandwich batteries (think of two hamburger buns being used to cool a hot patty inside). Systems like these are known as cold plate systems. For comparison, such cold plate configurations achieve an overall heat transfer coefficient in the range of 50-100 W/m²K, markedly better than air cooled systems alone.
Immersion cooling: the next step
Up to now, we have mentioned systems that are relatively indirect in their approach. That is, the thermal pathway from the hot thing (the actual battery cells themselves) to the cold thing (air, liquid coolant, etc.) is chalk full of obstacles. In classical EV battery design approaches, the individual cells are connected and then housed in a package, in many cases including barriers specially designed to prevent fires in one cell or group of cells from spreading to neighboring cells. Heat must first flow through this jungle of housing and fireproof barriers (usually designed first with weight and structural robustness in mind – not thermal conductivity) before it sees either the cold plate or air.
The innovation of immersion cooling drastically simplifies the path that heat must take to exit the battery. How? As the name implies, the individual cells themselves are actually in contact with the liquid coolant in an immersion cooled system. The cells are still held in place with some sort of housing, but the coolant is pumped through small sleeves next to the individual cells.
What kind of requirements does this mean for the coolant? Does it have to be non-flammable in the case of a thermal runaway event? Yes. More importantly, is it allowed to conduct electricity? Nope. So water won’t work? Nope. Is sealing the entire system an engineering headache? Yes.
But there’s a bigger picture here, because the extra effort expended on the design and implementation of such a system is rewarded with a much better overall heat transfer coefficient. In fact, some immersion cooled systems achieve around 150 W/m²K. This is why immersion cooled batteries are receiving such attention in today’s fast evolving EV market – they allow ultra-fast charging. The thermal demands of such an intense charging event require an effective system for dispatching that heat, and immersion cooling fits the bill.