Thermoelectrics Session, MRS Memorial
It’s really an honor to be here. I never had the privilege of actually co-authoring a paper with Millie, but I still managed to find myself, somehow, getting advice from Millie in many different aspects — not only in the academic world, but also in the government and the Department of Energy. So it is really a privilege to be here.
Let me begin with a personal story about Millie.
Millie has inspired a lot of people and, in my case, I have two daughters — and Millie inspired both of them. Both of them aspire to be engineers. And I was talking to my daughter, one day, and she was planning to go to a women in STEM program at CalTech, and at that program, they required every participant to pick a role model.
Now, I knew about the program but I didn’t know who she’d chosen. And I found out that — perhaps because of the dinnertime conversations that we’ve had within the family — my daughter had chosen Millie as her role model.
I sent an email to Millie about it:
Attached is a picture of a name tag that my younger daughter is wearing. She will be a senior in high school and wants to do EE in college. We are at CalTech because she wanted to attend a ‘Women in STEM’ event today.
Two months ago when she was registering she was asked to choose a role model. She had picked you and when I realized that this morning I thought I should send you this picture.
Thanks for inspiring Anjali.
So I sent Millie the picture, and, within a couple hours, I got an email back.
“Thank you for your inspiring message,” Millie wrote. Well, she's the one who was inspiring… but I digress. Millie continued: “I’m thrilled to have been chosen as a role model, and I’m happy to hear that she's interested in EE for study. CalTech is to be commended for having a women in science program as a way to improve the climate for women at CalTech. Keep well and keep up the good work. Best regards to you and family. Millie.”
I’ll never forget that. Hopefully, my daughters will continue to be inspired by Millie, as they get older. She was such a powerful role model.
I’m going to talk about heat engines. I think people have given presentations on this subject many times, and it really catalyzed the work and inspired not only some of the folks in Boston but even in our group. When I was at UC Berkeley in the Lawrence Berkeley Lab, we had a whole program in which we worked on many aspects of this, and part of that was the summary in a ZT plot. There are several of them, but this is our version of it.
So this is the number of new materials found that have ZT properties, and when they were discovered and the paper in which they were published. I think you can find Jos and Gang and myself on here, as well as many others. But the interesting thing is to look at the arrow. See that arrow, pointing at 1993? That was when Millie and Lyndon Hicks’ paper was published. You can see the way in which that paper catalyzed this whole field.
What is also known, but is not often connected, is Millie’s interest in hydrogen. Michelle Buchanan mentioned, yesterday, that Millie was involved in a study on the Hydrogen Economy, and that they co-authored a paper on hydrogen together.
Now, heat engines and the hydrogen economy — these may sound like completely disparate topics, but what I’m going to do is to try to connect them. I’m going to ask, “Could we use heat engines and hydrogen together, in some way that makes sense?”
Here’s a little background on heat engines. I won’t go through most of this slide, because it’s all pretty well known. Most people learn this in undergraduate thermodynamics. But I will just go over a few key things.
First, what is a heat engine?
A heat engine is a way to convert thermal energy (and chemical energy) into mechanical energy. It sounds complicated, but it really isn’t. The steam engine is a heat engine. Everyone knows how steam engines work, right?
Now, at the moment, we’re not using things like steam to power heat engines, we’re using things that pour CO2 into the atmosphere. So if we can create a heat engine that runs off of hydrogen, this could be very advantageous in to us.
Entropy is S, by the way. Change in entropy is ΔS.
The only other thing I’ll say about this slide is the following: remember that to get a heat engine, you need an entropy carrier1, and you need excitations2 with entropy.
The entropy can be in different forms. If you have a liquid vapor, you have configurational entropy of the gases (which are excited). If you have thermoelectrics, as we just heard, you can configurational entropy of the electrons in K-space. Or you can have, as we just heard, spin — entropy of the spin (excitations as well). If you have the electrocaloric effect or the magnetocaloric effect, you have orientational entropy of the dipoles (electric dipoles and magnetic dipoles). And the other kind of entropy you can have is the entropy related to redox reactions.
The first kind of entropy — liquid vapor entropy, which is today’s way of cooling and power generation — is the entropy we’re going to be dealing with, since we’re talking about hydrogen. The theoretical limit of ΔS — or at least what we thought was the theoretical limit — is the liquid vapor phase transition point. That’s where the molecule goes from a liquid phase to a vapor phase — water into steam. And at that point, we can measure that entropy is just slightly over 1 mV/K. We always just assumed this was the limit!
You can look at the thermoelectric side. You can see that the S and the σ, in typical bulk semiconductors, go in opposite directions. It’s very hard to decouple S, σ, and K. I call it the ‘Bermuda Triangle of electrics’. Once you get in, it’s very hard to get out. I think a lot of people are trying to figure out how to maneuver through that landscape of S, σ, and K.
And, of course, the electric caloric effects of dipoles.
What we started doing, after having spent about 10 to 15 years in solid state thermoelectrics, was to ask ourselves, “Can we look at liquids and reactions?”
We did, and what we found was very interesting. On the slide above, you can see some redox reactions. These are typically used in flow batteries, and when you look at the redox reaction in the flow battery, the redox reactions have enthalpy3 and entropy like any reaction.
But, it turns out, if you look at the reaction of vanadium plus two plus three (V+2/+3), which is on the left-hand side, and if you then combine that with the ferricyanide to ferrocyanide redox reaction — the combination of all that means you get 1.7 mV/K and -1.4 mV. Where can we combine them, you get 3 mV/K. This is absolutely staggering, because we thought the limit of a single water molecule going from a liquid phase to a vapor phase was about 1 mV/K. Now, we’re finding out it’s 3! We still don't know why.
What we think is happening is that when you have an ion, you typically have a solvation shell4 around it with many water molecules. When it goes through an oxidation-level change because of electron transfer, you are dispersing out more than one water molecule. And it is because of that amplification that we think we’re seeing a higher limit in the change in entropy (ΔS) than you’d normally find for one water molecule going from a liquid phase to a vapor phase.
This is something quite interesting, and we are finding that it happens not only with ferricyanide and vanadium, but also with many other kinds of redox batteries.
So now, you can look at all the flow battery literature and use it for thermoelectrics (energy conversion) between heat and electricity, directly. We have tried that. And, as a result of our trial, we have now constructed a little system that allows us to generate more power than we previously thought possible. We are getting efficiencies on the order of 20% of Carnot, which is not bad for a first shot.
Now, if we put the new limit for ΔS into the theory, the theory suggests that we should be able to get about 40-50% of Carnot.
But that’s not the interesting part. If we were using typical solid-state thermoelectrics, the entropy (S), electrons, and phonons would all be coupled together. But we’re not! We’re using redox reactions, instead. Therefore, we’ve discovered that we can decouple all these things here, because in our system, the entropy depends on the redox species. The electrical conductance depends on charge transport (ion transport) through a membrane, which is very well known. And in order for the thermal conductivity to be effective, heat exchanges must be involved.
This was one of the ideas that I thought might be very interesting, but which was going in a different direction from the vast majority of research on this subject.
The second thing I’d like to talk about is hydrogen. This was a topic that was provoked by Ernest J. Moniz, a former Secretary of Energy. We wrote a report for him, right at the end of his term, on how to do carbon management.
I remember that, at the time, he asked us all a very pertinent question: “What is the R&D that we should be doing, today, that will have a gigaton-scale effect on CO2 in the future?”
We put a task force together to think about exactly that. At the end of the task force, one of the recommendations that we made was to remember that hydrogen is going to be very important in the future. Why? Because, if you want to do something with CO2, you need hydrogen as a reductant of CO2. I won’t go into the details of the report except to say that, if you really want to look at hydrogen, it’s got more applications than just using it as a fuel cell in a car. You could actually use hydrogen to turn CO2 into some kind of hydrocarbon, which you can then burn to create more CO2 — and, essentially, instead of digging up new fuel sources out of the ground, we can just recycle the CO2 and hydrocarbons we already have.
So let’s look at hydrogen. How do you produce hydrogen? Well, if you have cheap electricity, you can do it electrochemically. That requires a lot of research in catalysis, and all that research is going on right now — especially in terms of trying to understand the oxygen evolution reaction. I won’t go into that in more detail, but the important thing to know is that the industry today is almost exclusively thermochemical.
The question we asked is, “Can we use the thermochemical means to produce hydrogen?”
If you focus on the science and the science works out, then there's the ability for the industry to adopt it very quickly because we already know how to create thermochemical plants and processes.
Therefore, we decided to look at the thermochemical process as being like a heat engine that requires two steps. First, you take a metal oxide and heat it, which causes oxygen to come out and which creates a vacancy inside the material. Second, you expose the material to water, which causes the material to grab the oxygen from the water (because it wants to fill those vacancies you created earlier), thus splitting H2O so that it no longer has any O — so hydrogen comes out.
This is the way to split water. The state-of-the-art materials that are used in this process are typically cesium oxide and ferrites, which is no big deal. The real problem is the temperature required to do this process. It’s about 1500°C! Yikes! As you can imagine, that is way too high for today’s chemical industry.
Our big challenge, therefore, has been to reduce the temperature required so that it’s less than 1,000°C. I'll give you a glimpse of how we are approaching this.
Why is this difficult? How can we reduce it to less than 1,000°C?
The answer to why it’s difficult is that it all comes down to thermodynamics and entropy.
If you look at the two reactions we’re using to split the water, for each of those reactions, the Gibbs free-energy has to be less than zero, otherwise the reaction wouldn't happen. If you plot the ΔH and ΔS for these two reactions, what you find is that it looks a little like the red line (labeled “TR”) on the graph above. That red line is for the high temperature reaction.
I also put a dashed line at 1500, which is the temperature our materials require.
So anything below the solid red line is thermodynamically feasible, because ΔG is less than zero.
For the water-splitting reaction, you can see the blue line in the graph above. The water-splitting reaction is the lower limit on temperature. Anything above that blue line is thermodynamically feasible.
Therefore, if you want a material that does both reactions, you are stuck with a thermodynamic sweet spot, which is the pink shaded triangle on the slide above.
Now, if you go to 1500 (dotted line), it’s a big triangle. If you go to 1100, it's a smaller triangle. If you try to go below 1000°, it's an even smaller triangle.
So how do you get inside the triangle?
You can see that cesium oxide (black line with dots at the top) starts inside the triangle and then goes outside. The ferrites (bottom lines with dots) also start inside the triangle and then go outside.
So how do you really get inside that triangle, and how do you find materials which have a large ΔS when oxygen is removed? That’s the trick.
We’ve struggled with this for a long time. In fact, one of Jos Heremans’ students came to my lab as a postdoc and then figured that the way to really manipulate ΔS, in this, is through phase transitions.
The question is, can we control phase transitions in a solid-solid phase?
There’s a paper that came out of North Carolina State University which deals with what are called ‘Entropy Stabilized Oxides.’ Entropy stabilizing metals are well known — these are alloys where the entropy of the mixture stabilizes the metal. What was not as well known is that this same effect happens in oxides, as well.
If you take magnesium oxide, copper oxide, cobalt oxide, and all these oxides in a four or five component, put all of them together and heat it, the cations mix together in the oxygen lattice. And the entropy caused by mixing things together does, eventually, stabilize into a single phase, which is the rock salt phase at high temperature.
If you go down to a lower temperature, you have spinel structures, perovskites — all kinds of structures! But when you go to high temperatures, they all stabilize.
And the entropy is a mixing all of the catalysts that stabilize it.
We believe this can be used for water-splitting. And, indeed, when we tried it in the lab, we did find that we’d reduced our temperature to 1100℃.
Look at the graph above. On the rightmost side, you have CeO2 — which, in reality, doesn’t work. That’s why the bar for CeO2 is so tiny. Then we keep going through different materials.
You can see where I’ve marked the thermodynamic limits, in purple. So the cobalt ferrite and nickel ferrite barely work at 1100°.
Overall, we are finding our entropy stabilized oxides are producing much higher levels of hydrogen. We have even gotten down to 1000° now, which is a big deal in terms of making it compatible with industrial scaling. But the science is still not completely understood.
So let me summarize this. The question we are asking is: Can we produce at less than 1000°? And we are seeing some signs that perhaps we can.
But it really depends on understanding the roles of all the other polycations (Mg, Ni, Co) as well as the crystal structure and the oxidation level of Fe+2/+3 in tetra/octahedral sites. How does that all work? What are the kinetics of all this?
And, finally, can it reduce CO2 to CO? If it does for water-splitting, it should be able to do for CO2, as well — and that would be of great importance in our fight against climate change.
Thank you all very much for listening.