Posted on

Machine learning makes smarter batteries

Renewable energies, such as wind and solar energy are naturally intermittent. To balance their demand and supply, batteries of, for example, electric vehicles can be charged and act as an energy buffer for the power grid. Cars spend most of their time idle and could, at the same time, feed their electricity back into the grid. While this is still a dream of the future, commercialization of electric and hybrid vehicles is already creating a growing demand for long-lasting batteries, both for driving as well as grid buffering. Consequently, methods for evaluating the state of the battery will become increasingly important.

The long duration of battery health tests is a problem, hindering the rapid development of new batteries. Better battery life forcasting methods are therefore urgently needed but are extremely difficult to develop. Now, Severson and her colleagues report in the journal Nature Energy that machine learning can help to predict computer battery life by creating computer models. The published algorithms use data from early-stage charge and discharge cycles.

Normally, a figure of merit describes the health of a battery. It quantifies the ability of the battery to store energy relative to its original state. The health status is 100% when the battery is new and decreases with time. This is similar to the state of charge of a battery. Estimating the state of charge of a battery is, in turn, important to ensure safe and correct use. However, there is no consensus in the industry and science as to what exactly a battery’s health status is or how it should be determined.

The state of health of a battery reflects two signs of aging: progressive capacity decline and impedance increase (another measure of electrical resistance). Estimates of the state of charge of a battery must therefore take into account both the drop in capacity and the increase in impedance.

Lithium ion batteries, however, are complex systems in which both capacity fade and impedance increase are caused by multiple interacting processes. Most of these processes cannot be studied independently since they often occur in simultaneously. The state of health can therefore not be determined from a single direct measurement. Conventional health assessment methods include examining the interactions between the electrodes of a battery. Since such methods often intervene directly in the system “battery”, they make the battery useless, which is hardly desired.

A battery’s health status can also be determined in less invasive ways, for example using adaptive models and experimental techniques. Adaptive models learn from recorded battery performance data and adjust themselves. They are useful if system-specific battery information are not available. Such models are suitable for the diagnosis of aging processes. The main problem, however, is that they must be trained with experimental data before they can be used to determine the current capacity of a battery.

Experimental techniques are used to evaluate certain physical processes and failure mechanisms. This allows the rate of future capacity loss to be estimated. Unfortunately, these methods can not detect any intermittent errors. Alternative techniques use the rate of voltage or capacitance change (rather than raw voltage and current data). In order to accelerate the development of battery technology, further methods need to be found which can accurately predict the life of the batteries.

Severson and her colleagues have created a comprehensive data set that includes the performance data of 124 commercial lithium-ion batteries during their charge and discharge cycles. The authors used a variety of rapid charging conditions with identical discharge conditions. This method caused a change of the battery lives. The data covered a wide range of 150 to 2,300 cycles.

The researchers then used machine learning algorithms to analyze the data, creating models that can reliably predict battery life. After the first 100 cycles of each experimentally characterized battery their model already showed clear signs of a capacity fade. The best model could predict the lifetime of about 91% data sets studied in the study. Using the first five cycles, batteries could be classified into categories with short (<550 cycles) or long lifetimes.

The researchers’ work shows that data-driven modeling using machine learning allows forecasting the state of health of lithium-ion batteries. The models can identify aging processes that do not otherwise apparent in capacity data during early cycles. Accordingly, the new approach complements the previous predictive models. But at Frontis Energy, we also see the ability to combine generated data with models that predict the behavior of other complex dynamic systems.

(Photo: Wikipedia)

Posted on

Melting ice sheets in Greenland contribute 25% to sea level rise

Recently we reported the loss of snow cover in Europe. The snow is not only gone in many parts of Europe, also Greenland’s ice cover is melting. The Greenland ice sheet contributes 25% to global sea-level rise. This makes it the largest contribution of the cryosphere. The increased mass loss of Greenland ice during the 21st century is mainly due to the increased surface water runoff, of which ~93% come directly from the small ablation zone of the ice sheet (~22% of the ice surface). As the snow melts in the summer, bare glacier ice is more exposed in this ablation zone. Naked ice is darker and less porous than snow. It absorbs more than twice the solar radiation while also holding back less meltwater. Smooth ice produces a large proportion (~78%) of the total outflow of Greenland into the sea, although in summer only a small area of ​​the ice is exposed. Accurately capturing the reduced albedo and the full extent of bare ice in climate models is critical to determining Greenland’s present and future runoff contribution to sea-level rise.

The mass loss of the Greenland ice sheet has recently increased due to the accelerated melting of its surface. As this melting is critically affected by surface albedo, understanding the processes and potential feedbacks regardinng the albedo is required for accurately forecasting mass loss. The resulting radiation variability of the ablation zone caused the ice layer to melt five times faster compared with hydrological and biological processes, which also darken the ice sheet. Variations in the snow limits due to the shallower ice layer at higher altitudes have an even greater impact on melt when the climate is warmer. As a result of these fluctuations, the mapped ice surface during the summer of 2012, the record year of snowmelt, was the largest and had an area of 300,050 km2. That is, bare ice accounted for 16% of the ice surface. The smallest extent of bare ice was 184,660 km2 and was observed in 2006. This corresponded to 10% of the ice surface, i.e. almost 40% less area than in 2012. However, the observed snowpack variation was high and the observation period was too short for a solid trend assessment.

Current climate models are too inaccurate in predicting the sea level rise during flood years, leading to uncertainty in the estimation of Greenland’s contribution to global sea level rise. To understand the factors that influence melting, Jonathan Ryan of Brown University, Providence, Rhode Island, and his colleagues have investigated Greenland’s snow line. At altitudes below the snow line, the darker ice is not covered by snow. This snow line moves up or down during Greenland’s seasons. The researchers mapped these movements between 2001 and 2017 using satellite images. The average height of the snow line at the end of the summer in 2009 was between 1,330 m and then 1,650 m in 2012. The fluctuations in the snow line are the most important factor when it comes to how much solar energy the ice sheet absorbs. Modelers must consider this effect to improve their predictions. Knowing how much and how fast the Greenland ice melts will help us to take better protective measures. At Frontis Energy, we think that the best protection against sea-level rise is the prevention and recycling of CO2.

(Photo: Wikipedia)

Posted on

Economic losses caused by flooding due to global warming

In Europe, floods are linked to high fluctuations of atmospheric pressure. These variations are also known as the North Atlantic Oscillation. Stefan Zanardo and his colleagues at Risk Management Solutions, London, UK, analyzed historical records of severe floodings in Europe since 1870. They compared patterns of atmospheric pressure at the time of the floods. When the North Atlantic Oscillation is in a positive state, a depression over Iceland drives wind and storm throughout northern Europe. In a negative state, however, it makes southern Europe moister than usual. Normally, floods occur in northern Europe. They cause the most damage if the North Atlantic Oscillation was positive in winter. If enough rain has already fallen to saturate the soil, high risk conditions for flooding are met. Air pressure in Europe may change with global warming and public administrations should take this into account when assessing flood risk in a region, the researchers say.

This is important because flooding in Europe often causes loss of life, significant property damage , and business interruptions. Global warming will further worsen this situation. Risk distribution will change as well. The frequent occurrence of catastrophic flooding in recent years has sparked strong interest in this problem in both the public and private sectors. The public sector has been working to improve early warning systems. In fact, these early warning systems have economic benefits. In addition, various risk mitigating strategies have been implemented in European countries. These include flood protection, measures to increase risk awareness, and risk transfer through better dissemination of flood insurance. The fight against the root cause, global warming that is, however, is still far behind to what is needed.

Correlations between large-scale climate patterns, and in particular the North Atlantic Oscillation, and extreme events in the water cycle on the European continent have long been described in the literature. With with more severe and more often flooding as well as alarming global warming scenarios, raising concerns over future flood-related economic losses have become the focus of public attention. Although it is known that climatic patterns also control meteorological events, it is not always clear whether this link will affect the frequency and severeness fo flooding and the associated economic losses. In their study, the researchers relate the North Atlantic Oscillation to economic flood losses.

The researchers used recent data from flood databases as well as disaster models to establish this relation. The models allowed the quantification of the economic losses that ultimately caused by the North Atlantic Oscillation. These losses vary widely between the countries within the influence of the North Atlantic Oscillation. The study shows that the North Atlantic Oscillation can well predict the average losses in the long term. Based on the predictability of the North Atlantic Oscillation, the researchers argue that, in particular, the temporal variations of the flood risks caused by climate oscillations can be forecast. This can help to take encounter catastrophic flood events early on. As a result, flood damage can be minimized or even avoided. As scientists improve their predictions for the North Atlantic Oscillation, society will be better prepared for future flooding.

(Photo: Wikipedia, Stefan Penninger, Sweden)

Posted on

Producing liquid bio-electrically engineered fuels from CO2

At Frontis Energy we have spent much thought on how to recycle CO2. While high value products such as polymers for medical applications are more profitable, customer demand for such products is too low to recycle CO2 in volumes required to decarbonize our atmosphere to pre-industrial levels. Biofuel, for example from field crops or algae has long been thought to be the solution. Unfortunately, they require too much arable land. On top of their land use, biochemical pathways are too complex to understand by the human brain. Therefore, we propose a different way to quickly reach the target of decarbonizing our planet. The proce­dure begins with a desired target fuel and suggests a mi­crobial consortium to produce this fuel. In a second step, the consortium will be examined in a bio-electrical system (BES).

CO2 can be used for liquid fuel production via multiple pathways. The end product, long-chain alcohols, can be used either directly as fuel or reduced to hydrocarbons. Shown are examples of high level BEEF pathways using CO2 and electricity as input and methane, acetate, or butanol as output. Subsequent processes are 1, aerobic methane oxida­tion, 2, direct use of methane, 3 heterotrophic phototrophs, 4, acetone-butanol fermentation, 5, heterotrophs, 6, butanol di­rect use, 7, further processing by yeasts

Today’s atmospheric CO2 imbalance is a consequence of fossil carbon combus­tion. This real­ity requires quick and pragmatic solutions if further CO2 accu­mulation is to be prevented. Direct air capture of CO2 is moving closer to economic feasibility, avoid­ing the use of arable land to grow fuel crops. Producing combustible fuel from CO2 is the most promis­ing inter­mediate solution because such fuel integrates seamlessly into existing ur­ban in­frastructure. Biofuels have been ex­plored inten­sively in re­cent years, in particular within the emerging field of syn­thetic biol­ogy. How­ever tempt­ing the application of genetically modified or­ganisms (GMOs) ap­pears, non-GMO technology is easier and faster to im­plement as the re­quired microbial strains al­ready exist. Avoiding GMOs, CO2 can be used in BES to produce C1 fu­els like methane and precursors like formic acid or syngas, as well as C1+ com­pounds like ac­etate, 2-oxybut­yrate, bu­tyrate, ethanol, and butanol. At the same time, BES inte­grate well into urban in­frastructure without the need for arable land. However, except for meth­ane, none of these fuels are readily com­bustible in their pure form. While elec­tromethane is a com­mercially avail­able al­ternative to fossil natu­ral gas, its volumetric energy den­sity of 40-80 MJ/m3 is lower than that of gasoline with 35-45 GJ/m3. This, the necessary technical modifications, and the psychological barrier of tanking a gaseous fuel make methane hard to sell to automobilists. To pro­duce liq­uid fuel, carbon chains need to be elongated with al­cohols or better, hy­drocarbons as fi­nal prod­ucts. To this end, syngas (CO + H2) is theoreti­cally a viable option in the Fischer-Tropsch process. In reality, syngas pre­cursors are ei­ther fossil fu­els (e.g. coal, natural gas, methanol) or biomass. While the for­mer is ob­viously not CO2-neu­tral, the latter com­petes for arable land. The di­rect con­version of CO2 and electrolytic H2 to C1+ fuels, in turn, is catalyzed out by elec­troactive microbes in the dark (see title figure), avoid­ing food crop com­petition for sun-lit land. Unfortunately, little re­search has been under­taken beyond proof of con­cept of few electroactive strains. In stark con­trast, a plethora of metabolic studies in non-BES is avail­able. These studies often pro­pose the use of GMOs or complex or­ganic sub­strates as precur­sors. We propose to systemati­cally identify metabolic strategies for liquid bio-electrically engineered fuel (BEEF) production. The fastest approach should start by screening meta­bolic data­bases using es­tablished methods of metabolic modeling, fol­lowed by high throughput hypothesis testing in BES. Since H2 is the intermediate in bio-electrosynthesis, the most efficient strategy is to focus on CO2 and H2 as di­rect pre­cursors with as few in­termediate steps as pos­sible. Scala­bility and energy effi­ciency, eco­nomic feasibil­ity that is, are pivotal elements.

First, an electrotrophic acetogen produces acetate, which then used by heterotrophic algae in a consecutive step.

The biggest obstacle for BEEF production is lacking knowledge about pathways that use CO2 and electrolytic H2. This gap exists despite metabolic data­bases like KEGG and more recently KBase, making metabolic design and adequate BEEF strain selection a guessing game rather than an educated ap­proach. Nonetheless, metabolic tools were used to model fuel pro­duction in single cell yeasts and various prokaryotes. In spite of their shortcomings, metabolic data­bases were also employed to model species interactions, for example in a photo-het­erotroph consor­tium using software like ModelSEED / KBase (http://mod­elseed.org/), RAVEN / KEGG and COBRA. A first sys­tematic at­tempt for BEEF cul­tures produci­ng acetate demonstrated the usability of KBase for BES. This research was a bottom-up study which mapped ex­isting genomes onto existing BEEF consor­tia. The same tool can also be em­ployed in a top-down ap­proach, starting with the desired fuel to find the re­quired or­ganisms. Some possi­ble BEEF organisms are the following.

Possible pathways for BEEF production involving Clostridium, 3, or heterotrophic phototrophs, 7, further processing by yeasts

Yeasts are among the microorganisms with the greatest potential for liquid biofuel production. Baker’s yeast, (Saccha­romyces cerevisiae) is the most promi­nent exam­ple. While known for ethanol fermentat­ion, yeasts also produce fusel oils such as bu­tane, phenyl, and amyl derivate aldehy­des and alco­hols. Unlike ethanol, which is formed via sugar fer­mentation, fusel oil is syn­thesized in branched-off amino acid pathways followed by alde­hyde reduction. Many en­zymes involved in the re­duction of aldehydes have been identified, with al­cohol dehydro­genases be­ing the most commonly ob­served. The corre­sponding reduc­tion reactions require reduced NADH⁠ but it is not known whether H2 pro­duced on cathodes of BES can be in­volved.
Clostridia, for example Clostridium acetobutylicum and C. carboxidivo­rans, can pro­duce alcohols like butanol, isopropanol, hexanol, and ketones like acetone from complex sub­strates (starch, whey, cel­lulose, etc. ) or from syngas. Clostridial me­tabolism has been clarified some time ago and is dif­ferent from yeast. It does not necessar­ily require com­plex precursors for NAD+ reduction and it was shown that H2, CO, and cath­odes can donate elec­trons for alcohol production. CO2 and H2 were used in a GMO clostridium to produce high titers of isobu­tanol. Typi­cal representa­tives for acetate produc­tion from CO2 and H2 are C. ljungdahlii, C. aceticum, and Butyribac­terium methy­lotrophicum. Sporo­musa sphaeroides pro­duces acetate in BES. Clostridia also dominated mixed cul­ture BESs converting CO2 to butyrate. They are therefore prime targets for low cost biofuel production. Alcohols in clostridia are produced from acetyl-CoA. This reaction is re­versible, al­lowing ac­etate to serve as substrate for biofuel production with extra­cellular en­ergy sup­ply. Then, en­ergy con­servation, ATP syn­thesis that is, can be achieved from ethanol electron bifurca­tion or H2 oxida­tion via respi­ration. While pos­sible in anaero­bic clostridia, it is hitherto unknown whether elec­tron bifurca­tion or res­piration are linked to alcohols or ke­tone synthesis.
Phototrophs like Botryococcus produce C1+ biofuels as well. They synthesize a number of different hydro­carbons including high value alkanes and alkenes as well as terpenes. However, high titers were achieved by only means of ge­netic engineering, which is economically not feasible in many countries due to regulatory constrains. Moreover, aldehyde dehy­dration/deformylation to alkanes or alkenes requires molecular oxygen to be present. Also the olefin path­way of Syne­chococcus depends on molecular oxygen with the cytochrome P450 involved in fatty acid de­carboxylation. The presence of molecular oxygen affects BES performance due to immediate product degrada­tion and unwanted cathodic oxygen reduction. In contrast, our own preliminary experi­ments (see title photo) and a corrosion experi­ment show that algae can live in the dark using electrons from a cath­ode. While the en­zymes in­volved in the production of some algal biofuels are known (such as olefin and alde­hyde de­formylation), it is not known whether these pathways are connected to H2 utilization (perhaps via ferredox­ins). Such a con­nection would be a promising indicator for the possibility of growing hydrocar­bon produc­ing cyanobacteria on cathodes of BES and should be examined in future research.
At Frontis Energy we believe that a number of other microorganisms show potential for BEEF production and these deserve further investi­gation. To avoid GMOs, BES compatible co-cultures must be identified via in silico meta­bolic reconstruc­tion from existing databases. Possible inter-species intermediates are unknown but are prerequisite for suc­cessful BES operation. Finally, a techno-economical assessment of BEEF pro­duction, with and with­out car­bon taxes, and compared with chemical methods, will direct future research.

Posted on

You Can Have the Pie and Eat It

In Paris, humanity has set itself the goal of limiting global warming to 1.5 °C. Most people believe that this will be accompanied by significant sacrifice of quality of life. That is one reason why climate protection is simply rejected by many people, even to the point of outright denial. At Frontis Energy, we think we can protect the climate and live better. The latest study published in Nature Energy by a research group around Arnulf Grubler of the International Institute for Applied Systems Analysis in Laxenburg, Austria, has now shown that we have good reasons.

The team used computer models to explore the potential of technological trends to reduce energy consumption. Among other things, the researchers said that the use of shared car services will increase and that fossil fuels will give way to solar energy and other forms of renewable energy. Their results show that global energy consumption would decrease by about 40% regardless of population, income, and economic growth. Air pollution and demand for biofuels would also decrease, which would improve health and food supplies.

In contrast to many previous assessments, the group’s findings suggest that humans can limit the temperature rise to 1.5 °C above preindustrial levels without resorting to drastic strategies to extract CO2 from the atmosphere later in the century.

Now, one can argue whether shared car services do not cut quality of life. Nevertheless, we think that individual mobility can be maintained while protecting our climate. CO2 recovery for the production of fuels (CO2 recycling that is) is such a possibility. The Power-to-Gas technology is the most advanced version of CO2 recycling and should certainly be considered in future studies. An example of such an assessment of the power-to-gas technology was published by a Swiss research group headed by Frédéric Meylan, who found that the carbon footprint can be neutralized with conventional technology after just a few cycles.

(Picture: Pieter Bruegel the Elder, The Land of Cockaigne, Wikipedia)

Posted on

Mapping Waste-to-Energy

Most readers of our blog know that waste can be easily converted into energy, such as in biogas plants. Biogas, biohydrogen, and biodiesel are biofuels because they are biologically produced by microorganisms or plants. Biofuel facilities are found worldwide. However, nobody knows exactly where these biofuel plants are located and where they can be operated most economically. This knowledge gap hampers market access for biofuel producers.

At least for the United States − the largest market for biofuels − there is now a map. A research team from the Pacific Northwest National Laboratory (PNNL) and the National Renewable Energy Laboratory (NREL) published a detailed analysis of the potential for waste-to-energy in the US in the journal Renewable and Sustainable Energy Reviews.

The group focused on liquid biofuels that can be recovered from sewage sludge using the Fischer-Tropsch process. The industrial process was originally developed in Nazi Germany for coal liquefaction, but can also be used to liquefy other organic materials, such as biomass. The resulting oil is similar to petroleum, but contains small amounts of oxygen and water. A side effect is that nutrients, such as phosphate can be recovered.

The research group coupled the best available information on these organic wastes from their database with computer models to estimate the quantities and the best geographical distribution of the potential production of liquid biofuel. The results suggest that the United States could produce more than 20 billion liters of liquid biofuel per year.

The group also found that the potential for liquid biofuel from sewage sludge from public wastewater treatment plants is 4 billion liters per year. This resource was found to be widespread throughout the country − with a high density of sites on the east cost, as well as in the largest cities. Animal manure has a potential for 10 billion liters of liquid biofuel per year. Especially in the Midwest are the largest untapped resources.

The potential for liquid biofuel from food waste also follows the population density. For metropolitan areas such as Los Angeles, Seattle, Las Vegas, New York, etc., the researchers estimate that such waste could produce more than 3 billion liters per year. However, food leftovers also had the lowest conversion efficiency. This is also the biggest criticism of the Fischer-Tropsch process. Plants producing significant quantities of liquid fuel are significantly larger than conventional refineries, consume a lot of energy and produce more CO2 than they save.

Better processes for biomass liquefaction and more efficient use of biomass still remain a challenge for industry and science.

(Photo: Wikipedia)

Posted on

The Photosynthetic CO2 Race − Plants vs. Algae

Algae store CO2 but also release it. Some of us may know that. However, so far it was unknown that algae may release additional CO2 due to global warming. That’s what researcher Chao Song and his colleagues of the University of Georgia in Athens, GA, found out.

As they published in the journal Nature Geoscience, the metabolism of algae and other microbes is accelerated by higher water temperatures in large streams. This could lead to some rivers releasing more CO2 than they do now. This could, in turn, further accelerate global warming. Although photosynthesis in algae would accelerate, plants along the river banks would be even faster. Decomposition of the plant material would immediately release the so fixed CO2. With extra nutrients from plants, competing microorganisms would overgrow the river algae or the algae would degrade the plant material themselves.

To calculate the CO2 net effect, scientists monitored temperature, dissolved oxygen, and other parameters in 70 rivers worldwide. Then they used their data for computer models. These models suggest that over time, accelerated photosynthesis in some rivers may not keep pace with plant growth. This net increase of 24% of the CO2 released from rivers could mean an additional global temperature increase of 1 °C.

However, the computer model still lacks some data. For example, the sedimentation rates are not taken into account. In addition, not all banks grow plants. Many rivers pass only sparsely vegetated land. As always, more research is needed to get better answers.

(Photo: Wikipedia)