Posted on

Alternating catholyte flow improves microbial electrosynthesis start-up

Microbial electrolysis is a technology that uses living microorganisms as electro-catalysts in electrolysis cells. The technology can be used for wastewater treatment. Earlier, we proposed that microbial electrolysis be used to decentralize wastewater treatment and biogas production. Since this is a process that converts CO2 into organic compounds using electricity it can also be used for CO2 valorization. Besides methane, such electrolysis cells produce compounds such as acetic acid (vinegar), caproic acid, and others. It is then called microbial electrosynthesis.

However, the main problem with microbial electrolysis and electrosynthesis is the long start-up time. The start-up time is the time required for the microorganisms to form a biofilm on the electrode surface and to start producing the desired products. It can range from several weeks to several months, depending on the operating conditions and the type of microorganisms. This long start-up time limits the feasibility and the scalability of microbial electrosynthesis, as well as its economic and environmental benefits.

Now, scientists of the Wageningen University in the Netherlands presented new research, which aimed to reduce the start-up time of microbial electrosynthesis. By using a novel technique that involves alternating the direction of the catholyte flow through a three-dimensional electrode they were able to reduce the startup time to only ten days. They hypothesized that this technique enhances mass transfer and biofilm formation, and thus accelerates the CO2 reduction and the product formation. This was a start-up time reduction of 50%, compared to a conventional flow-through electrode.

 

The alternating electrolyte flow also reduced the power consumption to 136 kWh per kg of hydrogen. After 60 days, the local hydrogen concentration at the cathode was at a maximum of 600 μM, which indicates a better mass transport and thus a more active biofilm. The researchers speculated that the alternating catholyte flow improved mass transport, because the hydrogen could be better distributed over the cathode layers. In addition, the researchers think that alternating the flow refreshed potential “dead zones” in the cathode chamber.

The pH in the catholyte was 5.8–6.8 and in optimal range for electrosynthetic microorganisms. Production of short and medium chain fatty acids was linked to the presence of microorganisms identified as Peptococcaceae and Clostridium sensu stricto 12 species. Hydrogenotrophic methanogenesis was also observed and was linked to Methanobrevibacter. The latter is a typical constituent of microbial electrolysis cells that use higher intermediate hydrogen concentrations for electrosynthesis at the cathode.

However, there are limitations of the technique, such as the energy efficiency, the product selectivity, and the scalability of microbial electrosynthesis. Such limitations are typical for bench top experiments. We are therefore looking forward to see an industrial application of this new method.

 

Posted on

Nanostructured membranes improve the gas separation of carbon dioxide

To reduce greenhouse gas emissions, various technologies are in development requiring the separation of mixed gases, such as  CO2 and methane or CO2 and nitrogen gas (CO2/CH4 and CO2/N2). Compared to other separation technologies, polymer membranes are  good candidates for industrial use. This is due to their low operating costs, high energy efficiency and simple scalability.

The gas permeability and selectivity, as well as the cost of these polymer membranes are the crucial criteria for their industrial use. These criteria are influenced by molecular order processes during polymerization at nano- and micrometer levels. However, the processes regulating the molecular order of most common membranes do not occur on these levels. Hence, there is little control over them during manufacturing. Not much is known about materials with self organizing properties and their influence on molecular order and gas separation.

Chemists at the Technical University of Eindhoven in the Netherlands examined the effects of the layer distance within the membrane and its halogenation on the gastrunge and published their results in the MDPI Membranes journal. They focused on the separation of helium, CO2 and nitrogen. The researchers used liquid crystal membranes for their investigation. Liquid crystal molecules can align in various nanostructures. These structures vary depending on the manufacturing process and can therefore be controlled. As a result, liquid crystal membranes are ideal in order to investigate the influence of nanostructures on gas separation.

A frequently used manufacturing method is to commence the self organization of the reactive liquid crystal molecules in a cell with spacers. This helps to better control the membrane thickness and alignment and ultimately control the molecular orientation. The final network of the liquid crystal molecules and their fixation in nanostructures is required to achieve mechanical strength. For example, high ordered crystal membranes (i.e. not liquid crystals) have a lower gas permeability. Nonetheless, they also are characterized by a higher selectivity for helium and CO2 compared to nitrogen.

A lamellar morphology and the flow direction of the gas also have a great influence on selectivity and permeability of the membrane. It is also known that halogen atoms such as chlorine or fluorine improve CO2 permeability and selectivity by affecting both gas solubility and diffusion.

In the presented experiments, all liquid crystal membranes with similar chemical compositions, but different halogenated alkyl chains, were aligned. The CO2 sorption and the entire gas permeation were better if their layers were further apart. The gas solubility itself had no impact. This was confirmed by the increased gas diffusion coefficients, which were also determined in the experiments.

Bulky halogens had only limited influence on gas permeability and selectivity. The CO2 permeability of all halogenated liquid crystal membranes increased due to a slightly higher CO2 solubility and diffusion coefficients, which led to improved selectivity for CO2. The layer distance in particular was a crucial factor that directly influenced the diffusion coefficient. The researchers recommended that future investigations should focus on improving separation performance, for example by reducing the membrane thickness.

At Frontis Energy, we are looking forward to a good commercial product that can separate CO2 from gas mixtures, such as biogas, effectively and cheap.

Photo: Pixabay / SD-Pictures

Posted on

Humidity-resistant composite membranes for gas separation

Hydrogen (H2) is a lightweight alternative fuel with a high energy density. However, its environmental impact and life cycle efficiency are determined by how it is produced. Today, the main processes of hydrogen production is either by coal gasification or steam reforming of natural gas where in the last step the produced carbon dioxide (CO2) is produced. Usually, this CO2 is released to the environment. The hydrogen produced by these processes lead is called black/brown or grey hydrogen. To improve its carbon footprint, CO2 capture is necessary. This hydrogen is then call blue hydrogen. However, to obtain zero-emission green hydrogen, electrolysis of water using renewable energy is necessary. During the electrolysis process, hydrogen and oxygen are produced on two electrodes (download our more about hydrogen production and utilization as fuel can be found in our latest DIY FC manual).

Climate-related economic pressure for more efficient gas separation processes

The produced hydrogen is not pure in any of the mentioned instances. For example, using steam methane reforming reaction there are many byproduct gases like carbon monoxide, CO2, water, nitrogen and methane gas.

Typically, the CO2 of hydrogen gas is up to 50% contributing to the greenhouse effect caused by burning fossil fuels. Currently, around 80% of CO2 emissions come from fossil fuels. It has been predicted that the concentration of CO2 in the atmosphere will increase up to 570 ppm in 2,100 which increases the global temperature of about 1.9°C.

The traditional processes of gas separation such as cryogenic distillation and pressure swing adsorption have certain disadvantages, for example high energy consumption. Therefore, developing high-quality and low-cost technologies for gas separation is an important intermediate step to produce cheap hydrogen while reducing CO2 emissions.

Application of 2D material towards gas separation

Finding low cost alternatives like membrane-based separation methods for hydrogen-CO2 separation is a potentially lucrative research and it is therefor not surprising that numerous publications have investigated the matter. The various membrane materials for gas separation range from polymeric membranes, nano-porous materials, metal–organic frameworks and zeolite membranes. The goal is to reach a good balance between selectivity and permeance of gas separation. Both are key parameters for hydrogen purification and CO2 capture processes.

A study published the journal Nature Energy by researchers of the National Institutes of Japan, offered a material platform as advanced solution for the separation of hydrogen  from humid gas mixtures, such as those generated by fossil fuel sources or water electrolysis. The authors showed that the incorporation of positively charged nanodiamonds into graphene oxide (GO/ND+) results in humidity repelling and high performance membranes. The performance of the GO/ND+ laminates excels particularly in hydrogen separation compared with traditional membrane materials.

Strategy and performance of new membrane materials

Graphene oxide laminates are considered as step-change materials for hydrogen-CO2 separation as ultra permeable (triple-digit permeance) and ultra-selective membranes. Still, graphene oxide films lose their attractive separation properties and stability in humid conditions.

After lamination, graphene oxide sheets have an overall negative charge and can be disintegrated due to the electrostatic repulsion if exposed to water. The strategy to overcome this obstacle was based on the charge compensation principle. That is, the authors incorporated positively and negatively charged fillers as stabilizing agents, and tested different loadings as well as graphene oxide flake sizes. So-prepared membranes were tested for stability in dry and humid conditions while separating either hydrogen from CO2 or oxygen.

The GO/ND+ composite membranes retained up to 90% of their hydrogen selectivity against CO2 exposure to several cycles and under aggressive humidity test. A GO30ND+ membrane with 30% positively charged nano-diamond particles exhibited exceptional hydrogen permeance with more than 3,700  gas permeatin units (GPU) and high hydrogen-CO2 selectivity. Interestingly, incorporation of negatively charged nano-diamond particles had no stabilizing effect. The researcher attributed this mostly to the generation of macro scale voids in ND systems resulting in the loss of selectivity. This phenomenon is commonly observed in polymer-based nano-composite membranes with poor interfacial interactions

The gas separation properties of the composite membranes were also investigated using an equimolar hydrogen-CO2 feed mixture. The hydrogen permeance decreased by 6% and hydrogen-CO2 selectivity of the GO30ND+ membrane by 13%.

The stability test of the membranes exposure to wet and dry feeds of the equimolar hydrogen-CO2 mixture  and hydrogen-oxygen mixture showed that GO/ND+ membranes were reversible membrane properties. On the other hand, graphene oxide-only membranes could not survive a single complete cycle exposure, becoming fully permeable to both gases. The researchers explained that the advantages of GO/ND+ membranes over graphene oxide-only membranes were caused by changes of the pore architecture such as dimensions and tortuosity, which could be improved by optimizing the nano-diamond loading. This results in better permeability without any notable loss of selectivity.

X-ray diffraction analysis showed that the incorporation of nanodiamonds has two major effects on the membrane microstructure: increasing the overall pore volume and reducing the average lateral size. Both make the membrane structure more accessible for molecular transport.

Nevertheless, this relatively new class of humid-resistant membranes still needs more optimization to compete with current industrial separation processes.

Image: Pixabay / seagul

Posted on

CO2-neutral traffic

Fossil fuels have made tremendous social and economic advances pssible. This becomes clear, among other things, if you look at the increase in road traffic. Around 90 million vehicles were produced in 2019. In 2000 it was 60 million. It is assumed that the number of vehicles produced by 2030 will grow to 120 million. The increase in road mobility undoubtedly has a positive impact on social mobility and economic growth. However, this also makes the traffic increase a self-accelerating process. Economic growth in the Brics countries (Brazil, Russia, India, China and South Africa) is particularly crucial in this regard. At the same time, it is expected that the proportion of electric vehicles, including hybrids, will also increase sharply. However, whether this is realistic, given the limited lithium reserves, can again be doubted.

In 2010 more than 1 billion cars were registered worldwide. With an annual increase of around 3%, it was already 1.3 billion in 2019. These emit around 6.0 billion tons of CO2 annually (out of a total of 33 billion tons worldwide), making them the largest expanding source of CO2. Energy-related CO2 emissions are generally continuing to rise, although this increase was briefly interrupted by the global health crisis of 2020. In addition, there are 20 to 30% of emissions from the production of fuels and the manufacture and disposal of vehicles.

Life cycle analyzes of vehicles with different drive concepts are the subject of many studies. When it comes to CO2 emissions, the energy source is crucial. Two main developments are discussed today: the electrification of the propulsion system (i.e. fully and partially electrified vehicles) and the electrification of fuels (i.e. hydrogen and synthetic fuels).

In the manufacture of synthetic fuels, water is broken down into oxygen and hydrogen by electrolysis with renewable electricity. Due to the temporary oversupply of renewable electricity, this energy is particularly cheap. The hydrogen can then be used in hydrogen vehicles propelled by fuel cells. Alternatively, CO2 can be converted into hydrocarbons with hydrogen and then used in conventional combustion engines in a climate-neutral manner. The advantage of fuel cell vehicles is their high efficiency and the low cost of electrolysis. The disadvantage is the lack of a hydrogen infrastructure. Converting from hydrocarbons to hydrogen would cost trillions. The cheaper alternative would be synthetic hydrocarbons. However, the development is still in its infancy and the production of synthetic fuels cannot yet be carried out on a large scale.

Hydrogen and synthetic fuels are a necessary addition to electromobility, especially for long-distance and load transport. The widespread view that the low level of efficiency of internal combustion engines makes these fuels uninteresting ignores the possibility of using them to store and transport energy and to enable climate neutrality for air and shipping traffic. If you compare the CO2 emissions from electric motors and electrified fuels, it becomes clear that these mainly depend on the CO2 pollution of the electricity used.

Synthetic fuel sources

The production of synthetic fuel requires renewable electricity, water and CO2. The technical processes are known. However, the first large-scale industrial plants are only in the planning phase. However, pilot projects such as that of the Canadian company Carbon Engineering have shown the technical feasibility of scaling. The generation costs depend mainly on the size of the plant and the electricity price, which results from the local conditions, the structure of the electricity market and the share of renewable electricity.

The decentralized production of these fuels brings not only climate neutrality but also geopolitical gains. Since CO2 and renewable energy – in contrast to lithium – are generally accessible resources, users of this technology become independent of energy imports. At Frontis Energy we think these are strong arguments in favor of synthetic fuels.

Posted on

Accelerated deforestation in the EU

Forests are vital to our society. In the EU, forests make up around 38% of the total land area. They are important carbon sinks as they eliminate around 10% of EU greenhouse gases. Efforts to conserve them are a key part of EU climate targets. However, the increasing demand for forest products poses challenges for sustainable forest management.

According to a report recently published in the renowned science magazine Nature, the EU’s deforested area has increased by 49% and with it the loss of biomass (69%). This is due to large-scale deforestation, which reduces the continent’s carbon absorption capacity and accelerates climate change.

The analyzed a series of very detailed satellite data. The authors of the report show that deforestation occurred primarily on the Iberian Peninsula, the Baltic States, and Scandinavia. Deforestation of forest areas increased by 49% between 2016 and 2018. Satellite images also show that the average area of ​​harvested land across Europe has increased by 34 percent, with potential implications for biodiversity, soil erosion and water regulation.

The accelerating deforestation could thwart the EU’s strategy to combat climate change, which aims in particular to protect forests in the coming years, the experts warn in their study. For this reason, the increasing use of forests is challenging to maintain the existing balance between the demand for wood and the need to preserve these key ecosystems for the environment. Typically, industries such as bioenergy or the paper industry are the driving forces behind deforestation.

The greatest acceleration in deforestation was recorded in Sweden and Finland. In these two countries, more than 50% of the increase in deforestation in Europe has been recorded. Next in line are Spain, Poland, France, Latvia, Portugal and Estonia, which together account for six to 30% of the increase, the study said.

Experts suggest linking deforestation and carbon emissions in model calculations before setting new climate targets. The increase in forest harvest is the result of the recent expansion of global wood markets, as evidenced by economic indicators for forestry, timber bioenergy and international trade. If such a high forest harvest continues, the EU’s vision of forest-based mitigation after 2020 could be compromised. The additional carbon losses from forests would require additional emission reductions in other sectors to achieve climate neutrality.

At Frontis Energy, we find the competition between bioenergy and this important carbon sink particularly disturbing, as both are strategies to mitigate global warming.

(Photo: Picography / Pixabay)

Posted on

Framework for a global carbon budget

Over the past decade, numerous studies have shown that global warming is roughly proportional to the concentration of CO2 in our atmosphere. In this way one can estimate our remaining carbon budget. This is the total amount of man-made carbon dioxide that can still be released into the atmosphere before reaching a set global temperature limit. The nations of the world agreed on this limit in the 2015 Paris Agreement. It should not exceed 1.5°C, and in any case be well below 2.0°C. However, diverging estimates have been made for the remaining carbon budget, which has a negative impact on policy-making. Now, an international research group of renown climate experts has published a framework for the calculation of the global CO2 budget in Nature. The researchers suggest that the application of this framework should help to overcome the differences when estimating the carbon budget, which will help to reduce uncertainties in research and policy.

Since the fifth report of the Intergovernmental Panel on Climate Change (IPCC), the concept of a carbon budget has become more important as an instrument for guiding climate policy. Over the past decade, a series of studies  has clarified why the increase in the global average temperature is roughly proportional to the total amount of CO2 emissions caused by human activity since the Industrial Revolution. In the framework, the research group cites numerous published documents that provide evidence for the linearity of this correlation. This literature has allowed scientists to define the linear relationship between warming and CO2 emissions as a transient climate response to cumulative CO2 emissions (TCRE). The linearity is an appealing concept because of the complexity of the Earth’s response to our CO2 emissions. Additional processes that affect future warming have been included in recent models, among them, for example, the thawing of the Arctic permafrost. These additional processes increase the uncertainty of current climate  models. In addition, global warming is not just caused by CO2 emissions. Other greenhouse gases, such as methane, fluorinated gases or nitrous oxide, as well as aerosols and their precursors affect global temperatures. This further complicates the relationship between future CO2.

In the case of global warming caused by CO2, every tonne contributes to warming, whether that ton is emitted in future, now or in the last century. This means that global CO2 emissions must be reduced to zero, and then remain zero. This also means that the more we emit in the next years, the faster we have to reduce our emissions later. At zero emissions, warming would stabilize, but not disappear. It may also reverse. An overdraft of the carbon budget would have to be compensated by removing the CO2 later. One way of removing CO2 from the atmosphere would be a technology called direct air capture, which we reported earlier. Ultimately, this will probably be the only way left, as carbon neutral renewable energy source sources only make up 5% of our energy mix. Establishing a global carbon budget will further highlights the urgency of our clean energy transition. Unfortunately, there is a large divergence when it comes the amount of the CO2 remaining in our carbon budget. In their framework, the researchers cite numerous studies on carbon budgets to maintain our 1.5°C target. Starting 2018, these range from 0 tonnes of CO2 to 1,000 gigatons. For the 2.0°C target, our carbon budget ranges from around 700 gigatons to nearly 2,000 gigatons of remaining CO2 emissions. The aim of the researchers is to limit this uncertainty by establishing a budget framework. The central element is the equation for calculating the remaining carbon budget:

Blim = (TlimThistTnonCO2Thist) / TCRE − EEsfb

The budget of the remaining CO2 emissions (Blim) for the specific temperature limit (Tlim) is a function of five terms that represent aspects of the geophysical and human-environment systems: the historical man-made warming (Thist), the non-CO2 contribution to the future temperature increase (TnonCO2), the zero emission commitment (TZEC), the TCRE, and an adaptation for sources from possible unrepresented Earth system feedback (EEsfb).

 

Term Key choices or uncertainties Type Level of understanding
Temperature limit Tlim Choice of temperature metrics that allow global warming, the choice of pre-industrial reference and consistency with global climate targets Choice Medium to high
Historical man-made warming Thist Incomplete data and methods for estimating the man-made component; see also Tlim Choice and uncertainty Medium to high
Non-CO2 contribution to future global warming TnonCO2 The level of non-CO2 contributions coinciding with global net zero CO2 emissions; depends on policy choices, but also on the uncertainty of their implementation Choice and uncertainty Medium
Non-CO2 contribution to future global warming TnonCO2 Climate reaction to non-CO2 forcers, such as aerosols and methane Uncertainty Low to medium
Zero-emissions commitment TZEC The extent of the decadal zero emission commitment and near-zero annual carbon emissions Uncertainty Low
Transient climate response to cumulative emissions of CO2 TCRE TCRE uncertainty, linearity and cumulative CO2 emissions that affect temperature metrics of the TCRE estimate Uncertainty Low to medium
Transient climate response to cumulative emissions of CO2 TCRE Uncertainty of the TCRE linearity, value and distribution beyond peak heating which is affected by cumulative CO2 emissions reduction
Uncertainty Low
Unrepresented Earth system feedback mechanisms EEsfb Impact of permafrost thawing and duration as well as methane release from wetlands on geomodels and feedback Uncertainty Very low

In the CO2 budget, the unrepresented Earth system feedback (EEsfb) is arguably the greatest uncertainty. These feedback processes are typically associated with the thawing of permafrost and the associated long-term release of CO2 and CH4. However, other sources of feedback have been identified as well. This include, for example, the variations of CO2 uptake by the vegetation and the associated nitrogen availability. Further feedback processes involve changes in surface albedo, cloud cover, or fire conditions.

It remains a challenge to adequately characterize the uncertainties surrounding the estimates of our carbon budget. In some cases, the reason of these uncertainties is inaccurate knowledge of the underlying processes or inaccurate measurements. In other cases the terminology is used inconsistently. For better comparability and flexibility, the researchers propose to routinely measure global surface air temperature values. This method gives robust data for models and model runs over selected time periods. More detailed comparisons between published estimates of the carbon budget are currently difficult because the original data used for publication often are missing. The researchers therefore propose to provide these in the future along with publications.

Breaking down the carbon budget into its individual factors makes it possible to identify a number of promising pathways for future research. One area of ​​research that might advance this field is to look more closely at the TCRE. Future research is expected to narrow down the range of TCRE uncertainties. Another promising area of ​​research is the study of the correlation between individual factors and their associated uncertainties, for example, between uncertainties in Thist and TnonCO2. This could be achieved by developing methods that allow a more reliable estimate of historical human-induced warming. It is also clear that less complex climate models are useful to further reduce the uncertainties of climate models, and hence the carbon budget. Currently, each factor of the framework presented by yhr researchers has its own uncertainties, and there is no method to formally combine them.

At Frontis Energy, too, we think that progress in these areas would improve our understanding of the estimates of our carbon budget. A systematic understanding of the carbon budget and is crucial for effectively addressing global warming challenges.

Posted on

Hydropower

Hydropower is electricity generated by the movement of water.

In the late 19th century, hydropower became an industrially efficient method of generating electricity. Waters falling from high altitudes, e.g. mountain streams or rivers, as well as strong currents are the best candidates for generating electricity from hydropower. This electricity is a considerable global energy source. It is generated by water entering a turbine which then rotates. When this turbine is connected to an electric generator, this mechanical energy is converted into electrical energy. The Niagara Falls and the Hoover Dam are two examples of electricity produced in this way.

Hydropower provides about 20% of the world’s electricity.

Hydropower has recently gained popularity. The World Bank called it a workable solution to keep up with growing energy needs while avoiding CO2 emissions.

(Photo: Wikipedia)

Posted on

Ammonia energy storage #3

As a loyal reader or loyal reader of our blog, you will certainly remember our previous publications on ammonia energy storage. There, we describe possible ways to extract ammonia from the air, as well as the recovery of its energy in the form of methane (patent pending WO2019/079908A1). Since global food production requires large amounts of ammonia fertilizers, technologies for extraction from air is already very mature. These technologies are essentially all based on the Haber-Bosch process, which was industrialized at the beginning of the last century. During this process, atmospheric nitrogen (N2) is reduced to ammonia (NH3). Despite the simplicity of the molecules involved, the cleavage of the strong nitrogen−nitrogen bonds in N2 and the resulting nitrogen−hydrogen bonds pose a major challenge for catalytic chemists. The reaction usually takes place under harsh conditions and requires a lot of energy, i.e. high reaction temperatures, high pressures and complicated combinations of reagents, which are also often expensive and energy-intensive to manufacture.

Now, a research group led by Yuya Ashida has published an article in the renowned journal Nature, in which they show that a samarium compound in aqueous solution combined with a molybdenum catalyst can form ammonia from atmospheric nitrogen. The work opens up new possibilities in the search for ways to ammonia synthesis under ambient conditions. Under such conditions, less energy is required to produce ammonia, resulting in higher energy efficiency for energy storage. In today’s Haber-Bosch process, air and hydrogen gas are combined via an iron catalyst. The resulting global ammonia production of this process ranges from 250 to 300 tonnes per minute, delivering fertilizers that provide nearly 60% of the world’s population (The Alchemy of Air, available at Amazon).

Comparison of different approaches to produce ammonia. Top: In the industrial Haber-Bosch synthesis of ammonia (NH3), nitrogen gas (N2) reacts with hydrogen molecules (H2), typically in the presence of an iron catalyst. The process requires high temperatures and pressures, but is thermodynamically ideal because only little energy is wasted on side reactions. Center: Nitrogenase enzymes catalyze the reaction of six-electron (e) nitrogen and six protons (H+) under ambient conditions to form ammonia. However, two additional electrons and protons form one molecule of H2. The conversion of ATP (the biological energy “currency”) into ADP drives the reaction. This reaction has a high chemical overpotential. It consumes much more energy than is needed for the actual ammonia forming reaction. Bottom: In the new reaction proposed by Ashida and colleagues, a mixture of water and samarium diiodide (SmI2) is converted to ammonia using nitrogen under ambient conditions and in the presence of a molybdenum catalyst. SmI2 weakens the O−H bonds of the water and generates the hydrogen atoms, which then react with atmospheric nitrogen.

On industrial scale, ammonia is synthesized at temperatures that exceed 400°C and pressures of approximately 400 atmospheres. These conditions are often referred to as “harsh”. During the early days, these harsh conditions were difficult to control. Fatal accidents were not uncommon in the early years of the Haber-Bosch development. This has motivated many chemists to find “milder” alternatives. After all, this always meant searching for new catalysts to lower operating temperatures and pressures. The search for new catalysts would ultimately reduce capital investment in the construction of new fertilizer plants. Since ammonia synthesis is one of the largest producers of carbon dioxide, this would also reduce the associated emissions.

Like many other chemists before them, the authors have been inspired by nature. Nitrogenase enzymes carry out the biological conversion of atmospheric nitrogen into ammonia, a process called nitrogen fixation. On recent Earth, this process is the source of nitrogen atoms in amino acids and nucleotides, the elemental building blocks of life. In contrast to the Haber-Bosch process, nitrogenases do not use hydrogen gas as a source of hydrogen atoms. Instead, they transfer protons (hydrogen ions, H+) and electrons (e) to each nitrogen atom to form N−H bonds. Although nitrogenases fix nitrogen at ambient temperature, they use eight protons and electrons per molecule N2. This is remarkable because the stoichiometry of the reaction requires only six each. This way, nitrogenases provide the necessary thermodynamic drive for nitrogen fixation. The excess of hydrogen equivalents means that nitrogenases have a high chemical overpotential. That is, they consume much more energy than would actually be needed for nitrogen fixation.

The now published reaction is not the first attempt to mimic the nitrogenase reaction. In the past, metal complexes were used with proton and electron sources to convert atmospheric nitrogen into ammonia. The same researchers have previously developed 8 molybdenum complexes that catalyze nitrogen fixation in this way. This produced 230 ammonia molecules per molybdenum complex. The associated overpotentials were significant at almost 1,300 kJ per mole nitrogen. In reality, however, the Haber-Bosch process is not so energy-intensive given the right catalyst is used.

The challenge for catalysis researchers is to combine the best biological and industrial approaches to nitrogen fixation so that the process proceeds at ambient temperatures and pressures. At the same time, the catalyst must reduce the chemical overpotential to such an extent that the construction of new fertilizer plants no longer requires such high capital investments. This is a major challenge as there is no combination of acids (which serve as a proton source) and reducing agents (the electron sources) available for the fixation at the thermodynamic level of hydrogen gas. This means that the mixture must be reactive enough to form N−H bonds at room temperature. In the now described pathway with molybdenum and samarium, the researchers have adopted a strategy in which the proton and electron sources are no longer used separately. This is a fundamentally new approach to catalytic ammonia synthesis. It makes use of a phenomenon known as coordination-induced bond weakening. In the proposed path, the phenomenon is based on the interaction of samarium diiodide (SmI2) and water.

Water is stable because of its strong oxygen-hydrogen bonds (O−H). However, when the oxygen atom in the water is coordinated with SmI2, it exposes its single electron pair and its O−H bonds are weakened. As a result, the resulting mixture becomes a readily available source of hydrogen atoms, protons and electrons, that is. The researchers around Yuya Ashida use this mixture with a molybdenum catalyst to fix nitrogen. SmI2-water mixtures are therefore particularly suitable for this type of catalysis. In them, a considerable coordination-induced bond weakening was previously measured, which was used inter alia for the production of carbon-hydrogen bonds.

The extension of this idea to catalytic ammonia synthesis is remarkable for two reasons. First, the molybdenum catalyst facilitates ammonia synthesis in aqueous solution. This is amazing because molybdenum complexes in water are usually degraded. Second, the use of coordination-induced bond weakening provides a new method for nitrogen fixation at ambient conditions. This also avoids the use of potentially hazardous combinations of proton and electron sources which are a fire hazard. The authors’ approach also works when ethylene glycol (HOCH2CH2OH) is used instead of water. Thus, the candidates for proton and electron sources are extended by an additional precursor.

Ashida and colleagues propose a catalytic cycle for their process in which the molybdenum catalyst initially coordinates to nitrogen and cleaves the N−N bond to form a molybdenum nitrido complex. This molybdenum nitrido complex contains the molybdenum-nitrogen triple bond. The SmI2-water mixture then delivers hydrogen atoms to this complex, eventually producing ammonia. The formation of N−H bonds with molybdenum nitrido complexes represents a significant thermodynamic challenge since the N−H bonds are also weakened by the molybdenum. Nevertheless, the disadvantages are offset by the reduction of the chemical overpotential. The SmI2 not only facilitates the transfer of hydrogen atoms, but also keeps the metal in a reduced form. This prevents undesired molybdenum oxide formation in aqueous solution.

The new process still has significant operational hurdles to overcome before it can be used on an industrial scale. For example, SmI2 is used in large quantities, which generates a lot of waste. The separation of ammonia from aqueous solutions is difficult in terms of energy consumption. However, if the process were used for energy storage in combination with our recovery method, the separation would be eliminated from the aqueous solution. Finally, there is still a chemical overpotential of about 600 kJ/mol. Future research should focus on finding alternatives to SmI2. These could be based, for example, on metals that occur more frequently than samarium and promote coordination-induced bond weakening as well. As Fritz Haber and Carl Bosch have experienced, the newly developed method will probably take some time for development before it becomes available on industrial scale.

(Photo: Wikipedia)

Posted on

Melting ice sheets in Greenland contribute 25% to sea level rise

Recently we reported the loss of snow cover in Europe. The snow is not only gone in many parts of Europe, also Greenland’s ice cover is melting. The Greenland ice sheet contributes 25% to global sea-level rise. This makes it the largest contribution of the cryosphere. The increased mass loss of Greenland ice during the 21st century is mainly due to the increased surface water runoff, of which ~93% come directly from the small ablation zone of the ice sheet (~22% of the ice surface). As the snow melts in the summer, bare glacier ice is more exposed in this ablation zone. Naked ice is darker and less porous than snow. It absorbs more than twice the solar radiation while also holding back less meltwater. Smooth ice produces a large proportion (~78%) of the total outflow of Greenland into the sea, although in summer only a small area of ​​the ice is exposed. Accurately capturing the reduced albedo and the full extent of bare ice in climate models is critical to determining Greenland’s present and future runoff contribution to sea-level rise.

The mass loss of the Greenland ice sheet has recently increased due to the accelerated melting of its surface. As this melting is critically affected by surface albedo, understanding the processes and potential feedbacks regardinng the albedo is required for accurately forecasting mass loss. The resulting radiation variability of the ablation zone caused the ice layer to melt five times faster compared with hydrological and biological processes, which also darken the ice sheet. Variations in the snow limits due to the shallower ice layer at higher altitudes have an even greater impact on melt when the climate is warmer. As a result of these fluctuations, the mapped ice surface during the summer of 2012, the record year of snowmelt, was the largest and had an area of 300,050 km2. That is, bare ice accounted for 16% of the ice surface. The smallest extent of bare ice was 184,660 km2 and was observed in 2006. This corresponded to 10% of the ice surface, i.e. almost 40% less area than in 2012. However, the observed snowpack variation was high and the observation period was too short for a solid trend assessment.

Current climate models are too inaccurate in predicting the sea level rise during flood years, leading to uncertainty in the estimation of Greenland’s contribution to global sea level rise. To understand the factors that influence melting, Jonathan Ryan of Brown University, Providence, Rhode Island, and his colleagues have investigated Greenland’s snow line. At altitudes below the snow line, the darker ice is not covered by snow. This snow line moves up or down during Greenland’s seasons. The researchers mapped these movements between 2001 and 2017 using satellite images. The average height of the snow line at the end of the summer in 2009 was between 1,330 m and then 1,650 m in 2012. The fluctuations in the snow line are the most important factor when it comes to how much solar energy the ice sheet absorbs. Modelers must consider this effect to improve their predictions. Knowing how much and how fast the Greenland ice melts will help us to take better protective measures. At Frontis Energy, we think that the best protection against sea-level rise is the prevention and recycling of CO2.

(Photo: Wikipedia)

Posted on

Economic losses caused by flooding due to global warming

In Europe, floods are linked to high fluctuations of atmospheric pressure. These variations are also known as the North Atlantic Oscillation. Stefan Zanardo and his colleagues at Risk Management Solutions, London, UK, analyzed historical records of severe floodings in Europe since 1870. They compared patterns of atmospheric pressure at the time of the floods. When the North Atlantic Oscillation is in a positive state, a depression over Iceland drives wind and storm throughout northern Europe. In a negative state, however, it makes southern Europe moister than usual. Normally, floods occur in northern Europe. They cause the most damage if the North Atlantic Oscillation was positive in winter. If enough rain has already fallen to saturate the soil, high risk conditions for flooding are met. Air pressure in Europe may change with global warming and public administrations should take this into account when assessing flood risk in a region, the researchers say.

This is important because flooding in Europe often causes loss of life, significant property damage , and business interruptions. Global warming will further worsen this situation. Risk distribution will change as well. The frequent occurrence of catastrophic flooding in recent years has sparked strong interest in this problem in both the public and private sectors. The public sector has been working to improve early warning systems. In fact, these early warning systems have economic benefits. In addition, various risk mitigating strategies have been implemented in European countries. These include flood protection, measures to increase risk awareness, and risk transfer through better dissemination of flood insurance. The fight against the root cause, global warming that is, however, is still far behind to what is needed.

Correlations between large-scale climate patterns, and in particular the North Atlantic Oscillation, and extreme events in the water cycle on the European continent have long been described in the literature. With with more severe and more often flooding as well as alarming global warming scenarios, raising concerns over future flood-related economic losses have become the focus of public attention. Although it is known that climatic patterns also control meteorological events, it is not always clear whether this link will affect the frequency and severeness fo flooding and the associated economic losses. In their study, the researchers relate the North Atlantic Oscillation to economic flood losses.

The researchers used recent data from flood databases as well as disaster models to establish this relation. The models allowed the quantification of the economic losses that ultimately caused by the North Atlantic Oscillation. These losses vary widely between the countries within the influence of the North Atlantic Oscillation. The study shows that the North Atlantic Oscillation can well predict the average losses in the long term. Based on the predictability of the North Atlantic Oscillation, the researchers argue that, in particular, the temporal variations of the flood risks caused by climate oscillations can be forecast. This can help to take encounter catastrophic flood events early on. As a result, flood damage can be minimized or even avoided. As scientists improve their predictions for the North Atlantic Oscillation, society will be better prepared for future flooding.

(Photo: Wikipedia, Stefan Penninger, Sweden)