Posted on

Hydropower

Hydropower is electricity generated by the movement of water.

In the late 19th century, hydropower became an industrially efficient method of generating electricity. Waters falling from high altitudes, e.g. mountain streams or rivers, as well as strong currents are the best candidates for generating electricity from hydropower. This electricity is a considerable global energy source. It is generated by water entering a turbine which then rotates. When this turbine is connected to an electric generator, this mechanical energy is converted into electrical energy. The Niagara Falls and the Hoover Dam are two examples of electricity produced in this way.

Hydropower provides about 20% of the world’s electricity.

Hydropower has recently gained popularity. The World Bank called it a workable solution to keep up with growing energy needs while avoiding CO2 emissions.

(Photo: Wikipedia)

Posted on

Ammonia energy storage #3

As a loyal reader or loyal reader of our blog, you will certainly remember our previous publications on ammonia energy storage. There, we describe possible ways to extract ammonia from the air, as well as the recovery of its energy in the form of methane (patent pending WO2019/079908A1). Since global food production requires large amounts of ammonia fertilizers, technologies for extraction from air is already very mature. These technologies are essentially all based on the Haber-Bosch process, which was industrialized at the beginning of the last century. During this process, atmospheric nitrogen (N2) is reduced to ammonia (NH3). Despite the simplicity of the molecules involved, the cleavage of the strong nitrogen−nitrogen bonds in N2 and the resulting nitrogen−hydrogen bonds pose a major challenge for catalytic chemists. The reaction usually takes place under harsh conditions and requires a lot of energy, i.e. high reaction temperatures, high pressures and complicated combinations of reagents, which are also often expensive and energy-intensive to manufacture.

Now, a research group led by Yuya Ashida has published an article in the renowned journal Nature, in which they show that a samarium compound in aqueous solution combined with a molybdenum catalyst can form ammonia from atmospheric nitrogen. The work opens up new possibilities in the search for ways to ammonia synthesis under ambient conditions. Under such conditions, less energy is required to produce ammonia, resulting in higher energy efficiency for energy storage. In today’s Haber-Bosch process, air and hydrogen gas are combined via an iron catalyst. The resulting global ammonia production of this process ranges from 250 to 300 tonnes per minute, delivering fertilizers that provide nearly 60% of the world’s population (The Alchemy of Air, available at Amazon).

Comparison of different approaches to produce ammonia. Top: In the industrial Haber-Bosch synthesis of ammonia (NH3), nitrogen gas (N2) reacts with hydrogen molecules (H2), typically in the presence of an iron catalyst. The process requires high temperatures and pressures, but is thermodynamically ideal because only little energy is wasted on side reactions. Center: Nitrogenase enzymes catalyze the reaction of six-electron (e) nitrogen and six protons (H+) under ambient conditions to form ammonia. However, two additional electrons and protons form one molecule of H2. The conversion of ATP (the biological energy “currency”) into ADP drives the reaction. This reaction has a high chemical overpotential. It consumes much more energy than is needed for the actual ammonia forming reaction. Bottom: In the new reaction proposed by Ashida and colleagues, a mixture of water and samarium diiodide (SmI2) is converted to ammonia using nitrogen under ambient conditions and in the presence of a molybdenum catalyst. SmI2 weakens the O−H bonds of the water and generates the hydrogen atoms, which then react with atmospheric nitrogen.

On industrial scale, ammonia is synthesized at temperatures that exceed 400°C and pressures of approximately 400 atmospheres. These conditions are often referred to as “harsh”. During the early days, these harsh conditions were difficult to control. Fatal accidents were not uncommon in the early years of the Haber-Bosch development. This has motivated many chemists to find “milder” alternatives. After all, this always meant searching for new catalysts to lower operating temperatures and pressures. The search for new catalysts would ultimately reduce capital investment in the construction of new fertilizer plants. Since ammonia synthesis is one of the largest producers of carbon dioxide, this would also reduce the associated emissions.

Like many other chemists before them, the authors have been inspired by nature. Nitrogenase enzymes carry out the biological conversion of atmospheric nitrogen into ammonia, a process called nitrogen fixation. On recent Earth, this process is the source of nitrogen atoms in amino acids and nucleotides, the elemental building blocks of life. In contrast to the Haber-Bosch process, nitrogenases do not use hydrogen gas as a source of hydrogen atoms. Instead, they transfer protons (hydrogen ions, H+) and electrons (e) to each nitrogen atom to form N−H bonds. Although nitrogenases fix nitrogen at ambient temperature, they use eight protons and electrons per molecule N2. This is remarkable because the stoichiometry of the reaction requires only six each. This way, nitrogenases provide the necessary thermodynamic drive for nitrogen fixation. The excess of hydrogen equivalents means that nitrogenases have a high chemical overpotential. That is, they consume much more energy than would actually be needed for nitrogen fixation.

The now published reaction is not the first attempt to mimic the nitrogenase reaction. In the past, metal complexes were used with proton and electron sources to convert atmospheric nitrogen into ammonia. The same researchers have previously developed 8 molybdenum complexes that catalyze nitrogen fixation in this way. This produced 230 ammonia molecules per molybdenum complex. The associated overpotentials were significant at almost 1,300 kJ per mole nitrogen. In reality, however, the Haber-Bosch process is not so energy-intensive given the right catalyst is used.

The challenge for catalysis researchers is to combine the best biological and industrial approaches to nitrogen fixation so that the process proceeds at ambient temperatures and pressures. At the same time, the catalyst must reduce the chemical overpotential to such an extent that the construction of new fertilizer plants no longer requires such high capital investments. This is a major challenge as there is no combination of acids (which serve as a proton source) and reducing agents (the electron sources) available for the fixation at the thermodynamic level of hydrogen gas. This means that the mixture must be reactive enough to form N−H bonds at room temperature. In the now described pathway with molybdenum and samarium, the researchers have adopted a strategy in which the proton and electron sources are no longer used separately. This is a fundamentally new approach to catalytic ammonia synthesis. It makes use of a phenomenon known as coordination-induced bond weakening. In the proposed path, the phenomenon is based on the interaction of samarium diiodide (SmI2) and water.

Water is stable because of its strong oxygen-hydrogen bonds (O−H). However, when the oxygen atom in the water is coordinated with SmI2, it exposes its single electron pair and its O−H bonds are weakened. As a result, the resulting mixture becomes a readily available source of hydrogen atoms, protons and electrons, that is. The researchers around Yuya Ashida use this mixture with a molybdenum catalyst to fix nitrogen. SmI2-water mixtures are therefore particularly suitable for this type of catalysis. In them, a considerable coordination-induced bond weakening was previously measured, which was used inter alia for the production of carbon-hydrogen bonds.

The extension of this idea to catalytic ammonia synthesis is remarkable for two reasons. First, the molybdenum catalyst facilitates ammonia synthesis in aqueous solution. This is amazing because molybdenum complexes in water are usually degraded. Second, the use of coordination-induced bond weakening provides a new method for nitrogen fixation at ambient conditions. This also avoids the use of potentially hazardous combinations of proton and electron sources which are a fire hazard. The authors’ approach also works when ethylene glycol (HOCH2CH2OH) is used instead of water. Thus, the candidates for proton and electron sources are extended by an additional precursor.

Ashida and colleagues propose a catalytic cycle for their process in which the molybdenum catalyst initially coordinates to nitrogen and cleaves the N−N bond to form a molybdenum nitrido complex. This molybdenum nitrido complex contains the molybdenum-nitrogen triple bond. The SmI2-water mixture then delivers hydrogen atoms to this complex, eventually producing ammonia. The formation of N−H bonds with molybdenum nitrido complexes represents a significant thermodynamic challenge since the N−H bonds are also weakened by the molybdenum. Nevertheless, the disadvantages are offset by the reduction of the chemical overpotential. The SmI2 not only facilitates the transfer of hydrogen atoms, but also keeps the metal in a reduced form. This prevents undesired molybdenum oxide formation in aqueous solution.

The new process still has significant operational hurdles to overcome before it can be used on an industrial scale. For example, SmI2 is used in large quantities, which generates a lot of waste. The separation of ammonia from aqueous solutions is difficult in terms of energy consumption. However, if the process were used for energy storage in combination with our recovery method, the separation would be eliminated from the aqueous solution. Finally, there is still a chemical overpotential of about 600 kJ/mol. Future research should focus on finding alternatives to SmI2. These could be based, for example, on metals that occur more frequently than samarium and promote coordination-induced bond weakening as well. As Fritz Haber and Carl Bosch have experienced, the newly developed method will probably take some time for development before it becomes available on industrial scale.

(Photo: Wikipedia)

Posted on

Melting ice sheets in Greenland contribute 25% to sea level rise

Recently we reported the loss of snow cover in Europe. The snow is not only gone in many parts of Europe, also Greenland’s ice cover is melting. The Greenland ice sheet contributes 25% to global sea-level rise. This makes it the largest contribution of the cryosphere. The increased mass loss of Greenland ice during the 21st century is mainly due to the increased surface water runoff, of which ~93% come directly from the small ablation zone of the ice sheet (~22% of the ice surface). As the snow melts in the summer, bare glacier ice is more exposed in this ablation zone. Naked ice is darker and less porous than snow. It absorbs more than twice the solar radiation while also holding back less meltwater. Smooth ice produces a large proportion (~78%) of the total outflow of Greenland into the sea, although in summer only a small area of ​​the ice is exposed. Accurately capturing the reduced albedo and the full extent of bare ice in climate models is critical to determining Greenland’s present and future runoff contribution to sea-level rise.

The mass loss of the Greenland ice sheet has recently increased due to the accelerated melting of its surface. As this melting is critically affected by surface albedo, understanding the processes and potential feedbacks regardinng the albedo is required for accurately forecasting mass loss. The resulting radiation variability of the ablation zone caused the ice layer to melt five times faster compared with hydrological and biological processes, which also darken the ice sheet. Variations in the snow limits due to the shallower ice layer at higher altitudes have an even greater impact on melt when the climate is warmer. As a result of these fluctuations, the mapped ice surface during the summer of 2012, the record year of snowmelt, was the largest and had an area of 300,050 km2. That is, bare ice accounted for 16% of the ice surface. The smallest extent of bare ice was 184,660 km2 and was observed in 2006. This corresponded to 10% of the ice surface, i.e. almost 40% less area than in 2012. However, the observed snowpack variation was high and the observation period was too short for a solid trend assessment.

Current climate models are too inaccurate in predicting the sea level rise during flood years, leading to uncertainty in the estimation of Greenland’s contribution to global sea level rise. To understand the factors that influence melting, Jonathan Ryan of Brown University, Providence, Rhode Island, and his colleagues have investigated Greenland’s snow line. At altitudes below the snow line, the darker ice is not covered by snow. This snow line moves up or down during Greenland’s seasons. The researchers mapped these movements between 2001 and 2017 using satellite images. The average height of the snow line at the end of the summer in 2009 was between 1,330 m and then 1,650 m in 2012. The fluctuations in the snow line are the most important factor when it comes to how much solar energy the ice sheet absorbs. Modelers must consider this effect to improve their predictions. Knowing how much and how fast the Greenland ice melts will help us to take better protective measures. At Frontis Energy, we think that the best protection against sea-level rise is the prevention and recycling of CO2.

(Photo: Wikipedia)

Posted on

Economic losses caused by flooding due to global warming

In Europe, floods are linked to high fluctuations of atmospheric pressure. These variations are also known as the North Atlantic Oscillation. Stefan Zanardo and his colleagues at Risk Management Solutions, London, UK, analyzed historical records of severe floodings in Europe since 1870. They compared patterns of atmospheric pressure at the time of the floods. When the North Atlantic Oscillation is in a positive state, a depression over Iceland drives wind and storm throughout northern Europe. In a negative state, however, it makes southern Europe moister than usual. Normally, floods occur in northern Europe. They cause the most damage if the North Atlantic Oscillation was positive in winter. If enough rain has already fallen to saturate the soil, high risk conditions for flooding are met. Air pressure in Europe may change with global warming and public administrations should take this into account when assessing flood risk in a region, the researchers say.

This is important because flooding in Europe often causes loss of life, significant property damage , and business interruptions. Global warming will further worsen this situation. Risk distribution will change as well. The frequent occurrence of catastrophic flooding in recent years has sparked strong interest in this problem in both the public and private sectors. The public sector has been working to improve early warning systems. In fact, these early warning systems have economic benefits. In addition, various risk mitigating strategies have been implemented in European countries. These include flood protection, measures to increase risk awareness, and risk transfer through better dissemination of flood insurance. The fight against the root cause, global warming that is, however, is still far behind to what is needed.

Correlations between large-scale climate patterns, and in particular the North Atlantic Oscillation, and extreme events in the water cycle on the European continent have long been described in the literature. With with more severe and more often flooding as well as alarming global warming scenarios, raising concerns over future flood-related economic losses have become the focus of public attention. Although it is known that climatic patterns also control meteorological events, it is not always clear whether this link will affect the frequency and severeness fo flooding and the associated economic losses. In their study, the researchers relate the North Atlantic Oscillation to economic flood losses.

The researchers used recent data from flood databases as well as disaster models to establish this relation. The models allowed the quantification of the economic losses that ultimately caused by the North Atlantic Oscillation. These losses vary widely between the countries within the influence of the North Atlantic Oscillation. The study shows that the North Atlantic Oscillation can well predict the average losses in the long term. Based on the predictability of the North Atlantic Oscillation, the researchers argue that, in particular, the temporal variations of the flood risks caused by climate oscillations can be forecast. This can help to take encounter catastrophic flood events early on. As a result, flood damage can be minimized or even avoided. As scientists improve their predictions for the North Atlantic Oscillation, society will be better prepared for future flooding.

(Photo: Wikipedia, Stefan Penninger, Sweden)

Posted on

Producing liquid bio-electrically engineered fuels from CO2

At Frontis Energy we have spent much thought on how to recycle CO2. While high value products such as polymers for medical applications are more profitable, customer demand for such products is too low to recycle CO2 in volumes required to decarbonize our atmosphere to pre-industrial levels. Biofuel, for example from field crops or algae has long been thought to be the solution. Unfortunately, they require too much arable land. On top of their land use, biochemical pathways are too complex to understand by the human brain. Therefore, we propose a different way to quickly reach the target of decarbonizing our planet. The proce­dure begins with a desired target fuel and suggests a mi­crobial consortium to produce this fuel. In a second step, the consortium will be examined in a bio-electrical system (BES).

CO2 can be used for liquid fuel production via multiple pathways. The end product, long-chain alcohols, can be used either directly as fuel or reduced to hydrocarbons. Shown are examples of high level BEEF pathways using CO2 and electricity as input and methane, acetate, or butanol as output. Subsequent processes are 1, aerobic methane oxida­tion, 2, direct use of methane, 3 heterotrophic phototrophs, 4, acetone-butanol fermentation, 5, heterotrophs, 6, butanol di­rect use, 7, further processing by yeasts

Today’s atmospheric CO2 imbalance is a consequence of fossil carbon combus­tion. This real­ity requires quick and pragmatic solutions if further CO2 accu­mulation is to be prevented. Direct air capture of CO2 is moving closer to economic feasibility, avoid­ing the use of arable land to grow fuel crops. Producing combustible fuel from CO2 is the most promis­ing inter­mediate solution because such fuel integrates seamlessly into existing ur­ban in­frastructure. Biofuels have been ex­plored inten­sively in re­cent years, in particular within the emerging field of syn­thetic biol­ogy. How­ever tempt­ing the application of genetically modified or­ganisms (GMOs) ap­pears, non-GMO technology is easier and faster to im­plement as the re­quired microbial strains al­ready exist. Avoiding GMOs, CO2 can be used in BES to produce C1 fu­els like methane and precursors like formic acid or syngas, as well as C1+ com­pounds like ac­etate, 2-oxybut­yrate, bu­tyrate, ethanol, and butanol. At the same time, BES inte­grate well into urban in­frastructure without the need for arable land. However, except for meth­ane, none of these fuels are readily com­bustible in their pure form. While elec­tromethane is a com­mercially avail­able al­ternative to fossil natu­ral gas, its volumetric energy den­sity of 40-80 MJ/m3 is lower than that of gasoline with 35-45 GJ/m3. This, the necessary technical modifications, and the psychological barrier of tanking a gaseous fuel make methane hard to sell to automobilists. To pro­duce liq­uid fuel, carbon chains need to be elongated with al­cohols or better, hy­drocarbons as fi­nal prod­ucts. To this end, syngas (CO + H2) is theoreti­cally a viable option in the Fischer-Tropsch process. In reality, syngas pre­cursors are ei­ther fossil fu­els (e.g. coal, natural gas, methanol) or biomass. While the for­mer is ob­viously not CO2-neu­tral, the latter com­petes for arable land. The di­rect con­version of CO2 and electrolytic H2 to C1+ fuels, in turn, is catalyzed out by elec­troactive microbes in the dark (see title figure), avoid­ing food crop com­petition for sun-lit land. Unfortunately, little re­search has been under­taken beyond proof of con­cept of few electroactive strains. In stark con­trast, a plethora of metabolic studies in non-BES is avail­able. These studies often pro­pose the use of GMOs or complex or­ganic sub­strates as precur­sors. We propose to systemati­cally identify metabolic strategies for liquid bio-electrically engineered fuel (BEEF) production. The fastest approach should start by screening meta­bolic data­bases using es­tablished methods of metabolic modeling, fol­lowed by high throughput hypothesis testing in BES. Since H2 is the intermediate in bio-electrosynthesis, the most efficient strategy is to focus on CO2 and H2 as di­rect pre­cursors with as few in­termediate steps as pos­sible. Scala­bility and energy effi­ciency, eco­nomic feasibil­ity that is, are pivotal elements.

First, an electrotrophic acetogen produces acetate, which then used by heterotrophic algae in a consecutive step.

The biggest obstacle for BEEF production is lacking knowledge about pathways that use CO2 and electrolytic H2. This gap exists despite metabolic data­bases like KEGG and more recently KBase, making metabolic design and adequate BEEF strain selection a guessing game rather than an educated ap­proach. Nonetheless, metabolic tools were used to model fuel pro­duction in single cell yeasts and various prokaryotes. In spite of their shortcomings, metabolic data­bases were also employed to model species interactions, for example in a photo-het­erotroph consor­tium using software like ModelSEED / KBase (http://mod­elseed.org/), RAVEN / KEGG and COBRA. A first sys­tematic at­tempt for BEEF cul­tures produci­ng acetate demonstrated the usability of KBase for BES. This research was a bottom-up study which mapped ex­isting genomes onto existing BEEF consor­tia. The same tool can also be em­ployed in a top-down ap­proach, starting with the desired fuel to find the re­quired or­ganisms. Some possi­ble BEEF organisms are the following.

Possible pathways for BEEF production involving Clostridium, 3, or heterotrophic phototrophs, 7, further processing by yeasts

Yeasts are among the microorganisms with the greatest potential for liquid biofuel production. Baker’s yeast, (Saccha­romyces cerevisiae) is the most promi­nent exam­ple. While known for ethanol fermentat­ion, yeasts also produce fusel oils such as bu­tane, phenyl, and amyl derivate aldehy­des and alco­hols. Unlike ethanol, which is formed via sugar fer­mentation, fusel oil is syn­thesized in branched-off amino acid pathways followed by alde­hyde reduction. Many en­zymes involved in the re­duction of aldehydes have been identified, with al­cohol dehydro­genases be­ing the most commonly ob­served. The corre­sponding reduc­tion reactions require reduced NADH⁠ but it is not known whether H2 pro­duced on cathodes of BES can be in­volved.
Clostridia, for example Clostridium acetobutylicum and C. carboxidivo­rans, can pro­duce alcohols like butanol, isopropanol, hexanol, and ketones like acetone from complex sub­strates (starch, whey, cel­lulose, etc. ) or from syngas. Clostridial me­tabolism has been clarified some time ago and is dif­ferent from yeast. It does not necessar­ily require com­plex precursors for NAD+ reduction and it was shown that H2, CO, and cath­odes can donate elec­trons for alcohol production. CO2 and H2 were used in a GMO clostridium to produce high titers of isobu­tanol. Typi­cal representa­tives for acetate produc­tion from CO2 and H2 are C. ljungdahlii, C. aceticum, and Butyribac­terium methy­lotrophicum. Sporo­musa sphaeroides pro­duces acetate in BES. Clostridia also dominated mixed cul­ture BESs converting CO2 to butyrate. They are therefore prime targets for low cost biofuel production. Alcohols in clostridia are produced from acetyl-CoA. This reaction is re­versible, al­lowing ac­etate to serve as substrate for biofuel production with extra­cellular en­ergy sup­ply. Then, en­ergy con­servation, ATP syn­thesis that is, can be achieved from ethanol electron bifurca­tion or H2 oxida­tion via respi­ration. While pos­sible in anaero­bic clostridia, it is hitherto unknown whether elec­tron bifurca­tion or res­piration are linked to alcohols or ke­tone synthesis.
Phototrophs like Botryococcus produce C1+ biofuels as well. They synthesize a number of different hydro­carbons including high value alkanes and alkenes as well as terpenes. However, high titers were achieved by only means of ge­netic engineering, which is economically not feasible in many countries due to regulatory constrains. Moreover, aldehyde dehy­dration/deformylation to alkanes or alkenes requires molecular oxygen to be present. Also the olefin path­way of Syne­chococcus depends on molecular oxygen with the cytochrome P450 involved in fatty acid de­carboxylation. The presence of molecular oxygen affects BES performance due to immediate product degrada­tion and unwanted cathodic oxygen reduction. In contrast, our own preliminary experi­ments (see title photo) and a corrosion experi­ment show that algae can live in the dark using electrons from a cath­ode. While the en­zymes in­volved in the production of some algal biofuels are known (such as olefin and alde­hyde de­formylation), it is not known whether these pathways are connected to H2 utilization (perhaps via ferredox­ins). Such a con­nection would be a promising indicator for the possibility of growing hydrocar­bon produc­ing cyanobacteria on cathodes of BES and should be examined in future research.
At Frontis Energy we believe that a number of other microorganisms show potential for BEEF production and these deserve further investi­gation. To avoid GMOs, BES compatible co-cultures must be identified via in silico meta­bolic reconstruc­tion from existing databases. Possible inter-species intermediates are unknown but are prerequisite for suc­cessful BES operation. Finally, a techno-economical assessment of BEEF pro­duction, with and with­out car­bon taxes, and compared with chemical methods, will direct future research.

Posted on

White Christmas, going … gone

In Germany, we seem to remember White Christmas from fairy tales only. Now there is also scientific evidence that winter snow cover in Europe is thinning. Thanks to global warming, the snow cover decrease accelerated

The research group behind Dr. Fontrodona Bach of the Royal Netherlands Meteorological Institute in De Bilt analyzed snow cover and climate data from six decades from thousands of weather stations across Europe. The researchers found that the mean snow depth, with the exception of some local extremely cold spots, has been decreasing since 1951 at 12% per decade. The researchers recently published their research results in the journal Geophysical Research Letters. The amount of “extreme” snow cover affecting local infrastructure has declined more slowly.

The observed decline, which accelerated after the 80s, is the result of a combination of rising temperatures and the impact of climate change on precipitation. The decreasing snow cover can reduce the availability of fresh water during the spring melt, the authors noted.

(Photo: Doris Wulf)

Posted on

Ammonia energy storage #1

The ancient, arid landscapes of Australia are not only fertile soil for huge forests and arable land. The sun shines more than in any other country. Strong winds hit the south and west coast. All in all, Australia has a renewable energy capacity of 25 terawatts, one of the highest in the world and about four times higher than the world’s installed power generation capacity. The low population density allows only little energy storage and electricity export is difficult due to the isolated location.

So far, we thought the cheapest way to store large amounts of energy was power-to-gas. But there is another way to produce carbon-free fuel: ammonia. Nitrogen gas and water are enough to make the gas. The conversion of renewable electricity into the high-energy gas, which can also be easily cooled and converted into a liquid fuel, produces a formidable carrier for hydrogen. Either ammonia or hydrogen can be used in fuel cells.

The volumetric energy density of ammonia is almost twice as high than that of liquid hydrogen. At the same time ammonia can be transported and stored easier and faster. Researchers around the world are pursuing the same vision of an “ammonia economy.” In Australia, which has long been exporting coal and natural gas, this is particularly important. This year, Australia’s Renewable Energy Agency is providing 20 million Australian dollars in funding.

Last year, an international consortium announced plans to build a $10 billion combined wind and solar plant. Although most of the 9 terawatts in the project would go through a submarine cable, part of this energy could be used to produce ammonia for long-haul transport. The process could replace the Haber-Bosch process.

Such an ammonia factories are cities of pipes and tanks and are usually situated where natural gas is available. In the Western Australian Pilbara Desert, where ferruginous rocks and the ocean meet, there is such an ammonia city. It is one of the largest and most modern ammonia plants in the world. But at the core, it’s still the same steel reactors that work after the 100 years-old ammonia recipe.

By 1909, nitrogen-fixing bacteria produced most of the ammonia on Earth. In the same year, the German scientist Fritz Haber discovered a reaction that could split the strong chemical bond of the nitrogen, (N2) with the aid of iron catalysts (magnetite) and subsequently bond the atoms with hydrogen to form ammonia. In the large, narrow steel reactors, the reaction produces 250 times the atmospheric pressure. The process was first industrialized by the German chemist Carl Bosch at BASF. It has become more efficient over time. About 60% of the introduced energy is stored in the ammonia bonds. Today, a single plant produces and delivers up to 1 million tons of ammonia per year.

Most of it is used as fertilizer. Plants use nitrogen, which is used to build up proteins and DNA, and ammonia delivers it in a bioavailable form. It is estimated that at least half of the nitrogen in the human body is synthetic ammonia.

Haber-Bosch led to a green revolution, but the process is anything but green. It requires hydrogen gas (H2), which is obtained from pressurized, heated steam from natural gas or coal. Carbon dioxide (CO2) remains behind and accounts for about half of the emissions. The second source material, N2, is recovered from the air. But the pressure needed to fuse hydrogen and nitrogen in the reactors is energy intensive, which in turn means more CO2. The emissions add up: global ammonia production consumes about 2% of energy and produces 1% of our CO2 emissions.

Our microbial electrolysis reactors convert the ammonia directly into methane gas − without the detour via hydrogen. The patent pending process is particularly suitable for removing ammonia from wastewater. Microbes living in wastewater directly oxidize the ammonia dissolved in ammonia and feed the released electrons into an electric circuit. The electricity can be collected directly, but it is more economical to produce methane gas from CO2. Using our technology, part of the CO2 is returned to the carbon cycle and contaminated wastewater is purified:

NH3 + CO2 → N2 + CH4

 

Posted on

Fresh CO2 − Now Even Cheaper!

Hurry up while stocks last, you may want to add. Carbon dioxide (CO2) is a waste product from the combustion of fossil fuels such as oil, gas and coal. It is almost worthless because it finds little use. However, technologies such as power-to-gas or electrosynthesis of methanol are able to convert CO2 directly into a valuable, albeit cheap, product. This increases the commercial interest in CO2 and ultimately the filtering from the air becomes economically interesting. That is, filtering CO2 from the air is now more than just an expensive strategy to fight global warming. Recently, a detailed economic analysis has been published in the journal Joule, which suggests that this filter technology could soon become a viable reality.

The study was published by the engineers of the Canadian company Carbon Engineering in Calgary, Canada. Since 2015, the company has been operating a pilot plant for CO2 extraction in British Columbia. This plant − based on a concept called Direct Air Capture (DAC) − formed the foundation for the presented economic analysis. It includes the costs from suppliers of all major components. According to the study, the cost of extracting a ton of CO2 from the air ranges from $94 to $232, depending on a variety of design options. The latest comprehensive analysis of DAC estimated $600 per tonne and was published by the American Physical Society in 2011.

In addition to Carbon Engineering, the Swiss company Climeworks also works on DAC in Zurich. There, the company has launched a commercial pilot that can absorb 900 tonnes of CO2 from the atmosphere every year for use in greenhouses. Climeworks has also opened a second plant in Iceland that can capture 50 tonnes of CO2 per year and bury it in subterranean basalt formations. According to Daniel Egger of Climeworks, capturing a ton of CO2 at their Swiss site costs about $600. He expect the number to fall below $100 per ton over the next five to ten years.

Technically, CO2 is dissolved in an alkaline solution of potassium hydroxide which reacts with CO2 to form potassium carbonate. After further processing, this becomes a solid residue of calcium carbonate, which releases the CO2 when heated. The CO2 could then be disposed of underground or used to make synthetic, CO2-neutral fuels. To accomplish this, Carbon Engineering has reduced the cost of its filtration plant to $94 per ton of CO2.

CO2-neutral fuel, from carbon dioxide captured from the air and electrolytic hydrogen.

Assuming, however, that CO2 is sequestered in rock, a price of $100 per ton would translate into 0.2 cent per liter gasoline. Ultimately, the economics of CO2 extraction depend on factors that vary by location, including the price of energy and whether or not a company can access government subsidies or a carbon trading market. But the cost per ton of DAC-CO2 is likely to remain above the real market price of CO2 in the near future. For example, emission certificates in the European Union’s trading system are around €16 per tonne of CO2. If CO2 extraction technology were to gain a foothold in markets where carbon can be sold at DAC price, then DAC would of course become economical. Conversion into useful products product such as plastic or fuel could help to include the DAC premium. Alberta seems a great location because its oil is of low quality and comes at high production costs. Moreover, the size of the DAC plant suggests this is done best in Canada, given the size of the country. Albertans may want to reconsider their business model.

At Frontis Energy, we are excited about this prospect. CO2 is accessible everywhere and DAC is helping us convert it into methane gas. Power-to-gas is perfect for this. However, there would still have something to happen. $100 per ton is already good (compared to $600), but to be able to economically place a product like methane on the market it should be more like $10 per tonne:

CO2 economy of power-to-gas with electrolytic hydrogen. Cal, California, EOR, enhanced oil recovery.

Sure, we always complain, but we still cannot wait to see how the price of DAC continues to fall and wish Carbon Engineering to Climeworks all the best. Keep it up!

(Photos: Carbon Engineering)

Posted on

A Brief Account of Wind Energy in the United States, Canada, and the European Union

Wind energy is short for the conversion of energy captured from wind to electrical or mechanical energy. Wind power turbines produce electrical energy and windmills produce mechanical energy. Other forms for wind energy conversion are wind pumps which use wind energy to pump water or sails which drive sail boats.

The cheapest US energy prices by source and county, Source: Energy Institute, University of Texas Austin

Since its first use on sail boats, wind energy is wide spread. Windmills have been used for more than 2,000 years as source of mechanical energy. The Scotsman James Blythe was the first who demonstrated the transformation of wind energy into electrical energy. As wind energy is a renewable source of energy, electrical energy generated by wind turbines is a clean and sustainable form of energy. Wind energy is often also cheaper than natural gas, for example throughout the entire American Midwest, as shown by the Energy Institute of University of Texas, Austin. It is therefore not surprising that wind energy is one of the fastest growing markets in the renewable energy sector worldwide. In 2015, 38% of all renewable energy in the United States and the European Union was generated by wind turbines.

Wind and solar energy production in the US and Canada in 2015. Sources: EIA, Statistics Canada

More efficient than single wind turbines is the use of wind parks where clusters of large turbines constantly generate electrical power. There are two kinds of wind parks, on-shore and off-shore wind parks. Off-shore wind parks are often more expensive but do not use valuable farmland as it is often the case for on-shore wind parks. However, wind parks on farmland can be a valuable addition for farmers seeking an extra income.

Wind and solar energy production in the European Union and the Euro-zone in 2015. WSH is the fraction of renewable energy of the European energy market. “Hydro” is the fraction of hydro power. Source, Eurostat
Posted on

You Can Have the Pie and Eat It

In Paris, humanity has set itself the goal of limiting global warming to 1.5 °C. Most people believe that this will be accompanied by significant sacrifice of quality of life. That is one reason why climate protection is simply rejected by many people, even to the point of outright denial. At Frontis Energy, we think we can protect the climate and live better. The latest study published in Nature Energy by a research group around Arnulf Grubler of the International Institute for Applied Systems Analysis in Laxenburg, Austria, has now shown that we have good reasons.

The team used computer models to explore the potential of technological trends to reduce energy consumption. Among other things, the researchers said that the use of shared car services will increase and that fossil fuels will give way to solar energy and other forms of renewable energy. Their results show that global energy consumption would decrease by about 40% regardless of population, income, and economic growth. Air pollution and demand for biofuels would also decrease, which would improve health and food supplies.

In contrast to many previous assessments, the group’s findings suggest that humans can limit the temperature rise to 1.5 °C above preindustrial levels without resorting to drastic strategies to extract CO2 from the atmosphere later in the century.

Now, one can argue whether shared car services do not cut quality of life. Nevertheless, we think that individual mobility can be maintained while protecting our climate. CO2 recovery for the production of fuels (CO2 recycling that is) is such a possibility. The Power-to-Gas technology is the most advanced version of CO2 recycling and should certainly be considered in future studies. An example of such an assessment of the power-to-gas technology was published by a Swiss research group headed by Frédéric Meylan, who found that the carbon footprint can be neutralized with conventional technology after just a few cycles.

(Picture: Pieter Bruegel the Elder, The Land of Cockaigne, Wikipedia)