Scientists Find Strange Critters Under a Half Mile of Ice

But because the researchers couldn’t collect specimens, they can’t yet say what exactly these sponges and other critters could be eating. Some sponges filter organic detritus from the water, whereas others are carnivorous, feasting on tiny animals. “That would be sort of your headline of the year,” says Christopher Mah, a marine biologist at the Smithsonian, who wasn’t involved in the research. “Killer Sponges, Living in the Dark, Cold Recesses of Antarctica, Where No Life Can Survive.”

And Griffiths and his team also can’t yet say if mobile creatures like fish and crustaceans also live around the rock—the camera didn’t glimpse any—so it’s not clear if the sessile animals face some kind of predation. “Are they all eating the same food source?” asks Griffiths. “Or are some of them kind of getting nutrients from each other? Or are there more mobile animals around somehow providing food for this community?” These are all questions only another expedition can answer.

It does appear that sedimentation around the rock isn’t very heavy, meaning the animals aren’t in danger of being buried. “It’s kind of a Goldilocks-type thing going on,” says Griffiths of the rock’s apparently fortuitous location, “where it’s got just enough food coming in, and it’s got nothing that wants to eat them—as far as we can tell—and it’s not getting buried by too much sediment.” (In the sediment surrounding the rock, the researchers also noticed ripples that are typically formed by currents, thus bolstering the theory that food is being carried here from afar.)

It’s also not clear how these stationary animals got there in the first place. “Was it something very local, where they kind of hopped from local boulder to local boulder?” asks Griffiths. Alternatively, perhaps their parents lived on a rock hundreds of miles away—where the ice shelf ends and more typical marine ecosystems begin—and released their sperm and eggs to travel in the currents.

Because Griffiths and his colleagues don’t have specimens, they also can’t say how old these animals are. Antarctic sponges have been known to live for thousands of years, so it’s possible that this is a truly ancient ecosystem. Perhaps the rock was seeded with life long ago, but currents have also refreshed it with additional life over the millennia.

The researchers also can’t say whether this rock is an aberration, or if such ecosystems are actually common under the ice. Maybe the geologists didn’t just get extremely lucky when they dropped their camera onto the rock—maybe these animal communities are a regular feature of the seafloor beneath Antarctica’s ice shelves. There’d certainly be a lot of room for such ecosystems: These floating ice shelves stretch for 560,000 square miles. Yet, through previous boreholes, scientists have only explored an area underneath them equal to the size of a tennis court. So it may well be that they’re out there in numbers, and we just haven’t found them yet.

And we may be running out of time to do so. This rock may be locked away under a half mile of ice, but that ice is increasingly imperiled on a warming planet. “There is a potential that some of these big ice shelves in the future could collapse,” says Griffiths, “and we could lose a unique ecosystem.”


More Great WIRED Stories

Read More

Scientists Can Literally Become Allergic to Their Research

This story originally appeared on Undark and is part of the Climate Desk collaboration.

Bryan Fry’s heart was pounding as he stepped back from the snake enclosure and examined the bite marks on his hand. He had just been bitten by a death adder, one of Australia’s most venomous snakes. Its neurotoxin-laced bite could cause vomiting, paralysis and—as the name suggests—death.

Fry, at the time a graduate student, had kept snakes for years. Oddly, the neurotoxins weren’t his biggest worry; the nearby hospital would have the antivenom he needed, and, although data is limited, people who receive treatment generally survive. Anaphylactic shock, on the other hand, might kill him within minutes.

“Anaphylactic shock is the single worst feeling you can possibly imagine,” recalled Fry, now a biologist at the University of Queensland in Australia. “It is just insane. Every cell in your body is screaming out in mortal terror.”

Fry, who had spent his life admiring and eventually studying venomous snakes, had become deathly allergic to them.

While most cases are not so extreme, anecdotal reports and expert analysis suggest that it is far from rare for scientists, students, and laboratory technicians to develop allergies to the organisms they study. Perversely, some allergy researchers say, it is the researchers’ passion for their subjects—the close observation, the long hours of work each day, and the years of commitment to a research project—that puts them at such high risk.

“It is true that some things cause allergies more often than others, but the biggest factor is the frequency of the interaction with the study organism,” said John Carlson, a physician and researcher at Tulane University who specializes in insect and dust mite allergies. “You probably have about a 30 percent chance of developing an allergy to whatever it is that you study.” While data is limited, that estimate is in line with research on occupational allergies, which studies suggest occur in as many as 44 percent of people who work with laboratory rodents, around 40 percent of veterinarians, and 25 to 60 percent of people who work with insects.

Federal guidelines suggest that laboratories have “well-designed air-handling systems” and that workers don appropriate personal protective equipment, or PPE, to reduce the risk of developing an allergy. However, interviews with researchers and experts suggest that there may be little awareness of—or adherence to—guidelines like these. For scientists working with less common species and those engaged in fieldwork, information on what exactly constitutes appropriate PPE may be very limited.

Many researchers, perhaps especially those who do fieldwork, are used to being uncomfortable in service of their work, Carlson points out. “I think that a lot of researchers are so interested in the process of the research,” he said, “that they aren’t really considering the long-term effects that it could have on them.”

In general, allergies develop when the immune system overreacts to a substance that is usually harmless, or relatively harmless. The immune system monitors the body for potentially dangerous invaders like bacteria, fungi, and viruses. Sometimes, for reasons that are not well understood, the immune system identifies something benign, like pollen or animal dander, as dangerous. To help mark the intruder, a person who has become sensitized in this way produces antibodies, or types of proteins, to identify it.

When that person comes into contact with the substance again, the antibodies flag it as an invader. As part of the response, immune cells release compounds like histamine, which irritate and inflame the surrounding tissues, resulting in allergy symptoms.

Read More

US Cities Are Way Underreporting Their Carbon Footprints

Having each city build an individual SRI is like developing a national weather forecasting system by asking each county to characterize their local weather, then gathering up all those systems into one cohesive model. “Well, that wouldn’t make any sense when you’re doing weather forecasting,” says Gurney. “In the same way, a greenhouse gas emissions system shouldn’t be every single entity doing this redundantly themselves.”

Instead, Gurney argues that the Vulcan system can shoulder the burden of calculating carbon levels for cities across the US. He and his colleagues have been developing the system for 15 years, incorporating two dozen data sets to thoroughly quantify sources of emissions across the country in fine detail. Vulcan looks at traffic, census, and air quality data, and takes an inventory of the emissions from all the power plants in the US. In some cities, like Los Angeles, the model is so detailed that it can discern how emissions vary block by block. The team has been able to confirm Vulcan’s modeling of emissions with atmospheric measures of CO2 across the US.

And in their new research, they found that city self-reports are often out of step when compared to Vulcan’s outputs. Their study found that some places, like Flagstaff and Palo Alto, significantly overreported their emissions (by about 60 percent and 40 percent, respectively). Others, like the California city of Torrance, underreported by over 100 percent, according to the study. (The team adjusted Vulcan’s output for each city, by the way. If one left out industrial fuel use, for instance, so would Vulcan, in order to better square the results. This means that Vulcan would also come up with somewhat of an underestimation, compared to a complete report.)

So why is it important to resolve these discrepancies? For one thing, city agencies can end up sinking considerable time and resources into mitigating emissions—like by creating more public transport and green spaces, or making the built environment friendlier for pedestrians—so they should have access to the most accurate data possible to figure out what to fund. And local data constantly changes as a city naturally transforms over time, so policymakers can find themselves tasked with making decisions based on a recent SRI report that’s already out of date.

Vulcan, by contrast, is consistently updated every two or three years with new data across the board, which can characterize a city’s growth over time. “We’re suggesting that this needs to be done in an ongoing fashion,” says Gurney. (He says governments can reach out to his team to start digging into Vulcan’s data on their cities.)

Might Vulcan, then, become a sort of standardized platform for American cities to more accurately measure their emissions? “I think that’s plausible, for sure,” says Louisiana State University environmental scientist Brian Snyder, who wasn’t involved in the work. “And I think it’d be a vast improvement upon what they’re doing right now.”

What cities are best equipped to transform at the moment, Snyder argues, is transportation. “If you want to reduce your emissions from transport, you have to know what your emissions are to begin with,” Snyder says. “And one of the nice things that Vulcan does is it sort of shows you—very specifically to the grid space—where they at least think those emissions are coming from.” This could help city agencies figure out where to bolster public transportation, for instance.

But there’s only so much a city can do to reduce emissions in the short term, Snyder adds. “A lot of stuff has been baked into the cake for 100 years.” Oil refineries remain a scar on the landscape for many cities, for instance.

Read More

9 Adventurers Died Mysteriously. A New Theory Explains Why

The researchers modeled how such winds could have built up snow above the tent, and how long it would have taken to reach a critical load that would cause the top slab to slip off the weaker layer below, now that its structural integrity was compromised by the cut. “This was how the loading was increasing,” says Gaume. “Because there was no other way—there was no snowfall on that night.” Sometime after midnight, enough weight had built above the weak layer that it suddenly collapsed, sending the slab into the tent. It would have been a relatively small avalanche—maybe 16 feet by 16 feet—which the researchers simulated with inspiration from the Disney snow model. It would have been enough to fill the hole the campers had dug into the snow, but not enough that the rescue team would be able to find clear signs of an avalanche 26 days later.

Here we see the disturbance caused not by an imaginary snowman, but by the combination of the cut above the tent and the snow deposited by wind.

Video: Guame et al.

An avalanche doesn’t need to be large, though, to cause grave damage to the human body. Typically, hikers who get caught up in one are likely to just suffocate. But in this case, none of the nine victims died of suffocation, and some had severe chest and head trauma.

This, too, can be explained by the dynamics of the slab avalanche and the downward winds. While it wasn’t snowing at the time of the incident, the katabatic winds would have produced a much more dangerous kind of deposit above the tent. “The wind was eroding and transporting the snow, which was made of very small crystals,” says Gaume. “And then when it deposits, [the crystals] are highly compacted.” This could have created a dense slab of snow that weighed perhaps 25 pounds per cubic foot. And even more unfortunate for our adventurers, they’d laid their skis out as a floor for their tent, creating a hard substrate for the snow to crush them against.

Gaume and Puzrin went even further by modeling what this trauma could have looked like. To calibrate their simulation, they used data from old automotive industry crash tests done using human cadavers, rather than dummies. (To be fair, it was the 1970s, which was a … different time.) They then modeled the release of simulated snow blocks of different sizes onto a digital model of a human body, and compared that to the crash test results. “What we saw is that it would not be fatal, but it would create moderate-to-serious injuries,” says Gaume. (Below, you can see the damage a chunk of snow a meter across could do.)

Because the snow deposited by the wind would have been very dense, even a small avalanche could have caused serious injury to the campers in the tent.

Video: Guame et al.

From this, they concluded that the mountaineers survived the initial crush of snow, cutting their way out of the tent, although some of them were seriously injured. But if they’d escaped a relatively small avalanche, why would they flee over half a mile away, instead of sticking around to dig out their supplies, especially their boots? Investigators found the group had actually stashed another set of supplies in the forest, so perhaps they’d set out for them in a panic. “You start to cut the tent from the inside to get out,” says Gaume. “You see there was an avalanche, and then you might be afraid of a second avalanche. And so they may have decided that the best option would probably be to go to the forest, make a fire, and try to find the supply.”

Read More

A New Project Maps the Pacific Coast’s Critical Kelp Forests

This story originally appeared in Canada’s National Observer and is part of the Climate Desk collaboration.

An ambitious project to map and monitor sea kelp forests along the entire British Columbia coast is afoot, and scientists are using seemingly disparate tools—both ancient and modern—to do it.

Researchers are using centuries-old British sea charts and advanced technology, such as camera drones and satellite images, to trace shifts in the abundance and distribution of kelp beds over time, said geographer Maycira Costa.

Like rainforests, BC’s canopy-forming kelp beds are critical and extensive ecosystems that shelter and feed a host of marine life, including juvenile salmon and marine mammals such as seals and otters, said Costa. “We’re trying to combine efforts to understand how these areas have been changing,” she said, adding that climate change in particular is a big concern, “and what we can do to minimize those changes because they’re such an important habitat.”

There is a lack of overall data around kelp beds along the coast, said Costa, who heads the Spectral lab at the University of Victoria, which specializes in using remotely sensed imagery to monitor change in marine environments. Some individual kelp beds in BC have been studied, but not consistently over time in a wider way, leaving a poor understanding of what’s going on with the giant algae populations so critical to the marine ecosystem, Costa said. “It’s one thing to look at kelp beds for just one year, but the important part is looking at several years of data,” said Costa, noting that kelp bed growth or loss can be quite dynamic over short periods of time.

Establishing a widespread picture of where and why kelp is diminishing or growing is critical to determining management or conservation policy and even the commercial harvest of these marine forests, she said.

Mapping the Future of Kelp With Old Sea Charts

But, curiously, to establish a baseline measurement of kelp on the coast, Costa’s high-tech research team relied on antiquated marine maps for the job. Using information from British admiralty charts from 1858 to 1956, the team created the first historical digital map of BC’s coastal kelp beds.

Considered navigational hazards, large kelp beds were carefully notated on British charts, which turned out to be an unusual but valuable source of information about coastal habitat in the 19th century, said Costa. A total of 137 charts were scanned, with the coordinates and kelp beds included on digital maps after ensuring the scale and quality of the data, according to the study.

The chart data suggests most concentrated kelp beds are around the north and west coasts of Vancouver Island, in the Johnstone Strait and in northern waters and northwestern Haida Gwaii.

Vast Quantity of Satellite Images

The next step to map the distribution of kelp on the coast over time is compiling satellite data from 2005 to the present, along with available scientific and government data from kelp inventories from the 1970s to 1990s, Costa said. “You wouldn’t believe the amount of data we have [to analyze],” she said. “For the BC coast, we have almost 6,000 satellite images. The amount of time spent processing data, it’s almost surreal.”

The project is looking at both Bull and giant kelp with help from the Hakai Institute and funding by Fisheries and Oceans Canada (DFO), the Canadian Hydrographic Service and the Pacific Salmon Foundation, Costa said.

A complete kelp map for the Salish Sea, which stretches across the inside passage of Vancouver Island, is expected to be complete by mid-2021, she said, adding maps of BC’s central and north coasts will follow.

Read More

The US Rejoins the Paris Climate Accord. Will It Matter?

With the stroke of a pen from his new desk in the Oval Office, President Joe Biden pulled the US back into the Paris climate accord on Wednesday, an international agreement that experts say is vital to getting the world’s nations to slow the emissions of planet-warming greenhouse gases. The executive order— the third of 17 executive orders or actions issued on his first day in office—means that US officials now will begin calculating a new target for the nation’s overall carbon emissions by the year 2030.

That target, in turn, will require federal, state, and corporate decisionmakers to set new standards for factories, cars, and power plants to use cleaner energy to meet that goal—while likely offering both incentives and penalties to reduce overall energy use by all US residents.

If that wasn’t enough climate action, Biden also signed an order canceling the controversial Keystone XL Pipeline, which would have brought crude oil from Canada to the Gulf of Mexico, an amount of petroleum whose production, refining, and burning would create the equivalent of the carbon dioxide emissions from 35.5 million cars per year. Another executive order signed Wednesday directs federal agencies to block former president Donald Trump’s previous weakening of federal rules that limited the release of emissions of methane, a powerful greenhouse gas, from oil and gas drilling operations, to revise vehicle fuel economy and emissions standards, and to update appliance and building efficiency standards.

Along with his dogs Major and Champ, Biden is bringing with him to the White House a big team of climate change experts, including new senior climate advisers in the Departments of State, Treasury, and Transportation, as well as in the National Security Council and Office of the Vice President. Former Environmental Protection Agency administrator Gina McCarthy is being tapped to head a new White House office on climate policy; former secretary of state John Kerry will be Biden’s new international climate envoy; and David Hayes, a former deputy interior secretary, was named Biden’s special assistant on climate policy, The New York Times reported.

Experts say these first-day moves will set the US on a better path to fight climate change at home and abroad. “The Paris announcement is really important because it puts the US back in the global conversation,” says Jake Schmidt, managing director for the international program at the Natural Resources Defense Council. “It means Biden can also use the influence of the US to drive other countries to act more aggressively on climate change. We’ve been making the case that we need to have a climate-first foreign policy.”

That approach might work in negotiations with countries like Mexico or Brazil, two nations whose current populist leaders have blocked investment in renewable energy (Mexico) and boosted deforestation (Brazil), Schmidt says. If either nation wants to secure trade agreements with the US, Biden might require them to make climate progress in return. Meanwhile, smaller nations are looking to Biden’s election as a return to normalcy and hopefully progress on climate change, especially in countries that are feeling the heat from rising sea levels and increasing tropical storms.

But experts also warn that there are plenty of hurdles ahead. Trump’s four years were marked with disdain for science, the weakening of environmental regulations, and outright denial of the perils of climate change. In fact, one of Trump’s own early executive actions was announcing that the US would withdraw from the Paris agreement, which the US had joined in 2016 under then-president Barack Obama. (The withdrawal process began in 2019 and became official on November 4, 2020—the day after Trump lost his reelection bid.)

Read More

The Ongoing Collapse of the World’s Aquifers

But scientists haven’t modeled global risks of subsidence—until now. To build their model, Sneed and her colleagues scoured the existing literature on land subsidence in 200 locations worldwide. They considered those geological factors (high clay content), as well as topology, as subsidence is more likely to happen on flat land. They factored in population and economic growth, data on water use, and climate variables.

The researchers found that, planet-wide, subsidence could threaten 4.6 million square miles of land in the next two decades. While that’s just 8 percent of Earth’s land, humanity tends to build big cities in coastal areas, which are prone to subsidence. So they estimate that, in the end, 1.6 billion people could be affected. The modeling further found that worldwide, subsidence exposes assets totaling a gross domestic product of $8.19 trillion, or 12 percent of global GDP.

True, gradual subsidence isn’t as destructive as a sudden earthquake or volcanic eruption. “But it will cause these indirect effects or impacts that, in the long term, can produce either damages to structures or infrastructure, or increase floodable areas in these river basins or coastal areas,” says geoscientist Gerardo Herrera-García of the Geological and Mining Institute of Spain, lead author on the paper.

Subsidence is uniquely sensitive to climate change—at least indirectly. On a warmer planet, droughts are longer and more intense. “This is very important,” says Herrera-García. “Because no matter the amount of annual rainfall you have, the most important issue is that you have a prolonged drought period.” Dry reservoirs will lead cities to pump even more water out of their aquifers, and once you collapse the structure of an aquifer by neatly stacking those plates of clay grains, there’s no going back. For the 1.6 billion people potentially affected by subsidence—and that’s just by the year 2040—the consequences could be dire, leading to both water shortages and the flooding of low-lying land.

“It’s definitely very startling results,” says USGS coastal geologist Patrick Barnard, who studies subsidence but wasn’t involved in this new work. “Especially coastal megacities—most of the megacities are, in fact, coastal. So it really highlights the issue in relation to coastal flooding.” And urban populations are booming: According to the United Nations, nearly 70 percent of humans will live in cities by 2050, up from 50 percent currently.

Humanity has tended to construct its cities where rivers empty into the sea, where the conditions for subsidence are ideal. Long ago, these rivers deposited sediments loaded with the clay, which humans then built upon. “The areas that are at high risk are in those kinds of settings near the outlets of river deltas, and where you have low-lying, flat sedimentary basins near coasts,” says University of California, Berkeley geophysicist Roland Burgmann, who studies subsidence but wasn’t involved in this new work. But you can actually find this problem inland, too, for instance in Mexico City, which is built on top of the sediments of a former lake, and is accordingly suffering from subsidence.

Cities built on landfill are also sinking as that material settles. In the Bay Area megalopolis, for instance, some areas are sinking as much as a third of an inch a year. Modeling estimates from researchers at Arizona State University and UC Berkeley hold that by the end of the century, as much as 165 square miles of the Bay Area could be inundated as land sinks and the sea rises.

Subsidence gets even trickier because its effects can vary dramatically over short distances, depending on factors like local clay composition or which side of an earthquake fault the land happens to be on. So this new global study is great for determining risk on a large scale, but scientists will still have to investigate subsidence with a finer focus.

Read More

The Autonomous Saildrone Surveyor Preps for Its Sea Voyage

If you happen to be crossing the San Francisco Bay or Golden Gate bridges this week, look for a massive surfboard with a red sail on top cruising slowly across the water. Don’t flinch if you don’t see anyone on board. It’s actually an autonomous research vessel known as the Saildrone Surveyor and it’s being steered remotely from shore.

The 72-foot-long vessel is launching this week into the bay from its dock at a former naval base in Alameda, California. It is designed to spend months at sea mapping the seafloor with powerful sonar devices, while simultaneously scanning the ocean surface for genetic material to identify fish and other marine organisms swimming below.

The carbon-fiber composite and stainless steel-hulled vessel will navigate by itself, following a preprogrammed route to collect and transmit oceanographic data back to Saildrone headquarters via satellite link. The data will then become available to government and academic scientists studying the ocean. In time, its designers say, they hope that solar-powered Surveyor might replace existing oceanographic research ships that are far more expensive to operate and leave a substantial carbon footprint.

“Our goal is to understand our planet,” says Richard Jenkins, founder and CEO of Saildrone, the California firm that has spent the past 15 years designing previous versions of vessels that are about a third as big as Surveyor. “There are many reasons why you need seafloor information, from knowing where to place telecommunications and transoceanic cables, to safety of navigation, or looking for submerged seismic faults that cause tsunamis.”

Another use is for building out new energy infrastructure: Developers of wind farms need to know the underlying geological conditions before sinking the structures into the seafloor. “There are also economic needs as we transition to renewable energies. Wind farms require substantial mapping to build the wind turbines,” Jenkins says.

After completing sea trials during the next few weeks, the Surveyor’s first mission will be to sail from San Francisco to Hawaii. Along the way, it will map unexplored regions near a series of underwater seamounts where fish and other marine life congregate. The Hawaii trip will also serve as a shakedown cruise for its new sensor package, which includes two multibeam sonars that emit multiple soundwaves from a device under the ship. These soundwaves then reflect off both the ocean floor and things in the water column, like bubbles or fish. As the sound waves bounce back to the ship, the multibeam echosounder receives the waves, interprets the data, and creates visualizations of the entire three-dimensional space beneath the ship. The Surveyor’s multibeam sonars can reach 7,000 meters deep (about 23,000 feet), which would cover the depth of most of the world’s oceans. There’s also a device called an acoustic doppler current profiler that can detect the speed and direction of water currents down to 1,000 meters (3,280 feet).

Scientists have been diving to the seafloor in crewed submersibles for decades, vessels like the newly retooled Alvin that can carry three people down to 6,500 meters. While crewed vehicles allow researchers to get close to hydrothermal vents, erupting underwater volcanoes or unusual deepwater marine habitats, they can only stay down for a few hours and are much more expensive to operate than a drone like Saildrone Surveyor.

Oceanographers want to understand ocean current circulation to get a better idea of how heat and carbon are being absorbed from the atmosphere and then distributed throughout the ocean, Jenkins says. More than 90 percent of the heat trapped by carbon emissions is absorbed by the oceans, making their warmth an undeniable signal of the accelerating crisis. Researchers want to improve their estimates of the global heat and carbon budget—where heat and carbon are both stored and released—to better measure how fast the atmosphere and ocean are changing and what effects might be felt in the future. Saildrone Surveyor will be collecting current and temperature data with its onboard sensors.

Read More

The Plan to Build a Global Network of Floating Power Stations

The SL1 is meant to be attached to submersible sensor-laden research robots known as profiling floats. These devices collect data during short trips as far as a mile beneath the surface. When they emerge from the depths, they beam that information to a satellite. Today, there are thousands of profiling floats drifting through Earth’s oceans as part of an international program called Argo. They remain the best tool scientists have for remotely studying the upper ocean, but their life span and data collection are severely limited by their power sources.

All the floats in the Argo fleet are powered by lithium-ion batteries, which are typically only good for about five years or a few hundred dives. Their battery reliance limits how often they can dive; a typical float only does it once every 10 days. And after its battery dies, a float is usually abandoned, because the cost of collecting it is higher than the cost of the device itself. Still, a float can cost as much as a new car, which makes them expensive pieces of jetsam.

“Anything we put in the ocean is limited by its battery,” says Steve Jayne, a senior scientist at Woods Hole Oceanographic Institution, who isn’t involved with Seatrec. “If you had unlimited energy available to you, you might be able to profile every day instead of every 10 days.”

Seatrec’s ocean generator doesn’t produce a lot of energy—each charging cycle tops it up with about half the energy of a single AA alkaline battery—but that’s more than enough to meet the needs of the low-powered sensors typically found on profiling floats. For applications that require more power, Chao says, it’s possible to increase the size of the generator or simply daisy-chain smaller ones together. The floats are also designed to work in any ocean environment, whether they are trapped among Arctic ice floes or diving among sharks in the tropics. All it takes to adapt them to different regions is tweaking the chemistry of their waxy guts so they solidify and melt at the correct temperatures.

Chao hopes that Seatrec’s ocean generator will deliver on a promise first conceptualized in the 1980s by the renowned oceanographers Douglas Webb and Henry Stommel. They envisioned a globe-spanning fleet of missile-shaped underwater research robots called Slocum gliders that would explore the oceans with the same dexterity, autonomy, and longevity that we’ve come to expect from the robots that NASA sends to explore other planets. Like Seatrec’s SL1, these gliders would be powered by underwater temperature differences.

Although Webb, Stommel, and their collaborators made progress toward bringing a global fleet of Slocums into existence, their vision is still a work in progress, says Matt Palanza, a program engineer at the Woods Hole Oceanographic Institution’s Ocean Observatory Initiative who previously worked with Webb. Palanza’s team at the Ocean Observatory oversees the largest civilian fleet of Slocum gliders in the world—50 in total—and he says the reason there aren’t thousands patrolling the world’s oceans is simply a lack of funding. “The technology is there and continuously being developed,” he says.

Chao and the team at Seatrec believe that extending the vehicles’ life spans with limitless clean energy could drastically increase the size of the ocean research fleets. But the company isn’t the first to work on the technology. In 2003, Webb built a prototype thermal glider that used temperature differences to control its ascent and descent in the ocean, but still relied on batteries for its electronics. In 2008 a team led by researchers at Woods Hole successfully deployed a different glider prototype in the Carribean that used ocean temperature differences to power an electric propulsion system. The following year, Chao and a team of researchers from NASA and the Scripps Institution of Oceanography rolled out Solo-Trec, the world’s first profiling float powered completely by electricity generated from temperature differences.

Read More

What Would It Take to Run a City on 100 Percent Clean Energy?

This story originally appeared on Grist and is part of the Climate Desk collaboration.

In 2014, Burlington, Vermont, the birthplace of Ben and Jerry’s ice cream and the stomping grounds of Senator Bernie Sanders, announced that it had reached an energy milestone. The city of 42,000, which hugs the shore of Lake Champlain, produced enough power from renewable sources to cover all its electricity needs. Burlington, the city government proclaimed, was one of America’s first “renewable cities.”

Since then, Burlington has been joined by Georgetown, Texas, Aspen, Colorado, and a few other small towns across the country. And though some cities have a head start—Burlington benefits from a huge amount of hydroelectric power and ample wood for biomass burning—many that rely on fossil fuels for power are joining in. Today, more than 170 cities and towns across the U.S. have promised to shift their power supply from coal and natural gas to solar, wind, and hydropower. St. Louis, which currently gets only 11 percent of its power from renewables, says that it will run purely on renewables by 2035; coal-dependent Denver has promised to do the same by 2030.

“Cities are setting these goals and striving to go from a very small percentage of renewables to 100 percent on an extremely ambitious timeline,” said Lacey Shaver, city renewable energy manager at the World Resources Institute, via email. “It’s an exciting time for city energy work.”

But are 100 percent renewable cities actually … 100 percent renewable? The reality is a bit complicated—and it shows the challenges of true, “deep” decarbonization of electricity in the United States.

First, shifting to clean electricity doesn’t mean that a city zeroes out its carbon footprint—residents could still be driving gas-guzzling cars or heating their homes with natural gas. Even most claims of running on “clean” electricity come with caveats: What cities actually mean is that they purchase enough electricity from wind, solar, or other clean sources to balance out the power that they use over the course of the year. For places filled with renewables, like Vermont, that’s not such a big deal. But in other areas, a city might not be using all renewable electricity in real-time. Even when the sun isn’t shining and the wind isn’t blowing, electrons still need to be flowing through the grid to keep the lights on. And at the moment, a lot of that more consistent energy comes from non-renewable sources, mainly natural gas and coal.

“There’s really no city that operates as an island in electricity,” said Joshua Rhodes, a research associate at the University of Texas at Austin. “You’re going to be connected to a larger grid.” There’s no such thing as “fossil fuel electrons” and “renewable electrons”—all power mixes together once it reaches the grid. That means even a 100 percent renewable town might, from time to time, be sourcing its electricity from fossil fuels. Because of this, Rhodes says that goals to run purely on renewables are more like accounting mechanisms than a pure description of a city’s energy sources.

At the moment, this isn’t a big problem: Most cities have a long way to go even to get to that stage. The U.S. electricity grid is still over 60 percent powered by fossil fuels, and most cities get only around 15 percent of their power from renewables. When municipal governments buy renewable energy—even if they are still hooked into the larger grid—they add to the demand for wind and solar installations. But in the long run, experts say that this strategy is not going to get the country entirely off fossil fuels.

Read More
Page 1 of 812345»...Last »