What’s in Wildfire Smoke, and How Dangerous Is It?

By the time it reaches the East Coast, a California smoke plume will have changed in a number of ways: Because it’s spent so much time aloft, the bigger particles have fallen out, but new particles will have formed. And because the cloud has spawned ozone, “it can be extremely impactful if you already have some health condition, let’s say asthma,” says Tarik Benmarhnia, a climate change epidemiologist at the UC San Diego’s Scripps institution of Oceanography and School of Medicine.

The solid stuff in wildfire smoke may also contain nasties. “Some particulate matter has more heavy metals than others,” says Mary Prunicki, director of air pollution and health research at Stanford University’s Sean N. Parker Center for Allergy Research. “Lead for example, or cadmium. There’s also other types of cancer-causing toxins. There’s things like PAHs—polyaromatic hydrocarbons,” which are found in fossil fuels. Keep in mind that when a wildfire tears through a residential neighborhood, it’s burning through the synthetic materials that make up homes, cars, and everything else in the built environment. “I think a lot of times we don’t know, when we’re talking about residential areas burning, how much more toxic it is to human health,” Prunicki adds.

With this handy map, you can actually see a forecast of where the smoke will end up. On the left side of the map, click “Vertically Integrated Smoke” to see what’s loading the East Coast atmosphere right now. (Red indicates high levels, blue means low.) The “Surface Smoke” option shows what you’d actually be breathing. As you can see, the latter is snaking a plume of bad air quality all the way to the Midwest, though at the moment not much of it is reaching the ground along the Eastern Seaboard. Which is not to say that it won’t—the weather could change and push the stuff down to ground level, at which point air quality will suffer.

Meanwhile, the West Coast has its own ozone problems because smoke has been recirculating through the region. “It’s staying in the same place, and you’re getting the same pollution from yesterday,” says Buchholz. The more time that goes by, “the more this ozone can be produced with sunlight.”

It’s not helping matters that the West Coast has been suffering extreme heat as these fires have burned—indeed, this and other consequences of climate change are supercharging blazes, because hotter temperatures and drier brush are making wildfires burn more intensely. That heat leads to the formation of yet more ozone at ground level. Hot air rises, so the fiercer the wildfire, the higher it propels smoke into the atmosphere to be carried across the US.

Wherever the smoke lands, we know it won’t be good for human health. “There’s a lot of literature in air pollution research showing associations with PM 2.5 and different types of diseases, in addition to shortened life expectancy, throughout the world,” Prunicki says. Her own research confirms that wildfire smoke specifically leads to inflammation in the lungs. She and her colleagues studied teenagers in Fresno, California, which suffers from bad air quality in general, but also endures blasts of wildfire smoke from forested areas to the east. “We looked at a group that was exposed to a wildfire versus not, and there was an increase in some of the systemic inflammatory biomarkers,” Prunicki says. “So we know that the smoke itself will cause systemic inflammation.” This is unhealthy for anyone, much more so for people with asthma or other respiratory issues.

Prunicki has also found that wildfire smoke causes an immune gene to be turned down, specifically one that produces what are known as T regulatory cells. “And T regulatory cells are needed to kind of have a healthy immune system,” Prunicki says. “It’s a good type of immune cell, not an inflammatory type of immune cell.”

Read More

Antarctic Glaciers Are Growing Unstable Above and Below Water

That study, published today in the Proceedings of the National Academy of Sciences, concludes that understanding how the ice field ruptures as it moves across the bedrock is vital to understanding when this collapse might occur. In addition to identifying the weak points in the glacier, Lhermitte and colleagues created a computer model to predict how such cracking and buckling could affect other Antarctic glaciers in the future.

Lhermitte says the goal of this model was not to predict the exact date when Thwaites will collapse. That’s next to impossible right now, because there are too many other unknown factors to consider, such as the pace of climate change that is warming both the air and water temperature around the glaciers, as well as the movement of ocean currents around Antarctica. (A 2014 study published in the journal Science by University of Washington scientists used satellite data and numerical modeling to predict that the West Antarctic Ice Sheet, including Thwaites, may collapse in 200 to 1,000 years.)

Instead, Lhermitte’s model is an attempt to incorporate ice sheet damage into similar global climate models that predict both sea level rise and the future of Antarctica’s glaciers. “The understanding of how much and how fast these glaciers are going to change is still unknown,” Lhermitte says. “We don’t know all the process. What we have done with this study is look at this damage, the tearing apart of these ice shelves, and what their potential contribution to sea level rise could be.”

Predicting glacier ice movement is difficult because ice behaves as both a solid and as a liquid, says Penn State University professor of geosciences Richard Alley, who was not affiliated with any of these studies. Alley says the study about how glaciers fracture is both new and important because it gives more insight into how fast they might collapse. In an email to WIRED, Alley compared the science of studying how Antarctic glaciers move to the process of engineering a bridge.

“You do NOT want your bridge to break, and you do not want to need to predict exactly the conditions that will make it break, so you design with a large safety margin. We can’t ‘design’ Thwaites, so we face these large uncertainties. Quantifying parts of that is important, although remembering that this is still fracture mechanics, and it still might surprise us, one way or the other,” Alley wrote.

Lhermitte thinks his study results mean that Antarctic glaciers need to be closely watched in the coming years for any signs of rapid change that might lead to an environmental catastrophe. “They are these large sleeping giants,” Lhermitte says about Thwaites and Pine Island glaciers. “We start to be curious if they will stay sleeping or awake with large consequences, with sea level rise.”

More Great WIRED Stories

Read More

Asbestos Removal Is a Hard Job, but Covid-19 Makes It Harder

This story originally appeared on Grist and is part of the Climate Desk collaboration.

It was just before dawn as seven bulky men in T-shirts and sweatpants gathered in front of a towering glass building on Lexington Avenue in New York City. Marcelo Crespo, a 41-year-old with gleaming green eyes and a goatee, beckoned the group over to a white company van, handing each man a pile of protective gear: face mask and respirators, full-body coveralls, shoe covers, hard hats, masking tape.

Clutching their bundles, the men entered through the back door of the building, taking the utility elevator up 32 floors to the roof. The day before, they had sealed up the workspace like an enormous Ziploc bag, covering a large section of the roof with protective plastic structures to shield it from the open air. Before passing through the clear sheeting, Crespo rattled the scaffolding, checking its stability. He traced a sign of the cross on his chest and whispered a prayer that God keep them all safe. Warning signs plastered the makeshift walls, boxes, and equipment. Caution. Danger. Authorized personnel only.

It could have been a scene from the movie Outbreak, but the job took place several months before the Covid-19 pandemic gripped Manhattan. With every breath, the men were still risking serious health problems–even death–as a result of the microscopic particles of asbestos swirling in the air.

Asbestos abatement workers were deemed essential long before the pandemic. Property owners are legally required to call abatement teams in to remove asbestos any time there’s construction, renovation, or retrofitting. Across the United States, during the coronavirus pandemic, some asbestos jobs have even accelerated as several cities are taking advantage of the closures of public spaces to schedule renovations. And there’s a lot more of that on the post-coronavirus horizon: New York City’s Climate Mobilization Act, which was passed last spring, includes a mandate that the city’s biggest buildings reduce their overall emissions by 40 percent by 2030 and 80 percent by 2050 by installing new windows, insulation, and other retrofits to become more energy efficient.

But while the timing makes sense for cities, it’s not so great for abatement workers, whose occupational risks make them especially vulnerable to serious complications of Covid-19.

Judging from its physical properties alone, asbestos is useful stuff: The naturally occurring mineral’s long, fibrous crystals absorb sound and resist fire, heat, and electricity. In ancient Greek, the word for “asbestos” means “inextinguishable.” By the late 19th century, businesses in Europe and North America were competing for rights to mine it. Asbestos turned up everywhere: in concrete, bricks, pipes, flooring, roofing, and couches. It was used as insulation in schools, hospitals, and theaters. Asbestos was used as snow on movie sets in the 1930s, blanketing Dorothy in The Wizard of Oz.

As it grew in popularity, doctors noticed that relatively young asbestos miners were short of breath, suffering from a condition called pulmonary fibrosis. When asbestos fibers become airborne, the small, needle-like filaments can enter the body through the lungs and skin, accumulating in internal organs and building up scar tissue over decades. By the time symptoms show up, people might already have permanent lung disease, genetic damage, or cancerous growths.

In the US, around 39,000 workers die every year from asbestos-related diseases. About 3,000 of these deaths are from mesothelioma, a malignant form of cancer linked to asbestos exposure. And it doesn’t take much: “Mesothelioma can occur at relatively low levels of exposure,” said Victor Roggli, a professor of pathology at Duke University.

Read More

The Bay Area Just Turned Orange. All Eyes Are on PurpleAir

There are a few very good reasons why the BAAQMD’s devices sacrifice speed for accuracy, and for using a system that calibrates with what environmental agencies across the nation are using. “When the EPA, the federal government, makes decisions about air quality on a national level, they can say with some level of confidence that the network in New York is giving you the same type of information as a network in the Bay Area,” says Michael Flagg, principal air quality specialist at the BAAQMD.

This data has to hold up in court when, say, the government needs to prove a company is polluting a given area. Accordingly, the feds have strict policies in place for these AQI-testing machines. “They have to meet certain EPA siting requirements: They have to be greater than 10 meters away from trees. They have to have unobstructed airflow,” says Flagg. “And also the regulatory data undergoes rigorous quality assurance and quality control to ensure the data is accurate.”

PurpleAir’s sensors don’t have to meet these strict rules. People can put them anywhere, including places an air quality expert would know to avoid. Owners might be placing them near chimneys, for instance, throwing off the readings for wildfire smoke. But what PurpleAir might lack in accuracy, it makes up in sheer numbers: AirNow.gov’s map shows one monitor in San Francisco, while PurpleAir’s map shows dozens of monitors within a square mile of my apartment. If one monitor is showing a wildly aberrant AQI reading, and all the others nearby are in general agreement, you get a kind of accuracy by way of averages—and you’re getting it in real time.

“This network is designed to know what the quality is right now,” says Dybwad, of PurpleAir. “And also by virtue of how many there are, you can then say, ‘Look, this one over here is reading, let’s say, green, and I don’t believe that because all of these others are reading orange.’ So just by sheer numbers, it becomes very persuasive in terms of the fact that they all agree.”

And just because PurpleAir’s monitors aren’t as accurate as BAAQMD’s, doesn’t mean the agency’s staffers scoff at the data. It’s quite the opposite, in fact. “The regulatory monitoring network is kind of the backbone of our decisionmaking, and we do that because we can trust the data are accurate,” says Flagg. “And with PurpleAir, we use that data in a qualitative sense. It can be really good at understanding if concentrations are increasing rapidly or decreasing, or if one area is experiencing poor air quality compared to a different area, and things like that. What PurpleAir can be good for is looking at the spatial distribution of smoke during a wildfire, like we’re experiencing now.”

All that data may also be useful in another way, says Adrienne Heinz, a research psychologist at the National Center for PTSD: It’s oddly compelling. For me and many others hunkered down in the orange gloom, relentlessly updating our PurpleAir and AirNow.gov maps offers a way to grasp at some kind of certainty—any kind of certainty—as the Bay Area suffers through this historic collision of disasters. “The more that you can put data into the hands of users, it can be comforting,” says Heinz, who studies the effects of disasters like wildfires and the Covid-19 pandemic. “Obviously, there’s a threshold, right? Like checking PurpleAir 20 times a day, that’s not helpful. But anything that can put it in the hands of consumers and citizens, helps us all come together to make more informed decisions.” So, for instance, timing forays into the outdoors when air quality improves.

Read More

Your Beloved Blue Jeans Are Polluting the Ocean—Big Time

The researchers looked at sediment samples from several habitats, including in the deep-sea Arctic, shallow suburban lakes around Toronto, and the Huron and Ontario Great Lakes. The mean number of microfibers they found per kilogram of dry sediment in each group was, respectively, 1,930, 2,490, and 780. Of those microfibers in general, 22 to 51 percent were anthropogenically modified cellulose, and of that, 41 to 57 percent were indigo denim microfibers. In other words, that’s a lot of denim in the environment. “I think what’s interesting is that a majority of these fibers that we were finding were these anthropogenic cellulose fibers, even in the deep ocean sediments,” says Athey. “And that shows that they are sufficiently persistent to accumulate in these remote regions.”

To be sure that they were characterizing the denim fibers correctly, the scientists ran a separate experiment in the lab, washing three different kinds of blue denim made from 99 or 100 percent cotton: used jeans, new regular jeans, and new mildly distressed jeans. (That meant no more than three holes and some fraying.) They captured their washing machine’s effluent and counted up the fibers.

In accordance with similar studies from other groups, they found that the new jeans shed more fibers than used jeans—which makes sense, as old jeans have long shed all the loose fibers left over from the manufacturing process. But weirdly, they didn’t find a significant difference between the regular new jeans and the mildly distressed new jeans, which you might assume would shed more, given the fraying. “If you have an extremely distressed pair of jeans, they might release a bit more,” says Athey. “But then it could also be the type of material.” Past studies have looked at more synthetic clothes, which probably shed differently than pure cotton. Regardless, Athey and her colleagues landed on a startling figure: A single pair of jeans may release 56,000 microfibers per wash.

The researchers also collected effluent from two wastewater treatment plants, which filter out some, but not all, microfibers before pumping the water into Lake Ontario. (Treatment plants elsewhere pump their effluent out to sea instead.) This landed them at an even more startling figure: Those two plants alone could be unloading a billion indigo denim microfibers per day into the lake. That’s in keeping with the country’s washing habits, as about half of the Canadian population wears jeans almost every day and the average Canadian washes their jeans after just two wears.

Wastewater plants actually do a decent job of sequestering microfibers in the solid “sludge” of human waste, which is turned into “biosolid,” which farmers often use as fertilizer. Unfortunately, packing the microfibers into fertilizer may well be giving them another pathway to get into the sea. As the fertilizer dries on the fields, the wind might pick up the blue jean fibers, and any number of synthetic ones, and deposit them in the ocean for scientists to later find in the sediment. Studies have already shown that microfibers can fly hundreds if not thousands of miles, landing in formerly pristine habitats like the Arctic.

Overall, the problem is that wastewater facilities weren’t designed to capture all these microfibers. They’re catching between 83 and 99 percent of them, but even letting a few percent through is a veritable torrent, given their volume. “The thing is, there’s so many people on the planet—there’s just too many of us,” says University of Toronto environmental scientist Miriam Diamond, coauthor on the paper. “And I think what’s astonishing is how many of us wear jeans. It’s not an indictment of jeans—I want to be really clear that we’re not coming down on jeans. It’s just a really potent example of human impact.”

Read More

A Beautiful Yet Grim Map Shows How Wildfire Smoke Spreads

Now, in the map menu, click on “Vertically Integrated Smoke.” Instead of measuring smoke around 8 meters off the ground, it’s modeling what a 25-kilometer-high column of air looks like in a given place in the US. (Think of it as the smoke that you can see in the sky, versus the near surface smoke being the stuff you’re actually breathing.) As you can see, on this scale, smoke now covers most of the country.

To map this out, HRRR considers the infrared intensity of those fires and projects how much smoke a fire is producing. That smoke starts off in the eddies of what atmospheric scientists call the boundary layer. “It’s the layer through which you feel the bumps when you land in an airplane in the late afternoon anywhere in the country,” says Benjamin. “But then some of that air gets mixed further up above the boundary layer, and then it encounters stronger horizontal winds.” This transports the smoke from west to east.

And as that smokey air moves across the nation, Benjamin adds, “it gets mixed down from that turbulent mixing that takes place in the daytime. And that’s how you get some of that to show back up now near the surface.”

Click back to the “Near Surface Smoke” option, and you can see that only a small fraction of those smoke particles are actually falling out of the atmosphere and reaching the ground on the Eastern Seaboard. So unlike Bay Area residents, you’ve got nothing to worry about if you’re in New York or Philadelphia. “It’s orders of magnitude difference,” Benjamin tells me, the unfortunate Californian. “You’re getting creamed.”

[embedded content]

But even though HRRR is still experimental, it’s quickly become a critical tool for meteorologists and atmospheric scientists because no one’s been able to forecast smoke clouds like this before. Previously, researchers have just been able to look at satellite images to see where smoke currently is. “This is really the first resource that was out there that tells you something about where the smoke you see comes from, really, and what the forecast is,” says atmospheric scientist Joost de Gouw of the Cooperative Institute for Research in Environmental Sciences at the University of Colorado, Boulder.

That’s helped de Gouw plan his experiments, in which he takes atmospheric measurements from aircraft to study how smoke changes chemically as it makes its way through the air. If he knows where the smoke is heading, he knows where to take samples. “Most people, when they think about smoke, they think about smoke particles,” de Gouw says. “But also a lot of gases come along with the smoke, and a lot of those gases are highly reactive—they change on a timescale of hours.”

Read More

Are Radioactive Diamond Batteries a Cure for Nuclear Waste?

In the summer of 2018, a hobby drone dropped a small package near the lip of Stromboli, a volcano off the coast of Sicily that has been erupting almost constantly for the past century. As one of the most active volcanoes on the planet, Stromboli is a source of fascination for geologists, but collecting data near the roiling vent is fraught with peril. So a team of researchers from the University of Bristol built a robot volcanologist and used a drone to ferry it to the top of the volcano, where it could passively monitor its every quake and quiver until it was inevitably destroyed by an eruption. The robot was a softball-sized sensor pod powered by microdoses of nuclear energy from a radioactive battery the size of a square of chocolate. The researchers called their creation a dragon egg.

Dragon eggs can help scientists study violent natural processes in unprecedented detail, but for Tom Scott, a materials scientist at Bristol, volcanoes were just the beginning. For the past few years, Scott and a small group of collaborators have been developing a souped-up version of the dragon egg’s nuclear battery that can last for thousands of years without ever being charged or replaced. Unlike the batteries in most modern electronics, which generate electricity from chemical reactions, the Bristol battery collects particles spit out by radioactive diamonds that can be made from reformed nuclear waste.

Earlier this month, Scott and his collaborator, a chemist at Bristol named Neil Fox, created a company called Arkenlight to commercialize their nuclear diamond battery. Although the fingernail-sized battery is still in a prototyping phase, it’s already showing improvements in efficiency and power density compared to existing nuclear batteries. Once Scott and the Arkenlight team have refined their design, they’ll set up a pilot facility to mass produce them. The company plans for its first commercial nuclear batteries to hit the market by 2024—just don’t expect to find them in your laptop.

Conventional chemical or “galvanic” batteries, like the lithium-ion cells in a smartphone or the alkaline batteries in a remote, are great at putting out a lot of power for a short amount of time. A lithium-ion battery can only operate for a few hours without a recharge, and after a few years it will have lost a substantial fraction of its charge capacity. Nuclear batteries or betavoltaic cells, by comparison, are all about producing tiny amounts of power for a long time. They don’t put out enough juice to power a smartphone, but depending on the nuclear material they use, they can provide a steady drip of electricity to small devices for millennia.

“Can we power an electric vehicle? The answer is no,” says Morgan Boardman, Arkenlight’s CEO. To power something that energy hungry, he says, means “the mass of the battery would be significantly greater than the mass of the vehicle.” Instead, the company is looking at applications where it is either impossible or impractical to regularly change a battery, such as sensors in remote or hazardous locations at nuclear waste repositories or on satellites. Boardman also sees applications that are closer to home, like using the company’s nuclear batteries for pacemakers or wearables. He envisions a future in which people keep their batteries and swap out devices, rather than the other way around. “You’ll be replacing the fire alarm long before you replace the battery,” Boardman says.

Unsurprisingly, perhaps, many people don’t relish the idea of having something radioactive anywhere near them. But the health risk from betavoltaics are comparable to the health risk of exit signs, which use a radioactive material called tritium to achieve their signature red glow. Unlike gamma rays or other more dangerous types of radiation, beta particles can be stopped in their tracks by just a few millimeters of shielding. “Usually just the wall of the battery is sufficient to stop any emissions,” says Lance Hubbard, a materials scientist at Pacific Northwest National Laboratory who is not affiliated with Arkenlight. “The inside is hardly radioactive at all, and that makes them very safe for people.” And, he adds, when the nuclear battery runs out of power, it decays to a stable state, which means no leftover nuclear waste.

Read More

The Biblical Flood That Will Drown California

In their model, 25 days of relentless rains overwhelm the Central Valley’s flood-control infrastructure. Then large swaths of the northern part of the Central Valley go under as much as 20 feet of water. The southern part, the San Joaquin Valley, gets off lighter; but a miles-wide band of floodwater collects in the lowest-elevation regions, ballooning out to encompass the expanse that was once the Tulare Lake bottom and stretching to the valley’s southern extreme. Most metropolitan parts of the Bay Area escape severe damage, but swaths of Los Angeles and Orange Counties experience “extensive flooding.”

As Jones stressed to me in our conversation, the ARkStorm scenario is a cautious approximation; a megastorm that matches 1862 or its relatively recent antecedents could plausibly bury the entire Central Valley underwater, northern tip to southern. As the report puts it: “Six megastorms that were more severe than 1861–1862 have occurred in California during the last 1800 years, and there is no reason to believe similar storms won’t occur again.”

A 21st-century megastorm would fall on a region quite different from gold rush–era California. For one thing, it’s much more populous. While the ARkStorm reckoning did not estimate a death toll, it warned of a “substantial loss of life” because “flood depths in some areas could realistically be on the order of 10–20 feet.”

Then there’s the transformation of farming since then. The 1862 storm drowned an estimated 200,000 head of cattle, about a quarter of the state’s entire herd. Today, the Central Valley houses nearly 4 million beef and dairy cows. While cattle continue to be an important part of the region’s farming mix, they no longer dominate it. Today the valley is increasingly given over to intensive almond, pistachio, and grape plantations, representing billions of dollars of investments in crops that take years to establish, are expected to flourish for decades, and could be wiped out by a flood.

Apart from economic losses, “the evolution of a modern society creates new risks from natural disasters,” Jones told me. She cited electric power grids, which didn’t exist in mid-19th-century California. A hundred years ago, when electrification was taking off, extended power outages caused inconveniences. Now, loss of electricity can mean death for vulnerable populations (think hospitals, nursing homes, and prisons). Another example is the intensification of farming. When a few hundred thousand cattle roamed the sparsely populated Central Valley in 1861, their drowning posed relatively limited biohazard risks, although, according to one contemporary account, in post-flood Sacramento, there were a “good many drowned hogs and cattle lying around loose in the streets.”

Today, however, several million cows are packed into massive feedlots in the southern Central Valley, their waste often concentrated in open-air liquid manure lagoons, ready to be swept away and blended into a fecal slurry. Low-lying Tulare County houses nearly 500,000 dairy cows, with 258 operations holding on average 1,800 cattle each. Mature modern dairy cows are massive creatures, weighing around 1,500 pounds each and standing nearly 5 feet tall at the front shoulder. Imagine trying to quickly move such beasts by the thousands out of the path of a flood—and the consequences of failing to do so.

A massive flood could severely pollute soil and groundwater in the Central Valley, and not just from rotting livestock carcasses and millions of tons of concentrated manure. In a 2015 paper, a team of USGS researchers tried to sum up the myriad toxic substances that would be stirred up and spread around by massive storms and floods. The cities of 160 years ago could not boast municipal wastewater facilities, which filter pathogens and pollutants in human sewage, nor municipal dumps, which concentrate often-toxic garbage. In the region’s teeming 21st-century urban areas, those vital sanitation services would become major threats. The report projects that a toxic soup of “petroleum, mercury, asbestos, persistent organic pollutants, molds, and soil-borne or sewage-borne pathogens” would spread across much of the valley, as would concentrated animal manure, fertilizer, pesticides, and other industrial chemicals.

Read More

Why Hurricane Laura’s Storm Surge Could Be ‘Unsurvivable’

Having strengthened with astonishing speed into a Category 4 storm Wednesday, Hurricane Laura will make landfall in Texas and Louisiana sometime early Thursday morning. With the landfall comes a dreaded storm surge—a rise in water level generated by a storm—that scientists say could spread seawater up to 30 miles inland, an inundation the National Hurricane Center just called “unsurvivable.”

The surge will be particularly dangerous along the coast, but it will remain a threat as the water moves inland. “You have very large currents, very large and dangerous waves pretty far inland along the immediate coastline,” says Brian Zachry, Joint Hurricane Testbed director at the National Hurricane Center. “And if you’re talking about a surge of 15 to 20 feet with very large waves, you just can’t survive that.”

“Even if you go inland,” Zachry adds, “as water gets over the tops of banks of rivers and other estuaries and such, that water can also have some velocity to it. As you see in flash flooding from rainfall, you can get swept away in that.”

For context, 2005’s Hurricane Katrina, a Category 5 storm, had an 18- to 23-foot storm surge. “This storm looks like it will be comparable as far as the levels of storm surge that we’re seeing,” says Mike Chesterfield, a meteorologist at the Weather Channel.

The size of a hurricane’s storm surge depends on a number of factors, “which makes the prediction of storm surge difficult until close to landfall,” writes Katie Peek, a coastal research scientist at Western Carolina University, in an email to WIRED. This includes wind speeds, how fast the storm itself is moving, and atmospheric pressure. “Where a storm makes landfall is also important, as shallower waters offshore and the shape of the coast play a part as well,” Peek writes. “In the case of Laura, the storm is moving through warm, shallow waters and projected to make landfall near an embayment (the shoreline is concave like a bowl) which can cause the waters to further ‘pile up’ along the shore.”

And it isn’t just the fact that the hurricane’s winds are pushing water horizontally onto shore—the storm actually lifts the water vertically. “In the center of a hurricane, you get incredibly low pressures, which actually allows a little bubble to form underneath the hurricane,” says Chesterfield. “The winds come and pick up that water and just pile it up on land. It’s a smaller factor when compared to wind, but it definitely does play a role.”

Not helping matters is the fact that warm water—which is particularly warm in the Gulf of Mexico right now—physically expands, taking up more space than cold water. And this storm could arrive during high tide, which might also add a bit to the surge.

That could mean a veritable wall of water barreling inland, overwhelming anything in its path. “Storm surge itself is and does remain the deadliest aspect of hurricanes,” says Chesterfield. “If you put yourself in a situation where there’s even 10 feet of storm surge, chances of you getting out in one piece are fairly small. But when you get up to 20 feet, there is no home structure, anyway, that’s going to keep you safe.”

This is particularly problematic where Laura could hit—in low-lying parts of Louisiana like the small cities of Houma and Morgan City. And across the state’s coast, inlets and river channels can carry the water farther inland. “You’re on the swamp, essentially,” says Jeremy Porter, head of research and development at First Street Foundation, which analyzes flood risk in the United States. Small cities are not well suited to fend off a storm surge like this. “They just don’t have the infrastructure, because they’re less populated,” Porter adds. “So there’s risk in having a lot of population, but there’s also risk of not, because you don’t have the tax base to build the infrastructure to actually protect yourself from these types of events.”

Read More

A California Wildfire Nearly Destroyed the Historic Lick Observatory

(WIRED reached out to Cal Fire, but they were not able to provide comment before press time due to California’s ongoing wildfires.)

Crews are still monitoring the area around the observatory, given that this group of wildfires, known as the SCU Lightning Complex, continues to blacken the region. As of August 26, the complex is only 25 percent contained, and it has so far chewed through 365,000 acres, or 570 square miles. But it appears the biggest threat to the historic Lick Observatory has passed.

Founded in 1888 by real estate mogul James Lick, then California’s richest rich person, Lick was the world’s first mountaintop observatory. “If you go to Paris, the observatory’s in the middle of Paris. If you go to Bologna, the observatory’s in the middle of Bologna,” says Claire Max, director of the University of California Observatories, which runs Lick. “And so this was the first observatory that really took advantage of being at a high site, where there’s both less pollution and less light pollution, and clearer air.”

You might be thinking, “Well, wait a minute. That was 130 years ago—now the observatory is plopped right above one of the most densely populated areas of the US!” How could Lick compete with the likes of the Hubble Space Telescope, which orbits far above both light pollution and smog? And it’s true that earthbound telescopes have some disadvantages. “For telescopes on the ground, light traveled through literally billions of light years of undisturbed space, and then in the last hundred kilometers, it gets blurred out by turbulence in the air,” says Max.

burning forests from camera
Courtesy of Lick Cameras

The solution is lasers. Lick’s Shane telescope wields an adaptive optics system, which fires a beam into the atmosphere. “Adaptive optics measures the turbulence hundreds or thousands of times a second, and then changes the shape of a special mirror, called the deformable mirror, to take away the blurring,” says Max.

The Shane telescope also has a larger mirror than Hubble: 3 meters compared to 2.4 meters. So if there happens to be a bright enough star nearby your viewing target, Max says, “you can get the same kind of spatial resolution that Hubble gets, because our telescope is more or less the same size, and we’re taking away the blurring of the atmosphere with adaptive optics.”

Not only can the Lick Observatory get astronomers great images—even with all the light pollution and disturbance of the atmosphere above—it’s also more accessible for researchers. “It’s a cutthroat competition to get observing time on Hubble,” Max says. “I don’t know what the fraction of success is, but it’s very low. And then if you do get success, you don’t get very much time to do your observations, because everybody else is chomping at the bit.”

Read More
Page 1 of 512345»