Via New York Times, a look at how scientists hope to use technology to help save whales from ship collisions:
Fran washed ashore in August, some 25 miles south of the Golden Gate Bridge. The beloved and much-photographed female humpback whale had a broken neck, most likely the result of being hit by a ship.
This latest instance of oceanic roadkill increased the tally of whales killed by ships near San Francisco this year to four. The true death toll is likely to be much higher as whale carcasses often sink to the sea floor.
Scientists and conservationists are trying to drive that number to zero. On Wednesday, Whale Safe, an A.I.-powered detection system, began operating around San Francisco Bay. Its goal is to warn large ships in the area’s waters when whales are nearby.
About 25 miles out to sea from the Golden Gate on Monday afternoon, a yellow buoy bobbed not far from the great white shark hunting grounds of the Farallon Islands. On a boat close by called the Nova, Douglas McCauley, director of the Benioff Ocean Initiative at the University of California, Santa Barbara, donned a wet suit and snorkel and jumped into the brine to give the buoy some T.L.C. before its big day. The buoy, tethered to an underwater microphone, is an integral part of Whale Safe.
Researchers estimate more than 80 endangered blue, humpback and fin whales are killed by ships each year along the West Coast. With increasing global marine traffic, the problems created by thousands of massive ships crisscrossing waters that teem with ocean giants are expected to only worsen. Near San Francisco in particular, climate change has been shifting the whales’ food closer to shore, placing the whales in harm’s way more often, according to Kathi George, field operations manager for the Marine Mammal Center in Sausalito, Calif.
That’s why Dr. McCauley and a network of collaborators developed Whale Safe with funding from Marc Benioff, founder of Salesforce, and his wife, Lynne. Whale Safe, which has been operating in the Santa Barbara Channel since 2020, provides near-real-time data on the presence of whales and sends out alerts to mariners, shipping companies and anyone else who signs up. The hope is that if ship captains get an alert saying there are lots of whales in the area, they might be more likely to shift course or slow their approach to port — a tactic that research suggests makes deadly collisions less likely.
“The near-real-time aspect of Whale Safe’s alerts and being able to have an idea of where whales are 24 hours a day is really unique and gives us a lot more information to share with ships coming in and out of the Bay,” said Maria Brown, superintendent of the Cordell Bank and Greater Farallones National Marine Sanctuaries for the National Oceanic and Atmospheric Administration.
Expanding Whale Safe from the Southern California shipping lanes to San Francisco will cover the two busiest hubs in California and two epicenters of whale mortality from ship strikes.
In 2021, the first full year of Whale Safe’s operation in the Santa Barbara Channel, there were no recorded whale-ship interactions in the area, which Dr. McCauley called an encouraging sign.
Whale Safe also uses publicly available location data transmitted by ships to determine whether they slow down to 10 knots during trips through the whales’ feeding grounds, something NOAA has been asking large ships to do during whale season (usually May to November off California) since 2014. Whale Safe processes the information on vessel speed and assigns shipping companies a letter grade.
Maersk, one of the world’s largest shipping companies, earned a “B” for slowing down 79 percent of the time in the Santa Barbara Channel. But ships operated by Matson, a major player in Pacific shipping, slowed only 16 percent of the time and received an “F.”
A spokesperson for Matson said the company had long instructed its ships to participate in NOAA’s voluntary speed reduction programs “to the greatest extent possible, given our operational requirements. A large percentage of our vessels have been averaging less than 12 knots.”
On Monday afternoon at the buoy, Dr. McCauley used a kitchen scrubber and a plastic putty knife to scrape away algae and checked that various instruments were intact. The device’s underwater microphone was positioned some 280 feet beneath his flippers, listening for whales from the sea bottom and attached to its floating counterpart’s communications array with a beefy rubber-clad cable. This high-tech buoy was developed by Mark Baumgartner of the Woods Hole Oceanographic Institution in Massachusetts, and his team is using the same technology to listen for critically endangered North Atlantic Right whales along the East Coast.
Whale Safe uses three data streams: the buoy listens for and identifies the songs of blue, fin and humpback whales with an algorithm and beams its findings to a satellite; a mathematical model informed by present and past oceanographic and biological data predicts where blue whales are most likely to be; and citizen scientists and trained observers report whale sightings via an app called Whale Alert.
Whale Safe’s platform integrates these data sources and alerts ships to their likelihood of encountering whales that day.
In 2019, before the system’s Santa Barbara launch, 46 percent of vessels slowed down in the Southern California voluntary speed reduction zones, and now the percentage has risen to 60 percent in 2022. But those increases can also be credited to a financial incentive program called Protecting Blue Whales and Blue Skies that pays shipping companies that slow down for whales, as well as more than a decade of outreach from NOAA officials like Ms. Brown to shipping companies.
In the San Francisco area, cooperation rates with NOAA’s speed limits have been hovering around 62 percent for the last three years, and the hope is that Whale Safe can help get them higher.
“We are looking to industry to rise to the occasion voluntarily,” Ms. Brown said. “If they can’t do that, our council has asked us to consider making these speed limits mandatory like they are on the East Coast where they have 80 percent compliance.”
The response from shipping companies has been encouraging, Dr. McCauley said, with some of the world’s largest outfits asking for more information about the good or bad grade they received and on how to get Whale Safe’s alerts to their fleets most efficiently.
CMA CGM, the world’s third largest container shipping company, has created an automated pipeline to disseminate Whale Safe’s alerts straight to ship captains near the Santa Barbara Channel.
Whale Safe’s team is also working with Hyundai Heavy Industries, the world’s largest shipbuilder, to bring the system’s data directly into the navigation systems of newly built ships, said Callie Steffen, a scientist at the Benioff Ocean Initiative and Whale Safe’s project leader.
Now that the system is switched on in two locations, Dr. McCauley said the immediate goal was to continue outreach with companies and try to reduce whale fatalities from ship strikes to zero in the places where Whale Safe is operating. Ms. Steffen and others aim to expand Whale Safe’s ship-speed monitoring to all areas of designated whale concern in the United States and Canada on both coasts.
On Monday, fog erased the horizon as the Nova motored away from the buoy. When the fog broke, the sea ahead of the boat erupted with whale spouts and leaping sea lions. The boat cut its engines, and Dr. McCauley whipped out a camera with a long lens to try to identify some of the nine humpbacks researchers spotted.
The air took on the fishy, primordial odor of whale breath as everyone on board marveled at the wildness on display. Then the radio crackled: Vessel Traffic Services, which manages ship movement in and out of the Bay, said the Nova needed to exit the shipping lane because a large vessel was coming through. The scientists radioed back that the big ship needed to be warned it was headed into an area where whales had been sighted.
While the Nova headed back to San Francisco, Dr. McCauley said that as he was framing up the feeding humpback whales for his photographs, he couldn’t help but think of the recently deceased Fran.
“That should have been her,” he said, with a slight catch in his voice.
,
Read More »Via Wired, an article on how – in some of the world’s most inaccessible places – tiny satellites are watching—and listening—for signs of destruction:
Fishing boats kept washing up in Japan with dead North Koreans on board. Dozens were documented every year, but they spiked in 2017, with more than 100 boats found on the northern coasts of Japan. No one could explain the appearance of these ghost ships. Why were there so many?
An answer arrived in 2020. Using a swarm of satellites orbiting Earth, a nonprofit organization called Global Fishing Watch in Washington, DC, found that China was fishing illegally in North Korean waters, “in contravention of Chinese and North Korean laws, as well as UN sanctions on North Korea,” says Paul Woods, the organization’s cofounder and chief innovation officer. As a result, North Korean fishermen were having to travel further afield, as far as Russia, something their small ships weren’t suited for. “They couldn’t get back,” says Woods. China, caught out, promptly halted its activities.
The alarming discovery was made possible by the DC-based firm Spire Global, which operates more than 100 small satellites in Earth orbit. These are designed to pick up the radio pings sent out by boats across the globe, which are primarily used by vessels to avoid each other on the seas. Listening out for them is also a useful way to track illegal maritime activity.
“The way they move when they’re fishing is distinct,” says Woods of the boats. “We can predict what kind of fishing gear they’re using by their speed, direction, and the way they turn.” Of the 60,000 vessels that emit such pings, Woods says 5,000 have been found conducting illegal activities thanks to Spire, including fishing at restricted times or offloading hauls of protected fish to other vessels to avoid checks at ports.
Satellite constellations like Spire’s have seen huge growth in recent years, and novel uses like this are becoming more common. Where once satellites would be large, bulky machines costing tens of millions of dollars, technological advances mean smaller, toaster-sized ones can now be launched at a fraction of the cost. Flying these together in groups, or constellations, to conduct unique assignments has become an affordable prospect. “It’s now economically viable to deploy many, many more satellites,” says Joel Spark, cofounder and a general manager at Spire.
Before 2018, no constellations of more than 100 active satellites had ever been launched into Earth orbit, says Jonathan McDowell, a satellite expert at the Harvard-Smithsonian Center for Astrophysics in the US. Now there are three, with nearly 20 more constellations in the process of being launched and some 200 more in development. It is a “boom in constellations,” says McDowell.
The reasons for flying constellations are numerous. The most notorious is to beam the internet to remote locations, made famous by SpaceX’s Starlink mega-constellation. This vast swarm of 3,000 satellites accounts for nearly half of all those in orbit, and it will swell further to 12,000 or more. Others, like Amazon, have plans for vast space internet constellations of their own. Many are worried about launching so many satellites into orbit, significantly raising the risk of collisions and producing dangerous space junk.
Smaller satellite constellations have their problems too. Many of their satellites lack the ability to maneuver, for example, to avoid a collision. “I’m a little uncomfortable with it,” says McDowell, although their small size means most fall back into our atmosphere within a few years, naturally clearing the skies. For now we can cope, but stricter regulation will be needed in the future as more are launched.
Satellite constellations can encompass the globe, providing valuable data that single satellites cannot. Some can track illegal methane emissions, others can provide useful communications networks, and others still can provide constant imagery of our planet’s surface. “I definitely did not expect the diversity of use cases,” says Sara Spangelo, cofounder and CEO of Swarm Technologies in California, whose own constellation of 160 satellites allows small packets of data to be sent between devices around the globe, even from remote locations, creating a worldwide internet of things.
One organization—Rainforest Connection, based in Texas—has found a particularly novel way of using Swarm’s satellites: tracking illegal logging and poaching in more than 32 countries. In areas where loggers or poachers might operate, Rainforest places solar-powered acoustic sensors called Guardians high in treetops, designed to blend in with the tree from the ground. If the sensors pick up the sound of illegal activity up to 1.5 kilometers away (assessed by software on board the Guardians), such as chain saws or gunshots, they send a signal to one of Swarm’s satellites overhead, which relays the information back to a ground station.
This allows Rainforest Connection to alert law enforcement or locals to illegal activity, from villages in Sumatra to lands that are home to Indigenous tribes in Brazil. “In countries like Brazil and Malaysia, deforestation contributes to over 70 percent of their total greenhouse gas emissions,” says Bourhan Yassin, Rainforest’s CEO. “It’s a very large problem.”
Prior to working with Swarm, Rainforest relied on cellular networks to transmit data. While quicker, that limited its monitoring to regions near populated areas. “With Swarm, we can put the devices anywhere we want,” says Yassin. “It’s doubled up the capability we can do.”
Gai Jorayev at University College London’s Institute of Archaeology, meanwhile, is using imagery from a constellation of more than 200 satellites run by the California firm Planet Labs to track Russia’s shelling of archaeological sites in Ukraine. Planet’s satellites take images of the entire Earth every day. This has enabled Jorayev, working with the Global Heritage Fund in California, to find that more than 165 sites have been damaged or destroyed by Russian shelling.
“Almost everywhere I look, I’m surprised by the levels of damage,” says Jorayev. “I did not expect it at this scale. The damage is very, very bad.”
Planet has provided its imagery free of charge to Jorayev and his team. “I’m exceptionally grateful,” says Jorayev. The hope is that Russia can be held accountable for its actions in future. That, however, “is a long process,” he says.
These are just a handful of ways satellite constellations are being used today: Spire says it has more than 700 customers, Planet also 700, and Swarm about 300. Concerns about collisions and the satellites’ potential to create space junk are well founded, but if we can find ways to adequately supervise these constellations, there are many ways they can prove useful.
“There are important roles that large constellations can play,” says McDowell. “It’s a question of managing it, and not having it be a free-for-all.”
,
Read More »Via BBC, an article on how space tech helping to tackle deforestation:
Conservationist, Leonidas Nzigiyimpa says “you can’t manage what you don’t know”.
He adds: “In order to improve the situation of forests, we need to use new technology.”
Mr Nzigiyimpa is the chief warden of five protected forestry areas in the small central African country of Burundi.
For the past two decades, he and his team have been working with local communities to protect and manage the forest. His face lights up when he describes the fresh smell and beauty of the areas. “It’s pure nature,” he says.
In carrying out his work, Mr Nzigiyimpa has to consider a range of factors, from monitoring the impact of human actions and economies, to tracking biodiversity and the impact of climate change, plus staff numbers and budgets.
To help him track and record all of this, he now uses the latest version of a free piece of software called the Integrated Management Effectiveness Tool.
The tool was developed specifically for such environmental work by a project called Biopama (Biodiversity and Protected Areas Management Programme). This is supported by both the European Union and the 79 member state Organisation of African, Caribbean and Pacific States.
“So, we use this kind of tool to train the managers of the site to use it to collect good data, and to analyse this data, in order to take good decisions,” says Mr Nzigiyimpa.
Tracking and protecting the world’s forests is not just important for the local communities and economies most directly affected. Deforestation contributes to climate change so restoring forests could help combat it.
Some 10 million hectares (25 million acres) of the world’s forests are lost every year, according to the United Nations.
This deforestation accounts for 20% of all the world’s carbon dioxide emissions, according to the World Wildlife Fund, which adds that “by reducing forest loss, we can reduce carbon emissions and fight climate change”.
To try to restore forests and other natural habitats around the world, the United Nations last year launched the UN Decade on Ecosystem Restoration. This has seen countries, companies and other organisations promise action towards preventing, halting and reversing the degradation of ecosystems worldwide.
“But just saying that we’re going to restore, it’s not enough,” says Yelena Finegold, forestry officer at the Food and Agricultural Organization (FAO) of the United Nations. “There’s the need for responsible planning of how that ecosystem restoration is going to happen, followed by actions on the ground enabled by investments in restoration, and monitoring systems in place to track that ecosystem restoration.”
This increased focus on managing forests has given rise to new digital tools to gather, sort and use data better.
One of these is the FAO’s own Framework for Ecosystem Monitoring (Ferm) website. The site was launched last year, and uses satellite imagery to highlight changes to forests around the world. The maps and data are accessible to any internet users, be they a scientist, government official, business, or member of the public.
A key data source for Ferm is US space agency Nasa, and its Global Ecosystem Dynamics Investigation system. Known as Gedi for short, this acronym is pronounced like the word Jedi from the Star Wars films. And continuing the theme of that movie series, its tagline is “may the forest be with you”.
The tech itself is certainly very sci-fi turned real life. “We shoot laser beams at trees from the International Space Station,” says Laura Duncanson, who helps to lead the Gedi project from the Univesrity of Marylands’s Department of Geographical Sciences.
“We use the use the reflected energy to map forests in 3D, including their height, canopy density, and carbon content,” adds Dr Duncanson, who is a leading expert in remote sensing. “This is an exciting new technology because for decades we have been able to observe deforestation from space, but now with Gedi we can assign the carbon emissions associated with forest loss [for greater accuracy].”
Maps and data are also provided to Ferm by Norwegian business Planet Labs, which operates more than 200 camera-equipped satellites. These take some 350 million photos of Earth’s surface on a daily basis, each covering an area of one sq km.
Planet Labs can also be directly hired by governments and businesses around the world. In addition to monitoring forests, its cameras can be used to check everything from droughts to agriculture, energy and infrastructure projects, and monitoring key infrastructure, such as ports.
Remi D’Annunzio, a fellow FAO forestry officer, says that all the available imagery from space “has tremendously changed the way we monitor forests, because it has produced extremely repeatable observations and extremely frequent revisits of places”.
He adds: “Basically, now, with all these publicly available satellites combined, we can get a full snapshot of the Earth every four to five days.”
Examples of how all this near real-time monitoring via Ferm is now being used are pilot schemes in Vietnam and Laos that are trying to tackle illegal logging. Rangers and community workers on the ground are sent alerts to their mobile phones when new deforestation is spotted.
“Now, what we’re really trying to do is not just understand the volume of forests being lost, but where is it specifically being lost in this district or that, so that we can monitor loss, and even prevent it in near real-time, from getting worse,” says FAO forestry officer, Akiko Inoguchi.
,
Read More »Via IEEE Spectrum, an article on the use of sensors to measure the tiny climate-change signals in the deep ocean:
In the puzzle of climate change, Earth’s oceans are an immense and crucial piece. The oceans act as an enormous reservoir of both heat and carbon dioxide, the most abundant greenhouse gas. But gathering accurate and sufficient data about the oceans to feed climate and weather models has been a huge technical challenge.
Over the years, though, a basic picture of ocean heating patterns has emerged. The sun’s infrared, visible-light, and ultraviolet radiation warms the oceans, with the heat absorbed particularly in Earth’s lower latitudes and in the eastern areas of the vast ocean basins. Thanks to wind-driven currents and large-scale patterns of circulation, the heat is generally driven westward and toward the poles, being lost as it escapes to the atmosphere and space.
This heat loss comes mainly from a combination of evaporation and reradiation into space. This oceanic heat movement helps make Earth habitable by smoothing out local and seasonal temperature extremes. But the transport of heat in the oceans and its eventual loss upward are affected by many factors, such as the ability of the currents and wind to mix and churn, driving heat down into the ocean. The upshot is that no model of climate change can be accurate unless it accounts for these complicating processes in a detailed way. And that’s a fiendish challenge, not least because Earth’s five great oceans occupy 140 million square miles, or 71 percent of the planet’s surface.
Providing such detail is the purpose of the Argo program, run by an international consortium involving 30 nations. The group operates a global fleet of some 4,000 undersea robotic craft scattered throughout the world’s oceans. The vessels are called “floats,” though they spend nearly all of their time underwater, diving thousands of meters while making measurements of temperature and salinity. Drifting with ocean currents, the floats surface every 10 days or so to transmit their information to data centers in Brest, France, and Monterey, Calif. The data is then made available to researchers and weather forecasters all over the world.
The Argo system, which produces more than 100,000 salinity and temperature profiles per year, is a huge improvement over traditional methods, which depended on measurements made from ships or with buoys. The remarkable technology of these floats and the systems technology that was created to operate them as a network was recognized this past May with the IEEE Corporate Innovation Award, at the 2022 Vision, Innovation, and Challenges Summit. Now, as Argo unveils an ambitious proposal to increase the number of floats to 4,700 and increase their capabilities, IEEE Spectrum spoke with Susan Wijffels, senior scientist at the Woods Hole Oceanographic Institution on Cape Cod, Mass., and cochair of the Argo steering committee.
Why do we need a vast network like Argo to help us understand how Earth’s climate is changing?
Susan Wijffels: Well, the reason is that the ocean is a key player in Earth’s climate system. So, we know that, for instance, our average climate is really, really dependent on the ocean. But actually, how the climate varies and changes, beyond about a two-to-three-week time scale, is highly controlled by the ocean. And so, in a way, you can think that the future of climate—the future of Earth—is going to be determined partly by what we do, but also by how the ocean responds.
Aren’t satellites already making these kind of measurements?
Wijffels: The satellite observing system, a wonderful constellation of satellites run by many nations, is very important. But they only measure the very, very top of the ocean. They penetrate a couple of meters at the most. Most are only really seeing what’s happening in the upper few millimeters of the ocean. And yet, the ocean itself is very deep, 5, 6 kilometers deep, around the world. And it’s what’s happening in the deep ocean that is critical, because things are changing in the ocean. It’s getting warmer, but not uniformly warm. There’s a rich structure to that warming, and that all matters for what’s going to happen in the future.
How was this sort of oceanographic data collected historically, before Argo?
Wijffels: Before Argo, the main way we had of getting subsurface information, particularly things like salinity, was to measure it from ships, which you can imagine is quite expensive. These are research vessels that are very expensive to operate, and you need to have teams of scientists aboard. They’re running very sensitive instrumentation. And they would simply prepare a package and lower it down the side into the ocean. And to do a 2,000-meter profile, it would maybe take a couple of hours. To go to the seafloor, it can take 6 hours or so.
The ships really are wonderful. We need them to measure all kinds of things. But to get the global coverage we’re talking about, it’s just prohibitive. In fact, there are not enough research vessels in the world to do this. And so, that’s why we needed to try and exploit robotics to solve this problem.
Pick a typical Argo float and tell us something about it, a day in the life of an Argo float or a week in the life. How deep is this float typically, and how often does it transmit data?
Wijffels: They spend 90 percent of their time at 1,000 meters below the surface of the ocean—an environment where it’s dark and it’s cold. A float will drift there for about nine and a half days. Then it will make itself a little bit smaller in volume, which increases its density relative to the seawater around it. That allows it to then sink down to 2,000 meters. Once there, it will halt its downward trajectory, and switch on its sensor package. Once it has collected the intended complement of data, it expands, lowering its density. As the then lighter-than-water automaton floats back up toward the surface, it takes a series of measurements in a single column. And then, once they reach the sea surface, they transmit that profile back to us via a satellite system. And we also get a location for that profile through the global positioning system satellite network. Most Argo floats at sea right now are measuring temperature and salinity at a pretty high accuracy level.
How big is a typical data transmission, and where does it go?
Wijffels: The data is not very big at all. It’s highly compressed. It’s only about 20 or 30 kilobytes, and it goes through the Iridium network now for most of the float array. That data then comes ashore from the satellite system to your national data centers. It gets encoded and checked, and then it gets sent out immediately. It gets logged onto the Internet at a global data assembly center, but it also gets sent immediately to all the operational forecasting centers in the world. So the data is shared freely, within 24 hours, with everyone that wants to get hold of it.
An animated gift of the globe with colored dots to represent the floats.This visualization shows some 3,800 of Argo’s floats scattered across the globe.ARGO PROGRAM
You have 4,000 of these floats now spread throughout the world. Is that enough to do what your scientists need to do?Wijffels: Currently, the 4,000 we have is a legacy of our first design of Argo, which was conceived in 1998. And at that time, our floats couldn’t operate in the sea-ice zones and couldn’t operate very well in enclosed seas. And so, originally, we designed the global array to be 3,000 floats; that was to kind of track what I think of as the slow background changes. These are changes happening across 1,000 kilometers in around three months—sort of the slow manifold of what’s happening to subsurface ocean temperature and salinity.
So, that’s what that design is for. But now, we have successfully piloted floats in the polar oceans and the seasonal sea-ice zones. So we know we can operate them there. And we also know now that there are some special areas like the equatorial oceans where we might need higher densities [of floats]. And so, we have a new design. And for that new design, we need to get about 4,700 operating floats into the water.
But we’re just starting now to really go to governments and ask them to provide the funds to expand the fleet. And part of the new design calls for floats to go deeper. Most of our floats in operation right now go only as deep as about 2,000 meters. But we now can build floats that can withstand the oceans’ rigors down to depths of 6,000 meters. And so, we want to build and sustain an array of about 1,200 deep-profiling floats, with an additional 1,000 of the newly built units capable of tracking the oceans by geochemistry. But this is new. These are big, new missions for the Argo infrastructure that we’re just starting to try and build up. We’ve done a lot of the piloting work; we’ve done a lot of the preparation. But now, we need to find sustained funding to implement that.
Equipment is seen inside a sphere which sits on a table.A new generation of deep-diving Argo floats can reach a depth of 6,000 meters. A spherical glass housing protects the electronics inside from the enormous pressure at that depth.MRV SYSTEMS/ARGO PROGRAM
What is the cost of a typical float?Wijffels: A typical cold float, which just measures temperature, salinity, and operates to 2,000 meters, depending on the country, costs between $20,000 and $30,000 U.S. dollars. But they each last five to seven years. And so, the cost per profile that we get, which is what really matters for us, is very low—particularly compared with other methods [of acquiring the same data].
What kind of insights can we get from tracking heat and salinity and how they’re changing across Earth’s oceans?
Wijffels: There are so many things I could talk about, so many amazing discoveries that have come from the Argo data stream. There’s more than a paper a day that comes out using Argo. And that’s probably a conservative view. But I mean, one of the most important things we need to measure is how the ocean is warming. So, as the Earth system warms, most of that extra heat is actually being trapped in the ocean. Now, it’s a good thing that that heat is taken up and sequestered by the ocean, because it makes the rate of surface temperature change slower. But as it takes up that heat, the ocean expands. So, that’s actually driving sea-level rise. The ocean is pumping heat into the polar regions, which is causing both sea-ice and ice-sheet melt. And we know it’s starting to change regional weather patterns as well. With all that in mind, tracking where that heat is, and how the ocean circulation is moving it around, is really, really important for understanding both what’s happening now to our climate system and what’s going to happen to it in the future.
What has Argo’s data told us about how ocean temperatures have changed over the past 20 years? Are there certain oceans getting warmer? Are there certain parts of oceans getting warmer and others getting colder?
Wijffels: The signal in the deep ocean is very small. It’s a fraction, a hundredth of a degree, really. But we have very high precision instruments on Argo. The warming signal came out very quickly in the Argo data sets when averaged across the global ocean. If you measure in a specific place, say a time series at a site, there’s a lot of noise there because the ocean circulation is turbulent, and it can move heat around from place to place. So, any given year, the ocean can be warm, and then it can be cool…that’s just a kind of a lateral shifting of the signal.
“We have discovered through Argo new current systems that we knew nothing about….There’s just been a revolution in our ability to make discoveries and understand how the ocean works.”
—Susan WijffelsBut when you measure globally and monitor the global average over time, the warming signal becomes very, very apparent. And so, as we’ve seen from past data—and Argo reinforces this—the oceans are warming faster at the surface than at their depths. And that’s because the ocean takes a while to draw the heat down. We see the Southern Hemisphere warming faster than the Northern Hemisphere. And there’s a lot of work that’s going on around that. The discrepancy is partly due to things like aerosol pollution in the Northern Hemisphere’s atmosphere, which actually has a cooling effect on our climate.
But some of it has to do with how the winds are changing. Which brings me to another really amazing thing about Argo: We’ve had a lot of discussion in our community about hiatuses or slowdowns of global warming. And that’s because of the surface temperature, which is the metric that a lot of people use. The oceans have a big effect on the global average surface temperature estimates because the oceans comprise the majority of Earth’s surface area. And we see that the surface temperature can peak when there’s a big El Niño–Southern Oscillation event. That’s because, in the Pacific, a whole bunch of heat from the subsurface [about 200 or 300 meters below the surface] suddenly becomes exposed to the surface. [Editor’s note: The El Niño–Southern Oscillation is a recurring, large-scale variation in sea-surface temperatures and wind patterns over the tropical eastern Pacific Ocean.]
What we see is this kind of chaotic natural phenomena, such as the El Niño–Southern Oscillation. It just transfers heat vertically in the ocean. And if you measure vertically through the El Niño or the tropical Pacific, that all cancels out. And so, the actual change in the amount of heat in the ocean doesn’t see those hiatuses that appear in surface measurements. It’s just a staircase. And we can see the clear impact of the greenhouse-gas effect in the ocean. When we measure from the surface all the way down, and we measure globally, it’s very clear.
Argo was obviously designed and established for research into climate change, but so many large scientific instruments turn out to be useful for scientific questions other than the ones they were designed for. Is that the case with Argo?
Wijffels: Absolutely. Climate change is just one of the questions Argo was designed to address. It’s really being used now to study nearly all aspects of the ocean, from ocean mixing to just mapping out what the deep circulation, the currents in the deep ocean, look like. We now have very detailed maps of the surface of the ocean from the satellites we talked about, but understanding what the currents are in the deep ocean is actually very, very difficult. This is particularly true of the slow currents, not the turbulence, which is everywhere in the ocean like it is in the atmosphere. But now, we can do that using Argo because Argo gives us a map of the sort of pressure field. And from the pressure field, we can infer the currents. We have discovered through Argo new current systems that we knew nothing about. People are using this knowledge to study the ocean eddy field and how it moves heat around the ocean.
People have also made lots of discoveries about salinity; how salinity affects ocean currents and how it is reflecting what’s happening in our atmosphere. There’s just been a revolution in our ability to make discoveries and understand how the ocean works.
As you pointed out earlier, the signal from the deep ocean is very subtle, and it’s a very small signal. So, naturally, that would prompt an engineer to ask, “How accurate are these measurements, and how do you know that they’re that accurate?”
Wijffels: So, at the inception of the program, we put a lot of resources into a really good data-management and quality-assurance system. That’s the Argo Data Management system, which broke new ground for oceanography. And so, part of that innovation is that we have, in every nation that deploys floats, expert teams that look at the data. When the data is about a year old, they look at that data, and they assess it in the context of nearby ship data, which is usually the gold standard in terms of accuracy. And so, when a float is deployed, we know the sensors are routinely calibrated. And so, if we compare a freshly calibrated float’s profile with an old one that might be six or seven years old, we can make important comparisons. What’s more, some of the satellites that Argo is designed to work with also give us ability to check whether the float sensors are working properly.
And through the history of Argo, we have had issues. But we’ve tackled them head on. We have had issues that originated in the factories producing the sensors. Sometimes, we’ve halted deployments for years while we waited for a particular problem to be fixed. Furthermore, we try and be as vigilant as we can and use whatever information we have around every float record to ensure that it makes sense. We want to make sure that there’s not a big bias, and that our measurements are accurate.
You mentioned earlier there’s a new generation of floats capable of diving to an astounding 6,000 meters. I imagine that as new technology becomes available, your scientists and engineers are looking at this and incorporating it. Tell us how advances in technology are improving your program.
Wijffels: [There are] three big, new things that we want to do with Argo and that we’ve proven we can do now through regional pilots. The first one, as you mentioned, is to go deep. And so that meant reengineering the float itself so that it could withstand and operate under really high pressure. And there are two strategies to that. One is to stay with an aluminum hull but make it thicker. Floats with that design can go to about 4,000 meters. The other strategy was to move to a glass housing. So the float goes from a metal cylinder to a glass sphere. And glass spheres have been used in ocean science for a long time because they’re extremely pressure resistant. So, glass floats can go to those really deep depths, right to the seafloor of most of the global ocean.
The game changer is a set of sensors that are sensitive and accurate enough to measure the tiny climate-change signals that we’re looking for in the deep ocean. And so that requires an extra level of care in building those sensors and a higher level of calibration. And so we’re working with sensor manufacturers to develop and prove calibration methods with tighter tolerances and ways of building these sensors with greater reliability. And as we prove that out, we go to sea on research vessels, we take the same sensors that were in our shipboard systems, and compare them with the ones that we’re deploying on the profiling floats. So, we have to go through a whole development cycle to prove that these work before we certify them for global implementation.
You mentioned batteries. Are batteries what is ultimately the limit on lifetime? I mean, I imagine you can’t recharge a battery that’s 2,000 meters down.
Wijffels: You’re absolutely right. Batteries are one of the key limitations for floats right now as regards their lifetime, and what they’re capable of. If there were a leap in battery technology, we could do a lot more with the floats. We could maybe collect data profiles faster. We could add many more extra sensors.
So, battery power and energy management Is a big, important aspect of what we do. And in fact, the way that we task the floats, it’s been a problem with particularly lithium batteries because the floats spend about 90 percent of their time sitting in the cold and not doing very much. During their drift phase, we sometimes turn them on to take some measurements. But still, they don’t do very much. They don’t use their buoyancy engines. This is the engine that changes the volume of the float.
And what we’ve learned is that these batteries can passivate. And so, we might think we’ve loaded a certain number of watts onto the float, but we never achieved the rated power level because of this passivation problem. But we’ve found different kinds of batteries that really sidestep that passivation problem. So, yes, batteries have been one thing that we’ve had to figure out so that energy is not a limiting factor in float operation.
,
Read More »Via Fast Company, an article on a new tool from Google that shows how the planet is changing in near real time:
The planet changes quickly: More than half a million acres are burning in New Mexico. A megadrought is shrinking Lake Mead. The Alps are turning from white to green. Development continues to expand, from cities to massive solar farms. All of these changes impact the Earth’s climate and biodiversity. But in the past, such changes have been difficult to track in detail as they’re happening.
A new tool from Google Earth Engine and the nonprofit World Resources Institute pulls from satellite data to build detailed maps in near real time. Called Dynamic World, it zooms in on the planet in 10-by-10-meter squares from satellite images collected every two to five days. The program uses artificial intelligence to classify each pixel based on nine categories that range from bare ground to trees, crops, and buildings.
Researchers, nonprofits, and other users can “explore and track and monitor changes in these terrestrial ecosystems over time,” says Tanya Birch, senior program manager for Google Earth Outreach. As the tool was being built last year, Birch used it in the days after the Caldor Fire, a wildfire that burned more than 200,000 acres in California. The pixels in satellite images quickly changed from being classified as “trees” to “shrub and scrub.”
Scientists used to rely on statistical tables that were sometimes released only every five years, says Fred Stolle, deputy director of the World Resources Institute’s Forests Program. “That’s clearly not good enough anymore,” he says. “We’re changing so fast, and the impact is so fast, that satellites are now the way to go.”
Researchers and planners already use satellite data in some applications—the World Resources Institute, for example, previously worked with Google to build Global Forest Watch, a tool that can track deforestation using satellite images. But the new data is much more detailed; now it’s sometimes possible to see if one or two trees are cut down in a tropical forest, even when a larger area is intact, Stolle says.
In cities, planners could use the data to easily see which neighborhoods don’t have enough green space. Researchers studying smallholder farms in Africa could use it to see the impacts of drought and when crops are being harvested. Because the data is continuously updated, it’s also possible to watch the seasons change throughout the year across the entire planet. The data goes back five years, and using the new tool, anyone can enter date ranges to see how a location has changed over time.
“I encourage people to dive into it and explore,” Birch says. “There’s a lot of depth and a lot of richness in Dynamic World. . . . I feel like this is really pushing the frontier of mapmaking powered by AI in an incredibly novel way.”
,
Read More »Via Wired, a report on some of the hurdles that stand in the way of ambitious plans to use imagery to help feed people, reduce poverty, and protect the planet:
For the past three decades, three decades, geologist Carlos Souza has worked at the Brazil-based nonprofit Imazon, exploring ways he and the teams he coordinates can use applied science to protect the Amazon rainforest. For much of that time, satellite imagery has been a big part of his job.
In the early 2000s, Souza and colleagues came to understand that 90 percent of deforestation occurs within 5 kilometers of newly created roads. While satellites have long been able to track road expansion, the old way of doing things required people to label those findings by hand, amassing what would eventually become training data. Those years of labor paid off last fall with the release of an AI system that Imazon says reveals 13 times more roadway than the previous method, with an accuracy rate of between 70 and 90 percent.
Proponents of satellite imagery and machine learning have ambitious plans to solve big problems at scale. The technology can play a role in anti-poverty campaigns, protect the environment, help billions of people obtain street addresses, and increase crop yields in the face of intensifying climate change. A UNESCO report published this spring highlights 100 AI models with the potential to transform the world for the better. But despite recent advances in deep learning and the quality of satellite imagery, as well as the record number of satellites expected to enter orbit over the next few years, ambitious efforts to use AI to solve big problems at scale still encounter traditional hurdles, like government bureaucracy or a lack of political will or resources.
Stopping deforestation, for instance, requires more than spotting the problem from space. A Brazilian federal government program helped reduce deforestation from 2004 to 2012 by 80 percent compared to previous years, but then federal support waned. In keeping with an election promise, President Jair Bolsonaro weakened enforcement and encouraged opening the rainforest to industry and cattle ranch settlers. As a result, deforestation in the Amazon reached the highest levels seen in more than a decade.
Other AI-focused conservation groups have run into similar issues. Global Fishing Watch uses machine learning models to identify vessels that turn off GPS systems to avoid detection; they’re able to predict the type of ship, the kind of fishing gear it carries, and where it’s heading. Ideally that information helps authorities around the world target illegal fishing and inform decisions to board boats for inspection at sea, but policing large swaths of the ocean is difficult. Global Fishing Watch’s tech spotted hundreds of boats engaged in illegal squid fishing in 2020, data that head of research David Kroodsma credits with increasing cooperation between China and South Korea, but it didn’t lead to any particular prosecution. Enforcement in ports, he says, is “key to making deterrence scalable and affordable.”
Back on land, the consulting company Capgemini is working with The Nature Conservancy, a nonprofit environmental group, to track trails in the Mojave Desert and protect endangered animal habitats from human activity. In a pilot program last year, the initiative mapped trails created by off-road vehicles in hundreds of square miles of satellite imagery in Clark County, Nevada, to create an AI model that can automatically identify newly created roads. Based on that work, The Nature Conservancy intends to expand the project to monitor the entirety of the desert, which stretches more than 47,000 square miles across four US states.
However, as in the Amazon, identifying problem areas only gets you so far if there aren’t enough resources to act on those findings. The Nature Conservancy uses its AI model to inform conversations with land managers about potential threats to wildlife or biodiversity. Conservation enforcement in the Mojave Desert is overseen by the US Bureau of Land Management, which only has about 270 rangers and special agents on duty.
In northern Europe, the company Iceye got its start monitoring ice buildup in the waters near Finland with microsatellites and machine learning. But in the past two years, the company began to predict flood damage using microwave wavelength imagery that can see through clouds at any time of day. The biggest challenge now, says Iceye’s VP of analytics, Shay Strong, isn’t engineering spacecraft, data processing, or refining machine learning models that have become commonplace. It’s dealing with institutions stuck in centuries-old ways of doing things.
“We can more or less understand where things are going to happen, we can acquire imagery, we can produce an analysis. But the piece we have the biggest challenge with now is still working with insurance companies or governments,” she says.
“It’s that next step of local coordination and implementation that it takes to come up with action,” says Hamed Alemohammad, chief data scientist at the nonprofit Radiant Earth Foundation, which uses satellite imagery to tackle sustainable development goals like ending poverty and hunger. “That’s where I think the industry needs to put more emphasis and effort. It’s not just about a fancy blog post and deep learning model.”
It’s often not only about getting policymakers on board. In a 2020 analysis, a cross-section of academic, government, and industry researchers highlighted the fact that the African continent has a majority of the world’s uncultivated arable land and is expected to account for a large part of global population growth in the coming decades. Satellite imagery and machine learning could reduce reliance on food imports and turn Africa into a breadbasket for the world. But, they said, lasting change will necessitate a buildup of professional talent with technical knowledge and government support so Africans can make technology to meet the continent’s needs instead of importing solutions from elsewhere. “The path from satellite images to public policy decisions is not straightforward,” they wrote.
Labaly Toure is a coauthor of that paper and head of the geospatial department at an agricultural university in Senegal. In that capacity and as founder of Geomatica, a company providing automated satellite imagery solutions for farmers in West Africa, he’s seen satellite imagery and machine learning help decision-makers recognize how the flow of salt can impact irrigation and influence crop yields. He’s also seen it help settle questions of how long a family has been on a farm and assist with land management issues.
Sometimes free satellite images from services like NASA’s LandSat or the European Space Agency’s Sentinel program suffice, but some projects require high-resolution photos from commercial providers, and cost can present a challenge.
“If decision-makers know [the value] it can be easy, but if they don’t know, it’s not always easy,” Toure said.
Back in Brazil, in the absence of federal support, Imazon is now forging ties with more policymakers at the state level. “Right now, there’s no evidence the federal government will lead conservation or deforestation efforts in the Amazon,” says Souza. In October 2022, Imazon signed cooperation agreements with public prosecutors gathering evidence of environmental crimes in four Brazilian states on the border of the Amazon rainforest to share information that can help prioritize enforcement resources.
When you prosecute people who deforest protected lands, the damage has already been done. Now Imazon wants to use AI to stop deforestation before it happens, interweaving that road-detection model with one designed to predict which communities bordering the Amazon are at the highest risk of deforestation within the next year.
Deforestation continued at historic rates in early 2022, but Souza is hopeful that through work with nonprofit partners, Imazon can expand its deforestation AI to the other seven South American countries that touch the Amazon rainforest.
And Brazil will hold a presidential election this fall. The current leader in the polls, former president Luiz Inácio Lula da Silva, is expected to strengthen enforcement agencies weakened by Bolsonaro and to reestablish the Amazon Fund for foreign reforestation investments. Lula’s environmental plan isn’t expected out for a few months, but environmental ministers from his previous term in office predict he will make reforestation a cornerstone of his platform.
,
Read More »