Archive for the ‘Open Source’ Category

Will Open Source Data and AI Help The Oceans Survive?

Via USA Today, a look at how open-source data and AI can help the world’s oceans survive a record-breaking year of heat: Approximately one in four marine life creatures live in coral reefs. Commonly mistaken for plants, corals are critical animals that provide aquatic species with the food, shelter, and breeding grounds necessary for sustaining biodiversity. As […]

Read More »



Tidal: Alphabet X’s New Effort to Protect The Oceans

Via MIT Technology Review, a look at a previously unreported Alphabet X program to use cameras, computer vision, and machine learning to track the carbon stored in the biomass of the oceans:

In late September, Bianca Bahman snorkeled above a seagrass meadow off the western coast of Flores, a scorpion-shaped volcanic island in eastern Indonesia. As she flutter-kicked over the green seabed, Bahman steered an underwater camera suspended on a pair of small pontoons.

The stereoscopic camera captures high-resolution footage from two slightly different angles, creating a three-dimensional map of the ribbon-shaped leaves sprouting from the seafloor.

Bahman is a project manager for Tidal, whose team wants to use these cameras, along with computer vision and machine learning, to get a better understanding of life beneath the oceans. Tidal has used the same camera system to monitor fish in aquafarms off the coast of Norway for several years.

Now, MIT Technology Review can report, Tidal hopes its system can help preserve and restore the world’s seagrass beds, accelerating efforts to harness the oceans to suck up and store away far more carbon dioxide.

Tidal is a project within Alphabet’s X division, the so-called moonshot factory. Its mission is to improve our understanding of underwater ecosystems in order to inform and incentivize efforts to protect the oceans amid mounting threats from pollution, overfishing, ocean acidification, and global warming.

Its tools “can unlock areas that are desperately needed in the ocean world,” Bahman says.

Studies suggest the oceans could pull down a sizable share of the billions of additional tons of carbon dioxide that may need to be scrubbed from the atmosphere each year to keep temperatures in check by midcentury. But making that happen will require restoring coastal ecosystems, growing more seaweed, adding nutrients to stimulate plankton growth, or similar interventions.

Tidal decided to focus initially on seagrass because it’s a fast-growing plant that’s particularly effective at absorbing carbon dioxide from shallow waters. These coastal meadows might be able to suck up much more if communities, companies, or nonprofits take steps to expand them.

But scientists have only a rudimentary understanding of how much carbon seagrass sequesters, and how big a role the plant plays in regulating the climate. Without that knowledge and affordable ways to verify that restoration efforts actually store away more carbon, it will be tricky to track climate progress and build credible carbon credit marketplaces that would pay for such practices.

Tidal hopes to crack the problem by developing models and algorithms that translate the three-dimensional maps of seagrass it captures into reliable estimates of the carbon held below. If it works, automated versions of Tidal’s data-harvesting technology could provide that missing verification tool. This could help kick-start and lend credibility to marine-based carbon credit projects and markets, helping to restore ocean ecosystems and slow climate change.

The team envisions creating autonomous versions of its tools, possibly in the form of swimming robots equipped with its cameras, that can remotely monitor coastlines and estimate the growth or loss of biomass.

“If we can quantify and measure these systems, we can then drive investment to protect and conserve them,” says Neil Davé, the general manager of Tidal.

Still, some scientists are skeptical that Tidal’s technology will be able to accurately estimate shifting carbon levels in distant corners of the globe, among other challenges. Indeed, nature-based carbon credits have faced growing criticism: studies and reporting find that such efforts can overestimate climate benefits, create environmental risks, or present environmental justice concerns.

Davé acknowledges that they don’t know how well it will work yet. But he says that’s precisely what the Tidal team went to Indonesia, along with a group of Australian scientists, to try to find out.

Google launched what was then called Google X in early 2010, with a mandate to go after big, hard, even zany ideas that could produce the next Google.

This research division took over the self-driving-car project now known as Waymo. It developed the Google Brain machine-learning tools that power YouTube recommendations, Google Translate, and numerous other core products of its parent company. And it gave the world the Google Glass augmented-reality headset (whether the world wanted it or not). There were even short-lived flirtations with things like space elevators and teleportation.

X pursued climate-related projects from the start, but has had a very mixed track record in this area to date.

It acquired Makani, an effort to capture wind energy from large, looping kites, but the company shut down in 2020. It also pursued a project to produce carbon-neutral fuels from seawater, dubbed Foghorn, but abandoned the effort after finding it’d be too hard to match the cost of gasoline.

The two official climate “graduates” still operating are Malta, a spinout that relies on molten salt to store energy for the grid in the form of heat, and Dandelion Energy, which taps into geothermal energy to heat and cool homes. Both, however, remain relatively small and are still striving to gain traction in their respective markets.

After 12 years, X has yet to deliver a breakout success in climate or clean tech. The question is whether shifting strategies at X, and the current crop of climate-related efforts like Tidal, will improve that track record.

Astro Teller, the head of X, told MIT Technology Review that the division “pushed hard on radical innovation” at first. But it has since gradually turned up the “rigor dials” in lots of ways, he says, focusing more on the feasibility of the ideas it pursued.

The earlier X climate efforts were generally high-risk, hardware-heavy projects that directly addressed energy technologies and climate emissions, producing electricity, fuels, and storage in novel ways.

There are some clear differences in the climate projects that X is publicly known to be pursuing now. The two aside from Tidal are Mineral, which is using solar-panel-equipped robots and machine learning to improve agricultural practices, and Tapestry, which is developing ways to simulate, predict, and optimize the management of electricity grids.

With Tidal, Mineral, and Tapestry, X is creating tools to ensure that industries can do more to address environmental dangers and that ecosystems can survive in a hotter, harsher world. It’s also leaning heavily in to its parent company’s areas of strength, drawing on Alphabet’s robotics expertise as well as its ability to derive insights from massive amounts of data using artificial intelligence.

Such efforts might seem less transformative than, say, flying wind turbines—less moonshot, more enabling technology.

But while Teller allows that their new thinking may “be changing the character of the things that you see at X today,” he pushes back against the suggestion that the problems it’s pursuing aren’t as hard, big, or important as in the past.

“I don’t know that Tidal has to apologize for some sort of scope problem,” he says.

“Humanity needs the oceans and is killing off the oceans,” he adds. “We have to find a way to get more value from the ocean for humanity, while simultaneously regenerating the oceans instead of continuing to deplete them. And that’s just not going to happen unless we find a way to get automation into the oceans.”

A better protein source
Tidal, founded in 2018, grew out of informal conversations at X about the mounting threats to the oceans and the lack of knowledge required to address them, Davé says.

“The goal was overly simplistic: save the oceans, save the world,” he says. “But it was based on the understanding that the oceans are critical to humanity, but probably the most neglected or misused resource we have.”

They decided to begin by focusing on a single application: aquaculture, which relies on land-based tanks, sheltered bays, or open ocean pens to raise fish, shellfish, seaweed, and more. Today, these practices produce just over half the fish consumed by humans. But the more they’re used, the more they might ease the commercial pressures to overfish, the emissions from fishing fleets, and the environmental impact of trawling.

Tidal believed it could provide tools that would allow aquafarmers to monitor their fish in a more affordable way, spot signs of problems earlier, and optimize their processes to ensure better health and faster growth, at lower cost.

The researchers developed and tested a variety of prototypes for underwater camera systems. They also began training computer vision software, which can identify objects and attributes within footage. To get it started, they used goldfish in a kiddie pool.

For the last five years, they’ve been stress-testing their tools in the harsh conditions of the North Sea, through a partnership with the Norwegian seafood company Mowi.

During a Zoom call, Davé pulled up a black-and-white video of the chaos that ensues at feeding time, when salmon compete to gobble up the food dropped into the pen. It’s impossible for the naked eye to draw much meaning from the scene. But the computer vision software tags each fish with tiny colored boxes as it identifies individuals swimming through the frame, or captures them opening their mouths to feed.

Davé says fish farms can use that data in real time, even in an automated way. For instance, they might stop dropping food into the pen when the fish cease feeding.

The cameras and software can perceive other important information as well, including how much the fish weigh, whether they have reached sexual maturity, and whether they show any signs of health problems. They can detect spinal deformities, bacterial infections, and the presence of parasites known as sea lice, which are often too tiny for the human eye to see.

“We knew from the early days that aquaculture would be us getting our feet wet, so to speak,” says Grace Young, Tidal’s scientific lead. “We knew it would be a stepping stone into working on other hard problems.”

Confident that it’s created one viable commercial application, Tidal is now turning its attention to gathering information about natural ocean ecosystems.

“Now is a big moment for us,” she adds, “because we’re able to see how the tools that we built can apply and make a difference in other ocean industries.”

Restoring our coasts
Seagrasses form thick meadows that can run thousands of miles along shallow coastlines, covering up to about 0.2% of the world’s ocean floors. They provide nutrients and habitat to marine populations, filter pollution, and protect coastlines.

The plants are photosynthetic, producing the food they need from sunlight, water, and carbon dioxide dissolved in ocean waters. They store carbon in their biomass and deliver it into the seabed sediments. They also help capture and bury the carbon in other organic matter that floats past.

Globally, seagrass beds may sequester as much as 8.5 billion tons of organic carbon in seafloor sediments and, to a much, much smaller degree, in their biomass. On the high end, these meadows draw down and store away about 110 million additional tons each year.

But estimates of the total range and carbon uptake rates of seagrass vary widely. A key reason is that there is no cheap and easy way to map the planet’s extensive coastlines. Only about 60% of seagrass meadows have been surveyed in US waters, with “varying degrees of accuracy because of difficulties in remote sensing of underwater habitat,” according to a National Academies study.

” ”
The seagrass meadows along Waecicu Beach in Labuan Bajo, Indonesia.

Whatever their full expanse, though, we know they are shrinking. Development, overfishing, and pollution are all destroying coastal ecosystems, which also include carbon-sucking habitats like mangrove forests and salt marshes. Draining and excavating these shallow biological communities releases hundreds of millions of tons of carbon dioxide each year. Meanwhile, climate change itself is making ocean waters warmer, more acidic, and deeper, placing greater strains on many of the species.

Nations could help halt or reverse these trends by converting developed shorelines back into natural ones, actively managing and restoring wetlands and seagrass meadows, or planting them in new areas where they may do better as ocean levels rise.

Such work, however, would be wildly expensive. The question is who would pay for it, particularly if it comes at the expense of lucrative coastal development.

The main possibility is that companies or governments could create market incentives to support preservation and restoration by awarding credits for the additional carbon that seagrass, mangroves, and salt marshes take up and store away. Tens of billions of dollars’ worth of carbon credits are likely to be traded in voluntary markets in the coming decades, by some estimates.

The carbon market registry Verra has already developed a methodology for calculating the carbon credits earned through such work. At least one seagrass project has applied to earn credits: a long-running effort by the Nature Conservancy’s Virginia chapter to plant eelgrass around the Virginia Barrier Islands.

But some marine scientists and carbon market experts argue that there need to be more rigorous ways to ensure that these efforts are removing as much carbon as they claim. Otherwise, we risk allowing people or businesses to buy and sell carbon credits without meaningfully helping the climate.

Diving in
Tidal began exploring whether its tools could be used for seagrass late last year, as a growing body of studies underscored the need for carbon removal and highlighted the potential role of ocean-based approaches.

“We started to double-click and read a lot of studies,” Davé says. “And found out, ‘Wow, we do have some technology we’ve developed that could be applicable here.’”

The team eventually held a series of conversations with researchers at the Commonwealth Scientific and Industrial Research Organisation (CSIRO), an Australian government science agency that has long used drones, satellites, acoustic positioning systems, and other equipment to survey coral reefs, mangrove forests, and seagrass meadows across the Indo-Pacific.

Seagrass is particularly difficult to map on large scales because in satellite images it’s difficult to distinguish from other dark spots in shallow waters, says Andy Steven, a marine scientist who oversees coastal research efforts at CSIRO.

“The world needs to move to being able to map and then measure change on a far more frequent basis,” Steven says. “I see the Tidal technology being part of an arsenal of methods that help us rapidly survey, process, and deliver information to decision makers on the time frames that are needed. It is addressing a really fundamental issue.”

CSIRO agreed to help Tidal test how well its system works. They collaborated on an earlier field trial off the coast of Fiji this summer and on the subsequent experiment this September in Indonesia. The latter country’s thousands of islands boast one of the world’s largest and most diverse expanses of seagrass meadows.

For the first effort, Tidal opted to couple its software with an off-the-shelf autonomous underwater vehicle equipped with a basic camera. The hope was that if the researchers could scan meadows using standard hardware, their general approach would be more widely accessible.

It didn’t work. The seagrass was taller and the tides were lower than expected. The thruster and rudder quickly got clogged up with seaweed, forcing the team to stop every few minutes, Bahman says.

After a brainstorming whiteboard session, the Tidal team decided to take its own camera system, turn it face down, and put it on a float that could be pulled along by a boat. The so-called Hammersled is equipped with fins to keep it moving straight and a set of ropes and cleats that allow the researchers to dip the camera deeper into the water.

Tidal’s researchers test out the “Hammersled” at a pool in the middle of Alphabet’s campus in Sunnyvale, California, by pulling it over patches of plastic seagrass.

The system worked well enough during a few tests in a large pool in the middle of Alphabet’s campus in Sunnyvale, California, where team members pulled it by hand over patches of plastic seagrass on the bottom.

The bigger test, however, is whether Tidal can translate its maps into an accurate estimate of the carbon seagrass holds and buries in the seafloor.

‘We’ve got it’
After Steven and his colleagues arrived in Labuan Bajo, on the western tip of Flores, they rented a 14-cabin liveaboard, the Sea Safari VII, and began sailing around the islands. They launched surveillance drones from the deck to search for promising seagrass beds to study, prioritizing sites with many different species to help train Tidal’s models and algorithms for the wide variability that occurs in the natural world.

Once the CSIRO researchers selected, measured, tagged, filmed, and photographed their 100-meter transects, the Tidal team passed through.

They used a little Indonesian fishing boat to pull along the Hammersled. Bahman, software engineer Hector Yee, and other staffers took turns jumping into the water with goggles and flippers to clasp a pontoon and keep the camera pointed straight as they crisscrossed the test area.

Once the process was complete, the CSIRO researchers used spades, peat borers, and other tools to pull up the seagrass and deep sediments from one-meter square study plots.

Back on the main island, the Australian scientists used makeshift ovens, including some created from hair dryers, to dry out the plant materials and sediments. Then they ground them up and deposited them into hundreds of plastic bags, carefully marked to denote different locations and depths.

In the months to come, they’ll analyze the carbon content in each batch at their labs in Adelaide, determining the total amount in each plot.

“If our algorithm takes a look at the data we gathered before they took the core samples and comes up with the same answer, then we’ve got it,” says Terry Smith, a solutions engineer with Tidal.

Open questions
Not everyone, however, is convinced that seagrass is a particularly promising path for carbon removal, or one whose climate benefits we’ll be able to accurately assess.

Among the suite of approaches to carbon removal that the National Academies has explored in its studies, those focusing on coastal ecosystems rank near the bottom in terms of the potential to scale them up. That’s largely because these ecosystems can only exist as narrow bands along shorelines, and there’s considerable competition with human activity.

“We need to do everything we can to preserve seagrass,” says Isaac Santos, a professor of marine biogeochemistry at the University of Gothenburg in Sweden, because of the valuable roles these plants play in protecting coasts, marine biodiversity, and more.

“But on the big question—Are they going to save us from climate change?—the answer is straightforward: No,” he says. “They don’t have enough area to sequester enough carbon to make a big impact.”

Accurately determining the net carbon and climate impact from seagrass restoration is also problematic, as studies have highlighted.

Carbon sequestration varies dramatically in these coastal meadows, depending on the location, the season, the mix of species, and how much gets gobbled up by fish and other marine creatures. The carbon in seafloor sediments can also leak into the surrounding waters, where some is dissolved and effectively remains in the ocean for millennia, and some may escape back out into the atmosphere. In addition, coastal ecosystems produce methane and nitrous oxide, potent greenhouse gases that would need to be factored into any estimate of overall climate impact.

Finally, the vast, vast majority of the carbon in seagrass beds is buried in the seafloor, not in the plant material that Tidal intends to measure.

“And we also know that the correlation between biomass and sediment carbon is not straight forward,” Santos said in an email. “Hence, any approach based on biomass only will require all sorts of validations,” to ensure that it actually provides reliable estimates of stored carbon.

An essay in The Conversation late last month highlighted another concern: environmental justice. The authors, Sonja Klinsky of Arizona State University and Terre Satterfield of the University of British Columbia, stressed that the local communities most affected by such projects should have considerable say in them. Some coastal towns may not want to turn their active harbor back into, say, a salt marsh.

“Much of the global population lives near the ocean,” they wrote, and some interventions “might impinge on places that support jobs and communities” and provide significant amounts of food.

Unlocking the secrets
Addressing the scientific questions will require better understanding of coastline ecosystems. CSIRO’s Steven says he hopes that Tidal’s technology will provide easier ways to conduct the necessary studies. “It’s absolutely a challenge,” he says. “But you’ve got to start somewhere.”

As for the environmental justice concerns, Tidal stresses that these nature-based approaches to carbon removal potentially provide multiple benefits to natural ecosystems and local communities. They could, for instance, help to sustain fishery populations. Tidal is also working with CSIRO to train local communities in Fiji and Indonesia, including university students, to help them participate directly in carbon markets.

“Ultimately, our vision is to provide these communities with tools to be able to manage, protect, and repopulate these local systems locally,” Davé said in an email.

So what’s next for Tidal?

It will still take months for the Australian team to complete its analysis of the seagrass and sediments. Whatever they find, the teams plan to continue conducting field experiments to refine the models and algorithms and make sure they provide accurate carbon estimates across a variety of seagrass types in different regions and conditions.

For instance, Tidal may look to partner with other research groups focused on the Bahamas, another major seagrass region.

If it does ultimately work well, Tidal believes, its suite of tools could also support other ocean-based approaches to carbon removal, including growing more seaweed and restoring mangrove forests.

Davé says he can envision a variety of potential business models, including providing carbon measurement, reporting, and verification as a service to offsets registries or organizations carrying out restoration work. They might also create autonomous robotic systems that plant seagrass with little human involvement.

Even if the systems don’t provide reliable enough carbon estimates, Tidal believes its efforts will still aid scientific efforts to understand crucial ocean ecosystems, and support international efforts to protect them. That could include monitoring the well-being of coral reefs, which are gravely threatened by warming waters, Davé says.

It may not sound like a moonshot in the way that X originally conceived of the concept. It’s certainly no space elevator.

But by building tools that a variety of organizations could use in a variety of ways to unlock the secrets of Earth’s critical and fragile ecosystems, Tidal may be demonstrating a new way to take on really hard problems.

,

Read More »



Can a Map of the Ocean Floor Be Crowdsourced?

Via BBC, an article on the potential for crowdsourcing to help map the ocean floor:

Tucked inside a federal government building in the American Rockies is the world’s best collection of seafloor maps. Occasionally a hard drive arrives in the mail, filled with new bathymetric – or seafloor – charts collected by survey vessels and research ships cruising the seas. The world’s largest public map of Earth’s oceans grows just a little bit more.

Cloaked in ocean, the seafloor has resisted human exploration for centuries. Folklore and myths told of it as the domain of terrifying sea monsters, gods, goddesses and lost underwater cities. Victorian-era sailors believed that there was no ocean floor at all, just an infinite abyss where the bodies of drowned sailors came to rest in watery purgatory.

Throughout the last century, modern scientific techniques and sonar have dispelled the stories and revealed a little understood seascape of crusted brine lakes, steaming volcanoes, and vast undulating underwater plains. We have only just begun to map, much less explore, this enormous subsea world.

One organisation wants to change this – and quickly. In 2023, Seabed 2030 announced that its latest map of the entire seafloor is nearly 25% complete. The data to make the world’s first publicly available map is stored at the International Hydrography Organization (IHO)’s Data Centre for Digital Bathymetry (DCDB) in a government building in Boulder, Colorado.

So far, the DCDB holds over 40 compressed terabytes of seafloor data. The biggest contributor is the US academic fleet: 17 research vessels owned by American universities which constantly circle the globe studying the deep ocean. Other contributors include the National Oceanic and Atmospheric Administration (NOAA) fleet, the Geological Survey of Ireland, and Germany’s Federal Maritime and Hydrographic Agency. The biggest users are scientists all over the world who rely on the data to conduct research.

Seabed 2030 has made extraordinary progress by asking countries and corporations to share maps with the DCDB. But unfortunately, the map is not growing quickly enough. Between 2016 and 2021, the map leapfrogged from 6% to 20%. Since then, the pace has slowed. In 2022, it reached just 23.3% complete; in 2023, 24.9%. The ocean mappers came up with a new plan: crowdsourcing.

By attaching a data logger to a boat’s echosounder, any vessel can build a simple map of the seafloor
“Crowdsourced bathymetry came about a few years ago when the IHO was saying: ‘At this rate, we’re never going to map the whole darn ocean; we need to start looking outside the box,'” says Jennifer Jencks, the director of the DCDB and the chair of a crowdsourced working group at the IHO.

By attaching a data logger to a boat’s echosounder, any vessel can build a simple map of the seafloor. This is crucial in developing coastal and island nations. Tion Uriam, the head of the Hydrographic Unit at the Republic of Kiribati’s Ministry of Communications, Transport and Tourism Development, recently received two data loggers that he’s planning to install on local ferries. “It’s a win to be part of that initiative,” he says. “Just to put us on the map and raise our hands [to say] we want to be part of a global effort. Our contribution might be small – but it’s a contribution.”

Kiribati is a Pacific island nation of about 130,000 people spread across 33 coral atolls, only 20 of which are inhabited. British charts published in the 1950s and 1960s have been the most accurate maps to date; the United Kingdom and United States claimed various islands as protectorates or territories, mining them for phosphate or using them as whaling stations. Other British maps used are old and inaccurate; some date back to the late Victorian age or list depth measurements in fathoms, which most countries moved on from years go (the US only retired it in 2022).

That isn’t so unusual in the Pacific, according to marine geologist Kevin Mackay, who oversees Seabed 2030’s South and West Pacific Regional Centre at New Zealand’s National Institute of Water and Atmospheric Research (Niwa) in the capital Wellington. “The big problem in the Pacific is the relic of the colonial system. So, in the Pacific, who looks after the mapping? It’s the Americans through their territories, or the UK through their territories, or the French and their islands, even though they’re now officially independent.” Kiribati gained independence in 1979, but there’s been little progress on surveying since then. In 2020, the World Bank funded a $42m (£34.1m) project to improve maritime infrastructure in the outer islands. A portion of that will go toward seabed mapping.

As one of the least developed countries in the world, most i-Kiribati (the name for Kiribati’s inhabitants) live in the capital of South Tawara: a 17 sq km (6.5 sq mile) crescent-shaped atoll with a population density equal to Tokyo. More people are crowding into the capital in search of a modern life, while the rest live on remote islands where poverty and unemployment is high, amenities are poor and the long-term future uncertain because of rising sea levels and severe tropical storms.

The military or commercial value of nautical charts will always be a barrier to achieving complete coverage of the world map
Improved charts could boost trade, transit and tourism on the outer islands. They could help communities plan for tsunamis, storm surges and rising shorelines. Many islands lack basic tide gauges, and so visiting ships time their arrival for high tide. In his meetings with government ministers, Uriam tries to stress the economic benefits of improving nautical charts in Kiribati.

However, there’s a roadblock when it comes to sharing maps with the DCDB archive back in Boulder. Around a third of the IHO’s 98 member states allow crowdsourcing inside territorial waters. However, the Pacific island nations of Kiribati, the Independent State of Samoa and the Cook Islands, which all recently received data loggers from Seabed 2030, are not among them. Until the governments give their blessing, the new crowdsourced maps will remain under wraps.

Despite Seabed 2030’s publicly stated scientific goal, the military or commercial value of nautical charts will always be a barrier to achieving complete coverage of the world map. “Sea charts, by their very nature, were destined to be removed from the academic realm and from general circulation,” wrote the map historian Lloyd Brown in his book The Story of Maps. “They were much more than an aid to navigation; they were in effect, the key to empire, the way to wealth.”

In a world where only a quarter of the seafloor is charted, there’s still an advantage in knowing more than your rivals. Niwa’s Mackay experienced this himself on a scientific-mapping expedition. He received a call from a military he chooses not to name and “they said ‘you need to destroy that data because there was military value in what you’re mapping, because it’s a place where submarines like to hide’,” he recalls. “Obviously, we ignore them because we’re [mapping] for science, we don’t care. But the military, they find lots of value in bathymetry that, as a scientist, we don’t even think about.”

For some nations, it’s also suspicious that the DCDB is based in the United States, which has the world’s most powerful military. “We have seen concerns as well, that the DCDB is hosted by the United States. Not everyone loves that,” says Jencks. She tries to assuage these concerns by stressing that the DCDB was endorsed by all IHO member states back when it was created in 1990.

In Kiribati, the challenges are less political, more practical, according to Uriam. His position as the head of the Hydrographic Unit only became permanent about a year ago. He used to work in the fisheries department and he knows just how hard is to share data across departments, let alone with outsiders. There’s also hurdles around storing data and hiring people with the right expertise to manage them. Another concern: foreign research vessels have mapped some of Kiribati’s territorial waters before and neglected to share data with the country’s government.

With just over six years left until the deadline, Seabed 2030 faces serious challenges in finishing the first public map of the seafloor. The staggering size of the ocean, the depths, the hostile offshore working environment where ocean mappers are constantly contending with wind, waves, and the corrosive effects of salt water. Then there’s the cost of mapping remote international waters where no country has a responsibility to map.

However, all these challenges seem small compared to the work of uniting countries behind a collective goal, particularly ones as diverse as the US and the Republic of Kiribati. The differences help explain why the goal of finishing a complete map of the seafloor may remain out of reach for many decades to come.

,

Read More »



The Race to Save the World’s DNA

Via The New Yorker, a look at a scientific rescue mission which aims to analyze every plant, animal, and fungus before it’s too late:

Four years ago, a few hundred miles off the coast of West Africa, a crane lifted a bulbous yellow submarine from the research vessel Poseidon and lowered it into the Atlantic. Inside the sub, Karen Osborn, a zoologist at the Smithsonian Institution who was swaddled in warm clothes, tried to ward off nausea. During half an hour of safety checks, Osborn watched water slosh across the submarine’s round window, washing-machine style. Then the crew gave the all-clear and the vessel descended. In the waters of Cape Verde, a volcanic archipelago that is famous for its marine life, Osborn felt the seasickness dissipate. She pressed her face against the glass, peering out at sea creatures until her forehead bruised. “You’re just completely mesmerized by getting to look at these animals in their natural habitat,” she told me.

Osborn was on a mission to find several elusive species, including a bioluminescent worm called Poeobius, and to sequence their genes for a global database of DNA. “We need the genome to figure out how these things are related to each other,” she explained. “Once we have that tree, we can start asking interesting questions about how those animals evolved, how they’ve changed through time, how they’ve adapted to their habitats.” Eventually, such genomes could inspire profound innovations, from new crops to medical cures. Osborn was starting to worry, however: she had already made several trips in the submarine and had not seen a single Poeobius. Each worm measures just a few centimetres in length and feeds on marine snow, or organic detritus that falls from the surface. Because it is yellow on one end, like a cigarette, it is sometimes called the butt worm.

As the pilot steered into deeper waters, Osborn operated a suction hose at the end of a robotic arm. Whenever she spotted organisms that she wanted to sample—crustaceans, sea butterflies, jellies—she’d suck them through a tube and into a collection box that was filled with seawater. She started to wish that the submarine had a rest room on board. Then, a few hundred metres down, she finally saw a group of Poeobius. “Oh, that’s what we want!” she remembers exclaiming. “Go! Go get that!” The pilot slowly turned the sub and Osborn sucked up the worms.

Back on the ship, even before using the rest room, Osborn deposited her boxes in an onboard laboratory. “It’s always exciting to climb out and go look at all the samplers, and take them into the lab and see what animals you’ve gotten,” she told me. She placed one of the Poeobius worms under a microscope, anesthetized it, sliced off a bit of gelatinous tissue, and placed it into a vial, which contained a liquid that would protect the DNA from deterioration. (The butt worm did not survive.) Back at the Smithsonian, a team would extract the genetic material and sequence it. It would soon become a new branch on a growing tree of life.

The evolution of life on Earth—a process that has spanned billions of years and innumerable strands of DNA—could be considered the biggest experiment in history. It has given rise to amoebas and dinosaurs; fireflies and flytraps; even mammals that look like ducks and fish that look like horses. These species have solved countless ecological problems, finding novel ways to eat, evade, defend, compete, and multiply. Their genomes contain information that humans could use to reconstruct the origins of life, develop new foods and medicines and materials, and even save species that are dying out. But we are also losing much of the data; humans are one of the main causes of an ongoing mass extinction. More than forty thousand animal, fungal, and plant species are considered threatened—and those are just the ones we know about.

Osborn is part of a group of scientists who are mounting a kind of scientific salvage mission. It is known as the Earth BioGenome Project, or E.B.P., and its goal is to sequence a genome from every plant, animal, and fungus on the planet, as well as from many single-celled organisms, such as algae, retrieving the results of life’s grand experiment before it’s too late. “This is a completely wonderful and insane goal,” Hank Greely, a Stanford law professor who works with the E.B.P., told me. The effort, described by its organizers as a “moonshot for biology,” will likely cost billions of dollars—yet it does not currently have any direct funding, and depends instead on the volunteer work of scientists who do. Researchers will need to scour oceans, deserts, and rain forests to collect samples before species die out. And, as new species are discovered, the task of sequencing all of them will only grow. “That’s a heavy aspiration that will probably never be entirely achieved,” Greely, who is seventy-one, told me. “It’s like, when you’re my age, planting a young oak tree in your yard. You’re not going to live to see that be a mature oak, but your hope is somebody will.”

For hundreds of years, biologists have roamed the globe in an epic effort to collect and categorize the life on Earth. In the seventeen-hundreds, after traversing Sweden to document its flora and fauna, Carl Linnaeus helped create the system that scientists still use to classify and name species, from Homo sapiens to Poeobius meseres. In 1831, Charles Darwin set out aboard H.M.S. Beagle to collect living and fossilized specimens, which inspired his theory of natural selection. The discovery of DNA, in the nineteenth century, offered a new way to classify species: by comparing their genetic material. DNA’s four building blocks—adenine (A), thymine (T), guanine (G), and cytosine (C)—encode profound differences between organisms. By studying their sequence, we might come to speak life’s language.

Scientists didn’t even begin to sequence a DNA molecule until 1968. In 1977, they sequenced the roughly five thousand base pairs in a virus that invades bacteria. And, in 1990, the Human Genome Project started the thirteen-year process of sequencing almost all of the three billion base pairs in our DNA. Its organizers called the endeavor “one of the most ambitious scientific undertakings of all time, even compared to splitting the atom or going to the moon.” Since then, researchers have been filling in gaps and improving the quality of their sequences, in part by using a new format known as a telomere-to-telomere, or T2T, genome. The first T2T human genome was sequenced only last year, but already scientists with the Earth BioGenome Project are talking about repeating this process for every known eukaryotic species. (Eukaryotes are organisms whose cells have nuclei.)

Because the E.B.P. does not have its own funding, it does not sample or sequence species on its own. Instead, it’s a network of networks; its organizers set ethical and scientific standards for more than fifty projects, including the Darwin Tree of Life, Vertebrate Genomes Project, the African BioGenome Project, and the Butterfly Genome Project. This way, “when we get to the end of the project, it’s not the Tower of Babel,” Harris Lewin, an evolutionary biologist at the University of California, Davis, who chairs the E.B.P. executive council, told me. “You know—your genomes are produced this way, and mine are produced that way, and they’re of different quality, so that, when you compare them, you get different results.”

By 2025, the participants hope to assemble about nine thousand sequences, one from every known family of eukaryotes. By 2029, they aim to have one sequence from every genus—a hundred and eighty thousand in all. After the third and final phase, which could be completed a decade from now, they aim to have sequenced all 1.8 million species that scientists have documented so far. (Roughly eighty per cent of eukaryotic species are still undiscovered.) This database of genomes, including annotations and metadata, will require close to an exabyte of data, or as much as two hundred million DVDs. The amount of information involved is more than “astronomical,” Lewin said; it’s “genomical.” He compared the project to the Webb Space Telescope, which received about ten billion dollars of government funding. Given how much these projects change the way that humans see the world, Lewin said, “the cost is really not that much.”

VIDEO FROM THE NEW YORKER

What a Mammal’s Loss Teaches Us About Mortality: Requiem for a Whale

Natural-history museums already have some of the samples needed to outline a genetic tree of life. The Smithsonian, for instance, has about fifty million biological samples. But, because DNA degrades quickly, it’s difficult to extract a high-quality sequence from, say, a frog in formaldehyde or an old taxidermy parrot. For this reason, the E.B.P. usually restricts itself to recent samples, which are often frozen. It relies on the Global Genome Biodiversity Network to keep track of who has what; another database, called Genomes on a Tree, tracks which species have been sequenced already, and whether they meet exacting standards. Scientists such as Osborn will have to find the rest—and their jobs will only become more difficult as the low-hanging fruit is plucked.

After Osborn collected her butt worms, she had to transport them to her colleagues at the Smithsonian. This process can be more difficult than it sounds. Many researchers keep their samples intact by packing them with dry ice or liquid nitrogen in the field; airport-security workers sometimes flag these packages as suspicious, leading to delays that can spoil the DNA and waste an expedition. Osborn, for her part, checked a large insulated box on the flight from Cape Verde, and then waited a few hours in Newark for Fish and Wildlife officials to approve it for entry. As it turned out, her samples came from an entirely new species of Poeobius; a paper announcing the discovery is forthcoming.

The first stop in the journey from sample to sequence is a genetics laboratory such as the Vertebrate Genome Lab, at the Rockefeller University, on the eastern shore of Manhattan. On a drizzly day last May, I visited the V.G.L. to see how scientists turn a bit of animal tissue into a string of billions of letters. Olivier Fedrigo, a bespectacled geneticist who was then the lab’s director, led me down a hallway decorated with photos of species that had been sequenced there: a snake, a swan, a shark. It was a kind of trophy wall on which inclusion signified not death but a kind of immortality.

Researchers extract DNA from animal tissue in a biosafety-level-two room, which requires goggles, gloves, coats, and special ventilation to protect people and samples. Nivesh Jain, a scientist who works there, told me that he minces the tissue and places it in a lysis buffer—a chemical that breaks open cells—and then uses one of two methods to get the DNA out. The first is a type of microscopic magnetic bead, which is treated with chemicals that help it stick to genetic material; magnets hold the beads and their attached DNA in place while Jain washes everything else away. The second is a glass wafer called a Nanobind disk, which similarly sticks to DNA while Jain removes the rest of the sample. When we met, Jain was standing at a lab bench, checking the concentration of DNA in a vial. The vial would then go to another room, where Jennifer Balacco, the lab-operation lead, would pipette pieces of extracted DNA into little plastic tubes. Special enzymes attach short, recognizable pieces of DNA, called adapters, to the animal DNA, which readies them for the sequencer.

Finally, the samples travel into refrigerator-size PacBio sequencing machines, which, in this case, were labelled with nicknames from “Star Trek.” Enzymes latch onto the adapters and traverse the strands, attaching a color-coded molecule to every building block of DNA. The machine detects the colors and “reads” the sequence that they represent.

It’s not enough to sequence DNA in pieces: scientists must figure out how each fragment connects to make a genome. Genomes tend to be bundled up in complicated shapes. A technique called Hi-C mapping “helps you to sort out the puzzle pieces,” Fedrigo told me. The resulting map of folded DNA is crowded with colorful squiggles. At some computers down the hall from the sequencers, the maps help another team of researchers assemble sequence fragments into a full T2T genome. Nadolina Brajuka, a bioinformatician, was assembling an Asian-elephant genome. “I can physically use key and mouse controls and pick pieces of the genome up and move them around,” she said. The last step is for a “data wrangler” on the team to upload the raw-sequence data file, the final genome assembly, and background information about the sample—including where, when, and how it was collected, and a photo of the species—to a public server called GenomeArk.

One goal of the E.B.P. is to compare and contrast large numbers of genomes, revealing how they are related. Benedict Paten, a computational biologist at the University of California, Santa Cruz, has developed software to align genomes and determine which genes correspond to one another. “It’s a really rich and difficult problem,” he told me, “because genomes evolve by a bunch of really complicated processes.” For a 2020 Nature paper, Paten and several collaborators used powerful computers to align more than a trillion As, Ts, Gs, and Cs and create a tree of six hundred bird and mammal species. On a typical home computer, such an undertaking could have taken more than a million hours. “If you wanted to do it for all plants and animals, it’s just a vast computational challenge,” Paten told me.

During my trip to the Rockefeller University, I visited Erich Jarvis, a well-dressed neurogeneticist who leads the Vertebrate Genomes Project, and asked him to show me the kinds of experiments that the E.B.P. will unlock. Jarvis, the son of two musicians, grew up in Harlem and originally trained as a dancer; today he studies the genes that help animals learn to imitate sounds.

We walked through Jarvis’s expansive laboratory toward a scientist who was peering through a microscope at a bird embryo. In this early stage of development, the scientist explained, it was possible to inject the embryo with cells that contain modified DNA. When the so-called transgenic bird hatched, the lab would be able to study whether the foreign genes affected its ability to learn songs.

A nearby room was filled with caged birds and mice; speakers played sounds while cameras and microphones recorded how animals responded. I bent down to look at a zebra finch, which was chirping away. A surprisingly small number of animals have been shown to imitate sounds, Jarvis told me: songbirds, hummingbirds, parrots, dolphins, whales, seals, bats, elephants, and humans. Figuring out what these animals have in common could help us understand the genetic roots of spoken language. This kind of research, Jarvis went on, is possible only with high-quality complete DNA sequences.

“We humans would benefit so much from nature’s experiment,” Jarvis said. Some species are resistant to sars-CoV-2. Some, including parrots and elephants, rarely get cancer. Some crops produce more food than others. “We’re going to lose that information if we don’t do something about it soon,” he said. The E.B.P. could also empower scientists to study the health of ecosystems. A researcher with access to full genomes can sample some pond water and figure out which species are living there. Such studies could help humans reverse the harms of agriculture, urbanization, and climate change—and fulfill what Jarvis called a “moral duty” to save fellow-species.

The Earth BioGenome Project “is going to blow the door wide open on conservation genomics,” Bridget Baumgartner, who works for an organization called Revive & Restore, told me. Her project, Wild Genomes, is trying to use DNA for the management of endangered species. In Bolivia, scientists are sequencing jaguars to determine which population individual jaguars came from, and also to track illegal wildlife trafficking. In the Mojave Desert, researchers are comparing the genomes of trees that survive in different temperatures, so they’ll know which individuals of that species could be planted in other places as the climate changes. And, in the archipelago of Indonesia, binturongs have been rescued from smugglers and returned to their specific island of origin, which can be determined through DNA. The other part of Revive & Restore aims for the de-extinction of lost species such as the passenger pigeon, with help from the genomes of living animals. Much of the funding for this work originally came from wealthy Bay Area tech investors—“not the typical conservation funder,” Ryan Phelan, Revive & Restore’s executive director and co-founder, said—but increasingly comes from governments.

Right now, the sequencing process is so cumbersome that scientists can’t hope to repeat it a million-plus times in the coming decade. To achieve the necessary pace of hundreds of genomes a day, they will need to automate much of it, perhaps with robots that can prepare samples and improved algorithms that can assemble genomes—though the bottleneck, Lewin stressed, is still the sampling. Of course, all of this will require funding. There’s little precedent for a government project that touches so many scientific fields, Lewin told me. “In the U.S., if you can eat it, U.S.D.A. will fund it. If it’ll kill you, N.I.H. will fund it. If it’s good for energy production, the Department of Energy will fund it. And, if you have some interesting scientific questions, the National Science Foundation will fund it. But there’s no agency that owns it all.” For that reason, Lewin said, the E.B.P.’s organizers are less focussed on assembling a patchwork of grants than finding what he called “a visionary philanthropist.”

Sooner or later, a global database of genomes will have profound practical implications. Some creatures can regrow their limbs; others do not appear to die unless they suffer an injury. If the basis for such traits can be pinpointed in genes, humans might be able to borrow them, perhaps by using gene therapies. “Evolution has already done nearly every experiment, right?” Lewin told me. “There are organisms that’ll eat oil spills, there are organisms that’ll eat heavy metals. I mean, it’s incredible.” But, when genomes inspire new products, to whom will they belong? This question makes the E.B.P. not only a scientific project but a political one.

In the nineties, scientists from the Human Genome Project argued that DNA sequences should be in the public domain, meaning that anyone, anywhere, would be able to use them. “That has been an animating principle for genomics for the past, like, thirty years,” Jacob Sherkow, a professor at the University of Illinois College of Law, told me. More recently, views have changed. “ ‘Public domain’ is a deceptive term used to deny Indigenous peoples rights from things important to them,” Ben Te Aika, an expert on the traditional knowledge of the M?ori people, in New Zealand, told me. “It would be more honest to say ‘domain of the élites.’ ” In the two-thousands, many observers worried that wealthy nations would exploit biological samples without compensating the countries that they come from. This concern helped inspire the Nagoya Protocol, a piece of international legislation that encourages “benefit sharing,” and instructs countries to agree on terms before biological samples are shared. More than a hundred countries have ratified it. (The U.S. is not one of them.)

Te Aika told me that, after centuries of European colonialism, his community has been reasserting its mana, or traditional authority, over native species. He argues that the M?ori people should have the opportunity to benefit from any scientific samples that are gathered in New Zealand. With a colleague from Ireland, Ann Mc Cartney, Te Aika has co-authored papers in support of data sovereignty, or the right of local and Indigenous people “to control data from and about their communities, land, species, and waters.” They described E.B.P. as “an opportunity to leave no one behind.” The scientific collaboration that Te Aika works for, Genomics Aotearoa, is not affiliated with the E.B.P. and has adopted an unusual structure: its data is accessible only to researchers who apply and are invited to travel to New Zealand. Outside scientists may see such restrictions as a kind of red tape, Te Aika said, but “ ‘red tape’ can become necessary when self-regulating systems fail.”

Several scientists told me that the Nagoya Protocol is already outdated. “Benefit sharing in the Nagoya Protocol is getting more strict and confusing,” in part because of debates about how to interpret it, Jarvis said. Currently, he argued, the protocol is discouraging scientists from developing products at all—an outcome that, in his view, helps no one. One argument for commercializing genomes is that “then you can get financial benefit going back to the people that are the caretakers of the land where the animal came from,” he said. “Something has to change.”

The most complex debate, Sherkow told me, is about whether a digital DNA sequence counts as a biological sample. If not, the Nagoya Protocol wouldn’t apply to the strings of letters stored in the E.B.P., and, as Sherkow put it, “It’s everyone for themselves.” Any scientist, company, or country could download a sequence and use it for their own ends, without consulting or compensating the community that the sequence originated from. But, if the sequence is a sample, then genomes will be governed by Nagoya, and many difficult questions will follow. How should the benefits of a discovery or product be shared? Are they owed to the country that the sequence came from, or someone else, such as an Indigenous group? Communities need an opportunity to voice their own priorities: some may want to build capacity for their own research, and others may want compensation or simply credit for their contributions to a discovery. Some of the scientists I spoke to felt that new international laws would need to be written to answer these questions.

The E.B.P. has formed an Ethical, Legal, and Social Issues Committee to work through such challenges. Sherkow described its work as a balancing act: “What’s best for science? What’s best for the world? What’s best for the particular country that we’re taking samples from?” Greely, who chairs the committee, said that it also develops best practices in other areas: interactions with local communities, the humane treatment of animals being sampled, whether to sample in countries ruled by “nasty regimes,” authorship on papers, and even risks of bioterrorism. He added that he was stunned to learn how many international treaties affect biological resources—treaties on food and agriculture, migratory species, whaling, the law of the sea, and more. “A lot of the hangups are not scientific or even engineering hangups,” Sherkow told me. “The biggest hangup to sequencing all the world’s non-human eukaryotes is humans.”

The quest to document life spans scientific disciplines, continents, and generations. Darwin first drew a tree of life in his notebook around 1837; two hundred years later, the E.B.P. could finish some of what he started. Last May, Mark Blaxter, an evolutionary biologist in the U.K. who contributes to the project and is the director of the Darwin Tree of Life, sat down in the grass in his back yard, cracked open a beer, signed on to Zoom from his laptop, and told me about the new era of biology that he foresees. Periodically, Blaxter, who has long white hair and a graying beard, interrupted himself to identify the creepies that were crawling around him: ladybug, bee, pill bug. “There’s two species of ant on this piece of grass,” he observed. “Only one of them’s biting me, though.”

Charlotte Wright, a twenty-five-year-old doctoral student who likes catching bugs, was drinking a beer with Blaxter that day. Wright studies moths, which, along with butterflies, make up a tenth of all known eukaryotic species. They, too, are mysterious. Human genomes typically have twenty-three pairs of chromosomes; Lepidoptera can have anywhere from five to two hundred and twenty-six. “That gives them the greatest range in chromosome number of any group of organisms on Earth,” Wright said. “They’re completely bonkers.” Because it’s difficult for animals with different numbers of chromosomes to produce offspring, studying chromosome evolution can shed light on how one species diverges into many—one of biology’s fundamental questions.

Blaxter watched a bee fly into his house. Then he reflected on the many drugs that have come from the natural world over the years. Aspirin was first derived from willow bark, which was used to relieve pain since ancient times. “We think that by sequencing, for example, fungi, there’s going to be a huge new pharmacopoeia opened up,” he told me. “Think about the transformative effect that the human genome had on our understanding of human biology and medicine and disease and health. We want that to be available for everyone.”

When Blaxter became a biologist, in the eighties, scientists had not even begun to sequence the human genome. Back then, “biodiversity” was still a new term; humans were only starting to grasp just how many species were vanishing forever, and how much our activities were transforming the planet and its climate. Blaxter, who is sixty-three, seemed conscious that he might not live long enough to see all the impacts of the genomic revolution. “I’m on my way out,” he said. “I’m the old generation, right?” Wright’s generation would inherit unprecedented challenges, but she would also build on an unprecedented foundation of knowledge about the natural world. “Charlotte’s going to be one of the first generation of genome natives,” Blaxter told me. “What we want to do with this project is to change the way biology is done forever.”

,

Read More »



USVs Could Deter IUU Fishing

Via the US Naval Institute, a report on how unmanned saildrones deployed primarily for maritime security at present, can support conservation efforts:

In the opening scenes of Top Gun: Maverick, Admiral Chester Cain tells Maverick, “These planes you’ve been testing, Captain, one day, sooner than later, they won’t need pilots at all . . . the future is coming, and you’re not in it.” The Coast Guard faces a similar reckoning. Autonomous technology is an attractive solution to many maritime security challenges. Autonomous oceangoing vessels, for example, could be a critical force multiplier in combating illegal, unreported, and unregulated (IUU) fishing. As the Coast Guard continues to increase its support of a free and open Indo-Pacific, it must expedite the deployment of autonomous technology to build its capacity to monitor, detect, and deter IUU fishing.

USVs as Deterrence
The presence of Coast Guard assets deters illegal fishing to some extent. This likely would be the case whether the asset were a cutter or an autonomous USV patrolling the high seas. If bad actors know a Coast Guard asset is in the area, they are more likely to check their practices and location before setting fishing lines.

When a Coast Guard vessel is unavailable or unable to operate in a region for an extended period, autonomous USVs could be used to observe, detect, and deter IUU. In addition, the data collected by the USVs could support global transparency efforts and supply allies with critical information within the maritime domain. Following President Joe Biden’s recent announcement expanding the Pacific Remote Islands Marine National Monument, USVs could be used to provide presence in waters that are far from any support or persistent presence from a Coast Guard asset.

Commercially Available
USVs already have proved to be a viable tool for maritime domain awareness. The Coast Guard conducted a 30-day proof-of-concept in 2020, testing three different autonomous uncrewed surface vehicles. Yet, three years later, these “low-cost maritime domain awareness” solutions have yet to see active use by the Coast Guard. As the pilot study report noted, USVs could be a useful in identifying fishing vessel activity and supporting search and rescue. Further, reports from USVs could allow the Coast Guard to adaptively deploy cutter assets to areas of concentrated fishing effort.

USVs are already supporting maritime domain awareness in other regions. The Saildrone has undertaken both maritime security and scientific missions. For example, the National Oceanic and Atmospheric Administration (NOAA) tasked three Saildrones to sail more than 6,000 nautical miles collecting fisheries data, which in turn supported the Alaska Pollock Stock Assessment. In another 2021 partnership with NOAA, five Saildrones sailed into the eye of a hurricane. The U.S. Fifth Fleet has deployed Saildrones across the Arabian Gulf and has a goal of deploying 100 more by the end of summer 2023. In addition, the Fourth Fleet is preparing to deploy USVs to counter transnational criminal organizations and Chinese IUU fishing in both the Atlantic and the Pacific Oceans off Central and South America.

NOAA and the Navy have integrated and successfully deployed Saildrone at scale, further demonstrating the applicability and utility of the technology. A USV program could be implemented immediately using the infrastructure and standard operating procedures established by Fifth Fleet. As Coast Guard Commandant Admiral Linda Fagan stated, “Tomorrow looks different. So will we. We will be a more adaptive and connected Coast Guard that generates sustained readiness, resilience, and capability—in new ways—to enhance our Nation’s maritime safety, security, and prosperity.” The Coast Guard must be innovative and able to adapt to the changing maritime landscape. Building capacity by deploying advanced technology in a public-private partnership would greatly advance the service, inspire its workforce, and change the game in maritime security.

Lack of Resources
The growing demand for assets in the Indo-Pacific has exacerbated the Coast Guard’s workforce shortage. The Indo-Pacific region covers more than 65 percent of the global maritime waters and 56 percent of the global ocean capture fisheries. With thousands of fishing and shipping vessels roaming the high seas, the Coast Guard requires additional support and ways to increase its presence. With current resource allocations, the service has little chance of covering this area of responsibility effectively to protect biodiversity and curb illegal fishing.

Autonomous USVs could help fill the void. Autonomous seagoing USVs require less manning, less support, and can provide the necessary presence to deter illicit activity, offering a solution to the current and future manpower challenge.

The Human Element
While autonomous USVs are useful, they cannot replace a human in every situation. Manned ships and crews still will be needed to represent the United States, the Coast Guard, and democracy. A USV cannot provide the same relationship. However, this should not be seen as a shortfall, but rather a capability that must be strengthened. A USV can augment the mission and serve as a force multiplier. Imagine instead of sending one fast response cutter (FRC) 2,000 miles by itself, the Coast Guard sent an FRC and four Saildrones, which were able to expand and increase the coverage and presence in the region.

The Gray Area of Regulatory Framework
Another challenge is the recognition of and regulatory framework for autonomous vehicles on the high seas. Consider the seizures of USVs by Iran in 2022 in the Red Sea and China in 2016 in the South China Sea. Following the 2016 incident, the Pentagon responded: “It is ours. It is clearly marked; we’d like to have it back and [would] like this to never happen again.” But the legal framework is not clear on what authorities a USV is granted.

Questions for the future include how to treat these situations and what policy framework is needed. If a Chinese distant-water fishing vessel in the Indo-Pacific rams and sinks a Coast Guard USV, what legal repercussions should be pursued? The contingencies and legal response will need to be clear, concise, and well thought out, but this should not deter the Coast Guard from moving forward. Questions of USV management are already being addressed in the private and public sector on land, and these policies will help inform policies for the maritime environment. However, challenges will remain inside national jurisdictions and on the high seas. A similar challenge will play out in space in the coming years in terms of jurisdiction, responsibility, and legal authorities.

Looking Forward
There is no foreseeable future in which the Coast Guard would be better off without autonomous vehicles to support its Indo-Pacific strategy. As Admiral Thomas H. Collins stated in his 2004 essay, “Change and Continuity—The U.S. Coast Guard Today”: “Adapting to change is one of the most difficult tasks we face as individuals or as an organization, but with change comes new opportunities. We must inspire a culture of innovation . . . in all mission areas so as to enhance productivity and reduce workload—all the while driving towards quality outcomes.” Adopting this technology is not a question of when, but how fast.

While USVs are not a panacea for all maritime security problems, they could increase the Coast Guard’s presence and deter illegal fishing. Getting eyes on the water could bring new opportunities for the service to better respond to the changing threats within the Indo-Pacific area of responsibility.

,

Read More »



How Africa’s Largest IoT Conservation Network Supports Wildlife Protection

Via Fast Company, a look at how an IoT conservation network supports wildlife protection in Kenya by leveraging cloud-based sensors and networks to collect, monitor, and analyze environmental data in real time:

Northern Rangelands Trust (NRT) and Connected Conservation Foundation are protecting the most vulnerable animals and natural resources in Kenya with Africa’s largest landscape-wide Internet of Things conservation network.

The project aims to enhance wildlife and natural resource conservation by leveraging cloud-based sensors and networks to collect, monitor, and analyze environmental data in real time.

This massive undertaking will contribute critical digital infrastructure to help Kenyan partners measure and achieve the global biodiversity targets set out at COP 15, the 2022 UN biodiversity conference, to conserve and manage at least 30% of the world’s natural habitats by 2030.

The data combined with analytics and conservation tools are geared toward effectively protecting and managing wildlife, ensuring peace, and improving the livelihood of the people of northern Kenya, says Samuel Lekimaroro, wildlife protection manager at Northern Rangelands Trust, a Kenyan conservation organization that works to protect and restore the Northern Rangelands of Kenya.

NRT’S IOT CONSERVATION NETWORK: THE FIRST IN KENYA
The NRT’s Internet of Things (IoT) conservation network, the first of its kind in Kenya, is made possible by the Connected Conservation Foundation, which has brought together a coalition of private- and public-sector partners including Cisco, Actility, 51 Degrees, and EarthRanger.

The IoT network and high-bandwidth communication backbone currently covers about 7.4 million acres of wilderness in Kenya—a figure that includes 22 of NRT’s community-led conservancies and 4 private reserves with plans to bring more on board to include more of the region, says Sophie Maxwell, executive director of the Connected Conservation Foundation. More than 190 new sensors have been deployed to all parks, with more scheduled in the next few weeks, bringing the total to 250.

For this project, the LoRaWAN network management is done using Actility’s ThingPark platform. While Actility is on the network side, Cisco builds the LoRaWAN gateways or base stations and Actility manages the base stations and the end devices to collect the data and provide the data to application servers.

“What we provide is the core network that connects the base station and the end devices,” says Alper Yegin, chief technology officer at Actility, a provider of low-power networks that play a vital role in IoT infrastructure. “Then on the technical side, there are also sensors coming from various device makers as well as NRT and Connected Conservation.”

NRT and Connected Conservation manage the parks and identify what the use cases are and then they bring all the technologies together. As such, NRT and the Connected Conservation Foundation are the users of this deployment and Cisco is the technology provider, according to Yegin. “We’re providing an innovative solution to manage gateways, integrate sensors, and monitor network operations in real time,” he says.

REVOLUTIONIZING CONSERVATION PROGRAMS
The capabilities of this IoT technology are revolutionizing the way conservation programs operate, offering long-lasting, cost-effective, and secure sensors to combat poaching and protect endangered species, Yegin says.

The LoRaWAN IoT sensors are perfect for deploying in the wildlife parks, tracking animals, tracking equipment and vehicles, tracking weather conditions as well as for monitoring the working conditions of machinery, Yegin says. And since the sensors have very low power consumption, once they’re placed on the animals, they can last for nearly 10 years, sometimes more, he says.

“The other special thing with this technology is that it uses unlicensed band, meaning one does not have to acquire a very expensive and limited license from the government,” Yegin says. “So it’s pretty much like Wi-Fi today—anyone can put up Wi-Fi and the same is true for LoRaWAN. As such, this also drives the cost down, which is essential in such wide-area deployments.”

Wildlife protection is a perfect use case for LPWAN IoT, given the vast territories to monitor, the necessity for long-lasting, low-cost sensors, and the requirement for secure technology to combat poaching, he says.

ADVERTISEMENT
The various sensors, which include rhino, lion, cheetah, and leopard trackers, livestock trackers, and ranger and vehicle trackers, provide critical data that is then visualized in EarthRanger for analysis and insights so that NRT can take any necessary conservation actions. EarthRanger is a tool that collects, integrates, and displays all historical and real-time data available from a protected area to enable organizations to make better decisions about how to manage those areas.

For example, data from the ranger, vehicle, and wildlife sensors enable rangers to monitor and respond to rhino threats to prevent poaching, share information on sick or vulnerable animals, boost conservation management strategies, and redeploy security measures between conservancies.

INCREASE IN ENDANGERED BLACK RHINO POPULATION
Black rhinos are still critically endangered animals because of the demand for rhino horns on the international black market. Kenya, however, is one of the few places in the world where black rhino populations are increasing due to the success of these conservation efforts, Maxwell says.

Consequently, it’s crucial to establish safe and connected rangelands for these endangered species to roam, according to Maxwell.

Having the tags on the rhinos enables the NRT to remove the fences and create larger connected habitats for the rhinos to roam, she says. And it has helped boost the black rhino numbers in Kenya by 10%.

“The technology is what we call a reserve-area network solution—and that is connectivity, communications, and sensors that bring real-time data back into an operations room,” she says. “That data is then visualized on the map through a range of software and that enables people to track the movement of people, the movement of wildlife, and the movement of the ranger teams. And that all happens in real time.”

The battery-powered LoRaWAN-enabled sensors communicate via a long-range, ultra-low data rate connection, resulting in longer battery life. Additionally, LoRa sensors are a fraction of the cost of satellite tracking tags—transforming how conservation programs operate because being able to deploy many sensors means capturing more data, enabling NRT to demonstrate the effectiveness of its conservation efforts, which is really valuable, according to Maxwell.

“Previously, NRT and our member conservancies used an analog system, and we were unable to observe what was happening in the landscape in terms of wildlife trends, asset monitoring, and security patrol coordination,” Lekimaroro says. “We were only communicating via radio between the conservancies and the Joint Operations and Communications Center (JOCC).”

Through Connected Conservation, NRT is now able to successfully protect and monitor wildlife, coordinate field patrols, and support the government and communities in peace efforts from an informed point of knowledge/data assessed by EarthRanger, according to Lekimaroro.

“All field patrol teams can be monitored and supported by the team in headquarters, which is the central location, from the JOCC,” he says. “Through technology, intra- and inter-conservancy communications have improved, allowing for more efficient surveillance, wildlife protection, and monitoring operations.”

The increased data transmission into the centralized JOCC system has helped the NRT assess patrol efforts, wildlife trends, patterns, and data generation for management decision making, Lekimaroro says.

,

Read More »


  |  Next Page »
ABOUT
Networked Nature
New technical innovations such as location-tracking devices, GPS and satellite communications, remote sensors, laser-imaging technologies, light detection and ranging” (LIDAR) sensing, high-resolution satellite imagery, digital mapping, advanced statistical analytical software and even biotechnology and synthetic biology are revolutionizing conservation in two key ways: first, by revealing the state of our world in unprecedented detail; and, second, by making available more data to more people in more places. The mission of this blog is to track these technical innovations that may give conservation the chance – for the first time – to keep up with, and even get ahead of, the planet’s most intractable environmental challenges. It will also examine the unintended consequences and moral hazards that the use of these new tools may cause.Read More