Beyond Open Data Access: Committing to Data Accessibility

Over the past two decades, the digital revolution has nudged the ocean science research community toward adopting a culture of open data access. The concept of making data Findable, Accessible, Interoperable, and Reusable (FAIR) has been broadly accepted, though not fully realized. FAIR principles have been readily adopted by some communities and disciplines, while others lag. Despite technical solutions that can ease the burden of data management, cultural inertia, and limited resources slow our collective progress. Delivering on the promise of the data revolution in the ocean domain requires a cultural transformation that prioritizes data stewardship at the outset of our offshore activities and a commitment to data accessibility.

The ocean is vast, under-sampled, and largely unexplored. Acquiring data is complicated and costly, especially below the sea surface. What is most costly, however, is trying to do it alone. Large international calls to action, including the UN Decade of Ocean Science for Sustainable Development and The Nippon Foundation—GEBCO Seabed 2030 Project, emphasize the importance of sharing ocean data across sectors and across political boundaries.

Much like the global ocean itself, the current ocean science data landscape is complex and challenging. Successfully navigating and utilizing online data resources requires specialist knowledge. This includes familiarity with data systems and their interfaces and an understanding of the data types and file formats that are preserved and available. Despite the adoption and use of data-sharing standards, the rapidly growing multi-sector (academic, government, industry) data landscape is poorly connected and far from integrated.

A further complication is that there are unknown amounts of global ocean data that exist but are not in public archives. These offline data resources (aka “dark” data) are at risk of being lost due to decaying media, discontinued formats, and the dissipation of corporate knowledge. A recent GEBCO webinar series targeted at improving data access identified some of the barriers to sharing these data as financial limitations, inadequate human resources, lack of tools, and cultural inertia. In short—we do not sufficiently value and prioritize data preservation and sharing.

A Solid Foundation

Thanks to the growing recognition of data-sharing policies, and a network of passionate data champions around the world, publicly available data holdings are increasing at rates never before seen. For example, projects such as the Rolling Deck to Repository (R2R) Program have developed holistic and efficient solutions for documenting and preserving underway data from research vessels in the US Academic Research Fleet. This shared digital catalog of unprocessed ocean data is a tremendous community resource that is the foundation of an emerging data ecosystem.

Assembling these foundational data into high-quality products often requires significant effort utilizing specialized tools and knowledge. Data are acquired by many individuals, groups, and organizations, all of whom employ different approaches, practices, and workflows. Documenting and preserving heterogeneous data products can be challenging due to variable formats, resolution, and variable quality requirements—all of which are dictated by use cases. Curating, assembling, and integrating these products increases return on investments and has the potential to deliver efficiency by minimizing redundant data processing efforts.

03 IHO DCDB holdings

A web application that provides access to global multibeam sonar data holdings preserved in the International Hydrographic Organization Data Center for Digital Bathymetry. (Image credit: IHO)

Some data stewardship projects led by scientists and technical experts with disciplinary expertise have been designed to focus on complementary steps in the data stewardship continuum to help address these complexities. The Marine Geoscience Data System (MGDS), for example, is an online catalog and repository of freely available primarily academic-generated data products that describe the seafloor and subseafloor. Its partner system, the Global Multi-Resolution Topography (GMRT) Synthesis, is focused on assembling and curating elevation data into a seamless compilation that can be used to generate custom maps, grids, and values, enabling easy web-based quantitative access to elevation data for a broad user community. The third component of this data ecosystem is GeoMapApp, an openly available tool that enables data discovery, visualization, and analysis for exploring these and many complementary data sets. Together these projects fortify foundational data preservation while increasing data accessibility for specialists and non-specialists alike.

Changing Data Culture

Truly embracing a culture of data sharing that prioritizes accessibility has enormous transformative potential, but there are several friction points that stand in our way. Although scientists consider data to be objective and impersonal, the reality is that the data we acquire, prepare, analyze, and interpret are deeply personal. We take pride in what we produce, and we have concerns about being credited for our work. Across multiple sectors, there are widespread concerns about data misuse and/or misinterpretation. There is tension between the increasing recognition of data policies, the legacy culture of treating data as proprietary, and a leaky system for ensuring data policy compliance. As we continue to explore and observe the ocean and work to accelerate climate solutions, we must do more to accelerate the pace of culture change related to data sharing.

04 GeoMapApp

GeoMapApp provides pre-assembled data that can be discovered and analyzed to enable transdisciplinary research by specialists and non-specialists. (Image credit: GeoMapApp)

We are in the midst of a rapid expansion of data acquisition in the offshore environment. More platforms and sensors, operated by more groups and organizations, are more frequently acquiring more data types. These data are processed and transformed into data products that describe temporally and spatially limited snapshots of the ocean environment. Plans to rapidly expand offshore infrastructure that relies on and has the potential to deliver increasing volumes of observational data means there is a wave of data coming. For robust decision-making and management of the offshore environment, it is essential to develop observation and data infrastructure that is co-developed and aligned with the multi-user framework of the future.

We need a master plan to leverage observational data to generate information products and syntheses that are accessible to diverse multi-sector user communities. Our master plan needs to strike the right balance between centralized and distributed efforts that unite data and prioritize equitable access to ocean data and information. The rich digital ecosystem that we can create will accelerate our ability to understand and predict the ocean environment, support responsible stewardship of ocean resources, and improve our collective ability to adapt to our changing planet.

This feature appeared in Environment, Coastal & Offshore (ECO) Magazine's 2023 Deep Dive II special edition Marine Environmental Research, to read more access the magazine here.

Our Partners

Frontiers in Marine Science

ECO Magazine is a marine science trade publication committed to bringing scientists and professionals the latest ground-breaking research, industry news, and job opportunities from around the world.


8502 SW Kansas Ave
Stuart, FL 34997


Newsletter Signup

The ECO Newsletter is a weekly email featuring the Top 10 stories of the past seven days, providing readers with a convenient way to stay abreast on the latest ocean science and industry news.