New techniques in sediment core analysis: an introduction

Rothwell, R.G.; Rack, F.R.. 2006 New techniques in sediment core analysis: an introduction. In: Rothwell, R.G., (ed.) New techniques in sediment core analysis. London, UK, Geological Society of London, 1-29, 266pp. (Geological Society Special Publication, 267).

Full text not available from this repository.


Marine sediment cores are the fundamental data source for information on seabed character, depositional history and environmental change. They provide raw data for a wide range of research including studies of global climate change, palaeoceanography, slope stability, oil exploration, pollution assessment and control, and sea-floor surveys for laying cables, pipelines and siting of sea-floor structures. During the last three decades, a varied suite of new technologies have been developed to analyse cores, often non-destructively, to produce high-quality, closely spaced, co-located downcore measurements, characterizing sediment physical properties, geochemistry and composition in unprecedented detail. Distributions of a variety of palaeoenvironmentally significant proxies can now be logged at decadal and, in some cases, even annual or subannual scales, allowing detailed insights into the history of climate and associated environmental change. These advances have had a profound effect on many aspects of the Earth Sciences, particularly palaeoceanography. In this paper, we review recent advances in analytical and logging technology, and their application to the analysis of sediment cores. Developments in providing access to core data and associated datasets, and data-mining technology, in order to integrate and interpret new and legacy datasets within the wider context of sea-floor studies, are also discussed. Despite the great advances in this field, however, challenges remain, particularly in the development of standard measurement and calibration methodologies and in the development of data analysis methods. New data visualization tools and techniques need to be developed to optimize the interpretation process and maximize scientific value. Amplified collaboration environments and tools are needed in order to capitalize on our analysis and interpretation capability of large, multi-parameter datasets. Sophisticated, yet simple to use, searchable Internet databases, with universal access and secure long-term funding, and data products resulting in user-defined data-mining query and display, so far pioneered in the USA and Australia, provide robust models for efficient and effective core data stewardship.

Item Type: Publication - Book Section
Digital Object Identifier (DOI):
ISBN: 1862392102
Date made live: 23 Nov 2006 +0 (UTC)

Actions (login required)

View Item View Item

Document Downloads

Downloads for past 30 days

Downloads per month over past year

More statistics for this item...