nerc.ac.uk

An evaluation of a self-calibrating infrared radiometer for measuring sea surface temperature

Thomas, J. P.; Knight, R. J.; Roscoe, H. K.; Turner, J. ORCID: https://orcid.org/0000-0002-6111-5122; Symon, C.. 1995 An evaluation of a self-calibrating infrared radiometer for measuring sea surface temperature. Journal of Atmospheric and Oceanic Technology, 12 (2). 301-316. https://doi.org/10.1175/1520-0426(1995)012<0301:AEOASC>2.0.CO;2

Full text not available from this repository. (Request a copy)

Abstract/Summary

Satellite radiometer measurements of global sea surface temperature (SST) with an accuracy of 0.3 K are required for climate change monitoring. In order to validate that this accuracy can be achieved, in situ measurements of sea surface radiance must be made during satellite overpasses. In the past decade attempts have been made to design self-calibrating, infrared radiometers for measuring SST from research ships, and some commercially manufactured models are now available. The British Antarctic Survey deployed one such radiometer on board the royal research ship Bransfield between October 1991 and May 1992. Its purpose was to measure SST within the Along Track Scanning Radiometer (ATSR) swath when the ERS-1 satellite passed over the ship. The ship radiometer was claimed to have an accuracy of ±0.1 K but this had not been verified under realistic measurement conditions. An evaluation of the radiometer's accuracy was therefore carried out during a voyage from the British Isles to Antarctica. At intervals throughout the voyage the temperature of well-stirred seawater in a tank on the deck of the ship was measured using both the radiometer and thermometers, which were accurate to ±0.1 K. These measurements revealed that the radiometer values of water temperature were more than 1.5 K warmer than the values given by the thermometers. The cause of this offset was thought to be incorrect calibration of platinum resistance thermometers within the instrument, and an empirical correction was derived. When the correction was applied the rms difference between the thermometer and the radiometer measurements of the temperature of the seawater in the tank was 0.1 K using the radiometer's 11-µm channel. The rms difference using the 12-µm channel was 0.14 K, which was larger because of an unidentified beat signal that affected this channel. These results, therefore, showed that this radiometer was capable of making SST measurements that were accurate enough to validate the ATSR SST and also to carry out useful investigations of the ocean skin effect.

Item Type: Publication - Article
Digital Object Identifier (DOI): https://doi.org/10.1175/1520-0426(1995)012<0301:AEOASC>2.0.CO;2
Programmes: BAS Programmes > Pre 2000 programme
ISSN: 0739-0572
Date made live: 16 Jan 2017 13:20 +0 (UTC)
URI: https://nora.nerc.ac.uk/id/eprint/515858

Actions (login required)

View Item View Item

Document Downloads

Downloads for past 30 days

Downloads per month over past year

More statistics for this item...