Observing System Simulation Experiments for NPOESS

Michiko Masutani, John C. Woollen, John C. Derber, Stephen J. Lord (PI)

NOAA/NWS/NCEP/EMC, Camp Springs, MD

Joseph Terry, Robert Atlas

NASA/GSFC/DAO, Greenbelt, MD

Sidney A. Wood, Steven Greco, G. David Emmitt

Simpson Weather Associates, Charlottesville, VA

Thomas J. Kleespies

NOAA/NESDIS, Camp Springs, MD

http://www.emc.noaa.gov:8000/research/osse

OSSE:

Observing System Simulation Experiments

NPOESS:

National Polar-orbiting Operational Environmental Satellites
Scheduled to be launched 2007-2010
Objective:
To test the impact of the new instruments using simulated data
To make recommendations for configuration based on quantitative
NWP impact of NPOESS instruments


Advantages of OSSE at NCEP:
Prepare for operational use of new data
Data base
Evaluate operational computing and storage requirements
Development of data assimilation system
Gain some knowledge on the new instrument
Data can be used very soon after launch
Neutral from instrument development


Instrument:
DWL (Doppler Wind Lidar )
Cross track infrared counter (CrIS)
Conically scanning Microwave Imager/Sounder (CMIS)
Advanced Technology Microwave Sounder (ATMS)


The Nature run was provided by ECMWF

OSSE test data assimilation systems
NCEP: Operational Center
NASA/DAO : Research Center
NRL : Second operational center



Simulation of NPOESS candidate instrument data
DWL: Simpson Weather Associates and others
TOVS radiance: NESDIS
Conventional data : NASA/DAO


Advised by: Drs. E. Kalnay, W. Baker, and R. Daley , review Pannel

Supported by: Dr. Steven Mango (IPO)






Major tasks
The Nature run is processed and made available to the U.S. OSSE community
Perform sensitivity test for simulated data set for existing Instruments with OPER97 system withnTOVS radiance
Evaluation of the Nature run
Produce a aeport on cloud evaluation and adjustment
Data base is upgraded for LOS winds
NCEP data assimilation (SSI) has been upgraded to use LOS winds.
Repeat sensitivity test for simulated data set for existing Instruments with OPER99 system with TOVS 1B data
Setting up OSSE environment.
Simulate observation
DWL wind
Conventional observation
TOVS radiance data
Evaluation of OSSE results

Contribution to above tasks
NCEP
Steve Lord, Michiko Masutani, John Derber, Jack Woollen Ken Campana, S.-K. Yang, Russ Treadon, Mark Iredell, Bert Kats, Bill Collins, Dennis Keyser, Yuejian Zhu , Wan Shu Wu, Song- You Hong, > Zoltan Toth, Bob Kistler, Wesley Ebisozaki, Jim Purser, Dave Parrish
NASA/DAO
Bob Atlas, Joe Terry, Genia Brin
Simpson Weather Associates (SWA)
Dave Emmitt, Sid Wood, Steven Greco
NESDIS
Tom Kleespie, Jim Yoe, L. Stowe Andrew Heidinger
ECMWF
Christian Jakob, Roger Saunders, Keith Felding, Tony Hollingsworth
Nature run (True Atmosphere for the OSSE)


ECMWF reanalysis model T213 and 640x320 grid 31 levels
Free forecast integration from 5 February 1993 to 7 March 1993

Evaluation
The nature run found to be representative to the actual atmosphere in many ways.
Excessive low level cloud over land and lack of low level cloud over ocean.


Adjustment of low level cloud


Add stratus and stratocumulas in the region of rising motion and
large amount on climatological values.


ISCCP, RTNEPH, CLAVR, Warren clouds are used for evaluation.


Data withdrawal experiments with actual data

Table and results of data withdrawal experiments


The results need to be repeated by simulated data and must produce similar results to conduct reliable OSSE.


Evaluate impact of TOVS radiance, RAOB temperature and RAOB wind
NCEP operational data assimilation system March 99 version.
T62 28 level model with TOVS 1B data


TOVS radiance has wider impact than RAOB temperature
RAOB wind has highest impact on NH
Require more detail analysis for conclusion




Calibration for OSSE


Testing for short period (February 6th and 7th)


Simulated DWL data will be tested with real observations to examine analysis sensitivity




Evaluation of OSSE results


Standard anomaly correlation by wave number
Diagnostics of cyclone and jets.
Compare extreme events.
Data rejection statistics
Cost benefit in aviation application (flight planning)


Plan


A. Perform sensitivity test for simulated data set for existing Instruments.

B. Perform sensitivity test for simulated data set for future Instruments.


B.1 Test the impact of data density
B.2 Adaptive observing strategies

C. Work on other potential instruments
Cross track infrared counter (CrIS)
Conically scanning Microwave Imager/Sounder (CMIS)
Advanced Technology Microwave Sounder (ATMS)


D. Other seasons and Nature run from other models
Summer need to be tested