2013年6月19日水曜日

NaN in 1D?!!

1-D Offline Simulation

Case: run2 (1994 5-day mean dynamical fields. dt=8760s)

Wow, it is kind of rare ---> NaN was found in "tracer.stat" for station 15 and 25, but NOT 8.

After checking all the tracers in "ptrc_T 1m" file, I noticed that the NaN is associated only with DIC and Alkalini, and everything else ran without any problem.

This may imply that it is not the time step, but maybe how these two tracers are computed in the model.

Case: run3 (only for station 15. same as run2 except dt=17520s)

...Still, NaN at 5612, exactly at the same time as run2.

---> It turns out that the offline dynamics for grid_T were not rebuild well, perhaps because it is a huge file (~25GB) so maybe there was some errors during computation (more about rebuild: http://www.nemo-ocean.eu/Using-NEMO/FAQ#eztoc1028_10_28).

Case: run4 (only for station 15. same as run2 except jpizoom=297 (shifted few points to the West, where the vertical structures are not affected by the rebuild errors).

0-D Configuration

The purpose is to use PISCES in zero-dimensional or box to study the model sensitivity without physical effects. I am modifying the model code to implement this. Here's what I have touched so far:

nemogcm.F90

By default, number of modeled vertical levels = number of vertical levels in input data. I changed the code so that the former will be 1:

      jpk = jpkdta                                             ! third dim
!ModB
#if defined key_box
      jpk = 1 ! 0-D box model (Hakase)
#endif
!ModE

Here, I defined jpk = 1, with a new cpp key called "key_box".

---> This did not work as there requires at least 3 vertical levels just like the horizontal grid structure in 1-D configuration (3 x 3 x z-level). but havent tested with jpk = 3 yet.

dtadyn.F90

This file reads and interpolates offline dynamics. There are up to 19 sets of data:

   INTEGER  , PARAMETER ::   jpfld = 19     ! maximum number of files to read
   INTEGER  , SAVE      ::   jf_tem         ! index of temperature
   INTEGER  , SAVE      ::   jf_sal         ! index of salinity
   INTEGER  , SAVE      ::   jf_uwd         ! index of u-wind
   INTEGER  , SAVE      ::   jf_vwd         ! index of v-wind
   INTEGER  , SAVE      ::   jf_wwd         ! index of w-wind
   INTEGER  , SAVE      ::   jf_avt         ! index of Kz
   INTEGER  , SAVE      ::   jf_mld         ! index of mixed layer deptht
   INTEGER  , SAVE      ::   jf_emp         ! index of water flux
   INTEGER  , SAVE      ::   jf_qsr         ! index of solar radiation
   INTEGER  , SAVE      ::   jf_wnd         ! index of wind speed
   INTEGER  , SAVE      ::   jf_ice         ! index of sea ice cover
   INTEGER  , SAVE      ::   jf_ubl         ! index of u-bbl coef
   INTEGER  , SAVE      ::   jf_vbl         ! index of v-bbl coef
   INTEGER  , SAVE      ::   jf_ahu         ! index of u-diffusivity coef
   INTEGER  , SAVE      ::   jf_ahv         ! index of v-diffusivity coef 
   INTEGER  , SAVE      ::   jf_ahw         ! index of w-diffusivity coef
   INTEGER  , SAVE      ::   jf_eiu         ! index of u-eiv
   INTEGER  , SAVE      ::   jf_eiv         ! index of v-eiv
   INTEGER  , SAVE      ::   jf_eiw         ! index of w-eiv

The goal is to make these constant throughout time, so that PISCES will not be independent of physical forcing. Since this configuration is spatially independent, I used ORCA2_LIM_PISCES as the reference configuration. To learn about regional (customized domain), it might be a good idea to take a look at the code in GYRE_LOBSTER configuration.

0 件のコメント:

コメントを投稿