The sub-domain functionality, also improperly called the zoom option (improperly because it is not associated with a change in model resolution) is a quite simple function that allows a simulation over a sub-domain of an already defined configuration ( without defining a new mesh, initial state and forcings). This option can be useful for testing the user settings of surface boundary conditions, or the initial ocean state of a huge ocean model configuration while having a small computer memory requirement. It can also be used to easily test specific physics in a sub-domain (for example, see [Madec et al., 1996] for a test of the coupling used in the global ocean version of OPA between sea-ice and ocean model over the Arctic or Antarctic ocean, using a sub-domain). In the standard model, this option does not include any specific treatment for the ocean boundaries of the sub-domain: they are considered as artificial vertical walls. Nevertheless, it is quite easy to add a restoring term toward a climatology in the vicinity of such boundaries (see §5.6).
In order to easily define a sub-domain over which the computation can be performed, the dimension of all input arrays (ocean mesh, bathymetry, forcing, initial state, ...) are defined as jpidta, jpjdta and jpkdta ( in namcfg namelist), while the computational domain is defined through jpiglo, jpjglo and jpk (namcfg namelist). When running the model over the whole domain, the user sets jpiglo=jpidta jpjglo=jpjdta and jpk=jpkdta. When running the model over a sub-domain, the user has to provide the size of the sub-domain, (jpiglo, jpjglo, jpkglo), and the indices of the south western corner as jpizoom and jpjzoom in the namcfg namelist (Fig. 15.2).
Note that a third set of dimensions exist, jpi, jpj and jpk which is actually used to perform the computation. It is set by default to jpi=jpjglo and jpj=jpjglo, except for massively parallel computing where the computational domain is laid out on local processor memories following a 2D horizontal splitting.
The extended grids for use with the under-shelf ice cavities will result in redundant rows
around Antarctica if the ice cavities are not active. A simple mechanism for subsetting
input files associated with the extended domains has been implemented to avoid the need to
maintain different sets of input fields for use with or without active ice cavities. The
existing 'zoom' options are overly complex for this task and marked for deletion anyway.
This alternative subsetting operates for the j-direction only and works by optionally
looking for and using a global file attribute (named: open_ocean_jstart) to
determine the starting j-row for input. The use of this option is best explained with an
example: Consider an ORCA1 configuration using the extended grid bathymetry and coordinate
files:
eORCA1_bathymetry_v2.nc
eORCA1_coordinates.nc
Add the new attribute to any input files requiring a j-row offset, i.e:
ncatted -a open_ocean_jstart,global,a,d,41 eORCA1_coordinates.nc
ncatted -a open_ocean_jstart,global,a,d,41 eORCA1_bathymetry_v2.nc
Add the logical switch to namcfg in the configuration namelist and set true:
!----------------------------------------------------------------------- &namcfg ! parameters of the configuration !----------------------------------------------------------------------- cp_cfg = "orca" ! name of the configuration jp_cfg = 1 ! resolution of the configuration jpidta = 362 ! 1st lateral dimension ( >= jpi ) jpjdta = 292 ! 2nd " " ( >= jpj ) jpkdta = 75 ! number of levels ( >= jpk ) jpiglo = 362 ! 1st dimension of global domain --> i =jpidta jpjglo = 292 ! 2nd - - --> j =jpjdta jperio = 6 ! lateral cond. type (between 0 and 6) ln_use_jattr = .true. ! use (T) the file attribute: open_ocean_jstart if present
Note the j-size of the global domain is the (extended j-size minus
open_ocean_jstart + 1 ) and this must match the size of all datasets other than
bathymetry and coordinates currently. However the option can be extended to any global, 2D
and 3D, netcdf, input field by adding the:
lrowattr=ln_use_jattr
|
Gurvan Madec and the NEMO Team
NEMO European Consortium2017-02-17