Home > fvcom_prepro > get_HYCOM_forcing.m

get_HYCOM_forcing

PURPOSE ^

Get mean flow, temperature, salinity, surface elevation and denstiy data

SYNOPSIS ^

function data = get_HYCOM_forcing(Mobj, modelTime, varargin)

DESCRIPTION ^

 Get mean flow, temperature, salinity, surface elevation and denstiy data
 from HYCOM model outputs through their OPeNDAP server.

 data = get_HYCOM_forcing(Mobj, modelTime)

 DESCRIPTION:
   Using OPeNDAP, extract the necessary parameters to create an FVCOM
   forcing file.

 INPUT:
   Mobj - MATLAB mesh object with the following fields:
           - have_lonlat - boolean indicating whether lat/long values are
           present in Mobj.
           - lon, lat - longitude and latitudes of the model grid nodes.
   modelTime - Modified Julian Date start and end times
   varlist - [optional] cell array of variables to download. Use HYCOM
       names (e.g. ssh, salinity, temperature, u, v). If omitted,
       downloads salinity, temperature and ssh only.
   three - [optional] boolean. Set to true to download the 3 hourly data
       for the period 1992/10/2 to 2008/09/18 (inclusive). Defaults to
       false and downloads only the daily data.

 OUTPUT:
   data - struct of the data necessary to force FVCOM. These can be
   interpolated onto the unstructured grid in Mobj using grid2fvcom.m.

 The parameters (and the corresponding field names returned) which are
 obtained from the HYCOM data are:
     - time [MT]
     - temperature [temperature]
     - salinity [salinity]
     - u mean flow component [u]
     - v mean flow component [v]
     - daily mean sea surface height [ssh]

 EXAMPLE USAGE:
   To download the default parameters (temperature, salinity and ssh):

       modeltime = [55561, 55591]; % time period in Modified Julian Days
       hycom = get_HYCOM_forcing(Mobj, modeltime);

   To download only sea surface height:

       modeltime = [55561, 55591]; % time period in Modified Julian Days
       hycom = get_HYCOM_forcing(Mobj, modeltime, {'ssh'})

 Author(s)
   Pierre Cazenave (Plymouth Marine Laboratory)

 Revision history:
   2013-01-31 First version.
   2013-08-19 Make some more progress in getting this working. Mainly
   change the way the dates are handled to form the relevant URL for
   downloading the data.
   2013-09-03 More incremetal changes to get the function working. At the
   moment, I'm ignoring the old OPeNDAP toolbox for reading the data from
   the HYCOM OPeNDAP server.
   2013-09-05 It's working! Also add a data.time variable with the time
   stamps from the HYCOM data.
   2013-12-02 Add sea surface height to the list of variables that can be
   downloaded.
   2013-12-09 Add ability to specify particular variables to download.
   2013-12-12 Fix the handling of the variable input list of field names.
   2015-05-21 Add support for the Global Reanalysis data which extends
   coverage back to 1992 (previously limited to 2008 with the Global
   Analysis data). The Global Analysis data is used from 2008 onwards even
   though the reanalysis exists up to 2012.
   2016-01-04 Add support for the three hourly output data for the 19.0
   and 19.1 experiments.
   2016-07-06 Add new data sets from the HYCOM server (post-2014).
   2017-01-26 Use ncread to download the data to automatically apply the
   scale and offset values from the server data. At some point (either in
   HYCOM's server upgrades or MATLAB updates), values being returned for
   2005 were raw values (not scaled and offset), which is obviously wrong.

==========================================================================

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SUBFUNCTIONS ^

SOURCE CODE ^

0001 function data = get_HYCOM_forcing(Mobj, modelTime, varargin)
0002 % Get mean flow, temperature, salinity, surface elevation and denstiy data
0003 % from HYCOM model outputs through their OPeNDAP server.
0004 %
0005 % data = get_HYCOM_forcing(Mobj, modelTime)
0006 %
0007 % DESCRIPTION:
0008 %   Using OPeNDAP, extract the necessary parameters to create an FVCOM
0009 %   forcing file.
0010 %
0011 % INPUT:
0012 %   Mobj - MATLAB mesh object with the following fields:
0013 %           - have_lonlat - boolean indicating whether lat/long values are
0014 %           present in Mobj.
0015 %           - lon, lat - longitude and latitudes of the model grid nodes.
0016 %   modelTime - Modified Julian Date start and end times
0017 %   varlist - [optional] cell array of variables to download. Use HYCOM
0018 %       names (e.g. ssh, salinity, temperature, u, v). If omitted,
0019 %       downloads salinity, temperature and ssh only.
0020 %   three - [optional] boolean. Set to true to download the 3 hourly data
0021 %       for the period 1992/10/2 to 2008/09/18 (inclusive). Defaults to
0022 %       false and downloads only the daily data.
0023 %
0024 % OUTPUT:
0025 %   data - struct of the data necessary to force FVCOM. These can be
0026 %   interpolated onto the unstructured grid in Mobj using grid2fvcom.m.
0027 %
0028 % The parameters (and the corresponding field names returned) which are
0029 % obtained from the HYCOM data are:
0030 %     - time [MT]
0031 %     - temperature [temperature]
0032 %     - salinity [salinity]
0033 %     - u mean flow component [u]
0034 %     - v mean flow component [v]
0035 %     - daily mean sea surface height [ssh]
0036 %
0037 % EXAMPLE USAGE:
0038 %   To download the default parameters (temperature, salinity and ssh):
0039 %
0040 %       modeltime = [55561, 55591]; % time period in Modified Julian Days
0041 %       hycom = get_HYCOM_forcing(Mobj, modeltime);
0042 %
0043 %   To download only sea surface height:
0044 %
0045 %       modeltime = [55561, 55591]; % time period in Modified Julian Days
0046 %       hycom = get_HYCOM_forcing(Mobj, modeltime, {'ssh'})
0047 %
0048 % Author(s)
0049 %   Pierre Cazenave (Plymouth Marine Laboratory)
0050 %
0051 % Revision history:
0052 %   2013-01-31 First version.
0053 %   2013-08-19 Make some more progress in getting this working. Mainly
0054 %   change the way the dates are handled to form the relevant URL for
0055 %   downloading the data.
0056 %   2013-09-03 More incremetal changes to get the function working. At the
0057 %   moment, I'm ignoring the old OPeNDAP toolbox for reading the data from
0058 %   the HYCOM OPeNDAP server.
0059 %   2013-09-05 It's working! Also add a data.time variable with the time
0060 %   stamps from the HYCOM data.
0061 %   2013-12-02 Add sea surface height to the list of variables that can be
0062 %   downloaded.
0063 %   2013-12-09 Add ability to specify particular variables to download.
0064 %   2013-12-12 Fix the handling of the variable input list of field names.
0065 %   2015-05-21 Add support for the Global Reanalysis data which extends
0066 %   coverage back to 1992 (previously limited to 2008 with the Global
0067 %   Analysis data). The Global Analysis data is used from 2008 onwards even
0068 %   though the reanalysis exists up to 2012.
0069 %   2016-01-04 Add support for the three hourly output data for the 19.0
0070 %   and 19.1 experiments.
0071 %   2016-07-06 Add new data sets from the HYCOM server (post-2014).
0072 %   2017-01-26 Use ncread to download the data to automatically apply the
0073 %   scale and offset values from the server data. At some point (either in
0074 %   HYCOM's server upgrades or MATLAB updates), values being returned for
0075 %   2005 were raw values (not scaled and offset), which is obviously wrong.
0076 %
0077 %==========================================================================
0078 
0079 subname = 'get_HYCOM_forcing';
0080 
0081 global ftbverbose;
0082 if ftbverbose
0083     fprintf('\nbegin : %s\n', subname)
0084 end
0085 
0086 % For checking whether to use the third-party OPeNDAP toolbox or native
0087 % MATLAB tools. OPeNDAP support is included in MATLAB version 7.14 onwards.
0088 % Although this check is made, I haven't actually written the code for the
0089 % third-party toolbox. If you need it, I be pleased if you wrote an
0090 % equivalent to the native version.
0091 v714date = datenum(2012, 3, 1);
0092 currdate = ver('MATLAB');
0093 currdate = datenum(currdate.Date);
0094 
0095 if datenum(currdate) < v714date
0096     error(['This version of MATLAB does not have native OPeNDAP ', ...
0097         'support and this function relies on that support. If you ', ...
0098         'require this function, you will have to add the ', ...
0099         'functionality with the third-party OPeNDAP toolbox.'])
0100 end
0101 
0102 % Check if we've been given a cell array of variables and set the varlist
0103 % to that, otherwise default to temperature, salinity and sea surface
0104 % height. For the three option, assume we want daily data unless three is
0105 % true, in which case set the threehourly parameter to be the string
0106 % required for the THREDDS URL (/3hrly), otherwise default to daily data.
0107 threehourly = '';
0108 if nargin == 2 || nargin > 3
0109     varlist = {'temperature', 'salinity', 'ssh'};
0110 end
0111 if nargin > 2
0112     % To maintain backwards compatibility, assume if we've been given a
0113     % cell array as the third option, that's the variables to download.
0114     % Otherwise, iterate through the argument as key-parameter pairs.
0115     if nargin == 3
0116         if iscell(varargin{1})
0117             varlist = varargin{1};
0118             assert(iscell(varargin{1}), [...
0119                 'List of variables to extract must', ...
0120                 ' be a cell array: {''var1'', ''var2''}'])
0121         end
0122     else
0123         for v = 1:2:length(varargin)
0124             switch varargin{v}
0125                 case 'three'
0126                     if varargin{v + 1}
0127                         threehourly = '/3hrly';
0128                     end
0129                     if modelTime(1) >= greg2mjulian(2008, 09, 19, 0, 0, 0)
0130                         error(['The three hourly output is not ', ...
0131                                 'configured for the period beyond ', ...
0132                                 '2008/09/18 in this function. Please ', ...
0133                                 'disable three hourly downloads, or ', ...
0134                                 'edit this function to suit your needs.'])
0135                     end
0136             end
0137         end
0138     end
0139 end
0140 
0141 % Get the extent of the model domain (in spherical).
0142 if ~Mobj.have_lonlat
0143     error('Need spherical coordinates to extract the forcing data')
0144 else
0145     % Add a buffer of two grid cells in latitude and two in longitude to
0146     % make sure the model domain is fully covered by the extracted data.
0147     [dx, dy] = deal(1/12, 1/12); % HYCOM resolution in degrees
0148     % West, east, south, north
0149     extents = [min(Mobj.lon(:)) - (2 * dx), max(Mobj.lon(:)) + (2 * dx), ...
0150         min(Mobj.lat(:)) - (2 * dy), max(Mobj.lat(:)) + (2 * dy)];
0151 end
0152 
0153 % List of URL suffixes so we can dynamically build the URL for each time
0154 % step.
0155 suffix.MT           = {'time', 'MT'};
0156 suffix.Longitude    = {'lon', 'Longitude'};
0157 suffix.Latitude     = {'lat', 'Latitude'};
0158 suffix.Depth        = {'depth', 'Depth'};
0159 suffix.temperature  = {'water_temp', 'temperature'};
0160 suffix.salinity     = {'salinity', 'salinity'};
0161 suffix.u            = {'water_u', 'u'};
0162 suffix.v            = {'water_v', 'v'};
0163 suffix.ssh          = {'surf_el', 'ssh'};
0164 
0165 % Get the URL to use for the first time step.
0166 url = get_url(modelTime(1), threehourly);
0167 
0168 if modelTime(1) < greg2mjulian(2008, 09, 19, 0, 0, 0)
0169     name_index = 1;
0170 elseif modelTime(1) >= greg2mjulian(2008, 09, 19, 0, 0, 0)
0171     name_index = 2;
0172 end
0173 hycom.MT            = [url, suffix.MT{name_index}];          % time [1D]
0174 hycom.Longitude     = [url, suffix.Longitude{name_index}];   % [2D]
0175 hycom.Latitude      = [url, suffix.Latitude{name_index}];    % [2D]
0176 hycom.Depth         = [url, suffix.Depth{name_index}];       % water depth [2D]
0177 hycom.temperature   = [url, suffix.temperature{name_index}]; % [4D]
0178 hycom.salinity      = [url, suffix.salinity{name_index}];    % [4D]
0179 hycom.ssh           = [url, suffix.ssh{name_index}];         % sea surface height [3D]
0180 hycom.u             = [url, suffix.u{name_index}];           % mean flow [4D]
0181 hycom.v             = [url, suffix.v{name_index}];           % mean flow [4D]
0182 
0183 % Load the depth data (1D vector).
0184 ncid = netcdf.open(hycom.Depth, 'NOWRITE');
0185 try
0186     varid = netcdf.inqVarID(ncid, 'Depth');
0187 catch
0188     varid = netcdf.inqVarID(ncid, 'depth');
0189 end
0190 
0191 % HYCOM has fixed depths, so the array which returned here is just
0192 % a list of those depths. We need them to interpolate the vertical
0193 % structure onto the FVCOM sigma levels.
0194 data.Depth.data = netcdf.getVar(ncid, varid, 'double');
0195 
0196 netcdf.close(ncid)
0197 
0198 % Get the number of vertical levels.
0199 nz = length(data.Depth.data);
0200 
0201 % Set a time increment is we've got the three-hourly flag passed to the
0202 % function. At some point, this may have to be a string comparison such
0203 % that we can use different outputs (if the HYCOM folks add them), but for
0204 % now, we just assume non-empty strings means three-hourly data. This will
0205 % also break horribly if we request data from both daily and three-hourly
0206 % data sets.
0207 if ~isempty(threehourly)
0208     time_increment = 3/24;
0209 else
0210     time_increment = 1;
0211 end
0212 times = modelTime(1):time_increment:modelTime(2);
0213 nt = length(times);
0214 
0215 % Before we go off downloading data, check the variables we've been asked
0216 % for are actually valid HYCOM names.
0217 for vv = 1:length(varlist)
0218     if iscell(varlist) && ~isfield(hycom, varlist{vv})
0219         error('Variable %s is not a valid HYCOM variable name.', varlist{vv})
0220     end
0221 end
0222 
0223 % Initialise the value of url so we can compare it each loop and only
0224 % reopen the connection to the server if we've crossed into a different
0225 % data set.
0226 url = get_url(modelTime(1), threehourly);
0227 
0228 data.time = [];
0229 c = 1; % counter for the tmjd cell array.
0230 for tt = 1:nt
0231 
0232     % So we can use either the Reanalysis (pre-2008) or Analysis
0233     % (post-2008) data, we need to build the request struct based on the
0234     % current time.
0235 
0236     % Set up a struct of the HYCOM data sets in which we're interested.
0237     if times(tt) < greg2mjulian(2008, 09, 19, 0, 0, 0)
0238         name_index = 1;
0239     elseif times(tt) >= greg2mjulian(2008, 09, 19, 0, 0, 0)
0240         name_index = 2;
0241     end
0242     hycom.MT            = [url, suffix.MT{name_index}];          % time [1D]
0243     hycom.Longitude     = [url, suffix.Longitude{name_index}];   % [2D]
0244     hycom.Latitude      = [url, suffix.Latitude{name_index}];    % [2D]
0245     hycom.Depth         = [url, suffix.Depth{name_index}];       % water depth [2D]
0246     hycom.temperature   = [url, suffix.temperature{name_index}]; % [4D]
0247     hycom.salinity      = [url, suffix.salinity{name_index}];    % [4D]
0248     hycom.ssh           = [url, suffix.ssh{name_index}];         % sea surface height [3D]
0249     hycom.u             = [url, suffix.u{name_index}];           % mean flow [4D]
0250     hycom.v             = [url, suffix.v{name_index}];           % mean flow [4D]
0251 
0252     oldurl = url;
0253     url = get_url(times(tt), threehourly);
0254     % Only reopen the connection if the two URLs differ.
0255     if ~strcmpi(oldurl, url) || tt == 1
0256         if times(tt) < greg2mjulian(2008, 09, 19, 0, 0, 0)
0257             hycom.MT = [url, suffix.MT{1}];
0258         elseif times(tt) >= greg2mjulian(2008, 09, 19, 0, 0, 0)
0259             hycom.MT = [url, suffix.MT{2}];
0260         end
0261         ncid = netcdf.open(hycom.MT, 'NOWRITE');
0262         try
0263             varid = netcdf.inqVarID(ncid, 'MT');
0264         catch
0265             varid = netcdf.inqVarID(ncid, 'time');
0266         end
0267 
0268         % Add the new data to the cell array. This should build up an
0269         % array of unique time series. We can then use these to query
0270         % for the time indices for each time step later.
0271         data.MT.data{c} = netcdf.getVar(ncid, varid, 'double');
0272 
0273         netcdf.close(ncid)
0274 
0275         % Convert to Gregorian date and then to Modified Julian Days. The
0276         % Global Reanalysis stores time as hours since 2000-01-01, the
0277         % Global Analysis as days since 1900-12-31.
0278         if times(tt) < greg2mjulian(2008, 09, 19, 0, 0, 0)
0279             t{c} = datevec((data.MT.data{c} / 24) + datenum(2000, 1, 1, 0, 0, 0));
0280         elseif times(tt) >= greg2mjulian(2008, 09, 19, 0, 0, 0)
0281             t{c} = datevec(data.MT.data{c} + datenum(1900, 12, 31, 0, 0, 0));
0282         end
0283         tmjd{c} = greg2mjulian(t{c}(:,1), t{c}(:,2), t{c}(:,3), t{c}(:,4), t{c}(:,5), t{c}(:,6));
0284 
0285         c = c + 1;
0286     end
0287 end
0288 
0289 % Open the relevant spatial variables on the remote server and download the
0290 % spatial data required to subset the HYCOM data.
0291 ncid = netcdf.open(hycom.Longitude, 'NOWRITE');
0292 try
0293     varid = netcdf.inqVarID(ncid, suffix.Longitude{1});
0294 catch
0295     varid = netcdf.inqVarID(ncid, suffix.Longitude{2});
0296 end
0297 
0298 data.X.data = netcdf.getVar(ncid, varid, 'double');
0299 
0300 netcdf.close(ncid)
0301 
0302 % Make the longitude values in the range -180 to 180 (to match the
0303 % model inputs).
0304 data.X.data = mod(data.X.data, 360);
0305 data.X.data(data.X.data > 180) = data.X.data(data.X.data > 180) - 360;
0306 
0307 ncid = netcdf.open(hycom.Latitude, 'NOWRITE');
0308 try
0309     varid = netcdf.inqVarID(ncid, suffix.Latitude{1});
0310 catch
0311     varid = netcdf.inqVarID(ncid, suffix.Latitude{2});
0312 end
0313 
0314 data.Y.data = netcdf.getVar(ncid, varid, 'double');
0315 
0316 netcdf.close(ncid)
0317 
0318 % If the spatial data are vectors, turn them in to matrices here.
0319 if isvector(data.X.data) && isvector(data.Y.data)
0320     [data.X.data, data.Y.data] = meshgrid(data.X.data, data.Y.data);
0321     % Orient the arrays as required for the calls to the remote server.
0322     data.X.data = data.X.data';
0323     data.Y.data = data.Y.data';
0324 end
0325 
0326 % Create indices of the size of the arrays.
0327 data.X.idx = repmat(1:size(data.X.data, 1), [size(data.X.data, 2), 1])';
0328 data.Y.idx = repmat(1:size(data.Y.data, 2), [size(data.Y.data, 1), 1]);
0329 %data.Y.idx = 1:size(data.Y.data, 2) - 1;
0330 
0331 % Find the indices which cover the model domain then find the extremes to
0332 % request only a subset from the OPeNDAP server.
0333 idx = data.X.data > extents(1) & data.X.data < extents(2) & data.Y.data > extents(3) & data.Y.data < extents(4);
0334 xrange = [min(data.X.idx(idx)), max(data.X.idx(idx))];
0335 yrange = [min(data.Y.idx(idx)), max(data.Y.idx(idx))];
0336 
0337 data.lon = data.X.data(xrange(1):xrange(2), yrange(1):yrange(2));
0338 data.lat = data.Y.data(xrange(1):xrange(2), yrange(1):yrange(2));
0339 
0340 % Clear out the full lon/lat arrays.
0341 % data = rmfield(data, {'X', 'Y'});
0342 
0343 % Now get the variables for which we've been asked.
0344 fields = varlist;
0345 
0346 for aa = 1:length(fields)
0347     % Store the downloaded data in a struct. Assume the spatial
0348     % data is identical to that in data.lon and data.lat.
0349     data.(fields{aa}).data = [];
0350 
0351     ncid = netcdf.open(hycom.(fields{aa}));
0352     varid = netcdf.inqVarID(ncid, suffix.(fields{aa}){name_index});
0353 
0354     % If you don't know what it contains, start by using the
0355     % 'netcdf.inq' and ncinfo operations:
0356     %[numdims, numvars, numglobalatts, unlimdimid] = netcdf.inq(ncid);
0357     %ncid_info = ncinfo(hycom.(fields{aa}));
0358 
0359     % Typically these data are 4D, with dimensions of:
0360     %   - x (X)
0361     %   - y (Y)
0362     %   - depth (Depth)
0363     %   - time (MT)
0364     % except in the case of sea surface height, where we lose
0365     % the depth dimension. For all other variables, we want all
0366     % depths but only a subset in time and space.
0367 
0368     % Since the HYCOM OPeNDAP server is so spectacularly slow,
0369     % extract a day's data at a time and stick them together
0370     % here. If nothing else, this at least means I can give
0371     % some indication of progress, rather than just wait for
0372     % something to eventually happen.
0373     nx = (xrange(2) - xrange(1)) + 1;
0374     ny = (yrange(2) - yrange(1)) + 1;
0375 
0376     % Preallocate the output so we don't append to an array
0377     % (which is slow). Sea surface height is 3D only (all the
0378     % other variables are 4D). So, it needs its own little
0379     % check all to itself.
0380     if strcmpi(fields{aa}, 'ssh') == 1
0381         was_zeta = true; % set boolean for surface elevation
0382         data.(fields{aa}).data = nan(nx, ny, nt);
0383     else
0384         was_zeta = false;
0385         data.(fields{aa}).data = nan(nx, ny, nz, nt);
0386     end
0387 
0388     c = 0; % counter for iterating through tmjd.
0389 
0390     for tt = 1:nt
0391         if ftbverbose
0392             fprintf('%s: time %i of %i... ', fields{aa}, tt, nt)
0393         end
0394 
0395         % Get the current url value for this time step. This
0396         % approach means we can grab data which straddles a
0397         % boundary between HYCOM outputs. Only reopen the
0398         % connection if the url value has changed. At this
0399         % point we also need to get ourselves a new time index
0400         % using modelTime and the cell array tmjd.
0401         oldurl = url;
0402         url = get_url(times(tt), threehourly);
0403 
0404         if ~strcmpi(oldurl, url) || tt == 1
0405             if times(tt) < greg2mjulian(2008, 09, 19, 0, 0, 0)
0406                 hycom.(fields{aa}) = [url, suffix.(fields{aa}){1}];
0407             elseif times(tt) >= greg2mjulian(2008, 09, 19, 0, 0, 0)
0408                 hycom.(fields{aa}) = [url, suffix.(fields{aa}){2}];
0409             end
0410             c = c + 1;
0411         end
0412 
0413         % Find the time index closest to the current model
0414         % time.
0415         [~, ts] = min(abs(tmjd{c} - times(tt)));
0416 
0417         if was_zeta
0418             % netCDF starts at zero, hence -1.
0419             start = [xrange(1), yrange(1), ts] - 1;
0420             count = [nx, ny, 1];
0421             data.(fields{aa}).data(:, :, tt) = ncread(url, suffix.(fields{aa}){name_index}, start + 1, count);
0422         else
0423             % netCDF starts at zero, hence -1.
0424             start = [xrange(1), yrange(1), 1, ts] - 1;
0425             count = [nx, ny, nz, 1];
0426             data.(fields{aa}).data(:, :, :, tt) = ncread(url, suffix.(fields{aa}){name_index}, start + 1, count);
0427         end
0428 
0429         % Build an array of the HYCOM times. Only do so once so
0430         % we don't end up appending it multiple times.
0431         if length(data.time) < nt
0432             data.time = [data.time; tmjd{c}(ts)];
0433         end
0434 
0435         if ftbverbose; fprintf('done.\n'); end
0436     end
0437     netcdf.close(ncid);
0438 end
0439 
0440 if ftbverbose
0441     fprintf('end   : %s\n', subname)
0442 end
0443 
0444 function url = get_url(time, threehourly)
0445 % Child function to find the relevant URL to use for a given time step.
0446 %
0447 % url = get_url(time, threehourly);
0448 %
0449 % INPUT:
0450 %   time - Modified Julian Day
0451 %   threehourly - additional string to append for optional three hourly
0452 %   outputs for the 1992/10/02 to 2008/09/18 period. Leave empty for daily
0453 %   data.
0454 %
0455 % OUTPUT:
0456 %   url - string of the approprate URL for the date supplied in time.
0457 %
0458 
0459 [t1, t2, t3, t4, t5, t6] = datevec(datestr(now));
0460 
0461 if time < greg2mjulian(1992, 10, 2, 0, 0, 0)
0462     error('No HYCOM data available prior to 1992-10-02. Select a start date from 1992-10-02 onwards.')
0463 elseif time >= greg2mjulian(1992, 10, 2, 0, 0, 0) && time < greg2mjulian(1995, 7, 31, 0, 0, 0)
0464     warning('Using the HYCOM Global Reanalysis data for dates up to 2008/09/16, thereafter the Global Analysis.')
0465     url = sprintf('http://tds.hycom.org/thredds/dodsC/GLBu0.08/expt_19.0%s?', threehourly);
0466 elseif time >= greg2mjulian(1995, 7, 31, 0, 0, 0) && time < greg2mjulian(2008, 09, 19, 0, 0, 0)
0467     warning('Using the HYCOM Global Reanalysis data for dates up to 2008/09/16, thereafter the Global Analysis.')
0468     url = sprintf('http://tds.hycom.org/thredds/dodsC/GLBu0.08/expt_19.1%s?', threehourly);
0469 elseif time >= greg2mjulian(2008, 9, 19, 0, 0, 0) && time < greg2mjulian(2009, 5, 7, 0, 0, 0)
0470     url = 'http://tds.hycom.org/thredds/dodsC/GLBa0.08/expt_90.6?';
0471 elseif time >= greg2mjulian(2009, 5, 7, 0, 0, 0) && time < greg2mjulian(2011, 1, 3, 0, 0, 0)
0472     url = 'http://tds.hycom.org/thredds/dodsC/GLBa0.08/expt_90.8?';
0473 elseif time >= greg2mjulian(2011, 1, 3, 0, 0, 0) && time < greg2mjulian(2013, 8, 21, 0, 0, 0)
0474     url = 'http://tds.hycom.org/thredds/dodsC/GLBa0.08/expt_90.9?';
0475 elseif time >= greg2mjulian(2013, 8, 21, 0, 0, 0) && time < greg2mjulian(2014, 4, 21, 0, 0, 0)
0476     url = 'http://tds.hycom.org/thredds/dodsC/GLBa0.08/expt_91.0?';
0477 elseif time >= greg2mjulian(2014, 4, 21, 0, 0, 0) && time < greg2mjulian(2016, 4, 18, 0, 0, 0)
0478     url = 'http://tds.hycom.org/thredds/dodsC/GLBa0.08/expt_91.1?';
0479 elseif time >= greg2mjulian(2016, 4, 18, 0, 0, 0) && time <= greg2mjulian(t1, t2, t3, t4, t5, t6)
0480     url = 'http://tds.hycom.org/thredds/dodsC/GLBa0.08/expt_91.2?';
0481 elseif time > greg2mjulian(t1, t2, t3, t4, t5, t6)
0482     error('Given date is in the future.')
0483 else
0484     error('Date is outside of the known spacetime continuum. See help TARDIS.')
0485 end
0486

Generated on Wed 20-Feb-2019 16:06:01 by m2html © 2005