Visit new DAL package TWiki
page https://twiki.cern.ch/twiki/bin/view/Atlas/DaqHltDal
Schema Changes
- Add new SW_PackageVariable
class to describe expendable environment variables defined by the SW
packages (links the variables via new AddProcessEnvironment
relationship). The class defines the variable
name and the suffix
concatenated with
the installation path of sw package it is linked with. When the same
package variable is linked with several packages, their
values are concatenated with colon (i.e. ':') separator.
- Add NumberOfCores
attribute to the ComputerParameters
class to describe number of processors' cores. This attribute is going
to be used to manage reasonable number of template HLT tasks depending
on capabilities of node where they are running.
- Add DBLookup technology
type to TriggerDBConnection.
- Added DBConnection base
class for the description of any type of database parameters. TriggerDBConnection extends from it.
- Added a DBConnections
relationship to TriggerConfiguration
class to link all DBConnection objects needed by trigger applications.
The TriggerDBConnecion is still linked explicitly via a dedicated
relationship (for backward compatibility).
Generation of Segment-Wide Process Environment by Infrastructure
Applications
An infrastructure application may provide service used by all
applications running inside this segment (e.g. rdb_server, is_server,
dbproxy). It is necessary to pass the name of this server to all it's
clients. The changes described below allow to automate this process
using process environment variables generated by DAL algorithm.
Schema changes: add
attributes SegmentProcEnvVarName,
SegmentProcEnvVarParentName and
SegmentProcEnvVarValue to the InfrastructureApplication class.
If value of the SegmentProcEnvVarName
attribute is non-empty, the process environment variable with name
equal to value of this attribute is created for any application of the
segment the infrastructure application belongs to (use case: pass name
of service to all clients in the segment). The value of this variable
is calculated depending on the value of the SegmentProcEnvVarValue attribute: it
either can be the application ID, or the name of the host the
infrastructure application runs on.
If value of SegmentProcEnvVarParentName
attribute is not empty, the process environment variable with name
equal to the attribute's value is generated for all applications of the
segment. The value of the environment is equal to value set for
variable corresponding to the SegmentProcEnvVarName from a parent
segment (use case: pass name of top-level service to the service of
this segment)
Example: segment rdb_server
- set rdb_server's SegmentProcEnvVarName=TDAQ_DB_NAME and
SegmentProcEnvVarValue=appId
- run rdb server with option "-a TDAQ_DB_NAME"
the "-a" option says: take name from environment with name TDAQ_DB_NAME
do not use -d XYZ command line option and do not create any process
environment to pass name of rdb_server to segment's applications as it
was before!
- as result of above, the rdb_server will be run with unique
application ID and this ID will be passed via TDAQ_DB_NAME process
environment variable to all applications of this segment
C++ Algorithms
- Test the circular dependencies between most DAL objects and throw
exception, if there is found one (e.g. a segment A contains segment B,
that in turn contains segment A, etc.; before this crashed DAL
algorithms). Reminder: it is
necessary to catch daq::config::Exception and daq::core::AlgorithmError
exceptions (or at least ers::Issue), when a DAL algorithm without
explicit throw() specification is used.
Java Algorithms
- put explicit declaration of throw by DAL algorithms
Add Utility to Generate Athentication and DBLookup Files for
DBProxy Clients
See "dal_create_db_connection_files -h" for command line options.
This utility creates authentication and dblookup files as requested
by the DbProxy clients (PT and L2PU). It uses the information stored in
the DBConnection objects in
the databases (linked through the
TriggerConfiguration object).
If the server name in the object is
specified as "LOCAL_HOST" the fully qualified name of the machine on
which the utility is run will be used.
This utility is meant to be run on all HLT nodes before the PT and L2PU
applications configure themselves.