Running ACIS Extract for timing analysis and point source photometry

P. Broos and L. Townsley

photometry_procedure.txt $Revision: 1304 $ $Date: 2022-02-14 13:36:12 -0700 (Mon, 14 Feb 2022) $

Tested on CIAO 4.12.

THIS PROCEDURE MUST BE EXECUTED IN A csh OR tcsh SHELL!



We assume that you have done all the setup steps in validation_procedure.txt, and still have the "screen" session built in validation_procedure.txt with a window for each ObsID.  



*****
BEWARE OF OVERFLOWING IDL'S COMMAND LINE BUFFER!
If characters are sent to the IDL prompt too quickly then its command line buffer will overflow and drop some of those characters.  This problem can easily occur if you paste a large block of IDL commands into terminal window where IDL is running.

In this recipe, blocks of IDL commands are sent to IDL processes in two distinct ways:

1. When the same commands need to be sent to several IDL sessions (e.g. to perform a task for each ObsID), we use features of the "screen" utility to send those "paste" to several screen windows.  The "screen" session is configured to slow down those "paste" operations, to reduce the chance of buffer overflow.

2. When commands need to be sent to just one IDL session, this recipe directs you to directly "paste" those commands in the appropriate screen window, using your computer's normal cut-and-paste operations.  In this case, the best way to prevent overflow of the IDL command buffer is to configure your terminal application's "paste" command to slow down the rate that characters are delivered.  

For example the Mac terminal application called iTerm2 has a menu item called "Paste Slowly" for just this situation.  If you have iTerm2, then the most convenient approach is to navigate to iTerm2->Preferences->Keys->Global Shortcut Keys.  There, you can configure your normal "paste" key (e.g. CMD-V) to the menu item "Paste Slowly".

Other terminal applications may have similar options to slow down pasting.  If that is not available, you may find that you have to cut-and-paste IDL commands in smaller chunks.
****




=============================================================================
Record in your notes (photometry_notes.txt) the version of this procedure that you are executing, shown below:
  Starting photometry_procedure.txt $Revision: 1304 $ $Date: 2022-02-14 13:36:12 -0700 (Mon, 14 Feb 2022) $
=============================================================================


=============================================================================
Check and record the CIAO and CALDB versions your machine is using.  

  ciaover
  check_ciao_caldb --latest
=============================================================================


#################################################################################
############# REVIEW THE CALIBRATION STATUS OF YOUR DATA ##################
#################################################################################
Between the time you started the validation procedure and now, the CXC may have updated several calibration products, particularly the model of the optical-blocking filter (OBF) transmission, and the time-dependent gain (TGAIN) files for the ACIS detectors.  A new version of CIAO may have been released, and slight changes in the FITS headers of Chandra data products may have been introduced.  Now is the time to incorporate such changes into your data analysis.

NOTE THAT UPDATES TO CIAO OR CALDB WHILE YOU WERE IN THE VALIDATION PROCEDURE HAVE PROBABLY NOT BEEN PROPAGATED TO YOUR DATA PRODUCTS!  Many data products are re-used (not recalculated in every pass) by the validation procedure, for efficiency.

You should review the calibration status of your data before proceeding, because this should be your last extraction of the catalog.  See calibration_patch_procedure.txt.



#################################################################################
#################### START BUILDING ADDITIONAL EXPOSURE MAPS ####################
#################################################################################
Exposure maps at additional energies will be needed in the diffuse recipe.  Start building these now (in a new screen session).

  cd  <target>/data/
  setenv TARGET <target>
  
  unset correct
  setenv OBS_LIST ""
  foreach dir (`'ls' -d pointing_*/obsid_* |sort --field-separator=_ --key=3g`)
    if (! -d $dir) continue
    set obs=`basename $dir | sed -e "s/obsid_//"`
    setenv OBS_LIST "$OBS_LIST $obs"
    end
  echo $OBS_LIST  
  
  setenv SCREEN_ARCH ''
  if (`uname -m` == arm64)  setenv SCREEN_ARCH 'arch -arm64 tcsh'

  setenv SCREEN_NAME diffuse_emaps_${TARGET}
  setenv PROMPT              '%m::%${TARGET} %% '
  screen -S $SCREEN_NAME -d -m -t  ${TARGET}  $SCREEN_ARCH
  screen -S $SCREEN_NAME -X defslowpaste 10

  screen -S $SCREEN_NAME -X setenv PROMPT "%m::${TARGET} (logs) %% "
  screen -S $SCREEN_NAME -X screen -t logs  $SCREEN_ARCH
  sleep 1
  foreach obs ($OBS_LIST)
    set dir=$PWD/pointing_*/obsid_${obs}
    screen -S $SCREEN_NAME -X chdir   $dir
    screen -S $SCREEN_NAME -X setenv  OBS     ${obs}
    screen -S $SCREEN_NAME -X setenv  PROMPT "${obs} %% "
    screen -S $SCREEN_NAME -X screen  -t   obs${obs}  $SCREEN_ARCH
    sleep 0.5
  end
  screen -S $SCREEN_NAME -X chdir   $PWD
  screen -S $SCREEN_NAME -r
  sleep 4
  screen -S $SCREEN_NAME -X select 0
  screen -S $SCREEN_NAME -X at \# slowpaste 10
  screen -S $SCREEN_NAME -X at \# stuff 'unset correct^M'
  screen -S $SCREEN_NAME -X at \# stuff 'pwd^M'
  screen -S $SCREEN_NAME -p 0 -X stuff 'echo $OBS_LIST | wc -w^M'
  screen -S $SCREEN_NAME -X windowlist -b
 
 
Double-check that you have a screen window for every ObsID.  There should also be a screen window called 'logs' plus the usual top-level $TARGET window #0.  Use ^a" to see the list of screen windows.  The last window index should be ONE LARGER than the number of ObsIDs (which was printed by the "echo" command above).  


From screen window #0 (named $TARGET) run:
        
cat > block.txt << \STOP
  
  idl -queue |& tee -a ae_make_emap.log 
    file_delete, "../../waiting_for_"+getenv("OBS"), /ALLOW_NONEXISTENT 
\STOP
  screen -S $SCREEN_NAME                -X readreg A block.txt

cat > block.txt << \STOP
    .run L1_2_L2_emaps.pro
    .run ae
    .run
    file_delete, "ae_finished", /ALLOW_NONEXISTENT 
    semaphore = wait_for_cpu()
    ; Discard any existing instrument maps, which may be miscalibrated if very old.
    L1_2_L2_emaps, OUTPUT_DIR="AE", REUSE_ASPHIST=1, REUSE_INSTMAP=0, /INCLUDE_FI_AND_BI_PRODUCTS, /INCLUDE_DIFFUSE_MAPS 
    file_move, "ae_lock", "ae_finished", /OVERWRITE 
    exit, /NO_CONFIRM 
    end  
\STOP
  screen -S $SCREEN_NAME                -X readreg B block.txt


  foreach obs ($OBS_LIST)
    set marker=waiting_for_${obs}; touch ${marker}
    screen -S $SCREEN_NAME -p obs${obs} -X   paste A
    sleep 1
    while ( -e ${marker} )
      echo "Launching IDL ..."
      sleep 1
    end
    printf "\nLaunching ObsID ${obs}\n"
    screen -S $SCREEN_NAME -p obs${obs} -X   paste B
    sleep 1
  end
  touch .marker
  while 1
    sleep 30
    printf '\nThe following L1_2_L2 runs are still making progress:\n'
    printf '  %s\n' `find pointing*/obsid*/ae_make_emap.log -newer .marker`
    touch .marker
    if (! `ls pointing*/obsid*/ae_lock | wc -l`) break
  end
 
  printf "\n\n"
  egrep -i "WARNING|ERROR|INFORMATION|DEBUG|halted|Stop|BADPIX_FILE" `ls -1tr pointing*/obsid*/ae_make_emap.log` | egrep -v "stowed_error|Merged|previously computed|Stop-|TSTOP|stop=sky|debug level|Program caused arithmetic error"


When all exposure maps are made and you have reviewed the log file grep above, you can kill this screen session with ^a\ as usual.

WHILE THOSE EXPOSURE MAPS ARE BEING BUILT, CONTINUE ON WITH THIS PROCEDURE


NOTES:
We take the conservative approach of discarding instrument maps (REUSE_INSTMAP=0), in case the observer is running this procedure to "update" the calibration of a target that was validated long ago.  That task could be done via calibration_patch_procedure.txt.




#################################################################################
############# PREPARE FOR HIGH-PRECISION TIMING ANALYSIS (OPTIONAL) ##################
#################################################################################
FOR OLD OBSERVATIONS recently downloaded, you don't need to run this section.

FOR NEW OBSERVATIONS, if you care about high-precision timing analysis, then download the final ephemeris (Level 1) for each ObsID to improve the barycentric correction made to the event timestamps.  This is only necessary if the CXC has updated the ephemeris since you downloaded the data.

IN THE VALIDATION SCREEN SESSION (not the emap screen session), from screen window #0 (named $TARGET), run:

  set extract_dir=`echo $cwd | sed -E "s|(.*extract)(.*)|\1|"`
  cd $extract_dir/.. ; pwd

  chmod    u+w pointing_*/obsid_*/archive/primary/

  foreach dir (pointing_*/obsid_*)
    if (! -d $dir) continue
    set obs=`basename $dir | sed -e "s/obsid_//"`
    printf "\nDownloading ephemeris for ObsID %s\n" $obs

    pushd /tmp/;  download_chandra_obsid -q $obs eph1;  popd 
    rsync -a /tmp/$obs/primary/orbit*eph1.fits.gz .

    gunzip      orbit*eph1.fits.gz
    set    file=orbit*eph1.fits

    'rm'                                         ${dir}/acis.eph1
    mv -f ${file} ${dir}/archive/primary/
    ln -s                archive/primary/${file}  ${dir}/acis.eph1

    'ls' -lL ${dir}/acis.eph1
  end

  chmod    u-w pointing_*/obsid_*/archive/primary/
  ls -lL pointing_*/obsid_*/acis.eph1
  

And then revise AE obsXXXX directories to point to those Level 1 ephemerides.

  set data_dir=`echo $cwd | sed -E "s|(.*data)(.*)|\1|"`
  cd $data_dir/extract/ ; pwd

  idl                               
    .run ae
    run_command, file_which('setup_AE_for_obsid_data.csh')+' "../pointing_*/obsid_*" AE'
    exit

  ls -ltr obs[0-9]*/obs.eph
  cd $data_dir/extract/point_sources.noindex/ ; pwd

***PAT -- explain acis.eph1 vs. acis.eph0 symlinks.  Consider having code skip download when not needed. ***





#################################################################################
############# CUSTOM EXTRACTION APERTURES ##################
#################################################################################
This point in the procedure, before the final extraction (below), is a good time to consider whether you wish to draw any custom extraction apertures.  Situations where a custom aperture may be scientifically beneficial include:


* AE's fixed default apertures (90% PSF fraction) are a compromise between wanting to extract all the signal from a source and wanting to minimize background in that extraction.  However, the PSF of a bright source may remain well above the background outside the 90% aperture.  Enlarging the aperture may increase the signal-to-noise ratio of the extraction.

* The same situation can arise for apertures that have been reduced below 90% to avoid overlapping with a neighbor's aperture.

* A heavily-piled extraction may benefit from extracting more of the wings of the PSF, where pile-up is greatly reduced.

* When pile-up cannot be successfully modeled using a normal extraction aperture, excluding the most heavily-piled data (the core of the PSF) may allow pile-up modeling to succeed.

* In principle, reducing the apertures for weak sources could improve the signal-to-noise ratio of their extractions.


Obviously, whether your time and money are well-spent on fine-tuning aperture sizes for a particular source is a management question, for which I can give no advice.


===============================================================================
RISKS PRODUCED BY CUSTOM APERTURES

**********
ANY CUSTOM APERTURE YOU DEFINE can adversely affect nearby sources.  AE will do its best to reduce the apertures of nearby sources to avoid overlapping apertures, but if overlap cannot be avoided then extractions of those neighbors (and possibly the extractions you just hand-crafted) WILL BE DISCARDED by the MERGE stage.
**********

**********
ANY CUSTOM APERTURE YOU DEFINE will remain stationary, even if you adopt a new position for the source.
DO NOT MESS WITH CUSTOM APERTURES UNTIL THE CATALOG IS STABLE.
**********




===============================================================================
IF YOU DO DECIDE TO DESIGN SOME CUSTOM APERTURES FOR A SOURCE, REMEMBER THAT IT HAS INDEPENDENT APERTURES IN EACH OBSID.  PSF FRACTIONS AND CROWDING CONDITIONS FOR THAT SOURCE MAY VARY AMONG ITS OBSIDS.


**** A NOTE ABOUT VISUAL ASSESSMENT OF CUSTOM APERTURES ****
A custom aperture is typically designed by visually reviewing the single-ObsID event list near the source, with the goals of maximizing signal-to-noise in the extracted spectrum, minimizing contamination from neighboring sources, and (when applicable) reducing pile-up effects.

To make good visual judgements, the displayed single-ObsID event list should be filtered appropriately.  When the /SHOW stage of acis_extract is used, appropriate filtering (on the "energy" and "status" columns) is applied automatically.

However, if you directly display event lists, then manual filtering may be appropriate in some cases.

  * ../obsXXXX/spectral.evt and <sourcename>/<obsname>/neighborhood.evt are preferred.  Note that they are not band-limited, so applying the filter [energy=500:7000] would be a slight improvement.

  * ../obsXXXX/validation.evt is NOT appropriate for bright/piled sources, because it is "heavily cleaned".  It is also not band-limited.
  
  * target.evt is NOT appropriate for designing custom apertures.  It combines all ObsIDs and it is "heavily cleaned" (inappropriate for piled sources).




An outline of a procedure for modifying apertures for multiple sources is shown below.  
This procedure assumes that modified_aperture.srclist contains the list of sources that need custom apertures.  
Recall that source apertures are stored in the <sourcename>/<obsname>/extract.reg files.  


1. First, it's often useful to have AE generate apertures at a variety of PSF fractions, so that you don't have to guess the PSF fraction that will be best for each source.
In each ObsID window, run:

  idl -queue |& tee -a modified_aperture.${OBS}.log
    .run ae
    .run
    ae_make_catalog, getenv("OBS"), EVTFILE="spectral.evt", /REGION_ONLY, SRCLIST_FILENAME='modified_aperture.srclist', EXTRACTION_NAME='92pct', MINIMUM_PSF_FRACTION=0.92, NOMINAL_PSF_FRACTION=0.92, REUSE_PSF=0, PSF_MODEL_COUNTS=replicate(1e6,5)

    ae_make_catalog, getenv("OBS"), EVTFILE="spectral.evt", /REGION_ONLY, SRCLIST_FILENAME='modified_aperture.srclist', EXTRACTION_NAME='94pct', MINIMUM_PSF_FRACTION=0.94, NOMINAL_PSF_FRACTION=0.94

    ae_make_catalog, getenv("OBS"), EVTFILE="spectral.evt", /REGION_ONLY, SRCLIST_FILENAME='modified_aperture.srclist', EXTRACTION_NAME='96pct', MINIMUM_PSF_FRACTION=0.96, NOMINAL_PSF_FRACTION=0.96

    ae_make_catalog, getenv("OBS"), EVTFILE="spectral.evt", /REGION_ONLY, SRCLIST_FILENAME='modified_aperture.srclist', EXTRACTION_NAME='98pct', MINIMUM_PSF_FRACTION=0.98, NOMINAL_PSF_FRACTION=0.98

    exit,/NO_CONFIRM
    end



2. Visually review each aperture size built above on top of the ObsID data, and make notes about what size is best for each extraction of each source.  If you're lucky, the SAME aperture size will be adequate for all extractions of a source.

  idl
  acis_extract, /SHOW, COLLATE='tables/??.collated', SRCLIST='modified_aperture.srclist', /INCLUDE_PRUNED_OBSIDS, /OMIT_BKG_REGIONS, EXTRACTION_NAME='98pct'

  acis_extract, /SHOW, COLLATE='tables/??.collated', SRCLIST='modified_aperture.srclist', /INCLUDE_PRUNED_OBSIDS, /OMIT_BKG_REGIONS, EXTRACTION_NAME='96pct'

  acis_extract, /SHOW, COLLATE='tables/??.collated', SRCLIST='modified_aperture.srclist', /INCLUDE_PRUNED_OBSIDS, /OMIT_BKG_REGIONS, EXTRACTION_NAME='94pct'

  acis_extract, /SHOW, COLLATE='tables/??.collated', SRCLIST='modified_aperture.srclist', /INCLUDE_PRUNED_OBSIDS, /OMIT_BKG_REGIONS, EXTRACTION_NAME='92pct'



3. For each extraction of each source needing custom apertures, COPY the desired aperture to <sourcename>/<obsname>/extract.reg.  This copying can be done by hand, or with shell scripts similar to those below.

  foreach src (`egrep -v '^[[:space:]]*(;|$)' label.txt | egrep 'p1_610|p1_995|c73' | cut -d " " -f1`)
    printf "\n\nSOURCE %s\n" $src
    foreach obs_dir ($src/[0-9]*_?I)
      chmod    u+w $obs_dir/extract.reg
      'rsync' -a $obs_dir/98pct/extract.reg $obs_dir/
      chmod -v a-w $obs_dir/extract.reg
    end
  end

  foreach src (`egrep -v '^[[:space:]]*(;|$)' label.txt | egrep 'p1_752|p1_698' | cut -d " " -f1`)  
    printf "\n\nSOURCE %s\n" $src
    foreach obs_dir ($src/[0-9]*_?I)
      chmod    u+w $obs_dir/extract.reg
      'rsync' -a $obs_dir/94pct/extract.reg $obs_dir/
      chmod -v a-w $obs_dir/extract.reg
    end
  end

  ...



4. If you prefer to hand-edit apertures (e.g. to avoid light from a nearby neighbor), then keep in mind a few principles.

A. Each extraction has an independent aperture, stored in <sourcename>/<obsname>/extract.reg files.


B. The aperture in each extract.reg file must be one or more POLYGONs, in the PHYSICAL (sky) coordinate system for that ObsID.
   Annular appertures of the following form are allowed:
     polygon(...)    <- the "outer_frac" (e.g. 99%) polygon from annulus000:???/extract.reg
    -polygon(...)    <- the "inner_frac" polygon


C. Each aperture must remain within the footprint of the corresponding PSF image, so that the "PSF fraction" can be calculated.  With care, the PSF_FOOTPRINT_MULTIPLIER option to the CONSTRUCT_REGIONS stage can be used to enlarge the PSF footprint of individual sources.

  obsdir      = '../obsXXXXX/'
  evtfile     = obsdir + 'spectral.evt'
  emapfile    = obsdir + 'obs.emap'
  aspect_fn   = obsdir + 'obs.asol'

  acis_extract, '<source name>', 'ObsID name', /CONSTRUCT_REGIONS, PSF_FOOTPRINT_MULTIPLIER=2,$
                evtfile, EMAP_FILENAME=emapfile, ASPECT_FN=aspect_fn,$
                REUSE_PSF=0, /REGION_ONLY


D. If the SAME aperture is appropriate for all extractions of a source, then you can draw the aperture once (e.g. in <sourcename>/hand_drawn_aperture.reg) in celestial coordinates, and then use an AE tool to assign that region as the aperture for all ObsIDs, e.g.
  idl
    .run ae
    ae_set_region, '053842.90-690604.9', '053842.90-690604.9/hand_drawn_aperture.reg'

or for specific ObsIDs:

    ae_set_region, '053842.90-690604.9', '053842.90-690604.9/hand_drawn_aperture.reg', '16192'


E. If you need to hand-draw each ObsID's aperture, then the shell script below can launch ds9 sessions for each ObsID.

  cd to the source directory, e.g.  asl p1_610

  foreach obs ([0-9]*)
    pushd $obs
    chmod u+w extract.reg
    set fn=/tmp/${obs}.evt
    dmcopy "neighborhood.evt[energy>500,energy<7000]" $fn clob+
    ds9 -title "${obs}: PSF model & lightly-cleaned neighborhood" source.psf $fn -region load all ../../../obs${obs}/extract.reg  -region load all extract.reg -zoom to fit &
    popd
  end

In each ds9 session, hand-edit the aperture in the neighborhood.evt frame:
  * Aperture must remain with PSF image footprint.
  * Aperture should avoiding neighbors, as desired.
Remove all other regions in that frame; save your aperture to extract.reg IN PHYSICAL COORDINATES!!!!








===============================================================================
WRITE-PROTECT THE APERTURES YOU HAVE BUILT!
FREEZE THE POSITIONS OF THE SOURCES WITH CUSTOM APERTURES!

If you do not write-protect your apertures, AE will overwrite them! 
If you do not freeze the source positions, future processing (e.g. during a patch procedure) could quietly move these sources, producing inconsistent source positions and apertures. 


Assuming that modified_aperture.srclist records every source with custom apertures, run the following:

  foreach src (`egrep -hv '^[[:space:]]*(;|$)' modified_aperture.srclist  | cut -d " " -f1`)  
    chmod -v a-w $src/[0-9]*/extract.reg
  end

  idl
    .run ae
    ae_poke_source_property, SRCLIST_FILENAME='modified_aperture.srclist', KEYWORD='IMMORTAL', VALUE='T', COMMENT='custom apertures'


  
  
===============================================================================
AFTER THE NEXT EXTRACTION, VERIFY THAT YOUR CUSTOM APERTURES HAVE NOT DAMAGED NEIGHBORING SOURCES.


Collate the modified region files for each ObsID.
  idl
    .run
    foreach obsname, strtrim(strsplit(getenv('OBS_LIST'), /EXTRACT), 2) do begin
      obsdir      = '../obs' + obsname + '/'
      acis_extract, 'modified_aperture.srclist', obsname, REGION_FILE=obsdir+'modified_aperture.reg', REGION_TAG='modified aperture', COLLATED_FILENAME='/dev/null'
    endforeach
    exit, /NO_CONFIRM
    end

VISUALLY REVIEW ALL CUSTOM APERTURES

  acis_extract, COLLATED_FILENAME='tables/most_valid_merge.collated', SRCLIST_FILENAME='modified_aperture.srclist', /SHOW_REGIONS, /OMIT_BKG_REGIONS, /INCLUDE_PRUNED_OBSIDS, OBSID_REGION_GLOB_PATTERN='../obs*/modified_aperture.reg'

Navigate to each nearby neighbor, and verify that its aperture does not significantly overlap any of your custom apertures.




#################################################################################
############# PRELIMINARY CUSTOM MASKING OF DIFFUSE STRUCTURES #############
#################################################################################
From any screen window cd'd to /point_sources.noindex run:

  if (! -e ../diffuse_sources.noindex                   ) mkdir ../diffuse_sources.noindex
  if (! -e ../diffuse_sources.noindex/extra_maskfile.reg) touch ../diffuse_sources.noindex/extra_maskfile.reg 
  ~/bin/acis_lkt4 ../diffuse_sources.noindex/ ../diffuse_sources.noindex/extra_maskfile.reg
  
  ds9 -bin factor 4 ../target.evt -scalelims 0 10  -zoom to 1 -region ../diffuse_sources.noindex/extra_maskfile.reg  &

If you have already made a custom extra_maskfile.reg (e.g. from a previous analysis of this target) it will appear in the ds9 window generated above.  If not, the code above will generate an empty file of that name and no region will appear in the ds9 session above.

The "extraction" section following this one will also execute our first attempt to build masked data products for diffuse analysis.  (We do this now because we have CPU resources available during this procedure, while point sources are being merged and reviewed.)

The "ae_better_masking" tool will mask the point sources.  Don't worry about bright/piled point sources right now -- the code will try to take care of them.  If it does an inadequate job this first time through, you can customize the masking region later in diffuse_procedure.txt

Additionally, you may want to apply extra custom mask regions to remove certain structures from your diffuse analysis; examples include the confused cores of massive star clusters (where there are so many unresolved stars that they will dominate any truly diffuse emisson), bright SNRs or PWNe, etc.  If you want to identify such structures now, then in the ds9 session launched above, draw custom mask regions (polygons, circles, boxes), and save them to ../diffuse_sources.noindex/extra_maskfile.reg in CELESTIAL COORDINATES USING SEXAGESIMAL FORMAT.  Later in diffuse_procedure.txt you will have a chance to revise extra_maskfile.reg and re-mask the field.




#################################################################################
############# REMOVE TRASH REMAINING IN EXTRACTION DIRECTORIES ##################
#################################################################################

As sources are repositioned in the validation recipe the set of ObsIDs that must be extracted for a source can change, leaving orphaned extractions (identified by the absence of an obs.stats file).  We remove these now.

The validation recipe produces some merges that have no long-term value.  We remove these now.


From screen window #0 (named $TARGET), run:

  idl -queue |& tee -a trash_removal.log
    .run
    ;; Create a unique trash directory.
    tempdir = temporary_directory( VERBOSE=0, SESSION_NAME=session_name)
    tempdir = 'trash.'+getenv('TARGET')+'.'+session_name+'/'  

    readcol, 'all.srclist', sourcename, FORMAT='A', COMMENT=';'

    ; Trim whitespace and remove blank lines.
    sourcename = strtrim(sourcename,2)
    sourcename = sourcename[where(sourcename NE '', num_sources)]

    extraction_list = file_dirname(file_search(sourcename, 'obs.psffrac'), /MARK)

    ind_orphaned = where( ~file_test(extraction_list + 'obs.stats'), num_orphaned)
    if (num_orphaned GT 0) then begin
      ophaned_list =                         extraction_list(ind_orphaned)
      destination  = tempdir + file_dirname( extraction_list(ind_orphaned) )
      
      orphan_modtime = file_modtime(ophaned_list+'obs.psffrac')
      orphan_date    = strarr(num_orphaned)
      for ii=0,num_orphaned-1 do orphan_date[ii] = strmid(systime(0, orphan_modtime[ii]),4)
      
      print, num_orphaned, tempdir, F='(%"\nMoving the following %d obsolete extractions to %s")'
      forprint, SUBSET=sort(orphan_modtime), ophaned_list, orphan_date, F='(%"  %-30s was extracted on %s")'
      
      file_mkdir, destination
      file_move, ophaned_list, destination, /REQUIRE_DIRECTORY
    endif
    exit, /NO_CONFIRM
    end

  mv trash.${TARGET}.* /tmp/
  find [0-9]* -maxdepth 1 -type d  \( -name theta_00-05 -or -name trash \)  -print0 -prune  | xargs -0 -n1000 rm -rv


#################################################################################
############# EXTRACT SOURCES; MASK POINT SOURCES; MERGE EXTRACTIONS ##################
#################################################################################


Construct PSFs at 5 energies; build src spectra, ARFs, and RMFs; build lightcurves; rebuild backgrounds.
Do NOT use the /REUSE_ARF option!  We want the ARFs to be rebuilt, using the current CALDB and instrument maps.
This takes many hours.

{REFERENCE NOTE: The code below builds the file neighborhood.evt TWICE, which is a waste of computing.  It's tempting to use /REUSE_NEIGHBORHOOD in either the ae_make_catalog or ae_standard_extraction call, but I am worried that there may be rare situations where that would allow obsolete event data to affect the extraction.  PB}


=================================================================================================
First we launch IDL sessions to extract the catalog from each ObsID. 
From screen window #0 (named $TARGET), run:

cat > block.txt << \STOP
   
   nice idl -queue |& tee -a ae_extraction_and_masking.${OBS}.log 
     file_delete, "../obs"+getenv("OBS")+'/ae_finished', /ALLOW_NONEXISTENT 
     file_delete, "waiting_for_"+getenv("OBS")         , /ALLOW_NONEXISTENT 
\STOP
  screen -S $SCREEN_NAME                -X readreg A block.txt

  touch CHECK_FOR_PILEUP SKIP_RESIDUALS
  'rm'  VALIDATION_MODE   >& /dev/null
  'rm'  DISCARD_PSF       >& /dev/null
  'rm'  REUSE_NEIGHBORHOOD   >& /dev/null

cat > block.txt << \STOP
    .run ae 
    .run 
    par = ae_get_target_parameters()
    obsdir = "../obs"+getenv("OBS")+"/" 
    if keyword_set(par.BACKGROUND_MODEL_FILENAME) then $
      par.BACKGROUND_MODEL_FILENAME = obsdir + par.BACKGROUND_MODEL_FILENAME
    extract_and_pileup_block1, par, BACKGROUND_MODEL_FILENAME=par.background_model_filename
    file_move, obsdir+"ae_lock", obsdir+"ae_finished", /OVERWRITE 

    ; Mask the validation.evt data for EACH ObsID.
    semaphore = wait_for_cpu(LOW_PRIORITY=strmatch(obsdir,'*_BI'))

    ae_better_masking, getenv("OBS"), EVTFILE_BASENAME="validation.evt", REUSE_MODELS=0, /SKIP_PHOTOMETRY_STAGE, /SKIP_EXTRACT_BACKGROUNDS, /SKIP_RESIDUALS, PLOT=0, EXTRA_MASKFILE="../diffuse_sources.noindex/extra_maskfile.reg" 

    file_move, obsdir+"ae_lock", obsdir+"ae_finished", /OVERWRITE 
    exit, /NO_CONFIRM 
    end 
\STOP
  screen -S $SCREEN_NAME                -X readreg B block.txt


  foreach obs ($OBS_LIST)
    set marker=waiting_for_${obs}; touch ${marker}
    screen -S $SCREEN_NAME -p obs${obs} -X   paste A
    sleep 1
    while ( -e ${marker} )
      echo "Launching IDL ..."
      sleep 1
    end
    printf "\nLaunching ObsID ${obs}\n"
    screen -S $SCREEN_NAME -p obs${obs} -X   paste B
    sleep 2
  end


***DON'T STOP HERE -- KEEP READING...***


=================================================================================================
Launch an IDL session that will:
  * manage the adjustment of each source's BACKSCAL range 
  * merge the extractions 
  * make grouped spectra for SNR>=4 sources

From screen window #1, run:

  idl -queue |& tee -a merge.log
    .run ae
    .run
    ;; ---------------------------------------------------------------------
    ;; Adjust BACKSCAL ranges
    manage_extract_and_pileup_block1


    ;; ---------------------------------------------------------------------
    ; VERIFY THAT EVERY EXTRACTION IS FRESH AND COMPLETE        
    verify_extractions


    ;; ---------------------------------------------------------------------
    ;; ALL-INCLUSIVE MERGE
    acis_extract, 'all.srclist', MERGE_NAME='all_inclusive', /MERGE_OBSERVATIONS, MERGE_FOR_PHOTOMETRY=0, OVERLAP_LIMIT=0.10, MIN_QUALITY=0.5, PAGE_LONG_DIM=18

    ; Identify source extractions suspected to contain afterglow events; sort by afterglow fraction.
    ae_afterglow_report, 'all.srclist', MERGE_NAME='all_inclusive', BAND_NUM=0, /SORT_BY_FRACTION, VERBOSE=0

    if logical_true( file_lines('agr.srclist') ) then begin
      ; Re-run tool on just the suspect sources to produce a cleaner report.
      ae_afterglow_report, 'agr.srclist', MERGE_NAME='all_inclusive', BAND_NUM=0, suspect_name, Pb_revised, ag_count, ag_fraction
      save, suspect_name, Pb_revised, ag_count, ag_fraction, FILE='agr_all_inclusive.sav'
    endif

    acis_extract, 'all.srclist', MERGE_NAME='all_inclusive', COLLATED_FILENAME='tables/all_inclusive.collated'


    ;; ---------------------------------------------------------------------
    ;; PHOTOMETRY MERGE
    acis_extract, 'all.srclist', MERGE_NAME='photometry', /MERGE_OBSERVATIONS, /MERGE_FOR_PHOTOMETRY, /SKIP_TIMING, OVERLAP_LIMIT=0.10, MIN_QUALITY=0.5

    ; Identify source extractions suspected to contain afterglow events; sort by afterglow fraction.
    ae_afterglow_report, 'all.srclist', MERGE_NAME='photometry', BAND_NUM=0, /SORT_BY_FRACTION, VERBOSE=0

    if logical_true( file_lines('agr.srclist') ) then begin
      ; Re-run tool on just the suspect sources to produce a cleaner report.
      ae_afterglow_report, 'agr.srclist', MERGE_NAME='photometry', BAND_NUM=0, suspect_name, Pb_revised, ag_count, ag_fraction
      save, suspect_name, Pb_revised, ag_count, ag_fraction, FILE='agr_photometry.sav'
    endif

    acis_extract, 'all.srclist', MERGE_NAME='photometry', COLLATED_FILENAME='tables/photometry.collated'

    ;; BUILD GROUPED SPECTRA (for high SNR sources) TO SUPPORT BY-HAND FITTING
    if ~file_test('xspec_scripts') then file_link, /VERBOSE, file_which('xspec_scripts'), '.'

    bt = mrdfits('tables/photometry.collated', 1)

    band_full = 0
    if ~almost_equal(bt.ENERG_LO[band_full], 0.5, DATA_RANGE=range) then print, band_full, range, F='(%"\nWARNING: for Full Band (#%d),  %0.2f <= ENERG_LO <= %0.2f; ENERG_LO should be 0.5 keV.\n")'
    if ~almost_equal(bt.ENERG_HI[band_full], 8.0, DATA_RANGE=range) then print, band_full, range, F='(%"\nWARNING: for Full Band (#%d),  %0.2f <= ENERG_HI <= %0.2f; ENERG_HI should be 8.0 keV.\n")'

    SRC_SIGNIF = bt.SRC_SIGNIF[band_full]
    
    significance_threshold = 4
    fit_ind    = where(SRC_SIGNIF GE significance_threshold, count)
    print, count, F='(%"\nGrouping %d spectra ...")'

    if (count GT 0) then begin
      forprint, TEXTOUT='grouped_spectra.srclist', bt.CATALOG_NAME, bt.LABEL, SRC_SIGNIF, SUBSET=fit_ind, /NoCOMMENT, F='(%"%s ; (%s) SNR = %0.1f")'

      acis_extract, 'grouped_spectra.srclist', MERGE_NAME='photometry', /FIT_SPECTRA, MODEL_FILENAME='xspec_scripts/nofit.xcm', SNR_RANGE=[1,3], NUM_GROUPS_RANGE=[2+8,250], CHANNEL_RANGE=[35,548]  ; Spectral range is 0.5:8 keV.
    endif
    exit, /NO_CONFIRM
    end


This gives some warnings about "no model saved by XSPEC" and "no plot produced by XSPEC" -- that's OK, because we never actually run XSPEC.



-------------------------------------------------------------------------------------------
PILE-UP CORRECTION

The extraction script for each ObsID will report piled extractions, and will pause while you go off and compute pileup-corrected photometry using the procedure pileup_reconstruction.txt.  You will then resume the extraction script, which will carry on with computing background spectra.

In each ObsID's screen window, the ae_pileup_screening tool prints a table of likely-piled extractions, and prepares the files needed for pile-up reconstruction.  If any of the rows (extractions) in the table printed above end with "STATUS=0", then you should stop and contact Patrick Broos!!!

Go execute the pile-up modeling procedure pileup_reconstruction.txt for EVERY EXTRACTION EXPECTED TO BE PILED (RATE_3x3 > 0.05 ct/frame), so that you will have a pile-up corrected spectrum on-hand ready for spectral fitting.  If you still have a pile-up screen session from the validation recipe, the safe thing to do is to close it out now and start fresh.  (You can close an entire screen session with the command "^a\".)

For complicated multi-ObsID cases, I save notes on these tests in check_for_pileup.txt, where I listed all sources showing >=0.05 cts/frame in each ObsID. 


**** 
NOTE that these pile-up reconstructions (which take place in EPOCH_XXXX/ directories) are NOT propagated into the merges built later in this procedure.  In other words, we are collating (and probably publishing) piled photometry for the bright sources.  That's not as bad as it sounds, since broad-band photometry is a very poor way to describe these bright sources; instead their spectra should be carefully fit with models that account for pile-up.  

Complex pile-up reconstruction for scientific analyses is outside the scope of this recipe.  In some cases, you may want to combine similar ObsIDs and reconstruct that merged spectrum.  In other cases, you may want to time-slice one ObsID and reconstruct the spectrum from each slice.
**** 


There is, of course, no magic threshold on RATE_3x3 where pile-up "begins".  We have observed that the pile-up tool has produced an event rate correction of 1.11 for an extraction where RATE_3x3 was 0.075 ct/frame.##  Thus, we recommend that you consider pile-up correction (using the procedure pileup_reconstruction.txt) for extractions with RATE_3x3 > 0.05 (ct/frame).  

## The extraction mentioned above was in NGC 3603:
111509.34-611602.0 (p1_3368 ) in ObsID 0633 (THETA=0.3; PSF_FRAC=0.66):     291 (ct)   0.075 (ct/frame)  



-------------------------------------------------------------------------------------------
TWO KINDS OF MERGES; AFTERGLOW REPORTING

The all_inclusive merge is necessary for timing analysis, and it produces unbiased photometry.
However, it can mix on-axis and far off-axis extractions, harming SNR in both photometry and in spectral fitting.  
So, we also run a photometry-optimized merge (as described in the AE manual and in Broos2010), which we will use for reporting photometry and for spectral fitting.


We do NOT remove any existing merge directories before running the merges, because those directories could contain valuable work (timing analysis and/or spectral fitting).


Afterglow screening in the validation recipe involved whatever merge produced the most significant detection.  Here, we want our 'all_inclusive' and 'photometry' merges to report (in the keyword AG_FRAC) how much afterglow contamination may be in those merges.  As always, AG_FRAC is garbage (false positives) for bright sources.



=================================================================================================
CHECK LOGS FOR PROBLEMS
This is your last chance to notice unexpected error/warning messages, that could signal something has gone very wrong.


***STOP HERE UNTIL ALL THE DANCING IS DONE.***

Note that masking of the ObsID data may finish AFTER the "merge.log" session above finishes; we don't need those masking results until the diffuse procedure, so you may continue with the photometry procedure while that masking runs.  Can't do diffuse work until masking finishes!



  egrep -i "WARNING|ERROR|DEBUG|halted|Stop" ae_extraction_and_masking.*.log | egrep -v "DISPLAY variable|LS_COLORS|no in-band data|No HDUNAME|Program caused arithmetic error|error=|ARF was computed to be zero|has no rows|spans multiple|not observed|spectra will be reused|sources were not extracted|sources not in this observation|ran out of candidate|one emap pixel|(GRATTYPE|CCD_ID)' in supplied header|Background spectrum|BACKGROUND_IMBALANCE_THRESHOLD|Zero length|cannot intersect|finite width|subset of full catalog|saved from previous session|adopted region|StopTime|severely crowded|brute force|single extraction has excessive OVERLAP|reserve region|reached hard limits|changed the standard deviations|max kernel radius|axBary: HDU|-debug" |more

  grep -c 'Spawning ds9 to perform coordinate conversions on EXTRA_MASKFILE' ae_extraction_and_masking.*.log

  egrep -i "WARNING|ERROR|DEBUG|halted|Stop" merge*.log | egrep -v "DISPLAY variable|LS_COLORS|arithmetic error|no in-band|No HDUNAME|different value|off-axis angle range|reading FILTER|fitsio|Zero length|cannot intersect|finite width|excessive OVERLAP|almost_equal|xspec_run.log|C-statistic|unignored groups"
  
  
  
If you built custom extraction apertures ("CUSTOM EXTRACTION APERTURES" above), then verify that none of the custom extractions have been discarded in the all-inclusive (e.g. due to "excessive overlap").
  idl
    .run
    acis_extract, 'modified_aperture.srclist', MERGE_NAME='all_inclusive', COLLATED_FILENAME='tables/modified_aperture.collated'
    
    bt = mrdfits('tables/modified_aperture.collated',1)
    forprint, bt.LABEL, bt.MERGFRAC, F='(%"%10s MERGFRAC = %0.2f")'
    exit, /NO_CONFIRM
    end




-------------------------------------------------------------------------------------------
BACKSCAL Range Adjustment

The ae_adjust_backscal_range tool (called by the manage_extract_and_pileup_block1 code) makes 20 attempts to adjust the "BACKSCAL range" of each source.  Often, the BACKSCALE range for a **few** sources is unstable.  If this occurs for more than 1% of the catalog, the code prints the following warning:

  WARNING: BACKSCAL range adjustments have not converged for more than XX% of sources.

You should investigate such a warning.  The shell commands below will show, for each source not converged, the evolution of the BACKSCAL goal (the middle number inside the square brackets).

  foreach src ( `cat rerun.srclist | cut -f1 -d ' '` )
    grep "$src" merge*.log | grep BKSCL_GL
    printf '\n\n'
    end




#################################################################################
############# VISUAL REVIEW ##################
#################################################################################

REVIEW SINGLE-ObsID EXTRACTION RESULTS
=====================================
In each ObsID's screen session, execute the following /PLOT stages.

Each plot is automatically saved as a Postscript file; the filename is shown in the IDL plot window.  You are free to rescale them (or switch from linear to log on either or both axes) to make them more informative, and then re-save (File->Print).  You may want to record information about them in photometry_notes.txt.  

The number of points in all these plots should be the number of sources in the catalog; this allows you to middle-click on a point to recover the index of that source, in case you need to investigate outliers.  When no data are available for a source (e.g. when the source was not extracted in the ObsID you're plotting), its point appears at a "null" location, typically (0,0). 

So in each ObsID's screen session:

  idl  
    .run
    obsdir = "../obs"+getenv("OBS")+"/" 
    collated_filename = obsdir + "all.collated"

    acis_extract, "", getenv("OBS"), /CONSTRUCT_REGIONS,   /PLOT, OUTPUT_DIR=obsdir, COLLATED_FILENAME=collated_filename
    acis_extract, "", getenv("OBS"), /EXTRACT_SPECTRA,     /PLOT, OUTPUT_DIR=obsdir, COLLATED_FILENAME=collated_filename
    acis_extract, "", getenv("OBS"), /EXTRACT_BACKGROUNDS, /PLOT, OUTPUT_DIR=obsdir, COLLATED_FILENAME=collated_filename
    end

    exit, /NO_CONFIRM

On a Mac, use Mission Control (formerly Expose') to see all the plots at the same time; press the space bar to magnify the plot under the mouse!


The following plots show characteristics of the background extraction.

  * The 'Background Surface Brightness' vs. 'Off-axis Angle' plot (saved as bkgd_sb_vs_theta.ps) serves as a sanity check on the scaling of the background extractions.  Any correlation with off-axis angle should have an astrophysical explanation.  Outliers above the main locus should be sources suffering significant background from the wings of a neighboring source.  

  * The 'Background Surface Brightness' vs. 'Flux in Source Aperture' plot (saved as bkgd_sb_vs_flux.ps) reminds us that the background algorithms do not do a perfect job of excluding power from the point source itself---in this plot you may see a small correlation between the background surface brightness and the flux of the corresponding source.

  * The 'Scaling of Bkg Extraction' histogram (saved as backscal_hist.ps) shows the relative size of the bkg region and src aperture (BACKSCAL).  You may want to display this as a log-log plot.  BACKSCAL will often range from <10 to >1000.

  * The 'Counts in Background Region' histogram (saved as bkgd_counts_hist.ps) may show a peak at 5 counts, because in the struggle among ObsIDs to influence the *single* bkgd scaling range that all ObsIDs must respect, each ObsID will vote to increase the scaling range upper-limit if it cannot find 5 or more counts.  It will likely show another peak around 100 counts, the target for merged photometry.  You may need to change binsize.  


The following plots show characteristics of the source extraction.

  * The 'Mean ARF' vs. 'Off-axis Angle' plot (saved as meanarf_vs_theta.ps) depicts a main locus governed by vignetting, and outliers below the main locus corresponding to reduced apertures.  

  * In the 'Fraction of Source Counts on Primary CCD' plot (saved as multiple_ccds.ps), outliers < 1 represent the rare source whose dither pattern touches multiple CCDs.  If you care, you can note which sources are affected (middle-click to obtain a source's zero-based index in the catalog).

  * The 'PSF Fraction' plot (saved as psffrac.ps) reminds us that the Chandra PSF is energy-dependent, i.e. that the extraction aperture corresponds to a very different PSF fraction at low (red) and high (blue) energies.  

  * For a target with little overlap among ObsIDs, the 'Catalog/Data Offset' vs. 'Off-axis Angle' plot (saved as dataoffset_vs_theta.ps) reminds us that the position estimates made by AE (e.g. RECON and CORR positions) differ significantly from simple mean data estimates at large off-axis angles.  For a target with overlapping ObsIDs, this plot is not very useful because the "data" position is calculated using only the single ObsID you are examining but the "catalog" position often comes from multi-ObsID merged data.


The following plots show characteristics of the PSF images and extraction apertures.  They are largely educational.  However, experienced AE users are encouraged to continue scanning these plots because an atypical instance of plots like these may be our first indication of something wrong with a data set, with CIAO, or with the AE software.  


  * The 'PSF pixel size' vs. 'Off-axis Angle' plot (saved as psf_pixsize_vs_theta.ps) reminds us that AE chooses larger pixels for its PSF images as we move off-axis, where lower resolution is needed to reasonably sample the growing PSF.  

  * The 'CROPFRAC' vs. 'Off-axis Angle' plot (saved as cropfrac_vs_theta.ps.) reminds us that the footprint of AE's PSF images are "small" (in order to conserve disk space and CPU time), and thus that a few percent of the PSF power falls outside of the PSF image.  We believe that higher-than-typical values are a flaw arising from our use of MARX to generate PSFs; when a source dithers across a chip gap its core sees a lot of dead time that its far wings do not see, increasing the fraction of MARX events lying outside the PSF image.  

  * The 'PSF Fraction' vs. 'Off-axis Angle' plot (saved as psffrac_vs_theta.ps) helps you see the level of crowding in your catalog.  Reduced apertures are seen as outliers below the main locus.  

  * The 'Offset between catalog position and PSF centroid' plot (saved as psfoffset_vs_theta.ps) reminds us that the centroid of the Chandra PSF is a poor estimator for the source position off-axis.  In the plot, we believe that outliers above the main locus arise from PSF images that are truncated by the edge of the field.  

  * The 'Area of Source Aperture' vs. 'Off-axis Angle' plot (saved as aperturearea_vs_theta.ps) reminds us that the areas of (and thus the backgrounds in) off-axis extraction regions are ~100 times larger than on-axis regions.  Outliers below the main locus are simply reduced apertures.   



REVIEW MULTI-ObsID EXTRACTION RESULTS
=====================================
Run this review step even if you only have a single ObsID in this pointing.
When only one ObsID has been extracted, a few plots are uninformative.

  idl
    acis_extract, '', /MERGE_OBSERVATIONS, /PLOT, COLLATED_FILENAME='tables/all_inclusive.collated'


  * The 'Flux1' vs. 'Flux2' plot (saved as flux2_vs_flux1.ps) reminds us that AE estimates photon flux in two ways.

  * The 'SCAL_MAX/SCAL_MIN' plot (saved as backscal_vs_theta.ps) shows the ratio between the largest and smallest BACKSCAL values among the merged ObsIDs.  AE tries to keep this ratio < 2, for reasons discussed in the manual.  Sources observed at radically different off-axis angles may not be able to achive that goal.  Uncrowded sources are usually able to achieve the same BACKSCAL value in every extraction (a ratio of ~1 in this plot).  

  * The 'log Net Counts' histogram (saved as netcts_vs_theta.ps) is best viewed with log scaling of the Y-axis, so that a powerlaw distribution will appear as a line.

  * The 'log p-value, no-variability hypothesis' plot (saved as merge_ks_vs_flux.ps) give you a quick idea of how many  sources have exhibited variablility (small p-values), and how many detected counts those sources have.  This plot may help you prioritize further timing studies.  

  * The 'Background Surface Brightness' vs. 'Off-axis Angle' plot (saved as bkgd_sb_vs_theta.ps) serves as a sanity check on the scaling of the background extractions.  At launch, correlations with off-axis angle would be seen only when there was astrophysical variation of background across the field.  However, now that the OBF contamination has a strong off-axis variation, you will see an apparent rise in Background Surface Brightness beyond ~6' off-axis (because the 1 keV emap used in this photon flux calculation does a poor job of describing the spatial distribution of particle background events).

  * The 'Background Surface Brightness' vs. 'Source Flux' plot (saved as bkgd_sb_vs_flux.ps) reminds us that the background algorithms do not do a perfect job of excluding power from the point source itself---in this plot you may see a small correlation between the background surface brightness and the flux of the corresponding source.

  * In the 'Mean ARF value' histogram (saved as meanarf.ps), the low tail is caused by reduced apertures, dithering over chip gaps, and vignetting. 

  * The 'Scaling of Bkg Extraction' histogram (saved as backscal_composite_hist.ps) shows the relative size of the merged bkg region and src aperture (BACKSCAL).  You may want to display this as a log-log plot.  BACKSCAL will often range from <10 to >1000.

   * In the 'Counts in Background Region' histogram (saved as bkg_counts_composite_hist.ps), the outliers below the peak near 100 represent sources for which our recipe and background algorithms were not able to achive the goal of 100 counts in the merged background. 

  exit, /NO_CONFIRM



(experimental) CHECK NORMALIZATION OF PARTICLE BACKGROUNDS
==============================================================

If the background spectra scaling are correct, then there should be no significant flux (positive or negative) in the 9:12 keV band, where only particles (not X-rays) generate ACIS events.  

The histogram of "photometry significance" made by the code below shows the distribution of AE's SRC_SIGNIF statistic (= NET_CNTS / NET_CNTS_SIGMA_UP = SNR) in the 9:12 keV band over all the extractions in the photometry merge.  This statistic is a direct measure of the consistency between photometry in that band and zero.  If the background subtraction was perfect then this distribution should be Normal, with a standard deviation of 1.0.  

The code below also plots the ratio of the 9:12 keV signal (NET_CNTS) divided by the 9:12 keV gross counts (SRC_CNTS), with error bars.  Again, consistency with zero is the goal for this ratio.  In this plot, individual sources can be identified with the middle mouse button.

Usually, only sources far off-axis with very large apertures have enough observed counts in this band to make a useful NET_CNTS measurement.

RUN THIS ONCE, not for each ObsID.

  idl -queue |& tee check_scaling.log
    obsname = ''
    .run
    bt = mrdfits('tables/photometry.collated', 1)
    scale_band = 16

    if ~almost_equal(bt.ENERG_LO[scale_band], 9.0, DATA_RANGE=range) then print, scale_band, range, F='(%"\nWARNING: for Scale Band (#%d),  %0.2f <= ENERG_LO <= %0.2f; ENERG_LO should be 9.0 keV.\n")'
    if ~almost_equal(bt.ENERG_HI[scale_band], 12.0, DATA_RANGE=range) then print, scale_band, range, F='(%"\nWARNING: for Scale Band (#%d),  %0.2f <= ENERG_HI <= %0.2f; ENERG_HI should be 12.0 keV.\n")'

    SRC_SIGNIF = bt.SRC_SIGNIF[scale_band]
    SRC_CNTS   = bt.SRC_CNTS  [scale_band]
    NET_CNTS   = bt.NET_CNTS  [scale_band]
    nc_error   = NET_CNTS / SRC_SIGNIF
    
    if (n_elements(bt) GT 1) then begin
      dataset_1d , id1, SRC_SIGNIF, DATASET=obsname, DENSITY_TITLE='scale_band (9:12 keV)', XTIT='photometry significance (9:12 keV)', PS_CONFIG={filename:'scale_signif_dist.ps'}, /PRINT
      
      function_1d, id2, indgen(n_elements(bt)), NET_CNTS/SRC_CNTS, Y_ERROR=nc_error/SRC_CNTS, DATASET=obsname, NSKIP_ERRORS=1, PSYM=1, LINE=6, XTIT='source number', YTIT='NET_CNTS/SRC_CNTS (9-12 keV)', TITLE='Fractional Error in background subtraction (9:12 keV)'

      function_1d, id2, indgen(n_elements(bt)), intarr(n_elements(bt)), COLOR='red', DATASET='perfect normalization', PS_CONFIG={filename:'scale_signif.ps'}, /PRINT
    endif  
    
    forprint, indgen(n_elements(bt)), bt.LABEL, SRC_SIGNIF, F='(%"  source #%3d (%s): SRC_SIGNIF = %5.1f")'
    end
    
    exit, /NO_CONFIRM

  egrep -i "WARNING|ERROR|DEBUG|halted|Stop" check_scaling.log | egrep -v "DISPLAY variable|LS_COLORS|arithmetic error|no in-band|No HDUNAME|keyword not found|FILTER|nc_error|print,"


If 3-sigma deviations from perfect normalization are found, you should run the code above using single-ObsID merges (by changing "obsname" to each ObsID name); that should identify which ObsIDs are having the most significant scaling problem.


Currently, every energy channel of a background spectrum is scaled by the same factor.  That scaling factor is calculated for X-rays at the monoenergy of the emap.  On-axis, that scaling factor is appropriate for X-rays at other energies, and is appropriate for instrumental background events (caused by particles).  That's because on-axis, over the small sizes of background regions, the **shape** of the emap is very similar at all energies, and the shape of the particle background density is similar to the emap.

However, this scaling strategy can introduce energy-dependent scaling biases off-axis, for two reasons.  First, the *shape* of the emap is very energy-dependent near the OBF frame, due to the OBF contamination.  Second, off-axis the shape of any emap (which describes the response to X-rays) is different from the shape of the particle background density (which does not care about the OBF).

Perfect background subtraction off-axis would require seperate modeling of the X-ray and particle backgrounds, and energy-dependent scaling of the X-ray background.

Poor background subtraction in the 9:12 keV band can also occur when the ae_better_backgrounds tool is forced to adjust the nominal background scaling (BACKCORR != 1), as described in the AE manual.  For example, BACKCORR << 1 causes an under-subtraction of particle background (and often a positive residual in the last spectrum group).





REVIEW EXTREME RESULTS FROM THE "BETTER BACKGROUNDS" ALGORITHM
==============================================================
The code below uses the SHOW stage of AE to review a handful of sources with the most extreme backgrounds.  See recipe.txt for explanations.

  idl
    acis_extract, COLLATED_FILENAME='tables/all_inclusive.collated', SRCLIST_FILENAME='bkg_review.srclist', /SHOW_REGIONS

    exit, /NO_CONFIRM


REVIEW SOURCES SUSPECTED TO BE VARIABLE
==============================================================
  idl -queue |& tee variable.log
    .r ae
    .run
    run_command
    bt = mrdfits('tables/all_inclusive.collated', 1)
    output_dir = ''
    possible_threshold = 0.05  ; p-value threshold for "possible variability"
    definite_threshold = 0.005 ; p-value threshold for "definite variability"
    
    CATALOG_NAME = strtrim(bt.CATALOG_NAME,2)
    LABEL        = strtrim(bt.LABEL       ,2)

    NUM_OBS  = bt.NUM_OBS

    ; Calculate p-value for reported PROB_KS = min( P1,P2,...Pn), where n = N_KS.
    pval_for_PROB_KS = 1D - (1D - bt.PROB_KS)^bt.N_KS
    pval_for_PROB_KS[where(/NULL, ~finite(bt.PROB_KS))] = !VALUES.F_NAN

    MERGE_KS = bt.MERGE_KS
    MERG_CHI = bt.MERG_CHI
    NET_CNTS = bt.NET_CNTS[0]
    region_tag = strarr(n_elements(bt))
    
    min_pvalue = min(DIM=2, [[pval_for_PROB_KS],[MERGE_KS],[MERG_CHI]], /NAN)  > 1e-20

    xtit = 'p-value for variability indices (pval_for_PROB_KS < MERGE_KS < MERG_CHI)'
    pwo = string(alog10(definite_threshold), alog10(possible_threshold),$
                 F='(%"SET_BIG_MARKER=[%f,0],SET_SMALL_MARKER=[%f,0]")')
    dataset_1d, id10, alog10(min_pvalue), DISTRIBUTION_TITLE='Variability', DENSITY_TITLE='Variability', COLOR='red', XTIT=xtit, PLOT_WINDOW_OPTIONS=pwo, PS_CONFIG={filename:output_dir+'variability_distribution.ps'}, /PRINT

    dataset_2d, id3, NAN=[0,0], PSY=1, alog10(min_pvalue), NET_CNTS, COLOR='red', TITLE=tit, XTIT=xtit, YTIT='Net Counts', PLOT_WINDOW_OPTIONS=pwo, PS_CONFIG={filename:output_dir+'variability_vs_netcnts.ps'}, /PRINT


    is_variable_single    = (pval_for_PROB_KS LT definite_threshold) ; NaN values produce FALSE.
    is_variable_merge_chi = (MERG_CHI         LT definite_threshold) ; NaN values produce FALSE.
    is_variable_merge_ks  = (MERGE_KS         LT definite_threshold) ; NaN values produce FALSE.

    is_variable           = is_variable_single OR is_variable_merge_chi OR is_variable_merge_ks
    possibly_variable     = ~is_variable AND (min_pvalue LT possible_threshold)

    
    print, total(/INTEGER, is_variable), n_elements(bt), definite_threshold, total(/INTEGER, possibly_variable), definite_threshold, possible_threshold, F='(%"\nWe have evidence that %d out of %d sources are \"definitely variable\" (p < %0.3f), and %d sources are \"possibly variable\" (%0.3f < p < %0.2f).")'
    
    openw, unit, 'variable.srclist', /GET_LUN
    !TEXTUNIT = unit
    printf, unit, '; "DEFINITELY" VARIABLE'

    ; ------------------------------------
    ind = where(is_variable_single, count)
    if (count GT 0) then begin
      forprint, TEXTOUT=5, /NoCOM, SUBSET=ind, CATALOG_NAME, LABEL, pval_for_PROB_KS, F='(%"%s ; (%s) ; pval_for_PROB_KS = %5.3f")'
      region_tag[ind] = 'definitely variable'
    end
    print, total(/INTEGER, finite(pval_for_PROB_KS)), count, definite_threshold, F='(%"\nVariability within each ObsID was assessed for %d sources; %d are identified as \"definitely variable\" (pval_for_PROB_KS < %0.3f).")'


    ; ------------------------------------
    ind = where(~is_variable_single AND (NUM_OBS GT 1) AND is_variable_merge_chi, count)
    if (count GT 0) then begin
      forprint, TEXTOUT=5, /NoCOM, SUBSET=ind, CATALOG_NAME, LABEL, pval_for_PROB_KS, MERG_CHI, F='(%"%s ; (%s) ; pval_for_PROB_KS = %5.3f, MERG_CHI = %5.3f")'
      region_tag[ind] = 'definitely variable'
    end

    print, total(/INTEGER, (NUM_OBS GT 1) AND finite(MERG_CHI)), count, definite_threshold, F='(%"\nVariability among ObsIDs was assessed for %d sources (by comparing ObsID-averaged photon fluxes); %d more are identified as \"definitely variable\" sources (MERG_CHI < %0.3f).")'


    ; ------------------------------------
    ind = where(~is_variable_single AND (NUM_OBS GT 1) AND ~is_variable_merge_chi AND is_variable_merge_ks, count)
    if (count GT 0) then begin
      forprint, TEXTOUT=5, /NoCOM, SUBSET=ind, CATALOG_NAME, LABEL, pval_for_PROB_KS, MERG_CHI, MERGE_KS, F='(%"%s ; (%s) ; pval_for_PROB_KS = %5.3f, MERG_CHI = %5.3f, MERGE_KS = %5.3f")'
      region_tag[ind] = 'definitely variable'
    end

    print, total(/INTEGER, (NUM_OBS GT 1) AND finite(MERGE_KS)), count, definite_threshold, F='(%"\nVariability among ObsIDs was assessed for %d sources (by concatenating the observations); %d more are identified as \"definitely variable\" sources (MERGE_KS < %0.3f).")'


    ; ------------------------------------
    printf, unit, '; "POSSIBLY" VARIABLE'
    ind = where(possibly_variable, count)
    if (count GT 0) then begin
      forprint, TEXTOUT=5, /NoCOM, SUBSET=ind, CATALOG_NAME, LABEL, pval_for_PROB_KS, MERGE_KS, MERG_CHI, F='(%"%s ; (%s) ; pval_for_PROB_KS = %5.3f, MERGE_KS = %5.3f, MERG_CHI = %5.3f")'
      region_tag[ind] = 'possibly variable'
    end

    free_lun, unit


    ; ------------------------------------
    ; Report the 20 brightest variable sources.
    ind = where(is_variable)
    ind = ind[ reverse(sort(NET_CNTS[ind])) ]
    print, '20 brightest variable sources:'
    forprint, /NoCOM, SUBSET=ind[0:20], CATALOG_NAME, LABEL, NET_CNTS, pval_for_PROB_KS, MERGE_KS, MERG_CHI, F='(%"%s ; (%10s) ; NET_CNTS=%5d  pval_for_PROB_KS = %5.3f, MERGE_KS = %5.3f, MERG_CHI = %5.3f")'


    ; ------------------------------------
    ; Build ds9 regions showing variability class.
    ind = where( region_tag NE '' ) 
    acis_extract, CATALOG_NAME[ind], MERGE_NAME='all_inclusive', COLLATED_FILENAME='/dev/null', REGION_FILE='temp.reg', REGION_TAG=region_tag[ind], VERBOSE=0

    run_command, 'egrep -v "data|fraction" temp.reg | sed -e "/definitely/s/DodgerBlue/red/" -e "/possibly/s/DodgerBlue/cyan/" > variable.reg'
    end
    
    exit, /NO_CONFIRM
  ds9 -title 'variable sources' ../target.evt -region variable.reg &



#################################################################################
############# COLLATING RESULTS ##################
#################################################################################
From here on, photometry doesn't need the validation screen session anymore, so I co-opt it for diffuse analysis.  If you want to join me, spin off to diffuse_procedure.txt and do the following photometry steps in a different shell, cd'd to <target>/data/extract/point_sources.noindex/ ...

BUILD A FINAL COLLATION OF THE AE DATA PRODUCTS BY COMBINING VARIOUS COLLATIONS
=====================================================================
We must combine validation and position quantities (pass_final_catalog/catalog.fits), timing quantities (tables/all_inclusive.collated), and photometry/fitting quantities (tables/photometry.collated).


Verify that the same source list, sorted the same way, has been used in all those collations.
Combine information from each into a final collation.

  clear
  dmlist    "pass_final_catalog/catalog.fits[cols CATALOG_NAME]" data > temp1
  dmlist      "tables/all_inclusive.collated[cols CATALOG_NAME]" data > temp2
  dmlist         "tables/photometry.collated[cols CATALOG_NAME]" data > temp3
  diff temp1 temp2
  diff temp2 temp3
  diff temp3 temp1
  'rm' temp1 temp2 temp3

  dmpaste pass_final_catalog/catalog.fits \
         "tables/all_inclusive.collated[cols PROB_KS,N_KS,MERGE_KS,MERG_CHI]" /tmp/temp.collated clob+
         
  dmpaste /tmp/temp.collated \
         "tables/photometry.collated[cols -CATALOG_NAME,-LABEL,-RA,-DEC,-ERR_RA,-ERR_DEC,-POSNTYPE,-ERX_DATA,-ERY_DATA,-ERR_DATA,-PROB_KS,-N_KS,-MERGE_KS,-MERG_CHI]"\
          tables/catalog_and_photometry.collation clob+ 

  dmlist tables/catalog_and_photometry.collation cols
  dmlist tables/catalog_and_photometry.collation blo
  


COMPUTE MORE SOPHISTICATED UNCERTAINTIES ON NET_COUNTS
CONSTRUCT A MORE CONVENIENT TABLE OF X-RAY PROPERTIES, SUITABLE FOR PUBLICATION.
BUILD LaTeX TABLES FOR USE BY COLLABORATORS (OR POSSIBLY FOR PUBLICATION)
======================================================
This tool will produce both FITS and ASCII tables. 
This runs "aprates," which takes many hours.

  pushd tables
  idl -queue |& tee flatten.log
    .run ae
    .run
    ae_flatten_collation, COLLATED_FILENAME='catalog_and_photometry.collation', FLAT_TABLE_FILENAME='xray_properties.fits', DISTANCE=0, SORT=0

    title = string( getenv("TARGET"), F='(%"Target %s is ready to generate a bunch of plots.")')
    TimedMessage, tm_id, 'Press OK when it is convenient to see plots.', TITLE=title, QUIT_LABEL='Close', PRESSED_ID=trash
    
    ; This tool builds aimpoints.reg in point_sources.noindex/tables/
    ; Supply local path to table templates.
    hmsfr_tables, 'xray_properties.fits', file_which('/hmsfr_tables.tex')

    ; This tool builds MedianEnergy_slice.reg in point_sources.noindex/
    cd, '..'
    ae_summarize_catalog, 'tables/catalog_and_photometry.collation'
    end
;Review plots that show how much ObsIDs moved in the end.

  exit
  

This code ends with a series of plots summarizing the source properties of the catalog.  These are worth reviewing.  It also brings up a ds9 session with the target exposure map and source regions overlaid, color-coded by median energy.  I save this as point_sources.noindex/target_summary.bck for inclusion in the appropriate MOXC paper.


  egrep -i "WARNING|ERROR|DEBUG|halted|Stop" flatten.log | egrep -v "arithmetic error|Large number of counts|Maximum number of iterations|source and/or background counts are 0"



Build the LaTeX source properties table, then open it.  (Recall that lualatex only works on Cochise and Hiawatha.)

  lualatex -jobname=target_observing_log "\includeonly{./observing_log}\input{xray_properties}"  
  lualatex -jobname=target_observing_log "\includeonly{./observing_log}\input{xray_properties}"  
  lualatex -jobname=xray_properties      "\includeonly{./src_properties}\input{xray_properties}"  
  lualatex -jobname=xray_properties      "\includeonly{./src_properties}\input{xray_properties}"  
  open xray_properties.pdf
  popd
  
These PDF files are formatted to be reviewed on a screen, not to be printed.  

***Pat -- I don't think the above plots are written to files.  Do you want me to suggest filenames?  These plots need descriptions, like the ones above have now.***


======================================================
MAKE OTHER SUMMARY PLOTS (optional)

This is an area that needs some work.  Lots of summary plots are made by the various /PLOT stages of AE shown earlier in this recipe.  We started a tool (ae_summarize_catalog above) to do the same thing---there's probably a lot of redundancy between those two.  Both use the collation file produced by AE, not the flattened table (xray_properties.fits) that we distribute/publish.

Until we have a more consistent plotting strategy, you can also make plots from xray_properties.fits by hand, with TopCat, fv, or using TARA in IDL as shown below.

In tables/ directory:
  idl
    .run
    bt = mrdfits('xray_properties.fits', 1)

    dt = 'my target'
    dataset_1d, id0, bt.SrcCounts_t         , DENSITY_TITLE=dt, BINSIZE=5   , XTIT='# of extracted counts, 0.5-8 keV'
    dataset_1d, id1, bt.NetCounts_t         , DENSITY_TITLE=dt, BINSIZE=5   , XTIT='# of net counts, 0.5-8 keV'
    dataset_1d, id2, alog10(bt.EnergyFlux_t), DENSITY_TITLE=dt, BINSIZE=0.1 , XTIT='log PhotonFlux*MedianEnergy (erg /cm**2 /s)'

    ; Calculate p-value for reported PROB_KS = min( P1,P2,...Pn), where n = N_KS.
    pval_for_PROB_KS = 1D - (1D - bt.ProbKS_single)^bt.N_KS_single
    pval_for_PROB_KS[where(/NULL, ~finite(bt.ProbKS_single))] = !VALUES.F_NAN

    dataset_1d, id9, pval_for_PROB_KS       , DENSITY_TITLE=dt,               XTIT='p-value for single-ObsId no-variability hypothesis'

    dataset_1d,id10, bt.ProbKS_merge        , DENSITY_TITLE=dt,               XTIT='p-value for multi-ObsId no-variability hypothesis'  
    dataset_1d,id11, bt.MedianEnergy_t      , DENSITY_TITLE=dt, BINSIZE=0.2 , XTIT='median energy, 0.5-8.0 keV'
    SNR = bt.NetCounts_t/(bt.NetCounts_Hi_t-bt.NetCounts_t)
    dataset_1d,id12, SNR                    , DENSITY_TITLE=dt, BINSIZE=1   , XTIT='signal-to-noise ratio'
    end


#################################################################################
############# ENABLE WRITE PERMISSION FOR COLLABORATORS) ##################
#################################################################################
In point_sources.noindex/

  find [0-9]* -maxdepth 1 -type d  \( -name photometry -or -name 'EPOCH_*' \) -user $USER -not -perm -g+w  -print0 -prune  | xargs -0  chmod -v g+w




#################################################################################
############# RELEASING DATA PRODUCTS TO COLLABORATORS  (optional, not yet tested) ##################
#################################################################################
Permanent archiving of data products will typically occur when a target is published, by posting tarballs on Zenodo and referencing the corresponding DOI in the paper.

Impermanent releases of data products to collaborators will typically be done using DropBox.com.  For clarity and ease of browsing, a target on Dropbox will have a directory tree similar to that on our disks.  However, since DropBox DOES NOT PRESERVE SYMLINKS, each source directory we release will be a tarball on DropBox (rather than a directory tree).

Below, we use environment variables to store the absolute path to the local Dropbox folder, the target location, and some parameter files for rsync.

  setenv TARGET_ROOT   /bulk/hiawatha1/targets
  setenv DEST_ROOT     /tmp
  setenv DROPBOX_ROOT '/Volumes/cochise_TimeMachine_14TB/DropboxAstronomy'
  setenv RSYNC_FILTER  /bulk/cochise1/psb6/TARA/code/rsync_filter_lists

CLOSE THE DROPBOX APPLICATION, SO THAT IT WON'T START UPLOADING FILES UNTIL YOU ARE READY.


=====================================================================
Release the most important data products as follows.
 
  setenv TARGET       t-rex
  
  set dest="$DROPBOX_ROOT/$TARGET/"
  mkdir -p "$dest"
  ls       "$dest"
  
  cd $TARGET_ROOT/$TARGET/data/
  rsync -av --update --include-from=$RSYNC_FILTER/target.txt                                   . "$dest"
  gzip -v $dest/*.evt $dest/*.emap
  rsync -av --update --include-from=$RSYNC_FILTER/catalog.txt                                  . "$dest"
  rsync -av --update --include-from=$RSYNC_FILTER/diffuse_fullband.txt                         . "$dest"
  rsync -av --update --include-from=$RSYNC_FILTER/diffuse_hardband.txt                         . "$dest"
  rsync -av --update --include-from=$RSYNC_FILTER/diffuse_softband.txt                         . "$dest"

  
  
=====================================================================
Review the sources you are releasing.
Prepare data products not built by standard recipes, as desired. 

  cd $TARGET_ROOT/$TARGET/data/extract/point_sources.noindex/

List the source(s) you are releasing in the file release.srclist


-------------------------------------------------------------  
Review the source's apertures and backgrounds.                                                                              

  idl
    acis_extract, COLLATED_FILENAME='tables/all_inclusive.collated', SRCLIST_FILENAME='release.srclist', /SHOW_REGIONS

    acis_extract, COLLATED_FILENAME='tables/all_inclusive.collated', SRCLIST_FILENAME='release.srclist', /SHOW_REGIONS, /OMIT_BKG_REGIONS
    
If the source is especially interested or crowded, print the ds9 window showing extractions WITHOUT background regions to <sourcename>/extractions.ps

See CUSTOM EXTRACTION APERTURES section of this procedure if you need hand-designed apertures.



-------------------------------------------------------------  
For sources with lots of counts, build single-ObsID data products and lightcurve plots.

  idl -queue |& tee merge_epochs.log
    .run
    foreach obsname, strsplit(getenv("OBS_LIST")," ", /EXT) do begin
    
      acis_extract, 'release.srclist', obsname, MERGE_NAME='EPOCH_'+obsname, /MERGE_OBSERVATIONS, EBAND_LO=0.5, EBAND_HI=8.0
    
    endforeach
    end


-------------------------------------------------------------  
Build ObsID-averaged lightcurves (if AE lightcurve is interesting).

  foreach sourcename (`egrep -hv '^[[:space:]]*(;|$)' release.srclist  | cut -d " " -f1`)
    pushd $sourcename/
    pwd
    idl -queue |& tee -a ObsID_photometry.txt
    
    ds9 -title "$sourcename" -lock bin yes -lock frame wcs -lock scale yes -lock colorbar yes -bin factor 0.25 all_inclusive/neighborhood.evt -region load all_inclusive/extract.reg all_inclusive/source.evt -region centroids.reg -zoom to fit 
    popd
  end

In each IDL session that starts above, paste:
    .run ae_timing
    ae_plot_ObsID_photometry

IF the source has a known period much longer than the ObsID lengths, then supply that period with the DAYS_PER_PERIOD parameter.
IF the phase of that periodicity is known, supply "T0" with the TIME_AT_PHASE0 parameter.  Below is an example calculation of TIME_AT_PHASE0 from a published T0 value in MJD:

    t0_MJD     = 56022.4D ; (days)
    
    ; Convert MJD to Chandra mission time.
    t0_Chandra = (t0_MJD - 50814.0D)*3600D*24D ; (s)
    
    ae_plot_ObsID_photometry, DAYS_PER_PERIOD=158.760D, TIME_AT_PHASE0=t0_Chandra



-------------------------------------------------------------  
(optional) Build ASCII lightcurves (e.g. for Liz Bartlett)

  * Binsize = frame time
  * Remove bins with off-nominal exposure.
  
For EXAMPLE:
  set sourcename='053749.06-690508.1'

  pushd $sourcename/
  foreach dir ([0-9]*)
    dmextract "$dir/source.evt[bin time=::3.241040]" $dir/bary_3s.lc opt=ltc1 clob+ verb=1
    dmcopy "$dir/bary_3s.lc[cols TIME,COUNTS,EXPOSURE][EXPOSURE>3.1]"   "$dir/bary_3s.lc.txt[opt kernel=text/simple]" clob+ verb=1
  end


-------------------------------------------------------------  
(optional) SEARCH TIMESTAMPS FOR PERIODS

For EXAMPLE:
  set sourcename='053749.06-690508.1'

  pushd $sourcename/all_inclusive/
  idl -queue |& tee -a ae_timing.log 
    .run ae_timing
    ae_fold_lightcurve, OVERSAMPLING_FACTOR=20, MIN_FREQUENCY=1/(2000*86400D), MAX_FREQUENCY=1/(0.1*86400D), trials, most_significant_trial, /TIME_IN_DAYS
    
  ls -ltr */bary_3s.lc.txt
  popd 

See more extensive notes in /bulk/cochise1/targets/t-rex/data/extract_optimized/point_sources.noindex/timing_notes.txt



=====================================================================
Release data products for your list of POINT SOURCES as follows.  
Source directory trees can be large---we exclude some data products to save space on Dropbox.

  set dest="$DEST_ROOT/$TARGET/extract/point_sources.noindex/"
  mkdir -p "$dest"
  ls       "$dest"

  cd $TARGET_ROOT/$TARGET/data/extract/point_sources.noindex/

  rsync -av --update /bulk/cochise1/psb6/TARA/doc/AE_users_guide/standardized_output_directory_tree.pdf "$dest"/..
  rsync -av --update release.srclist xspec.srclist "$dest"

-------------------------------------------------------------  
Release EPOCH data products for the piled sources ...

  foreach sourcename (`cut -f1 -d ';' ../obs*/possibly_piled.srclist | sort | uniq`)
    printf "%s\n" $sourcename
    rsync -av --update --include-from=$RSYNC_FILTER/point_EPOCH_spectra.txt         $sourcename "$dest"
  end
  
-------------------------------------------------------------  
Release XSPEC products for sources that we've tried to fit ...

  foreach sourcename (`egrep -hv '^[[:space:]]*(;|$)' xspec.srclist  | cut -d " " -f1`)
    printf "%s\n" $sourcename
    rsync -av --update --include-from=$RSYNC_FILTER/point_spectra.txt     $sourcename "$dest"
  end

-------------------------------------------------------------  
Release timing data products for all sources ...
  
  foreach sourcename (`egrep -hv '^[[:space:]]*(;|$)' release.srclist  | cut -d " " -f1`)
    printf "%s\n" $sourcename
    rsync -a --include-from=$RSYNC_FILTER/point_timing.txt      $sourcename "$dest"
  end
  
-------------------------------------------------------------  
Find and repair dangling symlinks.

  cd $TARGET_ROOT/$TARGET/data/
  set dest="$DEST_ROOT/$TARGET/"
  
  foreach file (`cd "$dest"; find . -type l ! -exec test -e {} \; -print`)
    printf "%s\n" $file
    'rsync' -aL --relative --no-implied-dirs  "$file" "$dest"
  end
  printf "\nRemaining dangling symlinks:\n"
  find "$dest" -type l ! -exec test -e {} \; -print
  echo $TARGET

  
-------------------------------------------------------------  
Review data products being released for each point source.
Build source tarballs in Dropbox directory.

  cd "$DEST_ROOT/$TARGET/extract/point_sources.noindex/"
  
  set dropbox_dest="$DROPBOX_ROOT/$TARGET/extract/point_sources.noindex/"
  mkdir -p "$dropbox_dest"

  foreach sourcename (`egrep -hv '^[[:space:]]*(;|$)' release.srclist  | cut -d " " -f1`)
    pushd $sourcename
    printf "\n\nRELEASING %s\n" $sourcename
    
    open extractions.ps lightcurve.ps  median_energy.ps  flux_vs_emedian.ps all_inclusive/*.sequenced_lc.ps
    
    open all_inclusive/spectral_models/best_model/ldata.ps

    dmlist all_inclusive/source.stats head |egrep 'MERGPRUN|MERGQUAL|MERGNUM|MERGFRAC|MERGBIAS'
    dmlist all_inclusive/source.evt block
    popd
    printf "\nClose 'xeyes' to release this source.\n"
    xeyes
    
    tar -czvf "$dropbox_dest/${sourcename}.tar.gz"    $sourcename
    pwd
  end
  du -s $dropbox_dest/*






TO BE REVIEWED FOR CONSISTENCY WITH POINT SOURCE RELEASE PROCEDURE ABOVE .....  
  
=====================================================================
Release data products for a specific DIFFUSE SOURCES as follows.  
Source directory trees can be large---we exclude some data products to save space on Dropbox.

-------------------------------------------------------------  
Configuration, e.g.
  set dest="$DEST_ROOT/$TARGET/extract/diffuse_sources.noindex/"
  mkdir -p "$dest"
  ls       "$dest"
       
  cd /bulk/hiawatha1/targets/$TARGET/data/extract/diffuse_sources.noindex/

List the source(s) you are releasing in the file release.srclist


-------------------------------------------------------------  
Review data products being released for each diffuse source.
Build source tarballs in Dropbox directory.

Execute as many of the rsync commands below as desired, trading off the collaborator's interests, upload time, and Dropbox storage limitations.

  foreach sourcename (`egrep -hv '^[[:space:]]*(;|$)' release.srclist  | cut -d " " -f1`)
    'rm' -rf /tmp/release/
    
    ds9 -title $sourcename  $sourcename/source.evt -region $sourcename/extract.reg -zoom to fit &
    
    open $sourcename/spectral_models/grp*_plot_gross_and_background_spectra/ldata.ps
    
    if ( -e $sourcename/handfit.ps) open $sourcename/handfit.ps

    rsync -av --update --include-from=$RSYNC_FILTER/diffuse_spectra.txt     $sourcename /tmp/release/
    
    
    pushd /tmp/release/$sourcename
    printf "\n\nRELEASING %s\n" $sourcename
    dmlist source.stats head |egrep 'MERGNUM|MERGFRAC|SRC_CNTS|SRC_AREA'
    popd
    printf "\nClose 'xeyes' to release this source.\n"
    xeyes
    
    tar -czvf "$dest/${sourcename}.tar.gz" -C /tmp/release/      $sourcename
    pwd
  end









=====================================================================
Launch Dropbox application, and send invitations to collaborators.
  open -a DropBox
  open https://www.dropbox.com/home



#################################################################################
#################### THINK CAREFULLY ABOUT YOUR BACKUP SYSTEMS #################### 
#################################################################################
At the top of the validation procedure, you may have chosen to exclude this target's AE directory tree from your backup systems (e.g. TimeMachine, CrashPlan, rsync, etc.), to avoid creating incremental backups of preliminary AE data products that have very little long-term value.  Now that you've completed the major AE processing, you should re-enable backup of this directory tree.


(Pat and Leisa only) Copy target to archive disk, where it will be backed up.
-------------------------------------------------------
  idl
    relocate_target,  /CHECKIN, /BACKUP, getenv('TARGET'), EXCLUDE_PATTERN=['instmap']
    
    exit



=============================================================================
Moving On
=============================================================================
That's it!  You have fully extracted your X-ray catalog!  

You can now move on to diffuse_procedure.txt if you want -- keep your screen session in place to do this.

Otherwise, terminate the screen session:

1. In screen window #0 run the following to paste the 'exit' command in every window, which will cleanly terminate ssh connections you have to other computers (avoiding zombie processes):.
  screen -S $SCREEN_NAME -X at \# stuff 'exit^M'

2. If that doesn't end the screen session, then type the screen command "^a\".


If you forgot to "exit" the ssh connections as instructed above, and instead simply killed the screen session with ^a\, the remote shells (which are running "caffeinate" to keep the machine from sleeping) will NOT DIE.  In that case then, the easiest way to kill them (and recover from your mistake) is to ssh to the machine again and run
  killall caffeinate


The next phase of your data reduction could be fitting some point source spectra (procedure spectral_fitting_procedure.txt), or could be timing analysis (no procedure written yet), or could be masking these point sources from the data and then some diffuse analysis (procedure diffuse_procedure.txt).






#################################################################################
#################### NOTES ON OPTIONAL ANALYSIS STEPS #################### 
#################################################################################
The material below is a collection of notes on analysis tasks that occasionally may be needed, not a procedure that should be strictly followed for all targets.

To figure out how many sources made it into the final catalog:
   dmlist target/data/xray_properties.fits block
   
or if that symlink doesn't exist,
  dmlist target/data/extract/point_sources.noindex/tables/xray_properties.fits block






IDENTIFY AND VISUALIZE SOURCES OF PARTICULAR INTEREST
=========================================================================
First, define a workspace where you can creates notes, source lists, region files, FITS tables, etc. without interfering with standard data products.  One sensible location is a subdirectory of point_sources.noindex/tables/

  cd point_sources.noindex/
  mkdir tables/sources_of_interest/
  cd    tables/sources_of_interest/



Draw a region of interest, and trim the source catalog to that region.
  touch roi.reg
  ds9 -bin factor 4 ../../../target.evt  -region roi.reg &

Draw one or more  POLYGON, CIRCLE, BOX regions representing the region-of-interest. Save to roi.reg in SEXAGESIMAL celestial coordinates.

  dmcopy "../xray_properties.fits[(RAdeg,DEdeg)=region(roi.reg)]" xray_properties_roi.fits clob+



Identify the N-brightest sources with the largest fluxes, show their lightcurves, display their locations in ds9.
  idl
    .run
    N = 20

    cd, CURRENT=workspace_dir
    workspace_dir += '/'
    pushd, '../../'
    srclist_fn        = workspace_dir+'xspec.srclist'
    collated_filename = workspace_dir+'xspec.collated'
    region_fn         = workspace_dir+'xspec.reg'

    ; Select brightest sources in ROI.
    xspec = mrdfits(workspace_dir+'xray_properties_roi.fits',1)
    
    ind_sort = reverse(sort(xspec.PhotonFlux_t))
    
    xspec = xspec[ ind_sort[0:N-1] ]
    
    ; List selected sources and useful properties in ASCII table.
    forprint, TEXTOUT=srclist_fn, /NoCOM, $
              xspec.NAME, xspec.LABEL, xspec.PhotonFlux_t, xspec.NetCounts_t, $
              F='(%"%s ; (%s) ; PhotonFlux_t = %8.2g, NetCounts_t = %5d")'

    ; Build XSPEC region file.
    acis_extract, srclist_fn, REGION_FILE='temp.reg', REGION_TAG='XSPEC sources', COLLATED_FILENAME='/dev/null

    cmd = string(region_fn, F='(%"sed -e ''s/DodgerBlue/Red/'' temp.reg > %s")')
    run_command, cmd
    popd
    
    ; Build "mugshots" document with lightcurves.
    plot_name = file_search( '../../' + strtrim(xspec.NAME,2) + '/all_inclusive/*.sequenced_lc.ps' )

    ind = where(~file_test(plot_name), count)
    if (count GT 0) then begin
      print, 'The following plots are missing:'
      forprint, plot_name[ind]
      plot_name[ind] = file_which('no_model.eps')
    endif

    cmd = file_which('plot_spectra.pl')+' --doc_name=lightcurves --angle=0 --num_per_page=10 ' + strjoin(plot_name, ' ')
    run_command, cmd 
    exit, /NO_CONFIRM
    end

  ~/bin/acis_lkt4 -R .
  ds9 -bin factor 4 ../../../target.evt  -region ../../polygons.reg  -region xspec.reg  -region roi.reg &


If you decide a source in xspec.srclist is uninteresting, hide it with the comment character ';'.



FIT SPECTRAL MODELS TO THE INTERESTING SOURCES
=========================================================================
From tables/sources_of_interest/
  idl -queue |& tee -a fit_spectra.log
    .run
    cd, CURRENT=workspace_dir
    workspace_dir += '/'
    pushd, '../../'
    srclist_fn        = workspace_dir+'xspec.srclist'
    collated_filename = workspace_dir+'xspec.collated'
    region_fn         = workspace_dir+'xspec.reg'

    ; Update collation and region file to match sourcelit.
    ; Below we need column SRC_SIGNIF from collation.
    acis_extract, srclist_fn, MERGE_NAME='photometry', REGION_FILE='temp.reg', REGION_TAG='XSPEC sources', COLLATED_FILENAME=collated_filename, HDUNAME='BEST_MDL'

    cmd = string(region_fn, F='(%"sed -e ''s/DodgerBlue/Red/'' temp.reg > %s")')
    run_command, cmd

    ; 1T Models for Galactic massive or T-tauri stars.
    xspec = mrdfits(collated_filename,1)
    foreach model_filename, 'xspec_scripts/compact/'+['T:Solar,Solar.xcm','T:Solar,TT.xcm'] do begin

      acis_extract, srclist_fn, MERGE_NAME='photometry', /FIT_SPECTRA,$
                  CSTAT_EXPRESSION='bt.SRC_SIGNIF LT 4',$
                  CHANNEL_RANGE=[35,548],$                       ; 0.5:8 keV.
                  MODEL_FILENAME=model_filename
     endforeach
    popd
    exit, /NO_CONFIRM
    end




REVIEW MODELS AND DECLARE BEST MODEL
From tables/sources_of_interest/
  idl
    .run ae
    .run
    run_command, 'cat ../../polygons.reg  roi.reg  xspec.reg > review.reg'
    cd, CURRENT=workspace_dir
    workspace_dir += '/'
    pushd, '../../'
    srclist_fn        = workspace_dir+'xspec.srclist'
    collated_filename = workspace_dir+'xspec.collated'
    region_fn         = workspace_dir+'xspec.reg'

    distance=0.0
    read, 'distance in pc: ', distance

    ae_spectra_viewer, collated_filename, DISTANCE=distance, DISPLAY_FILE='../target.evt', REGION_FILENAME=workspace_dir+'review.reg'

    ; Collate declared best models.
    acis_extract, srclist_fn, MERGE_NAME='photometry', REGION_FILE='temp.reg', REGION_TAG='XSPEC sources', COLLATED_FILENAME=collated_filename, HDUNAME='BEST_MDL'

    cmd = string(region_fn, F='(%"sed -e ''s/DodgerBlue/Red/'' temp.reg > %s")')
    run_command, cmd
    popd
    exit, /NO_CONFIRM
    end


WHEN FITTING IS COMPLETE, RUN FINAL COLLATION AND BUILD SUMMARY DATA PRODUCTS
From tables/sources_of_interest/
  idl
    .run
    cd, CURRENT=workspace_dir
    workspace_dir += '/'
    pushd, '../../'
    srclist_fn        = workspace_dir+'xspec.srclist'
    collated_filename = workspace_dir+'xspec.collated'
    region_fn         = workspace_dir+'xspec.reg'

    ; Collate declared best models.
    acis_extract, srclist_fn, MERGE_NAME='photometry', REGION_FILE='temp.reg', REGION_TAG='XSPEC sources', COLLATED_FILENAME=collated_filename, HDUNAME='BEST_MDL'

    cmd = string(region_fn, F='(%"sed -e ''s/DodgerBlue/Red/'' temp.reg > %s")')
    run_command, cmd

    xspec = mrdfits(collated_filename,1)
    popd

    ; Create document showing best model for all the sources, laid out 12 to a page.
    xspec = xspec[ where(~strmatch(xspec.MODEL, 'no_fit*')) ]

    plot_name = '../../' + strtrim(xspec.CATALOG_NAME,2) + '/' +$
                           strtrim(xspec.MERGE_NAME  ,2) + '/spectral_models/' +$
                           strtrim(xspec.MODEL       ,2) + '/ldata.ps'

    ind = where(~file_test(plot_name), count)
    if (count GT 0) then begin
      print, 'The following plots are missing:'
      forprint, plot_name[ind]
      plot_name[ind] = file_which('no_model.eps')
    endif

    cmd = file_which('plot_spectra.pl')+' --doc_name=xspec_fits ' + strjoin(plot_name, ' ')
    run_command, cmd 
    exit, /NO_CONFIRM
    end

The code above calls a Perl script, plot_spectral.pl, that builds a Latex document (spectra.tex) that presents the XSPEC plots 12 to a page.  Latex is run, and the PostScript document spectra.ps is produced.

From tables/sources_of_interest/
  idl -queue |& tee flatten.log
    .run ae
    .run
    collated_filename = 'xspec.collated'
    flat_table_fn     = 'xspec.fits'

    distance=0.0
    read, 'distance in pc: ', distance

    ae_flatten_collation, COLLATED_FILENAME=collated_filename, FLAT_TABLE_FILENAME=flat_table_fn, SORT=0, FAST=0, ABUNDANCES_TO_REPORT='', DISTANCE=distance

    ; For LMC, add AV_TO_NH_FACTOR=6.5e21 to call below.
    hmsfr_tables, flat_table_fn, file_which('/hmsfr_tables.tex'), THERMAL_MODEL_PATTERN='T|apec'
    end

    lualatex xray_properties;lualatex xray_properties;  lualatex -jobname=xray_properties "\includeonly{./src_properties,./thermal_spectroscopy,./powerlaw_spectroscopy}\input{xray_properties}" ; open xray_properties.pdf









BUILD CARTOON IMAGES DEPICTING THE POINT SOURCES
================================================
Starting from point_sources.noindex directory, you need a template image.  If you've  gone through diffuse_procedure.txt and made adaptively-smoothed diffuse images, then fullfield_template.img is a good template:

  idl
    acis_extract, 'all.srclist', /MERGE_OBSERVATIONS, /PLOT, COLLATED_FILENAME='tables/photometry.collated', CARTOON_TEMPLATE='../adaptive_smoothing/fullfield_template.img', CARTOON_FWHM=1
  exit

The FWHM given here is in sky pixels.  I generally like the gaussians to be quite narrow, so I use a FWHM of 1 sky pixel.  If you want to try other sizes, move the default output file cartoon_sky.img to a different name so it doesn't get overwritten:

  mv cartoon_sky.img cartoon_sky1.img


We typically use these cartoons in conjunction with a Spitzer 8um image in green and full-band diffuse X-ray emission in red:

  ds9c -blue cartoon_sky.img -green /bulk/hiawatha1/targets/MOXC/targets/M17/spitzer/GLM_01513-064_mosaic_I4.fits  -red ../adaptive_smoothing/full_band/sig015/tophat/iarray.diffuse_fill5.flux  &



INVESTIGATE WHETHER ANY BRIGHT SOURCES ARE EXTENDED.
==============================================================
  popd
  idl
    .r ae
    .run
    bt = mrdfits('tables/all_inclusive.collated', 1, /SILENT)
    ; Confirm we're using the "full" band:
    band_full = 0
    print, 'Using the energy band ', bt[0].ENERG_LO[band_full], bt[0].ENERG_HI[band_full]
    forprint, TEXTOUT='bright.srclist', bt.CATALOG_NAME, SUBSET=where(bt.NET_CNTS[band_full] GE 30, count), /NoCOMMENT
    print, count, F='(%"%d bright sources to analyze ...\n")'
    if (count EQ 0) then stop

    ae_radial_profile, report, MERGE_NAME='photometry', SRCLIST_FILENAME='bright.srclist', /PLOT

    forprint, report.SOURCENAME, report.LABEL, report.PROB_KS_R < report.PROB_KS_DX < report.PROB_KS_DY, F='(%"%s (%8s): min Pks = %6.4f")'
    end

***PAT -- the above code is not working.  5 Sept 2014, lkt.***

By DEFAULT the radial profile is calculated within a circular aperture, which may include only the core of the PSF; use SRC_RADIUS to change the radius of this analysis aperture.

Omiting /PLOT will suppress the plots and user interaction.
The optional keyword ENERGY_RANGE specifies which event energies are analyzed, e.g. ENERGY_RANGE=[0.5,2] (keV).
The output variable, "report" above, is a structure array containing three Kolmogorov-Smirnov statistics comparing the PSF and events.



