Welcome to legacypipe’s documentation!

Contents:

Running the legacypipe code

Preparing a LEGACY_SURVEY_DIR directory

The set of input images, calibration files, and output products, along with metadata, are all placed in a directory. The path to that directory should be placed in an environment variable called LEGACY_SURVEY_DIR. An example of a directory is in the github repository:

http://github.com/legacysurvey/legacypipe-dir

so to start out, you will probably want to clone that directory and then modify it to your purposes:

git clone http://github.com/legacysurvey/legacypipe-dir survey-dir export LEGACY_SURVEY_DIR=$(pwd)/survey-dir

You need not (indeed, probably don’t want to) be in that directory while running the code; typically it is easier to cd to the legacypipe/py directory and run the python codes from there.

git clone http://github.com/legacysurvey/legacypipe legacypipe cd legacypipe/py

Preparing the list of images to process

To run the legacypipe code on your own data, the first step is to produce a survey-ccds table. This is a FITS binary table that lists metadata about the images that are to be processed. In DECaLS, MzLS and BASS, this file is created by the decstat or mosstat code, then massaged by the legacypipe/merge-zeropoints.py code.

The CCDs table must contain the following fields:

  • camera – string, see below
  • image_filename – string, path relative to $LEGACY_SURVEY_DIR/images
  • image_hdu – integer, FITS extension containing pixels
  • expnum – integer exposure number counter. This plus ccdname is used to identify a CCD.
  • ccdname – string
  • filter – string of length 1
  • exptime – float, exposure time in seconds
  • seeing – float, seeing FWHM in arcseconds
  • zpt – float, zeropoint (average for the exposure)
  • crpix1 – float, WCS reference pixel X
  • crpix2 – float, WCS reference pixel Y
  • crval1 – float, WCS reference RA
  • crval2 – float, WCS reference Dec
  • cd1_1 – float, WCS transformation matrix
  • cd1_2 – float, WCS transformation matrix
  • cd2_1 – float, WCS transformation matrix
  • cd2_2 – float, WCS transformation matrix
  • width – integer, image width in pixels
  • height – integer, image height in pixels
  • ra – float, image center RA
  • dec – float, image center Dec

Each row of the table should correspond to one contiguous chunk of pixels contained in one HDU of a FITS file, that is described by a single astrometric World Coordinate System and photometric solution.

This so-called “CCDs table” file must be placed in your $LEGACY_SURVEY_DIR directory. The pipeline will look for all files named “survey-ccds-*.fits.gz” and read each one to determine which input images exist. Any field that does not exist in a table will be filled in with zeroes.

The camera field has a special meaning and purpose in the code: it is a string such as “decam”, “mosaic”, “90prime” that tells the code which python class to use to interpret the image. There is a dictionary (in the LegacySurveyData class) that holds this mapping. If you need to add a new camera to this mapping, you might want to either add it directly to the LegacySurveyData class, or create a small wrapper script that adds your new class and then calls the main runbrick code; for an example of this, please see legacypipe/runcosmos.py.

Calibrating the images

Some of the camera-specific image-handling classes perform some additional calibration of each image – for example, to produce a PSF model or a sky model. These calibration results are saved in files within the calib directory of $LEGACY_SURVEY_DIR. Each camera-specific class defines which calibration processes it wants to run. The legacypipe can either run these calibration processes as required when an image is about to be read, or they can be run in pre-processing.

(more on this; queue-calibs)

Bricks

A brick is a region of the sky defined by an RA,Dec box. The list of bricks to be processed is contained in a FITS table in the $LEGACY_SURVEY_DIR directory, survey-bricks.fits.gz. You could certainly define your own bricks if desired. Bricks are named like RRRR[pm]DDD, where RRRR is a (zero-padded) 4-digit string of the RA times 10 (‘%04i’ % (ra*10) in python). Similarly, DDD is a 3-digit string of the Dec times 10.

Running the pipeline

The main script is legacypipe/runbrick.py. It takes many command-line arguments, but at the very least you will need:

  • –brick <brickname>: this determines which part of sky to run.

By default, our bricks are 3600 x 3600 pixels, with a nominal pixel scale of 0.262 arcseconds per pixel. These can be adjusted with:

  • –width <W> –height <H> –pixscale <p> where W,H are in pixels, and p in arcsec/pix.

If you are using our default bricks, you should ensure that the brick size is at least 0.25 degrees plus some padding.

It is also possible to run a “custom” brick at a given RA,Dec center:

  • –radec <ra> <dec> where <ra> and <dec> are in degrees.
By default, output products are written to the current directory; to change that:
  • –outdir <d>

You can also set the directory used instead of the $LEGACY_SURVEY_DIR environment variable;

  • –survey-dir <d>

The code uses the stages framework, which allows saving the state of a computation between stages of processing. State is saved in python “pickle” files. There are dependencies between stages, so if a computation is resumed later, a pickle files can be read and the computation resumed. The stages in the runbrick code, and their prerequisites, are listed in the prereqs dictionary in the legacypipe/runbrick.py code. There are some flags to control the stage behavior:

  • –stage <s>, string <s>. Which stage(s) (plus their prerequisites) to run. Stages include:

    • tims: reads input images
    • mask_junk: eliminates satellite trails
    • image_coadds: early coadds
    • srcs: detects sources
    • fitblobs: fits sources
    • coadds: produces coadds, including models and residuals
    • wise_forced: WISE forced photometry
    • writecat: writes output tractor table
  • –force-all: ignore all pickle files and run all required stages

  • –force <s>: force a single stage

  • –no-write: do not write out pickle files

  • –pickle <s>: set the pickle filename pattern. This has a somewhat silly format, because it goes through two rounds of string substitution. The default is pickles/runbrick-%(brick)s-%%(stage)s.pickle (which you must put within single-quotes on the command-line to avoid strange shell behavior). This is a python string-formatting string. Note that first the brick is substituted, then the stage is substituted later, so the % of the stage formatting string is escaped with %%.

legacypipe.runbrick module

Main “pipeline” script for the Legacy Survey (DECaLS, MzLS, BASS) data reductions.

For calling from other scripts, see:

Or for much more fine-grained control, see the individual stages:

To see the code we run on each “blob” of pixels, see “oneblob.py”.

  • one_blob()
legacypipe.runbrick.collapse_unwise_bitmask(bitmask, band)[source]

Converts WISE mask bits (in the unWISE data products) into the more compact codes reported in the tractor files as WISEMASK_W[12], and the “maskbits” WISE extensions.

output bits : # 2^0 = bright star core and wings # 2^1 = PSF-based diffraction spike # 2^2 = optical ghost # 2^3 = first latent # 2^4 = second latent # 2^5 = AllWISE-like circular halo # 2^6 = bright star saturation # 2^7 = geometric diffraction spike

legacypipe.runbrick.read_gaia(targetwcs)[source]

margin in degrees

legacypipe.runbrick.read_star_clusters(targetwcs)[source]

Code to regenerate the NGC-star-clusters-fits catalog:

wget https://raw.githubusercontent.com/mattiaverga/OpenNGC/master/NGC.csv

import os import numpy as np import numpy.ma as ma from astropy.io import ascii from astrometry.util.starutil_numpy import hmsstring2ra, dmsstring2dec import desimodel.io import desimodel.footprint

names = (‘name’, ‘type’, ‘ra_hms’, ‘dec_dms’, ‘const’, ‘majax’, ‘minax’,
‘pa’, ‘bmag’, ‘vmag’, ‘jmag’, ‘hmag’, ‘kmag’, ‘sbrightn’, ‘hubble’, ‘cstarumag’, ‘cstarbmag’, ‘cstarvmag’, ‘messier’, ‘ngc’, ‘ic’, ‘cstarnames’, ‘identifiers’, ‘commonnames’, ‘nednotes’, ‘ongcnotes’)

NGC = ascii.read(‘NGC.csv’, delimiter=’;’, names=names)

objtype = np.char.strip(ma.getdata(NGC[‘type’])) keeptype = (‘PN’, ‘OCl’, ‘GCl’, ‘Cl+N’) keep = np.zeros(len(NGC), dtype=bool) for otype in keeptype:

ww = [otype == tt for tt in objtype] keep = np.logical_or(keep, ww)

clusters = NGC[keep]

ra, dec = [], [] for _ra, _dec in zip(ma.getdata(clusters[‘ra_hms’]), ma.getdata(clusters[‘dec_dms’])):

ra.append(hmsstring2ra(_ra.replace(‘h’, ‘:’).replace(‘m’, ‘:’).replace(‘s’,’‘))) dec.append(dmsstring2dec(_dec.replace(‘d’, ‘:’).replace(‘m’, ‘:’).replace(‘s’,’‘)))

clusters[‘ra’] = ra clusters[‘dec’] = dec

tiles = desimodel.io.load_tiles(onlydesi=True) indesi = desimodel.footprint.is_point_in_desi(tiles, ma.getdata(clusters[‘ra’]),

ma.getdata(clusters[‘dec’]))

print(np.sum(indesi)) clusters.write(‘NGC-star-clusters.fits’, overwrite=True)

legacypipe.runbrick.run_brick(brick, survey, radec=None, pixscale=0.262, width=3600, height=3600, zoom=None, bands=None, allbands=None, depth_cut=False, nblobs=None, blob=None, blobxy=None, blobradec=None, blobid=None, max_blobsize=None, nsigma=6, simul_opt=False, wise=True, lanczos=True, early_coadds=False, blob_image=False, do_calibs=True, write_metrics=True, gaussPsf=False, pixPsf=False, hybridPsf=False, normalizePsf=False, apodize=False, rgb_kwargs=None, rex=False, splinesky=True, subsky=True, constant_invvar=False, gaia_stars=False, min_mjd=None, max_mjd=None, unwise_coadds=False, bail_out=False, ceres=True, wise_ceres=True, unwise_dir=None, unwise_tr_dir=None, threads=None, plots=False, plots2=False, coadd_bw=False, plot_base=None, plot_number=0, record_event=None, pickle_pat='pickles/runbrick-%(brick)s-%%(stage)s.pickle', stages=['writecat'], force=[], forceall=False, write_pickles=True, checkpoint_filename=None, checkpoint_period=None, prereqs_update=None, stagefunc=None)[source]

Run the full Legacy Survey data reduction pipeline.

The pipeline is built out of “stages” that run in sequence. By default, this function will cache the result of each stage in a (large) pickle file. If you re-run, it will read from the prerequisite pickle file rather than re-running the prerequisite stage. This can yield faster debugging times, but you almost certainly want to turn it off (with writePickles=False, forceall=True) in production.

Parameters:
brick : string

Brick name such as ‘2090m065’. Can be None if radec is given.

survey : a “LegacySurveyData” object (see common.LegacySurveyData), which is in

charge of the list of bricks and CCDs to be handled, and where output files should be written.

radec : tuple of floats (ra,dec)

RA,Dec center of the custom region to run.

pixscale : float

Brick pixel scale, in arcsec/pixel. Default = 0.262

width, height : integers

Brick size in pixels. Default of 3600 pixels (with the default pixel scale of 0.262) leads to a slight overlap between bricks.

zoom : list of four integers

Pixel coordinates [xlo,xhi, ylo,yhi] of the brick subimage to run.

bands : string

Filter (band) names to include; default is “grz”.

Raises:
RunbrickError

If an invalid brick name is given.

NothingToDoError

If no CCDs, or no photometric CCDs, overlap the given brick or region.

Notes

You must specify the region of sky to work on, via one of:

  • brick: string, brick name such as ‘2090m065’
  • radec: tuple of floats; RA,Dec center of the custom region to run

If radec is given, brick should be None. If brick is given, that brick`s RA,Dec center will be looked up in the survey-bricks.fits file.

You can also change the size of the region to reduce:

  • pixscale: float, brick pixel scale, in arcsec/pixel.
  • width and height: integers; brick size in pixels. 3600 pixels (with the default pixel scale of 0.262) leads to a slight overlap between bricks.
  • zoom: list of four integers, [xlo,xhi, ylo,yhi] of the brick subimage to run.

If you want to measure only a subset of the astronomical objects, you can use:

  • nblobs: None or int; for debugging purposes, only fit the
    first N blobs.
  • blob: int; for debugging purposes, start with this blob index.
  • blobxy: list of (x,y) integer tuples; only run the blobs containing these pixels.
  • blobradec: list of (RA,Dec) tuples; only run the blobs containing these coordinates.

Other options:

  • max_blobsize: int; ignore blobs with more than this many pixels
  • nsigma: float; detection threshold in sigmas.
  • simul_opt: boolean; during fitting, if a blob contains multiple sources, run a step of fitting the sources simultaneously?
  • wise: boolean; run WISE forced photometry?
  • early_coadds: boolean; generate the early coadds?
  • do_calibs: boolean; run the calibration preprocessing steps?
  • write_metrics: boolean; write out a variety of useful metrics
  • gaussPsf: boolean; use a simpler single-component Gaussian PSF model?
  • pixPsf: boolean; use the pixelized PsfEx PSF model and FFT convolution?
  • hybridPsf: boolean; use combo pixelized PsfEx + Gaussian approx model
  • normalizePsf: boolean; make PsfEx model have unit flux
  • splinesky: boolean; use the splined sky model (default is constant)?
  • subsky: boolean; subtract the sky model when reading in tims (tractor images)?
  • ceres: boolean; use Ceres Solver when possible?
  • wise_ceres: boolean; use Ceres Solver for unWISE forced photometry?
  • unwise_dir: string; where to look for unWISE coadd files. This may be a colon-separated list of directories to search in order.
  • unwise_tr_dir: string; where to look for time-resolved unWISE coadd files. This may be a colon-separated list of directories to search in order.
  • threads: integer; how many CPU cores to use

Plotting options:

  • coadd_bw: boolean: if only one band is available, make B&W coadds?
  • plots: boolean; make a bunch of plots?
  • plots2: boolean; make a bunch more plots?
  • plot_base: string, default brick-BRICK, the plot filename prefix.
  • plot_number: integer, default 0, starting number for plot filenames.

Options regarding the “stages”:

  • pickle_pat: string; filename for ‘pickle’ files
  • stages: list of strings; stages (functions stage_*) to run.
  • force: list of strings; prerequisite stages that will be run even if pickle files exist.
  • forceall: boolean; run all stages, ignoring all pickle files.
  • write_pickles: boolean; write pickle files after each stage?
legacypipe.runbrick.stage_coadds(survey=None, bands=None, version_header=None, targetwcs=None, tims=None, ps=None, brickname=None, ccds=None, custom_brick=False, T=None, cat=None, pixscale=None, plots=False, coadd_bw=False, brick=None, W=None, H=None, lanczos=True, saturated_pix=None, brightblobmask=None, bailout_mask=None, mp=None, record_event=None, **kwargs)[source]

After the stage_fitblobs fitting stage, we have all the source model fits, and we can create coadds of the images, model, and residuals. We also perform aperture photometry in this stage.

legacypipe.runbrick.stage_fitblobs(T=None, brickname=None, brickid=None, brick=None, version_header=None, blobsrcs=None, blobslices=None, blobs=None, cat=None, targetwcs=None, W=None, H=None, bands=None, ps=None, tims=None, survey=None, plots=False, plots2=False, nblobs=None, blob0=None, blobxy=None, blobradec=None, blobid=None, max_blobsize=None, simul_opt=False, use_ceres=True, mp=None, checkpoint_filename=None, checkpoint_period=600, write_pickle_filename=None, write_metrics=True, get_all_models=False, refstars=None, rex=False, bailout=False, record_event=None, custom_brick=False, **kwargs)[source]

This is where the actual source fitting happens. The one_blob function is called for each “blob” of pixels with the sources contained within that blob.

legacypipe.runbrick.stage_mask_junk(tims=None, targetwcs=None, W=None, H=None, bands=None, mp=None, nsigma=None, plots=None, ps=None, record_event=None, **kwargs)[source]

This pipeline stage tries to detect artifacts in the individual exposures, by running a detection step and removing blobs with large axis ratio (long, thin objects, often satellite trails).

legacypipe.runbrick.stage_srcs(targetrd=None, pixscale=None, targetwcs=None, W=None, H=None, bands=None, ps=None, tims=None, plots=False, plots2=False, brickname=None, mp=None, nsigma=None, survey=None, brick=None, gaia_stars=False, star_clusters=True, record_event=None, **kwargs)[source]

In this stage we run SED-matched detection to find objects in the images. For each object detected, a tractor source object is created, initially a tractor.PointSource. In this stage, the sources are also split into “blobs” of overlapping pixels. Each of these blobs will be processed independently.

legacypipe.runbrick.stage_tims(W=3600, H=3600, pixscale=0.262, brickname=None, survey=None, ra=None, dec=None, plots=False, ps=None, target_extent=None, program_name='runbrick.py', bands=['g', 'r', 'z'], do_calibs=True, splinesky=True, subsky=True, gaussPsf=False, pixPsf=False, hybridPsf=False, normalizePsf=False, apodize=False, constant_invvar=False, depth_cut=True, read_image_pixels=True, min_mjd=None, max_mjd=None, mp=None, record_event=None, unwise_dir=None, unwise_tr_dir=None, **kwargs)[source]

This is the first stage in the pipeline. It determines which CCD images overlap the brick or region of interest, runs calibrations for those images if necessary, and then reads the images, creating tractor.Image (“tractor image” or “tim”) objects for them.

PSF options:

  • gaussPsf: boolean. Single-component circular Gaussian, with width set from the header FWHM value. Useful for quick debugging.
  • pixPsf: boolean. Pixelized PsfEx model.
  • hybridPsf: boolean. Hybrid Pixelized PsfEx / Gaussian approx model.

Sky:

  • splinesky: boolean. Use SplineSky model, rather than ConstantSky?
  • subsky: boolean. Subtract sky model from tims?
legacypipe.runbrick.stage_wise_forced(survey=None, cat=None, T=None, targetwcs=None, W=None, H=None, pixscale=None, brickname=None, unwise_dir=None, unwise_tr_dir=None, brick=None, wise_ceres=True, unwise_coadds=False, version_header=None, mp=None, record_event=None, **kwargs)[source]

After the model fits are finished, we can perform forced photometry of the unWISE coadds.

legacypipe.runbrick.stage_writecat(survey=None, version_header=None, T=None, WISE=None, WISE_T=None, maskbits=None, maskbits_header=None, wise_mask_maps=None, AP=None, apertures_arcsec=None, cat=None, pixscale=None, targetwcs=None, W=None, H=None, bands=None, ps=None, plots=False, brickname=None, brickid=None, brick=None, invvars=None, gaia_stars=False, allbands='ugrizY', record_event=None, **kwargs)[source]

Final stage in the pipeline: format results for the output catalog.

legacypipe.detection module

legacypipe.detection.run_sed_matched_filters(SEDs, bands, detmaps, detivs, omit_xy, targetwcs, nsigma=5, saturated_pix=None, plots=False, ps=None, mp=None)[source]

Runs a given set of SED-matched filters.

Parameters:
SEDs : list of (name, sed) tuples

The SEDs to run. The sed values are lists the same length as bands.

bands : list of string

The band names of detmaps and detivs.

detmaps : numpy array, float

Detection maps for each of the listed bands.

detivs : numpy array, float

Inverse-variances of the detmaps.

omit_xy : None, or (xx,yy,rr) tuple

Existing sources to avoid: x, y, radius.

targetwcs : WCS object

WCS object to use to convert pixel values into RA,Decs for the returned Tractor PointSource objects.

nsigma : float, optional

Detection threshold

saturated_pix : None or list of numpy arrays, booleans

Passed through to sed_matched_detection. A map of pixels that are always considered “hot” when determining whether a new source touches hot pixels of an existing source.

plots : boolean, optional

Create plots?

ps : PlotSequence object

Create plots?

mp : multiproc object

Multiprocessing

Returns:
Tnew : fits_table

Table of new sources detected

newcat : list of PointSource objects

Newly detected objects, with positions and fluxes, as Tractor PointSource objects.

hot : numpy array of bool

“Hot pixels” containing sources.

See also

sed_matched_detection
run a single SED-matched filter.
legacypipe.detection.sed_matched_detection(sedname, sed, detmaps, detivs, bands, xomit, yomit, romit, nsigma=5.0, saturated_pix=None, saddle=2.0, cutonaper=True, ps=None)[source]

Runs a single SED-matched detection filter.

Avoids creating sources close to existing sources.

Parameters:
sedname : string

Name of this SED; only used for plots.

sed : list of floats

The SED – a list of floats, one per band, of this SED.

detmaps : list of numpy arrays

The per-band detection maps. These must all be the same size, the brick image size.

detivs : list of numpy arrays

The inverse-variance maps associated with detmaps.

bands : list of strings

The band names of the detmaps and detivs images.

xomit, yomit, romit : iterables (lists or numpy arrays) of int

Previously known sources that are to be avoided; x,y +- radius

nsigma : float, optional

Detection threshold.

saturated_pix : None or list of numpy arrays, boolean

A map of pixels that are always considered “hot” when determining whether a new source touches hot pixels of an existing source.

saddle : float, optional

Saddle-point depth from existing sources down to new sources.

cutonaper : bool, optional

Apply a cut that the source’s detection strength must be greater than nsigma above the 16th percentile of the detection strength in an annulus (from 10 to 20 pixels) around the source.

ps : PlotSequence object, optional

Create plots?

Returns:
hotblobs : numpy array of bool

A map of the blobs yielding sources in this SED.

px, py : numpy array of int

The new sources found.

aper : numpy array of float

The detection strength in the annulus around the source, if cutonaper is set; else -1.

peakval : numpy array of float

The detection strength.

See also

sed_matched_filters
creates the (sedname, sed) pairs used here
run_sed_matched_filters
calls this method
legacypipe.detection.sed_matched_filters(bands)[source]

Determines which SED-matched filters to run based on the available bands.

Returns:
SEDs : list of (name, sed) tuples
legacypipe.detection.segment_and_group_sources(image, T, name=None, ps=None, plots=False)[source]

image: binary image that defines “blobs” T: source table; only “.ibx” and “.iby” elements are used (x,y integer pix pos). Note: “.blob” field is added. name: for debugging only

Returns: (blobs, blobsrcs, blobslices)

blobs: image, values -1 = no blob, integer blob indices blobsrcs: list of np arrays of integers, elements in T within each blob blobslices: list of slice objects for blob bounding-boxes.

legacypipe.catalog module

legacypipe.catalog.read_fits_catalog(T, hdr=None, invvars=False, bands='grz', allbands=None, ellipseClass=<Mock id='140497695188304'>, unpackShape=True, fluxPrefix='')[source]

This is currently a weird hybrid of dynamic and hard-coded.

Return list of tractor Sources.

If invvars=True, return sources,invvars where invvars is a list matching sources.getParams()

If ellipseClass is set, assume that type for galaxy shapes; if None, read the type from the header.

If unpackShapes is True and ellipseClass is EllipseE, read catalog entries “shapeexp_r”, “shapeexp_e1”, “shapeexp_e2” rather than “shapeExp”, and similarly for “dev”.

legacypipe.survey module

class legacypipe.survey.BrickDuck(ra, dec, brickname)[source]

Bases: object

A little duck-typing class when running on a custom RA,Dec center rather than a brick center.

class legacypipe.survey.LegacySurveyData(survey_dir=None, cache_dir=None, output_dir=None, version=None, ccds=None, verbose=False)[source]

Bases: object

A class describing the contents of a LEGACY_SURVEY_DIR directory – tables of CCDs and of bricks, and calibration data. Methods for dealing with the CCDs and bricks tables.

This class is also responsible for creating LegacySurveyImage objects (eg, DecamImage objects), which then allow data to be read from disk.

Methods

add_hashcode(fn, hashcode) Callback to be called in the write_output routine.
bricks_touching_radec_box(bricks, ralo, …) Returns an index vector of the bricks that touch the given RA,Dec box.
ccds_touching_wcs(wcs, **kwargs) Returns a table of the CCDs touching the given wcs region.
drop_cache() Clears all cached data contained in this object.
filter_ccds_files(fns) When reading the list of CCDs, we find all files named survey-ccds-*.fits.gz, then filter that list using this function.
find_ccds([expnum, ccdname, camera]) Returns a table of CCDs matching the given expnum (exposure number, integer), ccdname (string), and camera (string), if given.
find_file(filetype[, brick, brickpre, band, …]) Returns the filename of a Legacy Survey file.
get_annotated_ccds() Returns the annotated table of CCDs.
get_brick(brickid) Returns a brick (as one row in a table) by brickid (integer).
get_brick_by_name(brickname) Returns a brick (as one row in a table) by name (string).
get_bricks() Returns a table of bricks.
get_bricks_near(ra, dec, radius) Returns a set of bricks near the given RA,Dec and radius (all in degrees).
get_bricks_readonly() Returns a read-only (shared) copy of the table of bricks.
get_calib_dir() Returns the directory containing calibration data.
get_ccds(**kwargs) Returns the table of CCDs.
get_ccds_readonly() Returns a shared copy of the table of CCDs.
get_image_dir() Returns the directory containing image data.
get_image_object(t, **kwargs) Returns a DecamImage or similar object for one row of the CCDs table.
get_se_dir() Returns the directory containing SourceExtractor config files, used during calibration.
get_survey_dir() Returns the base LEGACY_SURVEY_DIR directory.
read_intermediate_catalog(brick, **kwargs) Reads the intermediate tractor catalog for the given brickname.
tims_touching_wcs(targetwcs, mp[, bands]) Creates tractor.Image objects for CCDs touching the given targetwcs region.
try_expnum_kdtree(expnum) # By creating a kd-tree from the ‘expnum’ column, search for expnums # can be sped up: from astrometry.libkd.spherematch import * from astrometry.util.fits import fits_table T=fits_table(‘/global/cscratch1/sd/dstn/dr7-depthcut/survey-ccds-dr7.kd.fits’, columns=[‘expnum’]) ekd = tree_build(np.atleast_2d(T.expnum.copy()).T.astype(float), nleaf=60, bbox=False, split=True) ekd.set_name(‘expnum’) ekd.write(‘ekd.fits’)
write_output(filetype[, hashsum]) Returns a context manager for writing an output file; use like:
ccds_for_fitting  
check_cache  
cleanup_ccds_table  
debug  
filter_ccd_kd_files  
get_approx_wcs  
get_ccd_kdtrees  
get_compression_string  
image_class_for_camera  
index_of_band  
sed_matched_filters  
add_hashcode(fn, hashcode)[source]

Callback to be called in the write_output routine.

bricks_touching_radec_box(bricks, ralo, rahi, declo, dechi)[source]

Returns an index vector of the bricks that touch the given RA,Dec box.

ccds_touching_wcs(wcs, **kwargs)[source]

Returns a table of the CCDs touching the given wcs region.

drop_cache()[source]

Clears all cached data contained in this object. Useful for pickling / multiprocessing.

filter_ccds_files(fns)[source]

When reading the list of CCDs, we find all files named survey-ccds-*.fits.gz, then filter that list using this function.

find_ccds(expnum=None, ccdname=None, camera=None)[source]

Returns a table of CCDs matching the given expnum (exposure number, integer), ccdname (string), and camera (string), if given.

find_file(filetype, brick=None, brickpre=None, band='%(band)s', output=False, **kwargs)[source]

Returns the filename of a Legacy Survey file.

filetype : string, type of file to find, including:
“tractor” – Tractor catalogs “depth” – PSF depth maps “galdepth” – Canonical galaxy depth maps “nexp” – number-of-exposure maps

brick : string, brick name such as “0001p000”

output: True if we are about to write this file; will use self.outdir as the base directory rather than self.survey_dir.

Returns: path to the specified file (whether or not it exists).

get_annotated_ccds()[source]

Returns the annotated table of CCDs.

get_brick(brickid)[source]

Returns a brick (as one row in a table) by brickid (integer).

get_brick_by_name(brickname)[source]

Returns a brick (as one row in a table) by name (string).

get_bricks()[source]

Returns a table of bricks. The caller owns the table.

For read-only purposes, see get_bricks_readonly(), which uses a cached version.

get_bricks_near(ra, dec, radius)[source]

Returns a set of bricks near the given RA,Dec and radius (all in degrees).

get_bricks_readonly()[source]

Returns a read-only (shared) copy of the table of bricks.

get_calib_dir()[source]

Returns the directory containing calibration data.

get_ccds(**kwargs)[source]

Returns the table of CCDs.

get_ccds_readonly()[source]

Returns a shared copy of the table of CCDs.

get_image_dir()[source]

Returns the directory containing image data.

get_image_object(t, **kwargs)[source]

Returns a DecamImage or similar object for one row of the CCDs table.

get_se_dir()[source]

Returns the directory containing SourceExtractor config files, used during calibration.

get_survey_dir()[source]

Returns the base LEGACY_SURVEY_DIR directory.

read_intermediate_catalog(brick, **kwargs)[source]

Reads the intermediate tractor catalog for the given brickname.

kwargs: passed to self.find_file()

Returns (T, hdr, primhdr)

tims_touching_wcs(targetwcs, mp, bands=None, **kwargs)[source]

Creates tractor.Image objects for CCDs touching the given targetwcs region.

mp: multiprocessing object

kwargs are passed to LegacySurveyImage.get_tractor_image() and may include:

  • gaussPsf
  • pixPsf
try_expnum_kdtree(expnum)[source]

# By creating a kd-tree from the ‘expnum’ column, search for expnums # can be sped up: from astrometry.libkd.spherematch import * from astrometry.util.fits import fits_table T=fits_table(‘/global/cscratch1/sd/dstn/dr7-depthcut/survey-ccds-dr7.kd.fits’,

columns=[‘expnum’])
ekd = tree_build(np.atleast_2d(T.expnum.copy()).T.astype(float),
nleaf=60, bbox=False, split=True)

ekd.set_name(‘expnum’) ekd.write(‘ekd.fits’)

> fitsgetext -i $CSCRATCH/dr7-depthcut/survey-ccds-dr7.kd.fits -o dr7-%02i -a -M > fitsgetext -i ekd.fits -o ekd-%02i -a -M > cat dr7-0* ekd-0[123456] > $CSCRATCH/dr7-depthcut+/survey-ccds-dr7.kd.fits

write_output(filetype, hashsum=True, **kwargs)[source]

Returns a context manager for writing an output file; use like:

with survey.write_output(‘ccds’, brick=brickname) as out:
ccds.writeto(out.fn, primheader=primhdr)

For FITS output, out.fits is a fitsio.FITS object. The file contents will actually be written in memory, and then a sha256sum computed before the file contents are written out to the real disk file. The ‘out.fn’ member variable is NOT set.

with survey.write_output(‘ccds’, brick=brickname) as out:
ccds.writeto(None, fits_object=out.fits, primheader=primhdr)

Does the following on entry: - calls self.find_file() to determine which filename to write to - ensures the output directory exists - appends a “.tmp” to the filename

Does the following on exit: - moves the “.tmp” to the final filename (to make it atomic) - computes the sha256sum

legacypipe.survey.brick_catalog_for_radec_box(ralo, rahi, declo, dechi, survey, catpattern, bricks=None)[source]

Merges multiple Tractor brick catalogs to cover an RA,Dec bounding-box.

No cleverness with RA wrap-around; assumes ralo < rahi.

survey: LegacySurveyData object

bricks: table of bricks, eg from LegacySurveyData.get_bricks()

catpattern: filename pattern of catalog files to read,
eg “pipebrick-cats/tractor-phot-%06i.its”
legacypipe.survey.bricks_touching_wcs(targetwcs, survey=None, B=None, margin=20)[source]

Finds LegacySurvey bricks touching a given WCS header object.

Parameters:
targetwcs : astrometry.util.Tan object or similar

The region of sky to search

survey : legacypipe.survey.LegacySurveyData object

From which the brick table will be retrieved

B : FITS table

The table of brick objects to search

margin : int

Margin in pixels around the outside of the WCS

Returns:
A table (subset of B, if given) containing the bricks touching the
given WCS region + margin.
legacypipe.survey.ccd_map_image(valmap, empty=0.0)[source]

valmap: { ‘N7’ : 1., ‘N8’ : 17.8 }

Returns: a numpy image (shape (12,14)) with values mapped to their CCD locations.

legacypipe.survey.ccds_touching_wcs(targetwcs, ccds, ccdrad=None, polygons=True)[source]

targetwcs: wcs object describing region of interest ccds: fits_table object of CCDs

ccdrad: radius of CCDs, in degrees. If None (the default), compute from the CCDs table. (0.17 for DECam)

Returns: index array I of CCDs within range.

legacypipe.survey.get_git_version(dir=None)[source]

Runs ‘git describe’ in the current directory (or given dir) and returns the result as a string.

Parameters:
dir : string

If non-None, “cd” to the given directory before running ‘git describe’

Returns:
Git version string
legacypipe.survey.get_rgb(imgs, bands, mnmx=None, arcsinh=None, scales=None, clip=True)[source]

Given a list of images in the given bands, returns a scaled RGB image.

imgs a list of numpy arrays, all the same size, in nanomaggies bands a list of strings, eg, [‘g’,’r’,’z’] mnmx = (min,max), values that will become black/white after scaling. Default is (-3,10) arcsinh use nonlinear scaling as in SDSS scales

Returns a (H,W,3) numpy array with values between 0 and 1.

legacypipe.survey.get_version_header(program_name, survey_dir, git_version=None)[source]

Creates a fitsio header describing a DECaLS data product.

legacypipe.survey.imsave_jpeg(jpegfn, img, **kwargs)[source]

Saves a image in JPEG format. Some matplotlib installations (notably at NERSC) don’t support jpeg, so we write to PNG and then convert to JPEG using the venerable netpbm tools.

jpegfn: JPEG filename img: image, in the typical matplotlib formats (see plt.imsave)

legacypipe.survey.radec_at_mjd(ra, dec, ref_year, pmra, pmdec, parallax, mjd)[source]

Units: - matches Gaia DR1/DR2 - pmra,pmdec are in mas/yr. pmra is in angular speed (ie, has a cos(dec) factor) - parallax is in mas.

NOTE: does not broadcast completely correctly – all params vectors or all motion params vector + scalar mjd work fine. Other combos: not certain.

Returns RA,Dec

legacypipe.survey.switch_to_soft_ellipses(cat)[source]

Converts our softened-ellipticity EllipseESoft parameters into normal EllipseE ellipses.

cat: an iterable of tractor Sources, which will be modified
in-place.
legacypipe.survey.wcs_for_brick(b, W=3600, H=3600, pixscale=0.262)[source]

Returns an astrometry.net style Tan WCS object for a given brick object.

b: row from survey-bricks.fits file W,H: size in pixels pixscale: pixel scale in arcsec/pixel.

Returns: Tan wcs object

legacypipe.utils module

exception legacypipe.utils.NothingToDoError[source]

Bases: legacypipe.utils.RunbrickError

exception legacypipe.utils.RunbrickError[source]

Bases: exceptions.RuntimeError

legacypipe.oneblob module

class legacypipe.oneblob.SourceModels[source]

Bases: object

This class maintains a list of the model patches for a set of sources in a set of images.

Methods

add(i, tims) Adds the models for source i back into the tims.
create(tims, srcs[, subtract]) Note that this modifies the tims if subtract=True.
model_masks  
restore_images  
save_images  
update_and_subtract  
add(i, tims)[source]

Adds the models for source i back into the tims.

create(tims, srcs, subtract=False)[source]

Note that this modifies the tims if subtract=True.

legacypipe.oneblob.one_blob(X)[source]

Fits sources contained within a “blob” of pixels.

legacypipe.image module

class legacypipe.image.LegacySurveyImage(survey, ccd)[source]

Bases: object

A base class containing common code for the images we handle.

You probably shouldn’t need to directly instantiate this class, but rather use the recipe described in the __init__ method.

Objects of this class represent the metadata we have on an image, and are used to handle some of the details of going from an entry in the CCDs table to a tractor Image object.

Attributes:
shape

Returns the full shape of the image, (H,W).

Methods

funpack_files(imgfn, maskfn, hdu, todelete) Source Extractor can’t handle .fz files, so unpack them.
get_cacheable_filename_variables() These are names of self.X variables that are filenames that could be cached.
get_good_image_slice(extent[, get_extent]) extent = None or extent = [x0,x1,y0,y1]
get_good_image_subregion() Returns x0,x1,y0,y1 of the good region of this chip, or None if no cut should be applied to that edge; returns (None,None,None,None) if the whole chip is good.
get_image_extent([wcs, slc, radecpoly]) Returns x0,x1,y0,y1,slc
get_image_shape() Returns image shape H,W.
get_sky_sig1([splinesky]) Returns the per-pixel noise estimate, which (for historical reasons) is stored in the sky model.
get_tractor_image([slc, radecpoly, …]) Returns a tractor.Image (“tim”) object for this image.
read_dq(**kwargs) Reads the Data Quality (DQ) mask image.
read_image(**kwargs) Reads the image file from disk.
read_image_header(**kwargs) Reads the FITS image header from self.imgfn HDU self.hdu.
read_image_primary_header(**kwargs) Reads the FITS primary (HDU 0) header from self.imgfn.
read_invvar(**kwargs) Reads the inverse-variance (weight) map image.
read_invvar_clipped([clip, clipThresh]) A function that can optionally be called by subclassers for read_invvar, clipping fpack artifacts to zero.
read_primary_header(fn) Reads the FITS primary header (HDU 0) from the given filename.
read_sky_model([splinesky, slc]) Reads the sky model, returning a Tractor Sky object.
remap_dq(dq, header) Called by get_tractor_image() to map the results from read_dq into a bitmask.
remap_invvar_shotnoise(invvar, primhdr, img, dq)
run_calibs([psfex, sky, se, fcopy, …]) Run calibration pre-processing steps.
check_for_cached_files  
check_image_header  
compute_filenames  
galaxy_norm  
get_sig1  
get_tractor_wcs  
get_wcs  
psf_norm  
read_psf_model  
remap_dq_cp_codes  
remap_invvar  
run_psfex  
run_se  
run_sky  
funpack_files(imgfn, maskfn, hdu, todelete)[source]

Source Extractor can’t handle .fz files, so unpack them.

get_cacheable_filename_variables()[source]

These are names of self.X variables that are filenames that could be cached.

get_good_image_slice(extent, get_extent=False)[source]

extent = None or extent = [x0,x1,y0,y1]

If get_extent = True, returns the new [x0,x1,y0,y1] extent.

Returns a new pair of slices, or extent if the whole image is good.

get_good_image_subregion()[source]

Returns x0,x1,y0,y1 of the good region of this chip, or None if no cut should be applied to that edge; returns (None,None,None,None) if the whole chip is good.

This cut is applied in addition to any masking in the mask or invvar map.

get_image_extent(wcs=None, slc=None, radecpoly=None)[source]

Returns x0,x1,y0,y1,slc

get_image_shape()[source]

Returns image shape H,W.

get_sky_sig1(splinesky=False)[source]

Returns the per-pixel noise estimate, which (for historical reasons) is stored in the sky model. NOTE that this is in image pixel counts, NOT calibrated nanomaggies.

get_tractor_image(slc=None, radecpoly=None, gaussPsf=False, pixPsf=False, hybridPsf=False, normalizePsf=False, splinesky=False, apodize=False, nanomaggies=True, subsky=True, tiny=10, dq=True, invvar=True, pixels=True, no_remap_invvar=False, constant_invvar=False)[source]

Returns a tractor.Image (“tim”) object for this image.

Options describing a subimage to return:

  • slc: y,x slice objects
  • radecpoly: numpy array, shape (N,2), RA,Dec polygon describing
    bounding box to select.

Options determining the PSF model to use:

  • gaussPsf: single circular Gaussian PSF based on header FWHM value.
  • pixPsf: pixelized PsfEx model.
  • hybridPsf: combo pixelized PsfEx + Gaussian approx.

Options determining the sky model to use:

  • splinesky: median filter chunks of the image, then spline those.

Options determining the units of the image:

  • nanomaggies: convert the image to be in units of NanoMaggies; tim.zpscale contains the scale value the image was divided by.
  • subsky: instantiate and subtract the initial sky model, leaving a constant zero sky model?
read_dq(**kwargs)[source]

Reads the Data Quality (DQ) mask image.

read_image(**kwargs)[source]

Reads the image file from disk.

The image is read from FITS file self.imgfn HDU self.hdu.

Parameters:
slice : slice, optional

2-dimensional slice of the subimage to read.

header : boolean, optional

Return the image header also, as tuple (image, header) ?

Returns:
image : numpy array

The image pixels.

(image, header) : (numpy array, fitsio header)

If header = True.

read_image_header(**kwargs)[source]

Reads the FITS image header from self.imgfn HDU self.hdu.

Returns:
header : fitsio header

The FITS header

read_image_primary_header(**kwargs)[source]

Reads the FITS primary (HDU 0) header from self.imgfn.

Returns:
primary_header : fitsio header

The FITS header

read_invvar(**kwargs)[source]

Reads the inverse-variance (weight) map image.

read_invvar_clipped(clip=True, clipThresh=0.01, **kwargs)[source]

A function that can optionally be called by subclassers for read_invvar, clipping fpack artifacts to zero.

read_primary_header(fn)[source]

Reads the FITS primary header (HDU 0) from the given filename. This is just a faster version of fitsio.read_header(fn).

read_sky_model(splinesky=False, slc=None, **kwargs)[source]

Reads the sky model, returning a Tractor Sky object.

remap_dq(dq, header)[source]

Called by get_tractor_image() to map the results from read_dq into a bitmask.

run_calibs(psfex=True, sky=True, se=False, fcopy=False, use_mask=True, force=False, git_version=None, splinesky=False)[source]

Run calibration pre-processing steps.

shape

Returns the full shape of the image, (H,W).

legacypipe.image.remap_dq_cp_codes(dq)[source]

Some versions of the CP use integer codes, not bit masks. This converts them.

legacypipe.decam module

class legacypipe.decam.DecamImage(survey, t)[source]

Bases: legacypipe.image.LegacySurveyImage

A LegacySurveyImage subclass to handle images from the Dark Energy Camera, DECam, on the Blanco telescope.

Attributes:
glowmjd
shape

Returns the full shape of the image, (H,W).

Methods

funpack_files(imgfn, maskfn, hdu, todelete) Source Extractor can’t handle .fz files, so unpack them.
get_cacheable_filename_variables() These are names of self.X variables that are filenames that could be cached.
get_good_image_slice(extent[, get_extent]) extent = None or extent = [x0,x1,y0,y1]
get_image_extent([wcs, slc, radecpoly]) Returns x0,x1,y0,y1,slc
get_image_shape() Returns image shape H,W.
get_sky_sig1([splinesky]) Returns the per-pixel noise estimate, which (for historical reasons) is stored in the sky model.
get_tractor_image([slc, radecpoly, …]) Returns a tractor.Image (“tim”) object for this image.
read_dq(**kwargs) Reads the Data Quality (DQ) mask image.
read_image(**kwargs) Reads the image file from disk.
read_image_header(**kwargs) Reads the FITS image header from self.imgfn HDU self.hdu.
read_image_primary_header(**kwargs) Reads the FITS primary (HDU 0) header from self.imgfn.
read_invvar_clipped([clip, clipThresh]) A function that can optionally be called by subclassers for read_invvar, clipping fpack artifacts to zero.
read_primary_header(fn) Reads the FITS primary header (HDU 0) from the given filename.
read_sky_model([splinesky, slc]) Reads the sky model, returning a Tractor Sky object.
remap_dq(dq, hdr) Called by get_tractor_image() to map the results from read_dq into a bitmask.
remap_invvar_shotnoise(invvar, primhdr, img, dq)
run_calibs([psfex, sky, se, fcopy, …]) Run calibration pre-processing steps.
check_for_cached_files  
check_image_header  
compute_filenames  
galaxy_norm  
get_good_image_subregion  
get_sig1  
get_tractor_wcs  
get_wcs  
glowmjd  
psf_norm  
read_invvar  
read_psf_model  
remap_dq_cp_codes  
remap_invvar  
run_psfex  
run_se  
run_sky  
get_good_image_subregion()[source]

Returns x0,x1,y0,y1 of the good region of this chip, or None if no cut should be applied to that edge; returns (None,None,None,None) if the whole chip is good.

This cut is applied in addition to any masking in the mask or invvar map.

read_invvar(**kwargs)[source]

Reads the inverse-variance (weight) map image.

remap_dq(dq, hdr)[source]

Called by get_tractor_image() to map the results from read_dq into a bitmask.

legacypipe.bok

class legacypipe.bok.BokImage(survey, t)[source]

Bases: legacypipe.image.LegacySurveyImage

Class for handling images from the 90prime camera processed by the NOAO Community Pipeline.

Attributes:
shape

Returns the full shape of the image, (H,W).

Methods

funpack_files(imgfn, maskfn, hdu, todelete) Source Extractor can’t handle .fz files, so unpack them.
get_cacheable_filename_variables() These are names of self.X variables that are filenames that could be cached.
get_good_image_slice(extent[, get_extent]) extent = None or extent = [x0,x1,y0,y1]
get_good_image_subregion() Returns x0,x1,y0,y1 of the good region of this chip, or None if no cut should be applied to that edge; returns (None,None,None,None) if the whole chip is good.
get_image_extent([wcs, slc, radecpoly]) Returns x0,x1,y0,y1,slc
get_image_shape() Returns image shape H,W.
get_sky_sig1([splinesky]) Returns the per-pixel noise estimate, which (for historical reasons) is stored in the sky model.
get_tractor_image([slc, radecpoly, …]) Returns a tractor.Image (“tim”) object for this image.
read_image(**kwargs) Reads the image file from disk.
read_image_header(**kwargs) Reads the FITS image header from self.imgfn HDU self.hdu.
read_image_primary_header(**kwargs) Reads the FITS primary (HDU 0) header from self.imgfn.
read_invvar_clipped([clip, clipThresh]) A function that can optionally be called by subclassers for read_invvar, clipping fpack artifacts to zero.
read_primary_header(fn) Reads the FITS primary header (HDU 0) from the given filename.
read_sky_model([splinesky, slc]) Reads the sky model, returning a Tractor Sky object.
remap_dq(dq, header) Called by get_tractor_image() to map the results from read_dq into a bitmask.
remap_invvar_shotnoise(invvar, primhdr, img, dq)
run_calibs([psfex, sky, se, fcopy, …]) Run calibration pre-processing steps.
check_for_cached_files  
check_image_header  
compute_filenames  
galaxy_norm  
get_sig1  
get_tractor_wcs  
get_wcs  
psf_norm  
read_dq  
read_invvar  
read_psf_model  
remap_dq_cp_codes  
remap_invvar  
run_psfex  
run_se  
run_sky  
read_dq(slice=None, header=False, **kwargs)[source]

Reads the Data Quality (DQ) mask image.

read_invvar(**kwargs)[source]

Reads the inverse-variance (weight) map image.

legacypipe.ptf

class legacypipe.ptf.PtfImage(survey, t)[source]

Bases: legacypipe.image.LegacySurveyImage

A LegacySurveyImage subclass to handle images from the Dark Energy Camera, DECam, on the Blanco telescope.

Attributes:
shape

Returns the full shape of the image, (H,W).

Methods

funpack_files(imgfn, maskfn, hdu, todelete) Source Extractor can’t handle .fz files, so unpack them.
get_cacheable_filename_variables() These are names of self.X variables that are filenames that could be cached.
get_good_image_slice(extent[, get_extent]) extent = None or extent = [x0,x1,y0,y1]
get_image_extent([wcs, slc, radecpoly]) Returns x0,x1,y0,y1,slc
get_image_shape() Returns image shape H,W.
get_sky_sig1([splinesky]) Returns the per-pixel noise estimate, which (for historical reasons) is stored in the sky model.
get_tractor_image([slc, radecpoly, …]) Returns a tractor.Image (“tim”) object for this image.
get_wcs()
read_image(**kwargs) returns tuple of img,hdr
read_image_header(**kwargs) Reads the FITS image header from self.imgfn HDU self.hdu.
read_image_primary_header(**kwargs) Reads the FITS primary (HDU 0) header from self.imgfn.
read_invvar_clipped([clip, clipThresh]) A function that can optionally be called by subclassers for read_invvar, clipping fpack artifacts to zero.
read_primary_header(fn) Reads the FITS primary header (HDU 0) from the given filename.
read_pv_wcs() extract wcs from fits header directly
remap_dq(dq, header) Called by get_tractor_image() to map the results from read_dq into a bitmask.
remap_invvar_shotnoise(invvar, primhdr, img, dq)
check_for_cached_files  
check_image_header  
compute_filenames  
galaxy_norm  
get_good_image_subregion  
get_sig1  
get_tractor_wcs  
psf_norm  
read_dq  
read_invvar  
read_psf_model  
read_sky_model  
remap_dq_cp_codes  
remap_invvar  
run_calibs  
run_psfex  
run_se  
run_sky  
get_good_image_subregion()[source]

Returns x0,x1,y0,y1 of the good region of this chip, or None if no cut should be applied to that edge; returns (None,None,None,None) if the whole chip is good.

This cut is applied in addition to any masking in the mask or invvar map.

get_tractor_image(slc=None, radecpoly=None, gaussPsf=False, const2psf=False, pixPsf=False, splinesky=False, nanomaggies=True, subsky=True, tiny=5, dq=True, invvar=True, pixels=True)[source]

Returns a tractor.Image (“tim”) object for this image.

Options describing a subimage to return:

  • slc: y,x slice objects
  • radecpoly: numpy array, shape (N,2), RA,Dec polygon describing bounding box to select.

Options determining the PSF model to use:

  • gaussPsf: single circular Gaussian PSF based on header FWHM value.
  • const2Psf: 2-component general Gaussian fit to PsfEx model at image center.
  • pixPsf: pixelized PsfEx model at image center.

Options determining the sky model to use:

  • splinesky: median filter chunks of the image, then spline those.

Options determining the units of the image:

  • nanomaggies: convert the image to be in units of NanoMaggies; tim.zpscale contains the scale value the image was divided by.
  • subsky: instantiate and subtract the initial sky model, leaving a constant zero sky model?
read_dq(**kwargs)[source]

Reads the Data Quality (DQ) mask image.

read_image(**kwargs)[source]

returns tuple of img,hdr

read_invvar(clip=False, clipThresh=0.2, **kwargs)[source]

Reads the inverse-variance (weight) map image.

read_pv_wcs()[source]

extract wcs from fits header directly

read_sky_model(**kwargs)[source]

Reads the sky model, returning a Tractor Sky object.

run_calibs(psfex=True, sky=True, funpack=False, git_version=None, force=False, **kwargs)[source]

Run calibration pre-processing steps.

legacypipe.ptf.read_dq(dqfn, hdu)[source]

return bit mask which Tractor calls “data quality” image PTF DMASK BIT DEFINITIONS BIT00 = 0 / AIRCRAFT/SATELLITE TRACK BIT01 = 1 / OBJECT (detected by SExtractor) BIT02 = 2 / HIGH DARK-CURRENT BIT03 = 3 / RESERVED FOR FUTURE USE BIT04 = 4 / NOISY BIT05 = 5 / GHOST BIT06 = 6 / CCD BLEED BIT07 = 7 / RAD HIT BIT08 = 8 / SATURATED BIT09 = 9 / DEAD/BAD BIT10 = 10 / NAN (not a number) BIT11 = 11 / DIRTY (10-sigma below coarse local median) BIT12 = 12 / HALO BIT13 = 13 / RESERVED FOR FUTURE USE BIT14 = 14 / RESERVED FOR FUTURE USE BIT15 = 15 / RESERVED FOR FUTURE USE INFOBITS= 0 / Database infobits (2^2 and 2^3 excluded)

legacypipe.ptf.read_image(imgfn, hdu)[source]

return gain*pixel DN as numpy array

legacypipe.mosaic

class legacypipe.mosaic.MosaicImage(survey, t)[source]

Bases: legacypipe.image.LegacySurveyImage

Class for handling images from the Mosaic3 camera processed by the NOAO Community Pipeline.

Attributes:
shape

Returns the full shape of the image, (H,W).

Methods

funpack_files(imgfn, maskfn, hdu, todelete) Source Extractor can’t handle .fz files, so unpack them.
get_cacheable_filename_variables() These are names of self.X variables that are filenames that could be cached.
get_good_image_slice(extent[, get_extent]) extent = None or extent = [x0,x1,y0,y1]
get_good_image_subregion() Returns x0,x1,y0,y1 of the good region of this chip, or None if no cut should be applied to that edge; returns (None,None,None,None) if the whole chip is good.
get_image_extent([wcs, slc, radecpoly]) Returns x0,x1,y0,y1,slc
get_image_shape() Returns image shape H,W.
get_sky_sig1([splinesky]) Returns the per-pixel noise estimate, which (for historical reasons) is stored in the sky model.
get_tractor_image([slc, radecpoly, …]) Returns a tractor.Image (“tim”) object for this image.
read_dq(**kwargs) Reads the Data Quality (DQ) mask image.
read_image(**kwargs) Reads the image file from disk.
read_image_header(**kwargs) Reads the FITS image header from self.imgfn HDU self.hdu.
read_image_primary_header(**kwargs) Reads the FITS primary (HDU 0) header from self.imgfn.
read_invvar(**kwargs) Reads the inverse-variance (weight) map image.
read_invvar_clipped([clip, clipThresh]) A function that can optionally be called by subclassers for read_invvar, clipping fpack artifacts to zero.
read_primary_header(fn) Reads the FITS primary header (HDU 0) from the given filename.
read_sky_model([splinesky, slc]) Reads the sky model, returning a Tractor Sky object.
remap_dq(dq, header) Called by get_tractor_image() to map the results from read_dq into a bitmask.
remap_invvar_shotnoise(invvar, primhdr, img, dq)
run_calibs([psfex, sky, se, fcopy, …]) Run calibration pre-processing steps.
check_for_cached_files  
check_image_header  
compute_filenames  
galaxy_norm  
get_sig1  
get_tractor_wcs  
get_wcs  
psf_norm  
read_psf_model  
remap_dq_cp_codes  
remap_invvar  
run_psfex  
run_se  
run_sky  

Indices and tables