2. Maps

2.1. Prepare the data

The following procedure is explained in Getting Started.

You start by creating a file events.txt that contains the photon data files you downloaded. Note that in this case there the Fermi LAT data server generated eight files, each only about 1 MB (mega-byte) large because there are only few events above 10 GeV.:

$ ls -1 *_PH??.fits > events.txt
$ du -hs *_PH??.fits
968K  L1309071835220B976F4330_PH00.fits
1004K L1309071835220B976F4330_PH01.fits
1.1M  L1309071835220B976F4330_PH02.fits
1.2M  L1309071835220B976F4330_PH03.fits
1.0M  L1309071835220B976F4330_PH04.fits
1.1M  L1309071835220B976F4330_PH05.fits
1.0M  L1309071835220B976F4330_PH06.fits
840K  L1309071835220B976F4330_PH07.fits
808K  L1309071835220B976F4330_PH08.fits

Now tun the following commands in sequence. gtselect will just take a few seconds, gtmktime a few minutes and gtltcube will take a few hours ... so we suggest you copy the file gtltcube.fits from the solutions folder so that you can continue quickly

$ gtselect infile=@events.txt outfile=gtselect.fits \
  ra=INDEF dec=INDEF rad=INDEF tmin=INDEF tmax=INDEF \
  emin=10000 emax=1000000 zmax=100 evclass=2

$ gtmktime scfile=../../spacecraft.fits evfile=gtselect.fits \
  filter=DATA_QUAL==1&&LAT_CONFIG==1&&ABS(ROCK_ANGLE)<52 \
  roicut=yes outfile=gtmktime.fits

$ gtltcube evfile=gtmktime.fits scfile=../../spacecraft.fits \
  outfile=gtltcube.fits dcostheta=0.025 binsz=1

2.2. Make a count cube and image

Run gtbin to make a counts image:

$ gtbin
This is gtbin version ScienceTools-v9r31p1-fssc-20130410
Type of output file (CCUBE|CMAP|LC|PHA1|PHA2|HEALPIX) [] CMAP
Event data file name[] gtmktime.fits
Output file name[] counts_image.fits
Spacecraft data file name[] ../../spacecraft.fits
Size of the X axis in pixels[] 700
Size of the Y axis in pixels[] 700
Image scale (in degrees/pixel)[] 0.1
Coordinate system (CEL - celestial, GAL -galactic) (CEL|GAL) [] GAL
First coordinate of image center in degrees (RA or galactic l)[] 0
Second coordinate of image center in degrees (DEC or galactic b)[] 0
Rotation angle of image axis, in degrees[] 0
Projection method e.g. AIT|ARC|CAR|GLS|MER|NCP|SIN|STG|TAN:[] TAN

Use ds9 to look at it:

$ ds9 -cmap b -scale sqrt counts_image.fits
galactic_center/ds9_gc.png

2.3. Compute an exposure cube

gtexpcube2

2.4. Compute a diffuse model image

gtmodel

ln -s $FERMI_DIR/refdata/fermi/galdiffuse/gal_2yearp7v6_v0.fits .
ln -s $FERMI_DIR/refdata/fermi/galdiffuse/iso_p7v6source.txt .

2.5. Compute an excess and significance image

TODO: Give a Python script to do it (tophat-correlate or PSF-correlate, then apply LiMa formula).

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
"""Compute correlated source excess and significance maps.

Christoph Deil, 2013-09-12
"""
import numpy as np
from numpy import sign, sqrt, log
from scipy.ndimage import convolve
import pyfits

def correlate_image(image, radius):
    """Correlate image with circular mask of a given radius.
    
    This is also called "tophat correlation" and it means that
    the value of a given pixel in the output image is the
    sum of all pixel values in the input image within a given circular radius.
    
    https://gammapy.readthedocs.org/en/latest/_generated/gammapy.image.utils.tophat_correlate.html
    """
    radius = int(radius)
    y, x = np.mgrid[-radius: radius + 1, -radius: radius + 1]
    structure = x ** 2 + y ** 2 <= radius ** 2
    return convolve(image, structure, mode='constant')

def significance_lima(n_observed, mu_background):
    """Compute Li & Ma significance.
    
    https://gammapy.readthedocs.org/en/latest/_generated/gammapy.stats.poisson.significance.html
    """
    term_a = sign(n_observed - mu_background) * sqrt(2)
    term_b = sqrt(n_observed * log(n_observed / mu_background) - n_observed + mu_background)
    return term_a * term_b


if __name__ == '__main__':
    counts = pyfits.getdata('counts.fits')
    model = pyfits.getdata('model.fits')

    radius = 5 # correlation circle radius
    correlated_counts = correlate_image(counts, radius)
    correlated_model = correlate_image(model, radius)
    
    excess = correlated_counts - correlated_model
    significance = significance_lima(correlated_counts, correlated_model)
    
    header = pyfits.getheader('counts.fits')
    pyfits.writeto('excess.fits', excess, header, clobber=True)
    pyfits.writeto('significance.fits', significance, header, clobber=True)

Project Versions

Table Of Contents

Previous topic

1. Introduction

Next topic

3. Explore Sources

This Page