4. Compute an excess and significance image with PythonΒΆ

We would like to compute correlated excess = counts - background and significance images of the sources detected by the Fermi LAT above 10 GeV in the inner part of the Galactic plane, similar to the one we showed previously from the Fermi publication.

Note

What is statistical significance and how can I compute it?

TODO http://en.wikipedia.org/wiki/Statistical_significance

This functionality is not readily available as a command line Fermi Science Tool.

If would be possible to do it by using fgauss and ftimgcalc.

Instead of using these command line FTOOLs let’s use a Python script make_source_images.py:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
"""Compute correlated source excess and significance maps.

Christoph Deil, 2013-09-12
"""
import numpy as np
from numpy import sign, sqrt, log
from scipy.ndimage import convolve
import pyfits

def correlate_image(image, radius):
    """Correlate image with circular mask of a given radius.
    
    This is also called "tophat correlation" and it means that
    the value of a given pixel in the output image is the
    sum of all pixel values in the input image within a given circular radius.
    
    https://gammapy.readthedocs.org/en/latest/_generated/gammapy.image.utils.tophat_correlate.html
    """
    radius = int(radius)
    y, x = np.mgrid[-radius: radius + 1, -radius: radius + 1]
    structure = x ** 2 + y ** 2 <= radius ** 2
    return convolve(image, structure, mode='constant')

def significance_lima(n_observed, mu_background):
    """Compute Li & Ma significance.
    
    https://gammapy.readthedocs.org/en/latest/_generated/gammapy.stats.poisson.significance.html
    """
    term_a = sign(n_observed - mu_background) * sqrt(2)
    term_b = sqrt(n_observed * log(n_observed / mu_background) - n_observed + mu_background)
    return term_a * term_b


if __name__ == '__main__':
    print('Reading count_image.fits')
    counts = pyfits.getdata('count_image.fits')
    print('Reading model_image.fits')
    model = pyfits.getdata('model_image.fits')

    radius = 5 # correlation circle radius
    correlated_counts = correlate_image(counts, radius)
    correlated_model = correlate_image(model, radius)
    
    excess = correlated_counts - correlated_model
    significance = significance_lima(correlated_counts, correlated_model)
    
    header = pyfits.getheader('count_image.fits')
    print('Writing excess.fits')
    pyfits.writeto('excess.fits', excess, header, clobber=True)
    print('Writing significance.fits')
    pyfits.writeto('significance.fits', significance, header, clobber=True)

TODO: explain script a bit.

Run it by typing:

$ python make_source_images.py

Note

Exercise: Open up the exercise.fits and significance.fits images and see if the values roughly make sense.

Project Versions

Previous topic

3. Create count and and model images with the Fermi Science Tools

Next topic

5. Explore Sources

This Page