• Tidak ada hasil yang ditemukan

Applying Systematics

2 (2) and sin

C.4 Applying Systematics

The principle for applying systematics for Feldman-Cousins is simple: in each fake experiment, pick a random value for each systematic uncertainty and incorporate these shifts into the pseudo- experiment. For systematics like normalization, this is very simple no matter how the pseudo- experiments are produced – simply scale the prediction before fluctuating or sampling (essentially, scaling the number of events drawn). For others it can be more complicated – they are described in Table C.1.

Since the normalization systematic was easy to implement for both methods, it was used as a cross-check to make sure the systematics were being applied properly. The resulting surfaces (four in all, new and old method with and without normalization), are shown in Figure C.3 (Transitions were used as a test since the 1-dimensional surfaces are easier to compare). As you can see, the

1Reconstructed energy is used for the neutral currents since this defines their behavior in the detectors and the true energy can vary wildly depending on how much energy is taken away by the exiting neutrino.

Systematic Size ND Method FD Method

Normalization 4% - Scale the number of events

drawn for all samples.

Backgrounds 50% Scale the weight of each background event.

Scale the number of events drawn for the background

samples.

Track Energy Range 2% Modify the energy component before adding to the Overall Shower Energy 10% reconstructed energy histograms.

Track Energy Cuvature 4% - FD Only

Relative Shower Energy 3.3% - FD Only

NuMuBar Cross-Section 1σ Scale the weight of each event in true energy as

shown in Figure C.4.

Use the weights in true energy shown in Figure C.4

to scale acceptance.

SKZP 1σ Scale the weight of each

event using its associated error.

Use each event’s error to scale its acceptance

(described below).

Decay Pipe 37% Scale the weight of decay

pipe events. Scale the acceptance of decay pipe events.

Table C.1: Table of systematics and how they are implemented.

systematics had a clear affect, and the new and old methods are consistent with each other with and without systematics.

Systematics that would be applied by changing the weights of events in a high-statistics histogram (like the Near Detector) need some care in application to the Far Detector where only a small number of unweighted events are used. It is easier when the events being weighted can be treated as a group (i.e. the background scales); the total number of those events can be scaled. When the scaling needs to happen on an event-by-event basis, care needs to be taken to account for the two separate effects of the systematic: changing the likelihood of having that particular event and changing the total number of events. If an event is drawn that needs to be weighted, two rejection probabilities are calculated: one unmodified, and one systematically shifted up or down. The event is rejected based on the shifted probability. If the event would have been accepted otherwise but is rejected because of the systematic, the total number of events drawn for this pseudo-experiment is reduced by one.

Conversely, if an event that would have been rejected is accepted because of this systematic, the number of events drawn is increased by one.

For the decay pipe, there was an option to either do the acceptance scaling event-by-event or treat it as a separate sample like the backgrounds. While the ‘Separate Sample’ method was not chosen for production because it required modification of much of the matrix method chain and thus introduced numerous possibilities for small bugs, it was implemented temporarily as a cross check.

Figure C.5 shows the predictions from both decay pipe methods compared to each other. It is clear that they give basically identical spectra.

146 Feldman-Cousins Method for the Antineutrino Analyses

Transition Probability

0 0.2 0.4 0.6 0.8 1

902χ

1.4 1.6 1.8 2 2.2 2.4 2.6 2.8

Feldman-Cousins Surface

New, No Syst New, Normalization Old, No Syst Old, Normalization Feldman-Cousins Surface

Transition Probability

0 0.2 0.4 0.6 0.8 1

Normalization/No Syst

0.92 0.94 0.96 0.98 1 1.02 1.04 1.06 1.08 1.1

Normalization/No Systematics

New Old

Normalization/No Systematics

Transition Probability

0 0.2 0.4 0.6 0.8 1

New Old

0.92 0.94 0.96 0.98 1 1.02 1.04 1.06 New/Old

No Systematics Normalization New/Old

Figure C.3: A comparison of the two Feldman-Cousins methods (old - fluctuated histogram, new - event drawing) both with and without a normalization systematic applied. As you can see in the lower left ratio plot (blue), applying the normalization systematic has an observable effect. And, looking at both lower ratio plots, it is clear that the new and old Feldman-Cousins methods are giving consistent results.

True Energy (GeV)

0 10 20 30 40 50

Error νσ

0 0.02 0.04 0.06 0.08 0.1 0.12 0.14

Fake Data, Summed in Quadrature Paramaterized Error

Figure C.4: The curve used to parameterize the antineutrino cross section uncertainty as a function of energy.

It is compared to the 1σ errors on the combined neutrino and antineutrino cross-sections components as well as the antineutrino cross section ratio components.

Reco. Energy (GeV)

0 5 10 15 20 25 30 35 40 45 50

FD Events/GeV

0 0.5 1 1.5 2 2.5 3 3.5

4 Separate Sample Method

Signal Modify Method

Reco. Energy (GeV)

0 5 10 15 20 25 30 35 40 45 50

Production/Cross-Check

0.96 0.98 1 1.02 1.04 1.06

Figure C.5: The two methods for applying the decay pipe systematic compared to each other. Here, the decay pipe systematic has been forced to be exactly +37%, not varied, so its effects can be investigated.

They clearly give results consistent to within 2%.