We present a new tree boosting algorithm designed for the measurement of parameters in the context of effective field theory (EFT). To construct the algorithm, we interpret the optimized loss function of a traditional decision tree as the maximal Fisher information in Poisson counting experiments. We promote the interpretation to general EFT predictions and develop a suitable boosting method. The resulting “Boosted Information Tree” algorithm approximates the score, the derivative of the log-likelihood function with respect to the parameter. It thus provides a sufficient statistic in the vicinity of a reference point in parameter space where the estimator is trained. The training exploits per-event information of likelihood ratios for different theory parameter values available in the simulated EFT data sets. Program summaryProgram Title: BIT (Boosted Information Trees)CPC Library link to program files:https://doi.org/10.17632/9fjyb5hyxt.1Developer's repository link:https://github.com/HephyAnalysisSW/BITLicensing provisions: GPLv3Programming language: Python2 and Python3Nature of problem: Providing a discriminator for parameter estimation in the context of the standard model effective field theory.Solution method: A tree-based algorithm exploits “augmented” information of the simulated training data set to regress in the score function and thereby provides a sufficient test statistic of an EFT parameter.
Read full abstract