User Tools

Site Tools


mva:mva

This is an old revision of the document!


How to run TMVA

TMVA is a tool to run a multivariate analysis on a root tree. It is included in ROOT from version release 5.34/11 on. More information can be found in:

When using the neural netwerk method MLP, you might need ROOT 34.0.0 or newer, to have larger buffer for the xml reader, for example:

. /afs/cern.ch/sw/lcg/app/releases/ROOT/5.34.26/x86_64-slc6-gcc48-opt/root/bin/thisroot.sh

Parameters to tune

The parameters and options of the MVA method can be optimized from the default settings for better a performance, see the reference page.

For the BDT important parameters are the the learning rate, number of boost steps and maximal tree depth:

  • AdaBoostBeta=0.5: learning rate, smaller (~0.1) is better, but takes longer
  • nTrees=800: number of boost steps, too large mainly costs time and can cause overtraining
  • MaxDepth=3: maximum tree depth, ~2-5 depending on interaction of the variables
  • nCuts=20: grid points in variable range to find the optimal cut in node splitting
  • MinNodeSize=5%

Important MLP parameters to tune are the number of neurons on each hidden layer, learning rate and the activation function.

  • HiddenLayers=N,N-1: number of nodes in each hidden layer for N variables
    • N = one hidden layer with N nodes
    • N,N = two hidden layers
    • N+2,N = two hidden layers, with N+2 nodes in the first
  • LearningRate=0.02

Tutorial and examples

mva/mva.1470475063.txt.gz · Last modified: 2016/08/06 11:17 by iwn