Xgboost: Meaning of feature importance score

Created on 30 May 2015  ·  4Comments  ·  Source: dmlc/xgboost

Do they have an interpretable semantics? How are they calculated? Does higher mean better?

To clarify, I'm using cls.booster().get_fscore() to get the scores.

Most helpful comment

that means these feature never get selected into the trees

All 4 comments

Also, get_fscore() returns fewer features than the number of features in the training data. I have 98 features and get_fscores() return scores of 71 features.

The higher the better, get_fscore returns number of occurance of features in the ensemble

Does it use their levels in the tree as weights?

Also, do you have an explanation for the situation in my second question?

Thanks.

that means these feature never get selected into the trees

Was this page helpful?
0 / 5 - 0 ratings

Related issues

choushishi picture choushishi  ·  3Comments

tqchen picture tqchen  ·  4Comments

vkuznet picture vkuznet  ·  3Comments

yananchen1989 picture yananchen1989  ·  3Comments

trivialfis picture trivialfis  ·  3Comments