[R] SOS Boosting

From: Kuhn, Max <Max.Kuhn_at_pfizer.com>
Date: Wed 13 Jul 2005 - 02:03:31 EST


>Hi,
>
>I am trying to implement the Adaboost.M1. algorithm as described in
>"The Elements of Statistical Learning" p.301
>I don't use Dtettling 's library "boost" because :
> - I don't understande the difference beetween Logitboost and L2boost
> - I 'd like to use larger trees than stumps.
>

It also doesn't have a predict function, which is why I don't use it much.

>By using option weights set to (1/n, 1/n, ..., 1/n) in rpart or tree
>function, the tree obtained is trivial (just root, no split) whereas
>without weight or for each weight >1,trees are just fine.
>
>So here is my question : how are weights taken into account in optimal
>tree's discovery ?
>Did someone implement boosting algorithm ?

Check out the gbm package. It is fairly close to MART in the reference you mentioned. To get to adaboost with stumps, you should look at the arguments

  distribution = "adaboost"
  interaction.depth = 1

To get more information see ?gbm (if you have it installed) or the file gbm.pdf in the doc directory of the library.

Max

>
>Regards,
>
>Olivier Celhay - Student - Paris, France

LEGAL NOTICE\ Unless expressly stated otherwise, this messag...{{dropped}}



R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Wed Jul 13 02:07:45 2005

This archive was generated by hypermail 2.1.8 : Fri 03 Mar 2006 - 03:33:34 EST