From: Rick Ram <r.ramyar_at_gmail.com>

Date: Fri 29 Jul 2005 - 05:44:34 EST

R-help@stat.math.ethz.ch mailing list

https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Fri Jul 29 05:50:04 2005

Date: Fri 29 Jul 2005 - 05:44:34 EST

Sorry guys, resending this - none of my posts have gone through because HTML emails where not being delivered... sending this plaintext now!

On 28/07/05, Rick Ram <r.ramyar@gmail.com> wrote:

> Hi all,

*>
**> I have not looked at this CUSUM SQUARED issue since the emails at the
**> beginning of the year but am looking at it again. For those who are
**> interested the following paper gives critical values where n>60 in addition
**> to the ones in Durbin 1969.
**>
**> Edgerton, David & Wells, Curt, 1994. "Critical Values for the Cusumsq
**> Statistic in Medium and Large Sized Samples," Oxford Bulletin of Economics
**> and Statistics, Blackwell Publishing, vol. 56(3), pages 355-65.
**>
**>
**> All the best,
**>
**> R.
**>
**> On 12/01/05, Achim Zeileis <Achim.Zeileis@wu-wien.ac.at> wrote:
**> > On Tue, 11 Jan 2005 19:33:41 +0000 Rick Ram wrote:
**> >
**> > > Groundwork for the choice of break method in my specific application
**> > > has already been done - otherwise I would need to rework the wheel
**> > > (make a horribly detailed comparison of performance of break
**> > > approaches in context of modelling post break)
**> > >
**> > > If it interests you, Pesaran & Timmerman 2002 compared CUSUM Squared,
**> > > BaiPerron and a time varying approach to detect singular previous
**> > > breaks in reverse ordered financial time series so as to update a
**> > > forecasting model.
**> >
**> > Yes, I know that paper. And if I recall correctly they are mainly
**> > interested in modelling the time period after the last break. For this,
**> > the reverse ordered recursive CUSUM approach works because they
**> > essentially look back in time to see when their predictions break down.
**> > And for their application looking for variance changes also makes sense.
**> > The approach is surely valid and sound in this context...but it might be
**> > possible to do something better (but I would have to look much closer at
**> > the particular application to have an idea what might be a way to go).
**> >
**> > > This works "fine" i.e. the plot looks correct. The problem is how to
**> > > appropriately normalise these to rescale them to what the CUSUM
**> > > squared procedure expects (this looks to be a different and more
**> > > complicated procedure than the normalisation used for the basic
**> > > CUSUM). I am from an IT background and am slightly illiterate in
**> > > terms of math notation... guidance from anyone would be appreciated
**> >
**> > I just had a brief glance at BDE75, page 154, Section 2.4. If I
**> > haven't missed anything important on reading it very quickly, you just
**> > need to do something like the following (a reproducible example, based
**> > on data from strucchange, using a notation similar to BDE's):
**> >
**> > ## load GermanM1 data and model
**> > library(strucchange)
**> > data(GermanM1)
**> > M1.model <- dm ~ dy2 + dR + dR1 + dp + ecm.res + season
**> >
**> > ## compute squared recursive residuals
**> > w2 <- recresid(M1.model, data = GermanM1)^2
**> > ## compute CUSUM of squares process
**> > sr <- ts(cumsum(c(0, w2))/sum(w2), end = end(GermanM1$dm), freq = 12)
**> > ## the border (r-k)/(T-k)
**> > border <- ts(seq(0, 1, length = length(sr)),
**> > start = start(sr), freq = 12)
**> >
**> > ## nice plot
**> > plot(sr, xaxs = "i", yaxs = "i", main = "CUSUM of Squares")
**> > lines(border, col = grey(0.5))
**> > lines(0.4 + border, col = grey(0.5))
**> > lines(- 0.4 + border, col = grey(0.5))
**> >
**> > Instead of 0.4 you would have to use the appropriate critical values
**> > from Durbin (1969) if my reading of the paper is correct.
**> >
**> > hth,
**> > Z
**> >
**> > > Does anyone know if this represents some commonly performed type of
**> > > normalisation than exists in another function??
**> > >
**> > > I will hunt out the 1969 paper for the critical values but prior to
**> > > doing this I am a bit confused as to how they will
**> > > implemented/interpreted... the CUSUM squared plot does/should run
**> > > diagonally up from left to right and there are two straight lines that
**> > > one would put around this from the critical values. Hence, a
**> > > different interpretation/implementation of confidence levels than in
**> > > other contexts. I realise this is not just a R thing but a problem
**> > > with my theoretical background.
**> > >
**> > >
**> > > Thanks for detailed reply!
**> > >
**> > > Rick.
**> > >
**> > >
**> > > >
**> > > > But depending on the model and hypothesis you want to test, another
**> > > > technique than CUSUM of squares might be more appropriate and also
**> > > > available in strucchange.
**> > >
**> > > >
**> > > > hth,
**> > > > Z
**> > > >
**> > > > > Any help or pointers about where to look would be more than
**> > > > > appreciated! Hopefully I have just missed obvious something in
**> > > > > the package...
**> > > > >
**> > > > > Many thanks,
**> > > > >
**> > > > > Rick R.
**> > > > >
**> > > > > ______________________________________________
**> > > > > R-help@stat.math.ethz.ch mailing list
**> > > > > https://stat.ethz.ch/mailman/listinfo/r-help
**> > > > > PLEASE do read the posting guide!
**> > > > > http://www.R-project.org/posting-guide.html
**> > > > >
**> > > >
**> > >
**> >
**>
*

>

R-help@stat.math.ethz.ch mailing list

https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Fri Jul 29 05:50:04 2005

*
This archive was generated by hypermail 2.1.8
: Fri 03 Mar 2006 - 03:34:07 EST
*