Re: [R] MCA in R

From: John Fox <jfox_at_mcmaster.ca>
Date: Fri, 13 Jun 2008 08:31:51 -0400

Dear Brian,

> -----Original Message-----
> From: Prof Brian Ripley [mailto:ripley_at_stats.ox.ac.uk]
> Sent: June-13-08 8:13 AM
> To: John Fox
> Cc: 'K. Elo'; r-help_at_r-project.org
> Subject: Re: [R] MCA in R
>
> Although John Fox naturally mentions his Anova function, I would like to
> point out that drop1() (and MASS::dropterm) also does the tests of Type-II
> ANOVA of which John says 'more tediously do these tests directly'.

It's true that for an additive model (such as Kimmo's), drop1() and Anova() produce the same sums of squares, but for a model in which some terms are marginal to others, drop1() produces tests only for the high-order terms. One could specify scope = ~ . to drop1(), but that produces so-called "type-III" tests. Perhaps there's some convenient way around this of which I'm unaware.

>
> It seems a lot easier to teach newcomers about drop1() than to introduce
> the SAS terminology and then say (to quote ?Anova)
>
> 'the definitions used here do not correspond precisely to those
> employed by SAS'
>
> (I would welcome a description of the precise differences on the Anova
> help page.)

As I recall, the differences are for "type-III" tests, where in Anova() these are dependent upon contrast coding.

Regards,
 John

>
>
> On Fri, 13 Jun 2008, John Fox wrote:
>
> > Dear Kimmo,
> >
> >> -----Original Message-----
> >> From: r-help-bounces_at_r-project.org
[mailto:r-help-bounces_at_r-project.org]
> > On
> >> Behalf Of K. Elo
> >> Sent: June-13-08 1:43 AM
> >> To: r-help_at_r-project.org
> >> Subject: Re: [R] MCA in R
> >>
> >> Dear John,
> >>
> >> thanks for Your quick reply.
> >>
> >>> John Fox wrote:
> >>> Dear Kimmo,
> >>>
> >>> MCA is a rather old name (introduced, I think, in the 1960s by
> >>> Songuist and Morgan in the OSIRIS package) for a linear model
> >>> consisting entirely of factors and with only additive effects --
> >>> i.e., an ANOVA model will no interactions.
> >>
> >> It is true, that MCA is an old name, but the technique itself is still
> >> robust, I think. The problem I am facing is that I have a research
> >> project where I try to find out which factors affect measured knowledge
> >> of a specific issue. As predictors I have formal education, interest,
> >> gender and consumption of different medias (TV, newspapers etc.). Now,
> >> these are correlated predictors and running e.g. a simple anova
> >> (anova(lm(...)) as You suggested) won't - if I have understood
correctly
> >> - consider the problem of correlated predictors. MCA would do this.
> >
> > That's because anova() calculates sequential ("type-I") sums of squares;
if
> > you use the Anova() function in the car package, for example, you'll get
> > so-called type-II sums of squares -- for each factor after the others.
You
> > could also more tediously do these tests directly using the anova()
> > function, by contrasting alternative models: the full model and the
model
> > deleting each factor in turn.
> >
> >>
> >> A colleague of mine has run anova and MCA in SPSS and the results
differ
> >> significantly.
> >
> > Yes, see above.
> >
> >> Because I am more familiar with R, I just hoped that this
> >> marvelous statistical package could handle MCA, too :)
> >>
> >>> Typically, the results of
> >>> an MCA are reported using "adjusted means." You could compute these
> >>> manually, or via the effects package.
> >>
> >> Well, I am interested in the eta and beta values, too.
> >
> > Aren't the eta values just the square-roots of the R^2's from the
> individual
> > one-way ANOVAs? I don't remember how the betas are defined, but do
recall
> > that they are a peculiar attempt to define standardized partial
regression
> > coefficients for factors that combine all of the levels.
> >
> >> I have tried to
> >> use the effects package but my attempts with all.effects resulted in
> >> errors. I have to figure out what's going wrong here :)
> >
> > If you tell me what you did, ideally including an example that I can
> > reproduce, I can probably tell you what's wrong.
> >
> > Regards,
> > John
> >
> >>
> >> Kind regards,
> >> Kimmo Elo
> >>
> >> --
> >> University of Turku, Finland
> >> Dep. of political science
> >>
> >> ______________________________________________
> >> R-help_at_r-project.org mailing list
> >> https://stat.ethz.ch/mailman/listinfo/r-help
> >> PLEASE do read the posting guide
> > http://www.R-project.org/posting-guide.html
> >> and provide commented, minimal, self-contained, reproducible code.
> >
> > ______________________________________________
> > R-help_at_r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-help
> > PLEASE do read the posting guide http://www.R-project.org/posting-
> guide.html
> > and provide commented, minimal, self-contained, reproducible code.
> >
>
> --
> Brian D. Ripley, ripley_at_stats.ox.ac.uk
> Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
> University of Oxford, Tel: +44 1865 272861 (self)
> 1 South Parks Road, +44 1865 272866 (PA)
> Oxford OX1 3TG, UK Fax: +44 1865 272595



R-help_at_r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code. Received on Fri 13 Jun 2008 - 15:26:13 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Fri 13 Jun 2008 - 15:30:39 GMT.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-help. Please read the posting guide before posting to the list.

list of date sections of archive