[R] test regression against given slope for reduced major axis regression (RMA)

From: Patrick Drechsler <patrick_at_pdrechsler.de>
Date: Tue 11 Jul 2006 - 11:10:21 EST


Hi,

for testing if the slope of experimental data differs from a given slope I'm using the function
"test_regression_against_slope" (see below).

I am now confronted with the problem that I have data which requires a modelII regression (also called reduced major axes regression (RMA) or geometric mean regression). For this I use the function "modelII" (see below).

What would be a good way of adapting
"test_regression_against_slope" for use with RMA regression?

The question I am trying to answer is: "Does the slope acquired from experimental data differ significantly from theoretical predictions?"

Any feedback on this is highly appreciated. And if you need more info do not hesitate to ask.

Kind Regards

Patrick

*test_regression_against_slope*

--8<------------------------schnipp------------------------->8---
test_regression_against_slope <- function(x,y,slope_2) {
### TEST_REGRESSION_AGAINST_SLOPE tests a measured regression against a
### given regression.
###
### INPUT:
###
### x and y: raw data
### slope_2: the slope you would like to test against (ie 1/3)
###
### OUTPUT:
###
### pvalue: the P-value...
### upperlimit95 and lowerlimit95 give the 95 percent confidence
### intervall (two-tailed).
###
### see Sokal and Rohlf, p. 465/471
  

  n <- length(x)
  mydf <- n-2

  ## least square fit:
  x2 <- (x-mean(x))^2
  y2 <- (y-mean(y))^2

  ## regression (pedestrian solution):
  xy <- (x-mean(x))*(y-mean(y))
  slope1 <- sum(xy)/sum(x2)
  intercept_a <- mean(y) - slope1 * mean(x)

  ## model data y_hat:
  y_hat <- intercept_a + slope1 * x
  ## least squares of model data:
  y_hat2 <- (y - y_hat)^2

  s2yx <- sum(y_hat2) / (n-2)

  sb <- sqrt(s2yx/sum(x2))
  ts <- (slope1 - slope_2) / sb
  pvalue <-  2*(pt(abs(ts), df, lower.tail=FALSE))

  ## 0.95 for one-tailed 0.975 for two-tailed t-distribution with   ## alpha<-5%:
  tval <- qt(.975, df=mydf)
  ts2 <- tval*sb

  lowerlimit95 <- slope1 - ts2
  upperlimit95 <- slope1 + ts2

  list(pvalue = pvalue,

       lowerlimit95 = lowerlimit95,
       upperlimit95 = upperlimit95)

}
--8<------------------------schnapp------------------------->8---




*modelII*

--8<------------------------schnipp------------------------->8---
modelII <- function(XjArray,YjArray){
### ============================================================
###
### Purpose:
###
### Calculates MODEL II Regression paramaters. Also called "reduced
### major axis regression" (Prentice 1987) or "geometric mean
### regression" (Webb et al. 1981).
###
### Input:
###
### Two one dimensional arrays XjArray and YjArray containing the X
### and Y vectors.
###
### XjArray = [0 0.9 1.8 2.6 3.3 4.4 5.2 6.1 6.5 7.4]
### YjArray = [5.9 5.4 4.4 4.6 3.5 3.7 2.8 2.8 2.4 1.5]
###
### Output:
###
### A list with the following:
###
### sumXjYj As Double
### sumXj As Double, sumYj As Double
### sumXjSquared As Double, sumYjSquared As Double
### n As Long
### varXj, varYj
### output(7)
###
### ============================================================

  sumXjYj <- 0
  sumXj <- 0
  sumYj <- 0
  n <- 0
  n <- length(XjArray)

  sumXjSquared <- 0
  sumYjSquared <- 0
  covariancexy <- 0
  

  for(i in 1:n){
    sumXjYj <- sumXjYj + XjArray[i] * YjArray[i]     sumXj <- sumXj + XjArray[i]
    sumYj <- sumYj + YjArray[i]
    sumXjSquared <- sumXjSquared + XjArray[i]^2     sumYjSquared <- sumYjSquared + YjArray[i]^2   }   

  ## Mean of X and Y vectors
  meanyj <- sumYj / n
  meanxj <- sumXj / n   

  ## Create covariance
  for(i in 1:n){
    covariancexy <- covariancexy + ((XjArray[i] - meanxj) * (YjArray[i] - meanyj))   }

  covariancexy <- covariancexy / n      

  ## get variance of X and Y (SD)

  varXj <- (n * sumXjSquared-sumXj^2)/(n*(n - 1))
  varYj <- (n * sumYjSquared-sumYj^2)/(n*(n - 1))
  sdxij <- (sumXjSquared)-(sumXj^2/n)
  sdxik <- (sumYjSquared)-(sumYj^2/n)

  ## make beta 'sgn function to return sign with magnitude of 1   betacoeff <- sign(covariancexy) * ((varYj^0.5) / (varXj^0.5))   ## 'make intercept
  Intercept <- meanyj - meanxj * betacoeff   

  ## Make R the pearson produce moment correlation coefficient   if (varYj==0 | varXj==0){
    corrCoeff <- 0
  }else{
    corrCoeff <- (sumXjYj - ((sumXj * sumYj) / n)) / ((sdxij * sdxik)^0.5)   }   

  ## Make sample variances of betacoefficient and intercept   variancebeta <- (varYj / varXj) * ((1 - (corrCoeff ^ 2)) / n)   varianceintercept <- (varYj / n) * (1 - corrCoeff) * (2 + ((meanxj ^ 2) * ((1 + corrCoeff) / varXj)))   sdbeta <- variancebeta^0.5
  sdintercept <- varianceintercept^0.5

  list(betacoeff=betacoeff, # <- Steigung

       Intercept=Intercept,
       sdbeta=sdbeta, # standard deviation
       sdintercept=sdintercept,
       meanxj=meanxj,
       meanyj=meanyj,
       corrCoeff=corrCoeff) # <- pearson correlation koeffizient ;
  

}

--8<------------------------schnapp------------------------->8---

-- 
Snoopy (on being house-trained with a rolled-up newspaper): 
It does tend however to give one a rather distorted view of the press!

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Received on Tue Jul 11 11:15:36 2006

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.1.8, at Fri 14 Jul 2006 - 22:15:03 EST.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-help. Please read the posting guide before posting to the list.