version 1.7.0.0 (5.44 KB) by
Antonio Trujillo-Ortiz

Geometric Mean Regression (Reduced Major Axis Regression).

Model II regression should be used when the two variables in the regression equation are random and subject to error, i.e. not controlled by the researcher. Model I regression using ordinary least squares underestimates the slope of the linear relationship between the variables when they both contain error. According to Sokal and Rohlf (1995), the subject of Model II regression is one on which research and controversy are continuing and definitive recommendations are difficult to make.

GMREGRESS is a Model II procedure. It standardize variables before the slope is computed. Each of the two variables is transformed to have a mean of zero and a standard deviation of one. The resulting slope is the geometric mean of the linear regression coefficient of Y on X. Ricker (1973) coined this term and gives an extensive review of Model II regression.

[B,BINTR,BINTJM] = GMREGRESS(X,Y,ALPHA) returns the vector B of regression coefficients in the linear Model II and a matrix BINT of the given confidence intervals for B by the Ricker (1973) and Jolicoeur and Mosimann (1968)-McArdle (1988) procedure.

GMREGRESS treats NaNs in X or Y as missing values, and removes them.

Syntax: function [b,bintr,bintjm] = gmregress(x,y,alpha)

Antonio Trujillo-Ortiz (2021). gmregress (https://www.mathworks.com/matlabcentral/fileexchange/27918-gmregress), MATLAB Central File Exchange. Retrieved .

Created with
R14

Compatible with any release

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!Create scripts with code, output, and formatted text in a single executable document.

Antonio Trujillo-OrtizHi Michael. I have to tell you that I think there is no adjustment model when the intercept is zero. We must bear in mind that the reduced major axis regression is to describe the symmetric relationship between two variables and not for predictive use of the variable x with respect to y or y with respect to x (McArdle, 2003, Smith, 2009). On the other hand I have to tell you that I already have a few years retired and withdrawn from all academic activity. Good luck in your inquiry, Antonio Trujillo-Ortiz.

McArdle, B.H. Lines, models, and errors: Regression in the field. Limnology and Oceanography. 2003, 48(3):1363-1366

Smith, R.J. Use and Misuse of the Reduced Major Axis for Line-Fitting. American Journal of Physical Anthropology. 2009, 140(3):476-486

Michael DöringHello,

is there a way to force the function to fit a special intercept (e.g. 0)?

Antonio Trujillo-OrtizHello Liang Zhang. I do not know what your knowledge is with this Type II regression model [Geometric Mean Regression (Reduced Major Axis Regression)]. I have to tell you that the calculation of the coefficient of determination in this model is different from the Type I model with which you intend to estimate it. I recommend you review the statistical procedure in the literature that is cited in the Matlab gmregress file. On the other hand, I tell you that I am going for three years that I retired from any academic activity. Good luck in your inquiry. Antonio Tujillo-Ortiz.

Liang ZhangDear Antonio Trujillo-Ortiz,

I want to know how to calculate R^2, because the number calculated of R^2 using R language is different from that I used by Matlab.(The method for calculating R^2 is that (R^2=1-SSE/TSS)). Can you solve the problem?

Alison HillThanks so much for writing this function - has been very useful!

Antonio Trujillo-OrtizHi Aish,

Thanks for your interst in our m-functions. In the Matlab environment, go to the drive where you have saved this function. You can just write 'type gmregress'. There you can find the CI-mathematical algorithm; this explain you how it is possible to calculete them.

In this m-function there were developed two procedures to estimate the intercept and slope, as well as its confidence intervals (CI's) for a Modell II regression: Ricker and Jolicoeur and Mosimann procedures. The first uses the t-distribution and the second the F-distribution.

Recall that to perform a hypothesis test you can use the (1) comparision of it observed statistic and its estimate or a (2) CI's; this latter for a two tailed.test. In the case of the use of CI's, if the observed intercept or slope value falls between CI's, a non-significant result can be considered; otherwise significant (outside the CI's).

In a two-tailed hypothesis test, if the result is not significant, it can be said, without calculatating, that the p-value is greater than or equal to the alpha-value (Type I error or experimenterwise) is the the probability of do not reject Ho since Ho is true.

Yours,

Prof. A. Trujllo-Ortiz

AishwaryaHow do we calculate the p value of the estimate for slope? How do we find if the slope calculated is significant or not?

Antonio Trujillo-OrtizHi Ainundil,

This is a parameter estimation for a particular regression model. As such, it is important to know the confidence intervals (CI's) of such a regression parameters coming from sample estimators. So, the alpha-value it is just the significance used and needed for our CI's (P = 1 - alpha).

Best,

Antonio Trujillo-Ortiz

PD. You must input at least three input arguments x,y and alhpa-value; if you only input x & y, automatically the file by default take it as 0.05.

AinundilWorks great

but, you do not explain what is alpha. I assume that is for the confidence interval. However it does not give the same results as in ricker (1973, table 6) in the confidence intervals.

[El programa no explica lo que es alfa. y da distinto los intervalos al paper de ricker (1973, tabla 6). eso o no se bien como ingresar el parametro alfa.]

Thanks for the function

Antonio Trujillo-OrtizThe slope sign bug was efficiently corrected thanks to the valuable suggestions given by Holger Goerlitz and Joel E. Cohen. Yes, a negative slope are always negative!

Antonio Trujillo-Ortiz

Holger GoerlitzThank you very much, very well done and works great.

A quick comparison with the rma.m by Edward T. Peltzer (http://web.ics.purdue.edu/~braile/eas309/rma.m) gives identical results.

Except for one error, I believe: gmregress always returns a positive slope, even for data with a negative trend. After correcting the slope by (compare to rma.m):

si = r/abs(r); % sign of correl. coeff.

b = si*b;

negative slopes are negative.

Thanks for the function again,

Holger

PeteWell documented. Appears to work as advertised.

I would invite somebody to compare and contrast this script with the scripts by Edward T. Peltzer hosted at: http://www.mbari.org/staff/etp3/regress.htm

(which encouragingly produce identical slopes)