# [Rd] Optimization in R

From: Andrew Clausen <clausen_at_econ.upenn.edu>
Date: Sat, 04 Aug 2007 01:12:31 -0400

Hi all,

I've been working on improving R's optim() command, which does general purpose unconstrained optimization. Obviously, this is important for many statistics computations, such as maximum likelihood, method of moments, etc. I have focused my efforts of the BFGS method, mainly because it best matches my current projects.

Here's a quick summary of what I've done:

My own implementation has several advantages over optim()'s implementation (which you can see in the vmmin() function in

• the linesearch algorithm (More-Thuente) quickly finds a region of interest to zoom into. Moreover, it strikes a much better balance between finding a point that adequately improves upon the old point, but doesn't waste too much time finding a much better point. (Unlike optim(), it uses the standard Wolfe conditions with weak parameters.)
• the linesearch algorithm uses interpolation, so it finds an acceptable point more quickly.
• implements "box" constraints.
• easier to understand and modify the code, partly because it's written in R.

Of course, this comes at the (slight?) overhead cost of being written in R.

The test suite above takes the first few functions from the paper

```        MorĂ©, Garbow, and Hillstrom, "Testing Unconstrained
Optimization Software", ACM Trans Math Softw 7:1 (March 1981)

```

The test results appear below, where "*" means "computed the right solution", and "!" means "got stuck".

```test                    optim           clausen         gsl
--------------------------------------------------------------
bard                                                    !
```
beale
brown-scaled
freudenstein-roth
```gaussian                                                *
helical-valley          *               *
jennrich-sampson                                        *
meyer                                                   *
powell-scaled                           *
rosenbrock                              *

```

The table indiciates that all three implementations of BFGS failed to compute the right answer in most cases. I suppose this means they are all quite deficient. Of course, this doesn't imply that they perform badly on real statistics problems -- but in my limited experience with my crude econometric models, they do perform badly. Indeed, that's why I started investigating in the first place.

For what it's worth, I think:

• the optimization algorithms should be written in R -- the overhead is small compared to the cost of evaluating likelihood functions anyway, and is easily made up by the better algorithms that are possible.
• it would be useful to keep a repository of interesting optimization problems relevant to R users. Then R developers can evaluate "improvements".

Cheers,
Andrew

R-devel_at_r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel Received on Sat 04 Aug 2007 - 05:22:00 GMT

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Wed 08 Aug 2007 - 04:37:59 GMT.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-devel. Please read the posting guide before posting to the list.