Re: [Rd] Master's project to coerce linux nvidia drivers to run generalised linear models

From: Marc Schwartz (via MN) <mschwartz_at_mn.rr.com>
Date: Mon 23 Jan 2006 - 21:47:38 GMT

On Mon, 2006-01-23 at 15:24 -0500, Oliver LYTTELTON wrote:
>
> Hi,
>
> I am working with a friend on a master's project. Our laboratory does a
> lot of statistical analysis using the R stats package and we also have a
> lot of under-utilised nvidia cards sitting in the back of our networked
> linux machines. Our idea is to coerce the linux nvidia driver to run
> some of our statistical analysis for us. Our first thought was to
> specifically code up a version of glm() to run on the nvidia cards...
>
> Thinking that this might be of use to the broader community we thought
> we might ask for feedback before starting?
>
> Any ideas...
>
> Thanks,
>
> Olly

Well, I'll bite.

My first reaction to this was, why?

Then I did some Googling and found the following article:

http://www.apcmag.com/apc/v3.nsf/0/5F125BA4653309A3CA25705A0005AD27

And also noted the GPU Gems 2 site here:

http://developer.nvidia.com/object/gpu_gems_2_home.html

So, my new found perspective is, why not?

Best wishes for success, especially since I have a certain affinity for McGill...

HTH, Marc Schwartz



R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel Received on Tue Jan 24 08:53:19 2006

This archive was generated by hypermail 2.1.8 : Tue 24 Jan 2006 - 04:04:19 GMT