Re: [Rd] Master's project to coerce linux nvidia drivers to run generalised linear models

From: Marc Schwartz (via MN) <>
Date: Mon 23 Jan 2006 - 21:47:38 GMT

On Mon, 2006-01-23 at 15:24 -0500, Oliver LYTTELTON wrote:
> Hi,
> I am working with a friend on a master's project. Our laboratory does a
> lot of statistical analysis using the R stats package and we also have a
> lot of under-utilised nvidia cards sitting in the back of our networked
> linux machines. Our idea is to coerce the linux nvidia driver to run
> some of our statistical analysis for us. Our first thought was to
> specifically code up a version of glm() to run on the nvidia cards...
> Thinking that this might be of use to the broader community we thought
> we might ask for feedback before starting?
> Any ideas...
> Thanks,
> Olly

Well, I'll bite.

My first reaction to this was, why?

Then I did some Googling and found the following article:

And also noted the GPU Gems 2 site here:

So, my new found perspective is, why not?

Best wishes for success, especially since I have a certain affinity for McGill...

HTH, Marc Schwartz mailing list Received on Tue Jan 24 08:53:19 2006

This archive was generated by hypermail 2.1.8 : Tue 24 Jan 2006 - 04:04:19 GMT