[R] Quick partitioning

From: Anna Oganyan <aoganyan_at_niss.org>
Date: Fri 26 Aug 2005 - 07:55:46 EST

I am quite new in R, and I have one problem: I have large d-dimensional data sets (d=2, 3, 6, 10). I would like to divide the d-dim space into n (n may be 10, but better some larger number, for example 20) equally sized d-dim hypercubes and count how many data points are in each cube. Is there any way to do it quickly, I mean - in a reasonable time? Actually, I want to get some rough idea of underlying densities of these data and compare them. Thanks a lot!

R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html Received on Fri Aug 26 07:59:43 2005

This archive was generated by hypermail 2.1.8 : Fri 03 Mar 2006 - 03:39:56 EST