Re: [R] Quick partitioning

From: Prof Brian Ripley <ripley_at_stats.ox.ac.uk>
Date: Fri 26 Aug 2005 - 16:23:21 EST

On Thu, 25 Aug 2005, Anna Oganyan wrote:

> Hello,
> I am quite new in R, and I have one problem:
> I have large d-dimensional data sets (d=2, 3, 6, 10). I would like to
> divide the d-dim space into n (n may be 10, but better some larger
> number, for example 20) equally sized d-dim hypercubes and count how
> many data points are in each cube. Is there any way to do it quickly, I
> mean - in a reasonable time? Actually, I want to get some rough idea
> of underlying densities of these data and compare them.
> Thanks a lot!
> Anna

How do you divide a 10D space into 10 hypercubes? You need at least some of dimensions to be undivided.

The general idea is easy: apply cut() to each dimension, so your dimensions become factors, then table() will produce the counts. That will be quick enough for millions of points.

-- 
Brian D. Ripley,                  ripley@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595

______________________________________________
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Received on Fri Aug 26 16:29:26 2005

This archive was generated by hypermail 2.1.8 : Sun 23 Oct 2005 - 15:57:54 EST