[DGD] Re: Clones and very large arrays
Erwin Harte
harte at is-here.com
Sat Apr 3 16:47:33 CEST 2004
On Fri, Apr 02, 2004 at 11:59:12PM +0000, Robert Forshaw wrote:
> What are the problems with having very large arrays? In my lib, every
> master has an array containing a list of its clones. I expect the number of
> clones for one particular object to be in the thousands, and potentially,
> tens to hundreds of thousands. Are there any issues to be wary of with an
> array of this size? Will it be CPU intensive to delete a single element in
> an array of that size? (i.e. clones -= ({
> a_clone_that_has_just_been_destructed }) when sizeof(clones) > 100000).
It's not the CPU usage that will get you, but the fact that DGD has a
limit on array size lower than that.
>From dgd/src/config.c:
{ "array_size", INT_CONST, FALSE, FALSE,
1, USHRT_MAX / 2 },
Assuming that an unsigned short is a 2 byte variable as it is on my
computer that means that array_size is a value between 1 and 32768.
> If this kind of use of arrays is deprecated, how should I keep a
> record of clones?
Michael gave you the solution that others (myself included) have been
using with a good amount of success. You may want to wrap it into a
LWO so you don't have to see the ugly details, but that's about it. :)
Hope that helps,
Erwin.
--
Erwin Harte <harte at is-here.com>
_________________________________________________________________
List config page: http://list.imaginary.com/mailman/listinfo/dgd
More information about the DGD
mailing list