On Wed, 16 Jan 2002, Frank Heckenbach wrote:
Toby Ewing wrote:
I'm working with some large arrays (2^27 elements), and am about to run into memory limitations. Currently I'm using a bytecard for each element, but in expanding my program's capabilities, I want to add a second array of cardinal. This second cardinal needs to have a range larger than 2^16 (65535), but I don't have enough memory for a 2^32 cardinal. Sounds like 2^24 cardinals would be appropriate...
.. how much speed will I lose in memory access?
Given this simple test program:
program speed; var ba : array[ 1..64000000] of integer; i : integer; begin for i := 1 to 64000000 do ba[i] := 123; writeln('sizeof array = ', sizeof(ba)); end.
using "time speed" got this result:
sizeof array = 256000000
real 5.114s user 1.810s sys 3.300s
changing ba to packed array [1..64000000 ] of integer(24) gives this:
sizeof array = 256000000
real 18.407s user 16.040s sys 2.370s
Have 384M memory so 256M array didn't trigger VM
I think the factor of 3 speed change shows it is packing even though the size is wrong.
Hope this helps, Russ