Hi, textfile can handle bigger files than msgfile, but there seems to be a limit, too. I tried to open a >250MB file, and Pd crashed. marius.
On Sat, 27 Oct 2007, marius schebella wrote:
textfile can handle bigger files than msgfile, but there seems to be a limit, too. I tried to open a >250MB file, and Pd crashed. marius.
If you measure bytes using an int32, the limit is 2GB, but if you measure it in bits, the limit is 256MB. [textfile] does read all the file into a buffer as big as the file. I didn't find anything that counts memory in bits, so perhaps it's just a coïncidence and the bug was something else, but I did fix another 256MB limit bug recently by adding parentheses in something like:
num_entries * bits_per_entry / 8
When crossing the limit, the numbers become negative, and trying to allocate a negative amount causes the allocator to either abort the process or return NULL or corrupt memory...
_ _ __ ___ _____ ________ _____________ _____________________ ... | Mathieu Bouchard - tél:+1.514.383.3801, Montréal QC Canada
I tried something different, which also did not work: split the files in three parts, each around 100MB and load them to 3 separate textfiles. when I try to load the second file, I get an error saying pd: resizebytes() failed -- out of memory if I try to load the 3rd file after that, my system freezes completely, and I have to reboot. this will be used for an installation, running day and night, so I am using [clear( and will load the files at sime time during night. (it takes about ten seconds to load the file, so I guess I can do that around 1pm). 'if you are not cheating, you are not doing it right.' (I learned that from the last blender camp). marius.
Mathieu Bouchard wrote:
On Sat, 27 Oct 2007, marius schebella wrote:
textfile can handle bigger files than msgfile, but there seems to be a limit, too. I tried to open a >250MB file, and Pd crashed. marius.
If you measure bytes using an int32, the limit is 2GB, but if you measure it in bits, the limit is 256MB. [textfile] does read all the file into a buffer as big as the file. I didn't find anything that counts memory in bits, so perhaps it's just a coïncidence and the bug was something else, but I did fix another 256MB limit bug recently by adding parentheses in something like:
num_entries * bits_per_entry / 8
When crossing the limit, the numbers become negative, and trying to allocate a negative amount causes the allocator to either abort the process or return NULL or corrupt memory...
_ _ __ ___ _____ ________ _____________ _____________________ ... | Mathieu Bouchard - tél:+1.514.383.3801, Montréal QC Canada
On Sat, 27 Oct 2007, marius schebella wrote:
I tried something different, which also did not work: split the files in three parts, each around 100MB and load them to 3 separate textfiles. when I try to load the second file, I get an error saying pd: resizebytes() failed -- out of memory if I try to load the 3rd file after that, my system freezes completely, and I have to reboot.
Is Pd somehow allocating memory in a no-swap zone?... that way, you can run out of memory much more quickly. But this requires root permissions.
You may also be running out of general memory. If you don't have a swap file then all your memory is no-swap all of the time.
In 32-bit Linux there's a limit of something between 1GB and 3.5GB per process. I don't know exactly how much, but it's not below 1GB for sure. Someone told me he has allocated using 2GB. You'd probably hit an actual memory allocation error much before you actually reach the absolute max, but it certainly wouldn't be below 1GB for sure.
So, I'm really puzzled. Does your system have anything special about RAM? any special limits? (settings that say "max 256MB per process" and such)
_ _ __ ___ _____ ________ _____________ _____________________ ... | Mathieu Bouchard - tél:+1.514.383.3801, Montréal QC Canada
On Oct 27, 2007, at 2:44 PM, Mathieu Bouchard wrote:
On Sat, 27 Oct 2007, marius schebella wrote:
textfile can handle bigger files than msgfile, but there seems to
be a limit, too. I tried to open a >250MB file, and Pd crashed.
marius.If you measure bytes using an int32, the limit is 2GB, but if you
measure it in bits, the limit is 256MB. [textfile] does read all
the file into a buffer as big as the file. I didn't find anything
that counts memory in bits, so perhaps it's just a coïncidence and
the bug was something else, but I did fix another 256MB limit bug
recently by adding parentheses in something like:num_entries * bits_per_entry / 8
When crossing the limit, the numbers become negative, and trying to
allocate a negative amount causes the allocator to either abort the
process or return NULL or corrupt memory...
If this is a bug in Pd, it would be great if you submitting a patch,
or at least posted more info here.
.hc
kill your television
On Sat, 27 Oct 2007, Hans-Christoph Steiner wrote:
On Oct 27, 2007, at 2:44 PM, Mathieu Bouchard wrote:
When crossing the limit, the numbers become negative, and trying to allocate a negative amount causes the allocator to either abort the process or return NULL or corrupt memory...
If this is a bug in Pd, it would be great if you submitting a patch, or at least posted more info here.
I haven't found the bug if there is one, but I did look for it for a while.
I only talked about a similar-looking bug that I had before, which in the end might not be related.
_ _ __ ___ _____ ________ _____________ _____________________ ... | Mathieu Bouchard - tél:+1.514.383.3801, Montréal QC Canada
you might make the file smaller by using ascii encoding.
is that file ascii or unicode?
On Sat, Oct 27, 2007 at 02:22:08PM -0400, marius schebella wrote:
Hi, textfile can handle bigger files than msgfile, but there seems to be a limit, too. I tried to open a >250MB file, and Pd crashed. marius.
PD-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list