OF 0.9.3 ofPixels max size allocation

I have an issue with ofPixels: both on Win10 and Lubuntu
the maximum size I can allocate is 8191x8191. Windows reports
GL_MAX_TEXTURE_SIZE to be 16384. Is there a special reason for this limit? I assume that ofPixels is not bound to any gpu texture memory at all and gets completeley allocated on the heap, right?
I can allocate much larger chunks of memory than 8191x8191x4 with malloc on the heap without issues.

2 Likes

You certainly should be able to do this, as ofPixels should only be limited by your available RAM/Swap memory. Can you post some sample code? What errors are you getting exactly?

1 Like

on WIndows10 get no console error at all, just a crash (release 64).
on Lubuntu64
terminate called after throwing an instance of ‘std::bad_alloc’
what(): std::bad_alloc

// this lets me allocate all available memory
    static void testMaxMemory()
    {
        // ulimit -a

        size_t oneHundredMiB = 100 * 1048576; // 1024*1024  https://en.wikipedia.org/wiki/Mebibyte
        size_t maxMemMiB = 0;
        void *memPointer = NULL;
        do
        {
            if (memPointer != NULL) 
            {
                printf("Max Tested Memory = %zi bytes\n", maxMemMiB);
                printf("Max Tested Memory = %zi kb\n", maxMemMiB / 1024); 
                printf("Max Tested Memory = %zi mb\n", maxMemMiB / 1024 / 1024);
                printf("Max Tested Memory = %zi gb\n", maxMemMiB / 1024 / 1024 / 1024);
                cout << endl;
                memset(memPointer, 0, maxMemMiB);
                free(memPointer); 
            }
            maxMemMiB += oneHundredMiB;
            memPointer = malloc(maxMemMiB);
        } while (memPointer != NULL);
        printf("Max Usable Memory aprox = %zi\n", maxMemMiB - oneHundredMiB);
        printf("result may be different on 32 or 64 bit system or compilation.\n");

    }

// this crashes at 8192 px
    ofPixels p;
    size_t dim = 8180;
    for (size_t i = 0; i < 100000; i++)
    {
        p.allocate(dim, dim, OF_PIXELS_RGBA);
        ofLogNotice("test pixels") << "ok: allocated pixels: " << dim << "x" << dim;
        dim++;
    }

For reference, I ran a slightly modified version of your code on OSX / Ubuntu.

I am using the master branch, and I believe there have been some fixes in ofPixels since 0.9.3 that had to do with memory allocation.

Anyway, here are my tests / results.

    ofPixels p;
    for (size_t dim = 0; dim < 100000; dim++)
    {
        p.allocate(dim, dim, OF_PIXELS_RGBA);
        ofLogNotice("test pixels") << "ok: allocated pixels: " << dim << "x" << dim;
        dim++;
    }
[notice ] test pixels: ok: allocated pixels: 99992x99992
[notice ] test pixels: ok: allocated pixels: 99994x99994
[notice ] test pixels: ok: allocated pixels: 99996x99996
[notice ] test pixels: ok: allocated pixels: 99998x99998

It completed successfully.

I then tested in on Ubuntu 16.04 and got this:

[notice ] test pixels: ok: allocated pixels: 27670x27670
[notice ] test pixels: ok: allocated pixels: 27672x27672
[notice ] test pixels: ok: allocated pixels: 27674x27674
[notice ] test pixels: ok: allocated pixels: 27676x27676
terminate called after throwing an instance of 'std::bad_alloc'
  what():  std::bad_alloc
Aborted (core dumped)
/home/dev/openFrameworks/libs/openFrameworksCompiled/project/makefileCommon/compile.project.mk:169: recipe for target 'run' failed
make: *** [run] Error 134

which makes sense because on my virtual machine I gave it only about 3GB or memory.

So, not sure – my advice would be to test it with the master branch on your machine.

1 Like

we recently changed ofPixels to work internally with size_t instead of int which makes master support way higher sizes than 0.9.3 where the maximum total size would be 2 ^ 16= 65536

2 Likes

Thank you Christopher, thank you Arturo

1 Like

i confirm that it works now on linux64. great!