ofTexture GPU usage same despite different graphic cards

Hi,

I’ve been stuck on this question for a while now, and any insight would be appreciated. The goal is to determine how many HD image sequences of (120 frames at 30fps) a computer can play simultaneously. The computer (a Hackintosh) I want to use for the project has a good graphics card (Nvidia GTX 1060 6GB), so naturally I want to use ofTextures.

I wrote a ofTextures test to check the capabilities, below are the findings (spoiler alert, they are the same!).

On Hackintosh (Nvidia GTX 1060 6GB) 20 videos at 20FPS
On a 2013 MBP (Nvidia GT 750M): 20 videos at 20FPS

It is strange that both can only store a similar amount of memory. I’ve done benchmark tests and the Hackintosh should perform about 5 times better.

Writing this out made me realize it is probably something with maximum data in the array…