This is needed for loading tensorflow.js and the mobilenet model (I placed it at the end of template.html):
<!-- Load TensorFlow.js. This is required to use MobileNet. -->
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs/dist/tf.min.js"> </script>
<!-- Load the MobileNet model. -->
<script src="https://cdn.jsdelivr.net/npm/@tensorflow-models/mobilenet@1.0.0"> </script>
This is needed for object detection of any texture:
fbo.allocate(texture.getWidth(), texture.getHeight(), GL_RGBA);
fbo.begin();
texture.draw(0, 0);
fbo.end();
fbo.readToPixels(pixels);
ofSaveImage(pixels, "screenshot.jpg");
EM_ASM(
var content = FS.readFile("/data/screenshot.jpg");
FS.unlink("/data/screenshot.jpg");
var blob = new Blob([content], {type: "text/plain;charset=utf-8"});
img = new Image();
url = URL.createObjectURL(blob);
img.src = url;
// Load the model.
mobilenet.load().then(model => {
// Classify the image.
model.classify(img).then(predictions => {
console.log('Predictions: ');
console.log(predictions);
});
});
<!-- Load TensorFlow.js. This is required to use MobileNet. -->
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs/dist/tf.min.js"> </script>
<!-- Load the MobileNet model. -->
<script src="https://cdn.jsdelivr.net/npm/@tensorflow-models/mobilenet@1.0.0"> </script>
in ofSetup instead of template.html (I tried, but without success)?
Here is an object recognition example: https://object.handmadeproductions.de/
I had to edit library_html5video.js to access the video directly, I am sure there is a better way:
Edit: While it is quite fast on desktop, it is slow on mobile devices (30 vs. 3 fps). Maybe it is because the use of pixels (need to try with grabber.setUsePixels(false))…
I tested a little more. Actually my RTX 3090 needs about 30% for 3d-acceleration for 30 fps, maybe thats the reason why my mobile phone only reaches about 3 fps…
These are really fun! Both the object recognition and the landmarks run at 30 fps on an m1 air (7-core gpu), and maybe at 65 - 85% of the gpu capacity at that rate. Nice!
i guess i am doing something wrong with this example (body segmentation), because it crashes sometimes and is not fast: https://body.handmadeproductions.de/
edit: i changed the model to bodyPix, now it seems to run well…
And, my attempts are all very hacky. It would be nice to put the tensorflow.js stuff in kind of an ofxEmscripten addon, so that its at least possible to use it without editing the OF source code…
Can anyone confirm that all of the examples run well on desktop but not on mobile? I always get only around 2-3 fps with my (middle class android) phone…
It runs great on OSX Ventura 13.3.1.
On my iPhone 12 mini on both Safari and DuckDuckGo, it renders a frame of video and then the video does not update, then the fps is between 8 fps - 20 fps depending on the example
I wonder, if it is possible to port stable diffusion to tensorflow.js…
Here is an attempt (not mine): edhyah/stable-diffusion-tensorflow-js at main
In theory it could work, but actually no idea…