Stream webcam over http

Hi all
I’m working on a headless (using ofAppNoWindow) application that does some image analysis and will serve the result data to a node.js webserver running in parallel, where clients can connect to via http and see a dashboard in their webbroswer.
Now I also want to serve the camera stream to the clients. How would I best go about this?
Any help is greatly appreciated.

I’ve managed to encode an image to base64 to be able to send it over http - I haven’t actually managed to send it over http though yet (need to do that via a POST request). If the base64 encoding sounds like something that would be of interest, I can share the code.

1 Like

ofxHTTP has a streaming video server. It streams using MJPEG, which can be seen on browsers. Here is the server example.


Hi all, so I have a http server now running (using and I can return strings on request.

The images I have are Mat’s with only the luminosity plane:

// camera is a VideoGrabber
cv::Mat frame = ofxCv::toCv(camera.getPixels().getPlane(0));

How can I re-convert that luminosity plane into a format which I can send as raw bytes for interpretation as a jpeg client-side?

For future reference, this is how I solved it:

// frame = cv::Mat
void lsHttpServer::createServer() {
	server.Get("/frame", [&](const httplib::Request &request, 
		httplib::Response &response) {
		vector<uchar> buffer;
		cv::imencode(".jpg", frame, buffer);
		std::string str(buffer.begin(), buffer.end());
		response.set_content(str, "text/plain");
	server.listen("localhost", 1234);
1 Like