Sending Data as Bytes formatted in Hex over TCP

So I am trying to get the dataset I am sending as small as possible.
The format is Micrcontroller Pixel R G B
I am able to send it using a string and sendRaw.

However, I am attempting to truncate that so that it sends the hex values as bytes using sendRawBytes.
I don’t want any delimiters, but I need a known segment size. I have it working to a degree, but instead of each hex value being a byte each is 2 bytes. For instance, FF isn’t reading as 1 byte, but a byte for F and another for F. This doesn’t help when I am trying to get the data stream as small as possible. There are over 30000 data sets I need to send and speed is a priority. I am planning a UDP version, but I want the TCP to work as well.

I have tried several things, but currently I am using

                            i,//Microcontroller (0-33)
                            j,//Pixel (0-909)
                            structure[i][j][2], //red
                            structure[i][j][3], //green
                            structure[i][j][4] //blue
                    txMessage += tcpData;

When I send the data I use


The length of the message is like 360360, and the python code parsing it needs to read it as though it is 12 bytes. I want it to be half that and each segment only being 6 bytes. For example, MC(33)Pixel(909)r(255)g(255)b(255) should read as 21038dffffff and be 6 bytes (0x21 0x038d 0xff 0xff 0xff).

I am still fairly new to this sort of thing so I appreciate any help. I have tried sscanf as well, but the output, while accurately showing the number of bytes was pointing to the wrong data:

                            &i,//Microcontroller (1-33)
                            &j,//Pixel (0-909)
                            &structure[i][j][2], //red
                            &structure[i][j][3], //green
                            &structure[i][j][4] //blue

with sscanf formatting data Python prints out bytearray(b’\x80\xf280\xfc\x7f\x80’) for the first 6 bytes, when it should be bytearray(b’\x01\x0000\x04\x0e\x16’). The total bytes shown is correct though, at 176910.

When I send it to python formatted with sprintf python prints out bytearray(b’010000040e16’) for the first 12 bytes for a total message of 353820.

*I have been able to send it as csv values as a string and that works, but is over twice the data I want to be sending. Python is splitting the data and sending it to its appropriate raspberry pi, and as the pi only has 100mb ethernet coupled with the amount of data, I can only get 15fps consistently. If I can cut the dataset in half I am likely able to increase the fps.

If more code is needed to get me in the right direction that is no problem.

Thanks for any help.

if you want to send the raw bytes of the colors directly you just need to cast the array to unsigned char like:

tcp.sendRawBytes((unsigned char*)&structure[i][j][2], 3);

I need to send 170+k bytes at a time. The tcpChar variable I’m using to store each set of pixel data is 6 bytes.

I suppose a clearer way to ask the question is how do I append or set bytes to an unsigned char that is 170k bytes with each segment being 6 bytes: 1byte for microcontroller #, 2bytes for address, 1 byte r, 1byte g, 1byte b.

unsigned char tcpMessage is the full message,
tcpChar is the individual pixel data set.

Then I could send with:

tcp.sendRawBytes((unsigned char*)&tcpMessage, 17600);

I solved it mostly. Sadly it requires a preset char size, which I would rather be truly dynamic, as it is designed to work with any screen size, based on the coordinates set out in a CSV file. Regardless, here is the solution:

//Truncated version of formatData Function
void ofApp::formatData(){
       uint buffNum;
       uint msgSizeBytes = pixelsPerController * bytesPerPixel +1 ;
       char chrMessage[msgSizeBytes]; 

       for(unsigned int i = 1; i < sizeof(structure)/sizeof(structure[0]); i++ ){ 
           buffNum = 0;//restart for each controller
           for(unsigned int j = 0; j < sizeof(structure[i])/sizeof(structure[i][0]); j++ ){
               chrMessage[buffNum] = (i & 0xff);                  //micro-controller address
               // Address is 2 bytes - requires bit shifting
               chrMessage[buffNum] = ((j >> 8) & 0xff);           // address byte1
               chrMessage[buffNum] = (j & 0xff);                  //address byte2
               chrMessage[buffNum] = (structure[i][j][2] & 0xff); // r
               chrMessage[buffNum] = (structure[i][j][3] & 0xff); // g
               chrMessage[buffNum] = (structure[i][j][4] & 0xff); // b
            sendDataBytes(chrMessage, sizeof(chrMessage));

//Send Byte Data
void ofApp:: sendDataBytes(char message[], int size){
    if(sendData && !waitInit){
                TCPtx.sendRawBytes(message, size);
        }else if(udp){
            UDPtx.Send(message, size);

Writing to the APA102C RGB LED via the Dostar Library from Adafruit (Adafruit_DotStar) via a Raspberry Pi.
Not that the data is set to Big Endian (Network Byte Order), as we are sending data to python script over UDP it is important to note that python wanted to read it as Little Endian, we switched it to Big Endian with ‘>HBBB’, so our python function in the class looks like:

def writeData(self, data):
        Assumes data is received as raw bytes in the sequence
        Where PIXEL is two bytes, and each of R, G, and B are one byte.
        This data should be equivalent to a byte string which was
        packed with struct.pack('HBBB', ...) and then joined to a byte
        string b''
        numChunks = len(data)/self.chunkSize
        for i in xrange(numChunks):
            chunk = data[i*self.chunkSize: (i+1)*self.chunkSize]
            pixel, R, G, B = struct.unpack('>HBBB', chunk)
            color = self.sortRGB(R, G, B)
#            pixel = ((pixel & 0xff00) >> 8 ) + ((pixel & 0x00ff) << 8)
#            if 900 < i: print pixel, color
            self.strip.setPixelColor(pixel, color)