I’m trying to decode my data in the TTN uplink decoder, but I am completely lost on how to determine how many bytes are what. For example, “data.Latitude = (input.bytes << 8) + input.bytes” I don’t understand why the bytes are “ <<8)” and plus the other bytes. I’ve read both heltec documentation and TTN documentation, But all I get is an example decoder that doesn’t explain why certain lines of the decoder have different byte amount for things like gps, an analog sensor, or time.
The decoder needs consistent information between nodes and servers.
I guess a better question is I don’t understand how to determine how many bytes are used to put it in my decoder? “data.Latitude = (input.bytes << 8) + input.bytes” Like what makes it so that the input byes are 4? I have other values I’m sending and I need a clearer understanding of why the bytes can be 4, or 6, or 5. Is this based off how many bytes are sent. And if so how do I determine how many are being sent in my Aruino sketch for the heltec LoRa 32 microcontroller and how to I determine what order to arrange the decoder?