HTCC-AB01 RadioLib OTAA join request fails

Hi, I (also) have trouble getting RadioLib working on CubeCell AB01 v1 boards.
I am aware of posts referring to AB01 timing issues, but in my case OTAA join already fails…

state = node.activateOTAA(); // <=== returns -1116
(see also below)
Infamous -1116 error = “RADIOLIB_ERR_NO_JOIN_ACCEPT
No JoinAccept was received - check your keys, or otherwise likely a range issue!

What works:

  • TTS receives join request from the (Mikrotik LR8G) gateway, accepts it and forwards join-accept message (TTS device settings: LoRaWAN Specification 1.1.0 + RP002 Regional Parameters 1.0.4)
  • Gateway receives join-accept and forwards (transmits) it.

What I also tried:

  • Moved existing LoRa App from Heltec lib to RadioLib 7.5.0.: Works fine;
  • Run basic Heltec LoRaWAN example on same board: OTAA Join works fine;
    • (TTS device settings: LoRaWAN Specification 1.0.2 + RP001 Regional Parameters 1.0.2 revision B)
  • Switch board: CubeCell AB01 v2 instead of v1: same -1116 error
  • Use RadioLib 7.0.0 instead of 7.5.0: same -1116 error

Any hint would be appreciated!

#include "sensor-lorawan-node-PSoC4100S-sandbox.h"
#include <CubeCell_NeoPixel.h>
#include <RadioLib.h>

// Unique chip id is used as DevEUI
uint64_t devEUI = 0x0000000000000000;

// Regional choices: EU868, US915, AU915, AS923, AS923_2, AS923_3, AS923_4, IN865, KR920, CN470.
const LoRaWANBand_t Region = EU868;
// Subband choice: for US915/AU915 set to 2, for CN470 set to 1, otherwise leave on 0.
const uint8_t subBand = 0;

uint64_t joinEUI = 0x000000000000000;
// These are generated by TTS.
uint8_t appKey[] = { APPKEY };
uint8_t nwkKey[] = { NWKKEY };

TimerEvent_t joinBlink;

int radiolibState = RADIOLIB_ERR_NONE;
#define printState(x, y) Serial.printf("%s - RadioLib state: %d\n", x, y); // Use https://radiolib-org.github.io/status_decoder/decode.html

CubeCell_NeoPixel Pixels(1, RGB, NEO_GRB + NEO_KHZ800);
SX1262 radio = new Module(P4_3, P4_6, P5_7, P4_7); // cores/asr650x/board/inc/board-config.h  pins 35, 38, 47, 39

LoRaWANNode node(&radio, &Region, subBand);

void setup() {
  Serial.begin(115200);
  // SK6812 RGB Leds
  pinMode(Vext, OUTPUT);
  digitalWrite(VBAT_ADC_CTL, HIGH);
  digitalWrite(Vext, LOW); // set power
  Pixels.begin(); // init NeoPixel strip
  Pixels.clear(); // all pixels off

  generateDeveuiByChipID(); // fill-in devEUI

  radiolibState = radio.begin(RF_FREQUENCY, LORA_BANDWIDTH, LORA_SPREADING_FACTOR, LORA_CODINGRATE,
    RADIOLIB_SX126X_SYNC_WORD_PRIVATE, TX_OUTPUT_POWER, LORA_PREAMBLE_LENGTH);
  if (radiolibState != RADIOLIB_ERR_NONE) {
    Led(0x500);
    while (true == true) ; // Radio fail
  }

  // Setup the OTAA session information
  node.beginOTAA(joinEUI, devEUI, nwkKey, appKey);
  
  // Override the default join rate
  node.setDatarate(4);
  
  TimerInit(&joinBlink, Blink);
  TimerStart(&joinBlink);
  
  int8_t joinAttemtps = 10;
  while (true == true) {
    radiolibState = node.activateOTAA(); // <=== returns -1116
    if (radiolibState == RADIOLIB_LORAWAN_NEW_SESSION)
      break;
    printState("act-otaa", radiolibState);
    if (--joinAttemtps > 0) {
      unsigned long t0 = millis();
      while (millis() - t0 < 2000) ;
    }
    else {
      TimerStop(&joinBlink);
      Led(0); // RGB off
      while (true == true) ; // Join failed
    }
  }
  TimerStop(&joinBlink);
 
  // Enable the ADR algorithm (on by default which is preferable).
  node.setADR(true);

  // Set a datarate to start off with.
  node.setDatarate(5);
}

You’ll need to adjust for the clock drift. There is some information about this in BuildOpt.h. If you look for those keywords you’ll be able to find the instructions in that file!

1 Like

@bns Many thanks for pointing me to this build option settings!
I will share my findings on this forum.

1 Like

First findings after measuring delay times to see how far off the AB01 clock is from its nominal value. Tests done with same AB01 v1 board.

Method 1 - use a Timer

TimerEvent_t cycleTimer;
setup() {
  Serial.begin(115200);

  TimerInit(&cycleTimer, NextCycle);
  NextCycle();
  while (1) ;
}

void NextCycle(void) {
  TimerSetValue(&cycleTimer, 10 * 1000);
  TimerStart(&cycleTimer);
  Serial.println();
}

Average deviation: 1208 ppm (≈1ms per 1000ms)

Method 2 - use delay()

setup() {
  Serial.begin(115200);
  while (1) {
    Serial.println();
    delay(10*1000);
  }
}

Average deviation: 18852ppm (≈19 ms per 1000ms)

In both cases, the following terminal command sequence was used (cu.usbserial-0001 = host specific port name):
(stty speed 115200 >/dev/null && cat) </dev/cu.usbserial-0001 | ts -i %.S

  1. So it appears the AB01 clock is not the root cause of the problem, but that it’s a software/firmware issue. That would also explain why @bns found almost the same ‘drift’ value. The use of Timer events instead of delay() seems a much better choice when accurate delays are required.

  2. Next step was to change the Radio drift setting in ‘BuildOpt.h’, assuming that RadioLib uses delay():
    #define RADIOLIB_CLOCK_DRIFT_MS (19)
    Et voila, OTAA join request works!

(For now, one hurdle taken…)

1 Like

Coool, it would be an interesting exercise to modify the RadioLib HAL to use hardware timers instead!