You’ll need to adjust for the clock drift. There is some information about this in BuildOpt.h. If you look for those keywords you’ll be able to find the instructions in that file!
HTCC-AB01 RadioLib OTAA join request fails
@bns Many thanks for pointing me to this build option settings!
I will share my findings on this forum.
First findings after measuring delay times to see how far off the AB01 clock is from its nominal value. Tests done with same AB01 v1 board.
Method 1 - use a Timer
TimerEvent_t cycleTimer;
setup() {
Serial.begin(115200);
TimerInit(&cycleTimer, NextCycle);
NextCycle();
while (1) ;
}
void NextCycle(void) {
TimerSetValue(&cycleTimer, 10 * 1000);
TimerStart(&cycleTimer);
Serial.println();
}
Average deviation: 1208 ppm (≈1ms per 1000ms)
Method 2 - use delay()
setup() {
Serial.begin(115200);
while (1) {
Serial.println();
delay(10*1000);
}
}
Average deviation: 18852ppm (≈19 ms per 1000ms)
In both cases, the following terminal command sequence was used (cu.usbserial-0001 = host specific port name):
(stty speed 115200 >/dev/null && cat) </dev/cu.usbserial-0001 | ts -i %.S
-
So it appears the AB01 clock is not the root cause of the problem, but that it’s a software/firmware issue. That would also explain why @bns found almost the same ‘drift’ value. The use of Timer events instead of delay() seems a much better choice when accurate delays are required.
-
Next step was to change the Radio drift setting in ‘BuildOpt.h’, assuming that RadioLib uses delay():
#define RADIOLIB_CLOCK_DRIFT_MS (19)
Et voila, OTAA join request works!
(For now, one hurdle taken…)
Coool, it would be an interesting exercise to modify the RadioLib HAL to use hardware timers instead!
@bns, I agree, but RL also offers the possibility to use customer supplied delay & sleep functions instead of using delay().
Here’s a mix of 2 comments found in different RL ‘config.h’ example files:
// "Custom delay function:
// Communication over LoRaWAN includes a lot of delays.
// By default, RadioLib will use the Arduino delay() function,
// which will waste a lot of power. However, you can put your
// microcontroller to sleep instead by customizing the function below,
// and providing it to RadioLib via "node.setSleepFunction".
// If set, LoRaWAN node will call this function to wait for periods of time longer than RADIOLIB_LORAWAN_DELAY_SLEEP_THRESHOLD.
// This can be used to lower the power consumption by putting the host microcontroller to sleep.
// NOTE: Since this method will call a user-provided function, it is up to the user to ensure that the time duration spent in that
// sleep function is accurate to at least 1 ms!
// void customDelay(RadioLibTime_t ms) {
// ::delay(ms);
// }
// "
Changed RADIOLIB_LORAWAN_DELAY_SLEEP_THRESHOLD from (50) to (0) in ‘LoRaWAN.h’ and disabled RADIOLIB_CLOCK_DRIFT_MS (19) in ‘BuildOpt.h’ again.
Implemented Timer based delay funcs for RL to use:
void RadioDelay(RadioLibTime_t ms); // start delay timer & sleep
void RadioDelayDone(); // wake-up call
RL is indeed using my RadioDelay(), but somehow I’m back to square 1: node.activateOTAA() returns -5 using interrupt enabled TX (radio.setPacketSentAction(UplinkDone);), and -1116 (again) without using interrupt TX.
(-5 = RADIOLIB_ERR_TX_TIMEOUT Timed out waiting for transmission finish.)
No beginners luck this time, but too early to give up…
It needs a bit more than beginner’s luck: RadioLib uses not only the default delay() but also the default millis(). That millis() is used to for instance verify the TxDone interrupt, which is not properly awaited in your modification resulting in a TxTimeout being generated before TxDone had a chance to occur.
So we would need access to a lower-level value for millis() which isn’t affected by the borked software translation layer. But is that available? I don’t know…
(By the way: it would really be better to plug this into the ArduinoHal than overriding the delay function
the sleepDelay isn’t used everywhere in the stack, while the normal delay is.)
Thanks for saving me from going down a fruitless rabbit hole.
CubeCell AB01 millis() is not using 16 bit numbers (suggested earlier). It’s happily counting and passed the 24bit mark ‘as we speak’. Don’t see problems using it as-as.
Timer events and millis() on CubeCell AB01 work fine and produce accurate & reproducable results.
delay() Sucks, as we’ve seen, but apparently it serves a purpose within RL when it needs a blocking delay.
Found this old posting from a former Heltec staff member:
Just added two examples: micros and millis .
micros
The micros function use the internal system ticker (48MHz timer), this timer of ASR6501 is not accurate, as tested, it have 1.7% error.
[img removed…]
The delay in the example is 1000ms, ideally, the output value here should be 1000000.But look at the printed time stamp, each print have 20 - 22ms error. It means the actual
delay(1000);spend 1020 - 1022ms. Because the delay uses internal system ticker too.So the error is
(1020000 - 1002735) / 1020000 ≈ 1.7%That’s why I didn’t use system ticker for LoRaWAN operations. Another problem is the system ticker can’t run during the deep sleep period, wake up from deep sleep will make the micros back to 0.
millis
[img removed…]
The millis function uses external 32.768KHz RTC clock, it a very accurate timer, and can run during the deep sleep period. As the picture has shown, it’s the same with the delay time.But the minimum period is
1/32768 ≈ 30us, can’t be the source of micros.
RadioLib assumes (‘ArduinoHal.cpp’) that the RADIOLIB_CLOCK_DRIFT_MS adjustment should be applied to all timing functions: micros(), delayMicroseconds(), delay() and millis().
CubeCell has two different clocks each driving different timing functions.
(For now, excluded RL millis() from drift adjustment.)
You appear to be implying that RadioLib is somehow wrong for using the Arduino API when it is running within the Arduino framework and then is further wrong for trying to find a way to ameliorate the issues with what is a rather suspect platform.
Are you hoping for help or are you just trying to make a point?
Looks mostly like a collection of statements to me, not an attack in any form on RadioLib.
@peterm is doing some pretty decent investigation and I’m rooting for an actual solution. The CubeCell hardware is good, it’s the software that sucks in random places and if we can make RadioLib run properly on them, I’d be delighted!
I personally have spent only a day or so trying to fix the platform before giving up, so I may have made some mistakes or missed some observations - please go on Peter!
@bns Thanks, you’re spot on about my intentions. I see it as a bumpy journey trying to get RadioLib reliably working on the CubeCell platform.
@nmcc If you can make substantial contributions, please join the discussion and try to be constructive.
I have chosen RadioLib because this is what you guys - amongst others - have been promoting. And for very good reasons: even I can read the code, it’s well documented, actively maintained & supported, no vendor lock in - the list goes on.
The easy way forward for me would be to stick with the LoRaWAN lib from Heltec c.s. and have my LoRa -> LoRaWAN project done within hours.
I took up the challenge while having fairly limited knowledge (understatement) of LoRaWAN, RadioLib, Arduino. (Fyi - my 40+ years EE background is in the embedded development tools business for mostly automotive applications.)
So I am very glad with any help & guidance I can get - and already got from @bns, thanks for that! - and others.
Enough, back to ‘work’…
I can and I was.
You made some statements about RadioLib that I interpreted as being that RL is at fault when RL is using the standards set up in the Arduino framework. These can’t be changed easily, if at all, so it was hard to know what your intent is here.
PS, long standing, since 2019, owner of 2 x HTCC-AB01 that are in storage as the timing using the Heltec & LMIC code bases were so borked I moved on to other MCUs. I know a bit about LMIC, RL LW and LW in general.
So, refactoring RL to use a different set of delay & timing functions isn’t an option as it would need doing for each & every platform it can run on, thereby making it non-Arduino, and you don’t appear to like the clock drift option, a technique that was a highlight of using the older Nano’s with LMIC, what other options are there?
@BNS, no one said it was an attack, I said “it looked like … please clarify”. The message heard is not necessarily the message intended. If I’d said “are you an idiot, you can’t just start re-arranging the whole delay code in Arduino”, then it wouldn’t be so cool. But I didn’t.
If a post isn’t clear, how should we ask for clarification?
That stack is based on a very old version of the LoRaMac-node, most of v1.0.2, which gets round timings by being less energy efficient - opening the Rx windows early and leaving them open longer than necessary. This will have a substantial effect on battery life it that’s how it’s powered. If you are developing something more than a home / hobby application, there are far more up to date boards in the Heltec range that can support the current offerings.
Infineon pSOC4100 supports two clocks. Heltec has communicated their (in)accuracies, which timing functions are effected, and which not (millis()).
RL currently does not support architectures with multiple clock sources.
One ‘drift’ value can be specified, which is then applied to all four timing functions RL uses. What I would like to have is a generic way to control this in RL. Has nothing to do with non-Arduino.
I would then also like this ‘drift’ value to be runtime changeable. Then Makers can let their mixed clock device do the ‘clock calibration’ internally (eg. at startup) instead of having to run a tiny app first, do the math manually for each device, put it into the real app, build and flash that being deployed.
RL does support architectures with multiple clock sources, you are more than welcome to build your own HAL for it if you want - there aren’t that many functions needed. But then you have to compile using the Infineon environment or create an “interesting” BSP for Arduino or hack RL to call your own timing functions.
For a business solution, this introduces considerable risk. If it’s for a home project and you fancy it, then that’s all good.
Most makers entering the LW world would go with many other done for you boards, the Heltec LoRa 32 v3 being the gold standard, the Tracker if you want to add a GNSS. I’ve no idea how these ones still get sold - attractive on paper at first & second reading I guess. But there are other issues with the implementation as well.
From almost five years watching makers & LW, I’d actually go as far as saying there is a measurable number that wouldn’t be able to get to a working state with this board and any LW stack.
TL;DR: You can work on this issue but it’s unlikely to have a huge uptake by other users and unless very very carefully aligned with the RL repro, aka invisible in use, unlikely to get merged in. The challenge with releasing a code base is people start to use it and then people start asking questions.
If it does support multiple clocks, how can I specify which clock source drives which timing function in RL?
@bns: Making progress. Happily uploading packets using node.sendReceive(packet, packetSize);. Not just a few…
As mentioned earlier, I had to tweak ArduinoHAL.h because it does NOT support multiple clock sources out of the box.
Next is to see how to use interrupt-driven TX and why it doesn’t work yet.
BTW, AB01 v2 board HF clock is indeed off less than v1 (as Heltec already published). About 9ms instead of 19-20ms. Heltec’s Aaron also mentioned back in 2020 that HF clock offsets differ from device to device, so (manual) calibration would be required for every CubeCell device being deployed as LoRaWAN node.
I think Heltec’s Aaron did a great job at that time - would not disqualify the result as ‘borked’…
It may not be broken, but it is suboptimal in a number of places. Clocks aren’t the only thing that are regularly mentioned on this forum 
The micros + delayMicros and millis + delayMillis (which is the normal delay()) can use separate clocks. I can’t remember any place in the stack where the micros and millis are mixed, it’s either one or the other. But the millis and delayMillis should use the same if at all possible, and the micros and delayMicros should use the same. Otherwise a lot of stuff is very likely to break.
Glad to hear you are getting proper results. But are you also receiving all downlinks? Or is it missing them? And if you receive downlinks, is that with the default scanGuard, or overridden to a larger value? If you get a reliable solution which is reproducible across a number of CubeCell boards and preferably versions, we can make a CubeCellHal.h instead. But that is a bit too early for now 
Can’t tell you yet. Clearly have to learn more about LW first, so that’s next for me then instead of interrupt stuff. Thanks!
If you mean that you want interrupt driven LoRaWAN: that’s not going to work and not going to be supported, too, in RadioLib. I know that it would be sweet and cool and may save a second here and there because you can for instance update the display while it’s transmitting. But there is so much that the user can do wrong during a LoRaWAN up/downlink cycle that we will not support that. The sleepDelay can be (ab)used for updating a display or writing to a file if you take proper care of the duration and remaining time. But we would strongly advise leaving everything intact, and maybe using a light sleep as a sleepDelay and not much else.
RL supports multiple clocks via its support for writing your own HAL - your own interface to the underlying MCU’s API.
For sure for a one-off for a Maker with the time & inclination. But if you are making & shipping v1 proof-of-concept devices using this board in the 100’s, not so much fun.
And does anyone want to have to start working around idiosyncrasies in a device to get timings right and then have to calibrate for each board they have? And not have this highlighted in the docs with some good workarounds so you know what you are getting in to pre-purchase and have to discover it the hard way?
You can see I’m active on here, I love many Heltec products, the LoRa 32 v3, the Tracker and the CT-62 are brilliant. But some should have been retired a long time ago and some have been released far too early - the v4, the MeshPocket and the MeshSolar seem to have raised some non-trivial issues.
Ah, I see. I was triggered by
RadioLib/examples/SX126x/SX126x_Transmit_Interrupt/SX126x_Transmit_Interrupt.ino
RadioLib SX126x Transmit with Interrupts Example
This example transmits LoRa packets with one second delays
between them. Each packet contains up to 256 bytes
of data, in the form of:
- Arduino String
- null-terminated char array (C-string)
- arbitrary binary data (byte array)
Other modules from SX126x family can also be used.
Does only LoRa.
OK, got it now. Thanks.
I recall back in '20 or so ST announced they would come with - in my view - superior MCU + LoRa radio device compared to the ASR650x. Loved it, but didn’t want to wait for that, besides it would probably also mean spending €€€ on things like Segger J-Link and professional dev tools. Plus big learning curve.
So yes, I totally agree with you there are way better (= more robust/stable, well documented & supported, open sourced, …), solutions out there than EOL’d HTCC AB0x’s, but as a - meanwhile - hobbyist with 6 already on the shelf and sufficient time now, I didn’t want to trash these yet.