I am using a BME680 sensor with an STM32 and a FreeRTOS in ULP sampling mode.
A thread is dedicated to manage the BME sensor, and in its loop there is a sleep for the duretion of the interval returned by the library itlesf, as suggested in the bsec_iot_loop example.
time_stamp_interval_ms = (sensor_settings.next_call - get_timestamp_us() * 1000) / 1000000;
I noticed that the returned value is normally slightly less than 300'000mS which looks correct and keep track of processing time.
Everything works and I have reached Accuracy of 3 after few hours. So far, so good.
The point is that sometimes the returned timing is much less than the expected value of 300'000.
I've seen sometimes 130'000, 50'000, or even 24'000, but still the accuracy tells me that the library should be in track.
Is this normal, or I may have some problem in some initialization code?
I'll have a deep look at your code, but at first sight looks very similar to mine.
And indeed it works well also to me. I get IAQ and accuracy up to 3 and other reasonable values for virtual sensors.
The point is that sometimes the returned interval value is very different from the expected 300Secs, and wanted to know if that is normal ...