03-30-2020 04:13 PM - edited 03-30-2020 04:37 PM
Hi there!
To verify that my integration is correct, I used the BSEC configuration "generic_33v_3s_4d" and the sampling rate BSEC_SAMPLE_RATE_LP (one sample every 3 seconds).
The BSEC library wakes up every 3 seconds to take a BME680 reading and report the calculated values (IAQ, eCO2, VOC, ...). The whole operation takes a whopping 281 ms out of 3000 ms and that will kill the battery in no time 😞
I then proceeded to use the BSEC configuration "generic_33v_300s_4d" and the sampling rate BSEC_SAMPLE_RATE_ULP (one sample every 300 seconds).
The BSEC library still wants to wake up every 3 seconds, but only wastes 3 ms each time. Every 300 seconds it takes a reading and reports calculated values. This operation takes 2031 ms, not 281 ms. This is still bad if the application is supposed to last for years.
Why does the BSEC library want to wake up every 3 seconds? Can I safely ignore and only call it once every 300 seconds?
Are there BSEC library settings to improve battery life, i.e. sleep for a long time, wake up to take a measurement and perform a calculation quickly and then go back to sleep?
Why must the time stamps be in 64-bit nanosecond resolution? It feels like overkill and a waste to me on a 32-bit architecture. Why would a 32-bit millisecond time stamps not suffice?
Thanks in advance,
Pieter
Solved! Go to Solution.
03-30-2020 07:02 PM
@piconomix wrote:
Every 300 seconds it takes a reading and reports calculated values.
This seems to indicate that BSEC_SAMPLE_RATE_ULP was successfully used in bsec_update_subscription().
@piconomix wrote:
The BSEC library still wants to wake up every 3 seconds, but only wastes 3 ms each time.
But this seems to indicate that generic_33v_300s_4d was possibly not loaded properly, or there is some configuration mismatch somewhere.
If I try to reproduce your setup:
bsec_sensor_control() called at 6004ms, next call expected at 306004ms.
bsec_sensor_control() called at 306004ms, next call expected at 606004ms.
In bsec_sensor_control() at 309004ms:
BSEC WARNING: 100. Difference between actual and defined sampling intervals of bsec_sensor_control() greater
than allowed.
@piconomix wrote:
Why does the BSEC library want to wake up every 3 seconds? Can I safely ignore and only call it once every 300 seconds?
Are there BSEC library settings to improve battery life, i.e. sleep for a long time, wake up to take a measurement and perform a calculation quickly and then go back to sleep?
Please review your implementation, that you are importing a valid config string from the correct file, that you are making use of next_call and process_data structure elements returned by BSEC, etc. Feel free to share which reference code you've been using, or your relevant source code snippets.
For optimal power savings, it is even possible to completely turn off (or deep-sleep) your MCU/system. In this case BSEC is operated slightly differently, mainly you would need to add the extra steps of saving/restoring BSEC's state in some NVM between every samples, and keep track of an absolute timestamp. This approach was for instance discussed there, but for another platform..
@piconomix wrote:
Why must the time stamps be in 64-bit nanosecond resolution? It feels like overkill and a waste to me on a 32-bit architecture. Why would a 32-bit millisecond time stamps not suffice?
I believe that is just a matter of handling overflows somewhere, and I guess BSEC expects it to be handled by the host. BSEC relies on accurate timings for optimal performance, and with absolute/continuous timestamps, it can trigger appropriate errors/warnings if violations are detected (as seen above). With an unsigned 32bit integer, assuming the timestamp is in milliseconds, one can count up to (2^32-1)/1000/3600/24=49.7days, this means the buffer would overflow before 2months of continuous operation.
03-31-2020 02:52 PM
@handytec,
Thanks for the response and advice 🙂 FYI: we are using the BME680 in a battery powered IoT device that must take a measurement and send an RF packet with the data to a gateway that relays the info to a server. It is vital that the device's battery last as long as possible.
I suspect that my Makefile did not pick up the change to the "bsec_serialized_configurations_iaq.c" file and used an old object file. After fixing it, the BSEC lib expects to be called every 300 seconds. Here is my simplified super loop that performes the operation every 300 seconds and feeds it an artificial timestamp:
// Configure RTC to wake up micro once every 5 minutes (300 seconds)
px_rtc_wakeup_tmr_enable(PX_RTC_WAKEUP_PRESC_CLK_DIV_1, 300 - 1);
while (1)
{
// 300 seconds passed?
if(px_rtc_wakeup_tmr_has_expired())
{
px_sysclk_ticks_t start_ms = px_sysclk_get_tick_count();
// Update timestamp in nanoseconds before calling bsec_sensor_control()
time_stamp += 300e9;
// Retrieve sensor settings to be used in this time instant by calling bsec_sensor_control
ret = bsec_sensor_control(time_stamp, &env_sensor_settings);
PX_DBG_INFO("bsec_sensor_control() returned %d", ret);
// Trigger a measurement if necessary
bme680_bsec_trigger_measurement(&env_sensor_settings, sleep);
// Read data from last measurement
num_bsec_inputs = 0;
bme680_bsec_read_data(time_stamp, env_sensor_bsec_inputs, &num_bsec_inputs, env_sensor_settings.process_data);
// Time to invoke BSEC to perform the actual processing
bme680_bsec_process_data(env_sensor_bsec_inputs, num_bsec_inputs, output_ready);
PX_DBG_INFO("Duration = %u ms", px_sysclk_get_tick_count() - start_ms);
}
}
I let the code run and here is a sample of my debug output:
I 03603.529 env 137 : IAQ = 74.959404 (46.823772) Accuracy 1
I 03603.530 env 138 : Temp = 24.963551 (24.964138)
I 03603.535 env 139 : Hum = 46.385796 (46.384171)
I 03603.540 env 140 : Press = 100085.429688
I 03603.545 env 141 : Gas = 334342.656250
I 03603.549 env 142 : eCO2 = 587.295105
I 03603.554 env 143 : VOC = 0.695413
I 03603.558 env 537 : Duration = 2032 ms
I 03901.491 env 529 : bsec_sensor_control() returned 0
I 03903.493 env 137 : IAQ = 56.065483 (38.570335) Accuracy 1
I 03903.494 env 138 : Temp = 25.041248 (25.041836)
I 03903.499 env 139 : Hum = 46.475689 (46.474064)
I 03903.504 env 140 : Press = 100083.601562
I 03903.509 env 141 : Gas = 336407.125000
I 03903.513 env 142 : eCO2 = 554.281372
I 03903.518 env 143 : VOC = 0.613845
I 03903.522 env 537 : Duration = 2032 ms
I 04201.468 env 529 : bsec_sensor_control() returned 0
I 04203.470 env 137 : IAQ = 72.424126 (45.716286) Accuracy 1
I 04203.471 env 138 : Temp = 25.103903 (25.104490)
I 04203.476 env 139 : Hum = 46.501877 (46.500244)
I 04203.481 env 140 : Press = 100090.484375
I 04203.486 env 141 : Gas = 332980.343750
I 04203.490 env 142 : eCO2 = 582.865112
I 04203.495 env 143 : VOC = 0.683868
Why does the IAQ value vary so much between readings?
If we were not interrested in IAQ, because of the excessive calculation burden, but only eCO2 and VOC, would there be a simpler formula / procedure available to calculate it from the gas resistance?
Thanks in advance,
Pieter
03-31-2020 05:05 PM
@piconomix wrote:
I suspect that my Makefile did not pick up the change to the "bsec_serialized_configurations_iaq.c" file and used an old object file. After fixing it, the BSEC lib expects to be called every 300 seconds.
Good to hear 🙂
@piconomix wrote:
Why does the IAQ value vary so much between readings?
Honestly hard to say from 3 data points, and without knowing the environmental conditions during the 15 minutes observed here. What I can tell from this little amount data is that:
@piconomix wrote:
If we were not interrested in IAQ, because of the excessive calculation burden, but only eCO2 and VOC, would there be a simpler formula / procedure available to calculate it from the gas resistance?
Which outputs are enabled will not impact the integration efforts. In fact, eCO2 and bVOCeq are derived from the sIAQ output, thus the same internal calculations would still be needed.
@piconomix wrote:
FYI: we are using the BME680 in a battery powered IoT device that must take a measurement and send an RF packet with the data to a gateway that relays the info to a server. It is vital that the device's battery last as long as possible.
Since you mention it, a valid integration is also to collect only raw BME680 data at the edge, and run BSEC in your gateway or server. Of course of the BME680 needs to be operated with the appropriate settings, and an absolute timestamp is still mandatory. If BSEC outputs are still needed at the edge (e.g. to be displayed on a screen), you would need to consider how to stream the relevant data back to your device though. Pros and cons must be defined for your specific use-case.