Bosch Sensortec Community

    cancel
    Showing results for 
    Search instead for 
    Did you mean: 
    SOLVED

    BME680 + BSEC lib + ARM Cortex M0+ (STM32L071) optimization for battery powered applications?

    BME680 + BSEC lib + ARM Cortex M0+ (STM32L071) optimization for battery powered applications?

    piconomix
    New Poster

    Hi there!

    To verify that my integration is correct, I used the BSEC configuration "generic_33v_3s_4d" and the sampling rate BSEC_SAMPLE_RATE_LP (one sample every 3 seconds).

    The BSEC library wakes up every 3 seconds to take a BME680 reading and report the calculated values (IAQ, eCO2, VOC, ...). The whole operation takes a whopping 281 ms out of 3000 ms and that will kill the battery in no time 😞

    I then proceeded to use the BSEC configuration "generic_33v_300s_4d" and the sampling rate BSEC_SAMPLE_RATE_ULP (one sample every 300 seconds).

    The BSEC library still wants to wake up every 3 seconds, but only wastes 3 ms each time. Every 300 seconds it takes a reading and reports calculated values. This operation takes 2031 ms, not 281 ms. This is still bad if the application is supposed to last for years.

    Why does the BSEC library want to wake up every 3 seconds? Can I safely ignore and only call it once every 300 seconds?

    Are there BSEC library settings to improve battery life, i.e. sleep for a long time, wake up to take a measurement and perform a calculation quickly and then go back to sleep?

    Why must the time stamps be in 64-bit nanosecond resolution? It feels like overkill and a waste to me on a 32-bit architecture. Why would a 32-bit millisecond time stamps not suffice?

    Thanks in advance,

    Pieter

    3 REPLIES 3

    handytech
    Community Moderator
    Community Moderator

    @piconomix wrote:

    Every 300 seconds it takes a reading and reports calculated values.


    This seems to indicate that BSEC_SAMPLE_RATE_ULP was successfully used in bsec_update_subscription().


    @piconomix wrote:

    The BSEC library still wants to wake up every 3 seconds, but only wastes 3 ms each time.


    But this seems to indicate that generic_33v_300s_4d was possibly not loaded properly, or there is some configuration mismatch somewhere.

    If I try to reproduce your setup:

    • generic_33v_300s_4d configuration,
    • BSEC_SAMPLE_RATE_ULP  used for all outputs in bsec_update_subscription(),
    • calling bsec_sensor_control() with the expected 300s delay runs successfully,
    • calling bsec_sensor_control() after a delay of only 3seconds, triggers a Warning:

     

    bsec_sensor_control() called at 6004ms, next call expected at 306004ms.
    bsec_sensor_control() called at 306004ms, next call expected at 606004ms.
    In bsec_sensor_control() at 309004ms:
    BSEC WARNING: 100. Difference between actual and defined sampling intervals of bsec_sensor_control() greater
    than allowed.

     


    @piconomix wrote:

    Why does the BSEC library want to wake up every 3 seconds? Can I safely ignore and only call it once every 300 seconds?

    Are there BSEC library settings to improve battery life, i.e. sleep for a long time, wake up to take a measurement and perform a calculation quickly and then go back to sleep?


    Please review your implementation, that you are importing a valid config string from the correct file, that you are making use of next_call and process_data structure elements returned by BSEC, etc. Feel free to share which reference code you've been using, or your relevant source code snippets.

    For optimal power savings, it is even possible to completely turn off (or deep-sleep) your MCU/system. In this case BSEC is operated slightly differently, mainly you would need to add the extra steps of saving/restoring BSEC's state in some NVM between every samples, and keep track of an absolute timestamp. This approach was for instance discussed there, but for another platform..


    @piconomix wrote:

    Why must the time stamps be in 64-bit nanosecond resolution? It feels like overkill and a waste to me on a 32-bit architecture. Why would a 32-bit millisecond time stamps not suffice?


    I believe that is just a matter of handling overflows somewhere, and I guess BSEC expects it to be handled by the host. BSEC relies on accurate timings for optimal performance, and with absolute/continuous timestamps, it can trigger appropriate errors/warnings if violations are detected (as seen above). With an unsigned 32bit integer, assuming the timestamp is in milliseconds, one can count up to (2^32-1)/1000/3600/24=49.7days, this means the buffer would overflow before 2months of continuous operation.

    @handytec,

    Thanks for the response and advice 🙂 FYI: we are using the BME680 in a battery powered IoT device that must take a measurement and send an RF packet with the data to a gateway that relays the info to a server. It is vital that the device's battery last as long as possible.

    I suspect that my Makefile did not pick up the change to the "bsec_serialized_configurations_iaq.c" file and used an old object file. After fixing it, the BSEC lib expects to be called every 300 seconds. Here is my simplified super loop that performes the operation every 300 seconds and feeds it an artificial timestamp:

        // Configure RTC to wake up micro once every 5 minutes (300 seconds)
        px_rtc_wakeup_tmr_enable(PX_RTC_WAKEUP_PRESC_CLK_DIV_1, 300 - 1);
    
        while (1)
        {
            // 300 seconds passed?
            if(px_rtc_wakeup_tmr_has_expired())
            {
                px_sysclk_ticks_t start_ms = px_sysclk_get_tick_count();
                // Update timestamp in nanoseconds before calling bsec_sensor_control()
                time_stamp += 300e9;
                // Retrieve sensor settings to be used in this time instant by calling bsec_sensor_control
                ret = bsec_sensor_control(time_stamp, &env_sensor_settings);
                PX_DBG_INFO("bsec_sensor_control() returned %d", ret);
                // Trigger a measurement if necessary
                bme680_bsec_trigger_measurement(&env_sensor_settings, sleep);
                // Read data from last measurement
                num_bsec_inputs = 0;
                bme680_bsec_read_data(time_stamp, env_sensor_bsec_inputs, &num_bsec_inputs, env_sensor_settings.process_data);
                // Time to invoke BSEC to perform the actual processing
                bme680_bsec_process_data(env_sensor_bsec_inputs, num_bsec_inputs, output_ready);
                PX_DBG_INFO("Duration = %u ms", px_sysclk_get_tick_count() - start_ms);
            }
        }

    I let the code run and here is a sample of my debug output:

    I 03603.529 env 137 : IAQ = 74.959404 (46.823772) Accuracy 1
    I 03603.530 env 138 : Temp = 24.963551 (24.964138)
    I 03603.535 env 139 : Hum = 46.385796 (46.384171)
    I 03603.540 env 140 : Press = 100085.429688
    I 03603.545 env 141 : Gas = 334342.656250
    I 03603.549 env 142 : eCO2 = 587.295105
    I 03603.554 env 143 : VOC = 0.695413
    I 03603.558 env 537 : Duration = 2032 ms
    I 03901.491 env 529 : bsec_sensor_control() returned 0
    I 03903.493 env 137 : IAQ = 56.065483 (38.570335) Accuracy 1
    I 03903.494 env 138 : Temp = 25.041248 (25.041836)
    I 03903.499 env 139 : Hum = 46.475689 (46.474064)
    I 03903.504 env 140 : Press = 100083.601562
    I 03903.509 env 141 : Gas = 336407.125000
    I 03903.513 env 142 : eCO2 = 554.281372
    I 03903.518 env 143 : VOC = 0.613845
    I 03903.522 env 537 : Duration = 2032 ms
    I 04201.468 env 529 : bsec_sensor_control() returned 0
    I 04203.470 env 137 : IAQ = 72.424126 (45.716286) Accuracy 1
    I 04203.471 env 138 : Temp = 25.103903 (25.104490)
    I 04203.476 env 139 : Hum = 46.501877 (46.500244)
    I 04203.481 env 140 : Press = 100090.484375
    I 04203.486 env 141 : Gas = 332980.343750
    I 04203.490 env 142 : eCO2 = 582.865112
    I 04203.495 env 143 : VOC = 0.683868

     Why does the IAQ value vary so much between readings?

    If we were not interrested in IAQ, because of the excessive calculation burden, but only eCO2 and VOC, would there be a simpler formula / procedure available to calculate it from the gas resistance?

    Thanks in advance,

    Pieter

    handytech
    Community Moderator
    Community Moderator

    @piconomix wrote:

    I suspect that my Makefile did not pick up the change to the "bsec_serialized_configurations_iaq.c" file and used an old object file. After fixing it, the BSEC lib expects to be called every 300 seconds.


    Good to hear 🙂


    @piconomix wrote:

    Why does the IAQ value vary so much between readings?


    Honestly hard to say from 3 data points, and without knowing the environmental conditions during the 15 minutes observed here. What I can tell from this little amount data is that:

    • The IAQ is reacting to stimuli in the raw rag resistance signal, thus it seems to behave as expected,
    • It looks like IAQ accuracy is still at 1. This typically indicates that BSEC hasn't experienced significant enough stimuli (yet?), thus the self-calibration status is uncertain and could explain such variations. If the second value shows sIAQ with less significant variations, this could consolidate this hypothesis.

    @piconomix wrote:

    If we were not interrested in IAQ, because of the excessive calculation burden, but only eCO2 and VOC, would there be a simpler formula / procedure available to calculate it from the gas resistance?


    Which outputs are enabled will not impact the integration efforts. In fact, eCO2 and bVOCeq are derived from the sIAQ output, thus the same internal calculations would still be needed.


    @piconomix wrote:

    FYI: we are using the BME680 in a battery powered IoT device that must take a measurement and send an RF packet with the data to a gateway that relays the info to a server. It is vital that the device's battery last as long as possible.


    Since you mention it, a valid integration is also to collect only raw BME680 data at the edge, and run BSEC in your gateway or server. Of course of the BME680 needs to be operated with the appropriate settings, and an absolute timestamp is still mandatory. If BSEC outputs are still needed at the edge (e.g. to be displayed on a screen), you would need to consider how to stream the relevant data back to your device though. Pros and cons must be defined for your specific use-case.

    Icon--AD-black-48x48Icon--address-consumer-data-black-48x48Icon--appointment-black-48x48Icon--back-left-black-48x48Icon--calendar-black-48x48Icon--center-alignedIcon--Checkbox-checkIcon--clock-black-48x48Icon--close-black-48x48Icon--compare-black-48x48Icon--confirmation-black-48x48Icon--dealer-details-black-48x48Icon--delete-black-48x48Icon--delivery-black-48x48Icon--down-black-48x48Icon--download-black-48x48Ic-OverlayAlertIcon--externallink-black-48x48Icon-Filledforward-right_adjustedIcon--grid-view-black-48x48IC_gd_Check-Circle170821_Icons_Community170823_Bosch_Icons170823_Bosch_Icons170821_Icons_CommunityIC-logout170821_Icons_Community170825_Bosch_Icons170821_Icons_CommunityIC-shopping-cart2170821_Icons_CommunityIC-upIC_UserIcon--imageIcon--info-i-black-48x48Icon--left-alignedIcon--Less-minimize-black-48x48Icon-FilledIcon--List-Check-grennIcon--List-Check-blackIcon--List-Cross-blackIcon--list-view-mobile-black-48x48Icon--list-view-black-48x48Icon--More-Maximize-black-48x48Icon--my-product-black-48x48Icon--newsletter-black-48x48Icon--payment-black-48x48Icon--print-black-48x48Icon--promotion-black-48x48Icon--registration-black-48x48Icon--Reset-black-48x48Icon--right-alignedshare-circle1Icon--share-black-48x48Icon--shopping-bag-black-48x48Icon-shopping-cartIcon--start-play-black-48x48Icon--store-locator-black-48x48Ic-OverlayAlertIcon--summary-black-48x48tumblrIcon-FilledvineIc-OverlayAlertwhishlist