02-19-2019 01:24 PM - edited 02-20-2019 11:33 AM
Would be nice to get some better understanding about the internal definition of the IAQ accuracy.
Following will happen:
1) starting sensor - IAQ=0, accuracy=0
2) some later - IAQ=50, accuracy=0 (as sample)
3) some later - IAQ=55, accuracy=1 (as sample)
4) some later - IAQ=87, accuracy=1 (as sample)
.....
1000) some later - IAQ=76, accuracy=2 (as sample)
......
2000) some later - IAQ=155, accuracy=2 (as sample)
......
3000) some later - IAQ=125, accuracy=3 (as sample)
THEN
3000) some later - IAQ=87, accuracy=2 (as sample)
3005) some later - IAQ=25, accuracy=2 (as sample)
3010) some later - IAQ=5, accuracy=2 (as sample)
3020) some later - IAQ=0, accuracy=2 (as sample)
3021) some later - IAQ=0, accuracy=2 (as sample)
3022) some later - IAQ=0, accuracy=2 (as sample)
.......for long time
xxxx) some later - IAQ=0, accuracy=2 (as sample)
--------------------------------
As far as I understand the sensor is collecting data and after was decided that enough data available the accuracy will be set to 1/2/3.
So the questions:
1) how long this process should be going on, and what kind of data needed?
(one senor here is already collecting data since >10 hours (every 3 seconds) and still have accuracy=1)
2) what is the condition for accuracy=2 and accuracy=3? Can somebody describe the internal workflow?
3) How can it be correct that accuracy=2 -> but IAQ=0 for a long time?
I think in such a case there could be an internal "reset" of collected data (or recalculation/shift) and accuracy should be set back to "0", right?
Current accuracy=2 means - not trust these values, they can be wrong at all, right?
But if this is right - after what time we can expect an accuracy=3 then???
Maybe somebody from development can tell us a bit more about this?
Thanks!
Michael
Solved! Go to Solution.
05-23-2019 01:48 PM
@ljh95 wrote:1. Is the calibrating algorithm triggered not just by the deviation in IAQ but also it has to satisfy accumulation of certain amount of data(which is pretty much same as duration of its running time since it has first turned on) if the whole state is reseted ?
The calibrating algorithm is always active and monitoring the baseline, as well as the amplitude of the signal. If you delete the state, it will take the same amount of time to get back to 3/3 in the same conditions.
@ljh95 wrote:2. In order to make new device to reach accuracy 3 , would it be fine to save the state of one device which has already reached accuracy of 3. And then load this to the newly reseted device?
No. . Each device is unique and mix-match of state file will only increase the amount of time to calibrate, and decrease the accuracy before it calibrates.
06-09-2019 09:46 PM
I've been wondering about the meaning of the IAQ number and the raw resistance.
It appears to me the BME680 gas sensor is a device that generally reacts to VOC's. I'm guessing the sensor is not stable (most likely the zero offset but not sure) and needs to be exposed to different environments to get a sense of high and low VOC exposure.
If I'm correct a sensor exposed to a constant VOC environment would never leave the [1] accuracy.
When I first got my sensor running I let it run for 3 to 4 days. I had not yet gotted the BSEC code working. When I got the BSEC code working (but no state saving) I performed the following:
So it seems the IAQ algorithm takes what is generally expected from the VOC sensor, factors in the min and max VOC exposure and tries to create some calibration coefficients.
For those who know more about this sensor than us mere mortals, is there a way to simulate some high and low VOC conditions (in the home) to accelerate the calibration procedure? And perhaps increase the accuracy.
I was thinking about a near 100% helium for the "clean air" point. I thought about a container with 100% ethanol (sensor would be in the vapor, not liquid) but don't know off hand how to extimeat the concentration, and I'm concerned such a high VOC level might shift the sensor output and or damage the sensor.
Having said all this, I realize the VOC sensor is what I would call a "ballpark" sensor, but I'm a curious fellow 🙂 and am interested what goes into the IAQ reading.
06-11-2019 09:28 AM
06-11-2019 09:33 AM
@JohnRob wrote:
For those who know more about this sensor than us mere mortals, is there a way to simulate some high and low VOC conditions (in the home) to accelerate the calibration procedure? And perhaps increase the accuracy.
Hi JohnRob,
Firstly, HAHHAHAHAHHA - we are all mere mortals.
Secondly, you can also use a white board marker to trigger the the IAQ accuracy 2 state. I will check with our experts and get back to you for how one can have a desk based setup to completely calibrate the sensor.
Regards,
kgoveas
03-28-2023 12:34 PM
Sorry to revive this old post but it is the best info I have found on this topic so far.
I have been using the slightly newer BME688 and have this same calibration issue.
If I turn the sensor on in the office I'm in, it normally sits just above 50 IAQ. However if I turn the sensor on outside for 10mins it gets a baseline of good air quality, so when I bring it back inside it goes up towards 150 IAQ.
It seems the calibration depends on the extremes that the sensor sees to determined the IAQ value, it can't tell if the air is bad if it hasn't seen other quality levels.
Is there a way to load a profile of captured data as the baseline calibration, so it knows what is good and bad when it loads, so it doesn't need to be shown the extremes each time it is restarted?
"I will check with our experts and get back to you" I'm hoping 3.5 years is enough time to get more information hahaha.
Thank you for your help.