Would be nice to get some better understanding about the internal definition of the IAQ accuracy.
Following will happen:
1) starting sensor - IAQ=0, accuracy=0
2) some later - IAQ=50, accuracy=0 (as sample)
3) some later - IAQ=55, accuracy=1 (as sample)
4) some later - IAQ=87, accuracy=1 (as sample)
.....
1000) some later - IAQ=76, accuracy=2 (as sample)
......
2000) some later - IAQ=155, accuracy=2 (as sample)
......
3000) some later - IAQ=125, accuracy=3 (as sample)
THEN
3000) some later - IAQ=87, accuracy=2 (as sample)
3005) some later - IAQ=25, accuracy=2 (as sample)
3010) some later - IAQ=5, accuracy=2 (as sample)
3020) some later - IAQ=0, accuracy=2 (as sample)
3021) some later - IAQ=0, accuracy=2 (as sample)
3022) some later - IAQ=0, accuracy=2 (as sample)
.......for long time
xxxx) some later - IAQ=0, accuracy=2 (as sample)
--------------------------------
As far as I understand the sensor is collecting data and after was decided that enough data available the accuracy will be set to 1/2/3.
So the questions:
1) how long this process should be going on, and what kind of data needed?
(one senor here is already collecting data since >10 hours (every 3 seconds) and still have accuracy=1)
2) what is the condition for accuracy=2 and accuracy=3? Can somebody describe the internal workflow?
3) How can it be correct that accuracy=2 -> but IAQ=0 for a long time?
I think in such a case there could be an internal "reset" of collected data (or recalculation/shift) and accuracy should be set back to "0", right?
Current accuracy=2 means - not trust these values, they can be wrong at all, right?
But if this is right - after what time we can expect an accuracy=3 then???
Maybe somebody from development can tell us a bit more about this?
Thanks!
Michael