I'm using a BME280 to estimate the altitude of an object.
I designed a Kalman Filter to fuse IMU and Barometer data to better estimate the movements relative to Z axis.
My problem is that I noticed a drift of the measured pressure of the BME280 related to the raising/dropping of its own temperature.
According to the datasheed of BME280 the pressure and the temperature are related by a "Temperature coefficient of offset" (TCO) of 1.5 Pa/°K.
Analizing the pressure compensation formula of the BME280 API provided by Bosh (I'm using the latest version from Github using "double formulas") I supposed that this "drift" was corrected by the "t_fine" value, but according to my measurements this is not the case, as you can see in the following comparison plots:
The acquisition lasts about one hour and the temperature raises for self-heating of the PCB that is closed inside an alluminium case.
The device is placed in a room by itself and no doors and no windows are opened/closed.
Do I have to calibrate the sensor looking for the formula that links the pressure trend to the increase/drop in temperature?
It is really important for my application to understand this behavior because 1 Pa of pressure is about 8 cm of altitude difference and a steady sensor is reported as moving for more than 1 meter in less than 10 minute...
From the data shared we cannot tell if the drift observed in the output is due to the sensor or to envrionmental conditions (e.g. local weather changes). It would be helpful to reproduce tests in controlled conditions, and/or include data from an appropriate reference sensor.