The BMP280 and other pressure sensor datasheets list a relative accuracy specification that is tighter than absolute accuracy. In some industries (metrology for example), relative accuracy is the accuracy of the instrument compared to the calibration standard while absolute accuracy takes the error of the calibration standard into account, as well. Is that what is meant by the BMP280 relative accuracy?
Or, alternatively, does the relative accuracy instead mean the accuracy of a measurement relative to a prior measurement from the same device? E.g., if I measure 900hPA and then measure 910hPA, does the relative accuracy mean that the second measurement is actually 910-900=10hPa+/-0.12hPa higher than the former?
If so, how does this compound as pressure is increased and decreased? If I applied a 900, 910, 900, 910 hPa square wave, would the sensor values be repeatable?
With BMP280, there can be offsets induced by the soldering process, which greatly impacted the absolute accuracy. If this offset is removed via calibration from a reference device, then the relative accuracy is the key parameter.
For BMP280, It is defined over a 200hPa window (approximately 2000m of elevation gain) and the relative accuracy is the maximum deviation from the expected output over the entire measurement window.
That is, you would expect the sensor to measure linearly 1Pa = 1Pa, that measurement the different from a perfect linear response. Over smaller altitude gain, of course the accuracy is much greater. BMP280 can reliably detect about 1m~ of elevation change.