This device produces readings in G, but G is not a constant value over the surface of the earth. What value of G, in m/s2, is represented by an output value of unity? Is 1G the value of the earth's gravitational field at the location at which the device is calibrated, or is it calibrated in m/s2 and 1G represents a constant, such as 9.81m/s2?
Solved! Go to Solution.
The sensor refers to the 1g value of the location where the last calibration was performed. For most applications, the slight difference in g values at different locations doesn’t influence the application performance.
Hi handytech, what do you mean by "the last calibration"? We're quite interested in very precise gravity measurements that require using our locally estimated gravitational acceleration. Can we redo this calibration or is it only done at the factory? Is it possible to read out the stored constant or at least know what is used inside the sensors?
Thanks for pointing to the application note. Still, the sensor has to be calibrated in the factory and some conversion factor has to be defined to express the specific forces in g. I am wondering what this conversion factor - calibrated or assumed value for g - is by default.
In addition, I am wondering how a constant bias can be distinguished for a variation in g. Maybe I am missing out something, but by simple static calibration as described in the applcation note, it is not possible to do so. No?