I'm trying to learn about the offset compensation feature of the BMA456. There's a small section in the datasheet dedicated to this topic, but I would like further clarification from you regarding the process involved, please.
Is it basically a manual process whereby three accelerometer readings are taken in the "starting position", for the three axes, and then these three values written by the user into registers, OFFSET_0, _1, and _2, such that all subsequent readings produced by the device will take these offsets into account when producing those readings (providing this has all been enabled via the acc_off_en bit) ?
So it's a way in which we can (ideally) produce three zero outputs for x, y, and z, at this starting position, because we have previously performed this compensation process at that same position ? The datasheet has a separate heading "manual offset compensation", but as far as I can see there is no "automatic" process ? By that I mean the user needs to manually read three values from the accelerometer then write these to those three registers then set that bit, and after this the device compensates future readings with the saved values ? So there is no automatic way for the device to populate those three registers by itself - is that correct ?
Is it true to say this compensation process doesn't increase accuracy rather it just ensures, or tries to ensure, a "zero output" at the desired "starting" position ?
The datasheet says the device manual compensation as well as inline calibration, but I don't see the difference - can you please explain ?
How does the accuracy of the BMA456 vary, if at all, with its starting position ? If the device is initially "perfectly" flat, such that x = 0g, y = 0g, and z = 1g, will its future readings be more or less accurate than if its initial position was not in this "perfect" position ? Basically, is the device equally accurate with regards to its x, y, and z output readings regardless of its initial position, or is there a best initial position ?
Whether to perform offset compensation depends on the requirements of the application, and it is not necessary to perform offset compensation. Offset compensation is only for offset, not for accuracy. As described in the data sheet, manual offset compensation is to write the read x, y and Z data to the offset compensation register. Inline offset compensation is to write the offset value to NVM.
Ok BSTRobin that's clear, and thanks for the quick reply.
As for the second part of my query ...
"How does the accuracy of the BMA456 vary, if at all, with its starting position ? If the device is initially "perfectly" flat, such that x = 0g, y = 0g, and z = 1g, will its future readings be more or less accurate than if its initial position was not in this "perfect" position ? Basically, is the device equally accurate with regards to its x, y, and z output readings regardless of its initial position, or is there a best initial position ?"
... does that make any sense, or is it irrelevant ? I guess I'm asking if the BMA456 is as accurate when moving from, say, 90 degrees to 89 degrees as it would be when moving from 60 degrees to 59 degrees ? I've heard that accelerometers are not linear in their accuracy as they are moved from one angle to another, and was wondering if this is true of the BMA456 and if so what information can you provide, please ?