Hi
For a research project, I am currently trying to read the acceleration data from the BMA280. This chip is used by silabs on there WSTK expansion board. At this moment I'm following a complete procedure to initialize the accelerometer.
- Perform self test
- Check if difference from read out of +X en -X is bigger than 800mg (answer: check)
- Check if difference from read out of +Y en -Y is bigger than 800mg (answer: check)
- Check if difference from read out of +Z en -Z is bigger than 400mg (answer: check)
- Soft reset the accelerometer
- wait 1,5 sec
- Initialize the accelerometer to my preferences
- Perform Fast compensation
- Clear offset register
- Set offset compensation to X: 0g, Y: 0g, Z:+1g, BW 1%
- X-axis calibration
- Y-axis calibration
- Z-axis calibration
- Read offset X my answer: x = -31 mg
- Read offset Y my answer: y = -15 mg
- Read offset Z my answer: z = 0 mg
- init done
The values afterwards are x=+-20mg, y=+-2mg en z=+- -19mg. I expected the values would be x=+-0g, y=+-0g en z=+- -1g when the chip is laying flat on my desk.
I convert the acceleration data in this way.
volatile float factor = 0.488; // 8000/2^14 mg/LSB
// X-Axis in 0x02-0x03
LSBx = BMA280_read_register(USART1, BMA280_ACCD_X_LSB);
LSBx = LSBx >> 2;
MSBx = BMA280_read_register(USART1, BMA280_ACCD_X_MSB);
tmpx = (uint16_t)((MSBx << 6) | LSBx);
acceleration[0] = (float) tmpx * factor;
Can somebody confirm me If this is correct? Or give me the correct init from the accelerometer?
Thank you in advance
Best Regards
Nick