I need to know how raw magnetometer data is processed please.
When I plot raw magnetometer data with the CHR serial interface, the range is roughly -100 to 150.
When I plot processed magnetometer data, the range is roughly -05 to 1.5, and this does not look like 16 bit data. What’s going on?
After the magnetometer is calibrated, the processed data should be normalized so that the 3-element mag vector is unit-norm regardless of the orientation. If the processed data is not unit-norm, then it reflects poor calibration or the possible presence of nearby objects distorting the magnetic field.
Also note that the processed mag data registers (addresses 0x69, 0x6A, and 0x6B) are 32-bit floating-point values, while the raw data is stored as 16-bit integers.
That could be tricky. The process of converting the raw data to processed data involves a floating point matrix multiply and an addition. To do that on a microcontroller, you’d need to pull the floating point calibration terms off the device beforehand, convert them to fixed-point, and implement a fixed-point matrix multiply algorithm on the micro.
Viewing 4 posts - 1 through 4 (of 4 total)
The forum ‘UM7 Product Support’ is closed to new topics and replies.
Thank you very much for your patronage. We are experiencing production and shipment delays due to the coronavirus infection that is currently occurring. We apologize for any inconvenience this may cause. We will work hard to deliver the product as soon as possible. Dismiss