cancel
Showing results for 
Search instead for 
Did you mean: 

How to get raw sensor data of Oculus with the SDK later than v1.3?

zhangyuchi45
Honored Guest
I've tried getting raw sensor data (raw acceleration, gyro, magnetometer) of my Oculus DK2 via ovrTrackingState::RawSensorData, which is wrote in the official document of SDK v0.4 to v0.8.
But it seems that the section related to raw sensor data doesn't in the document of later version than 1.3. And there is no RawSensorData in ovrTrackingState.
So can I get the raw sensor data of an Oculus DK2 or CV1 through the official SDK? Or can I get it from another way?




4 REPLIES 4

galopin
Heroic Explorer
All you have for the head seat and touch controllers is that :smile: They have been renamed, but they are what you are asking for i believe.

typedef struct OVR_ALIGNAS(8) ovrPoseStatef_
{
ovrPosef ThePose; ///< Position and orientation.
ovrVector3f AngularVelocity; ///< Angular velocity in radians per second.
ovrVector3f LinearVelocity; ///< Velocity in meters per second.
ovrVector3f AngularAcceleration; ///< Angular acceleration in radians per second per second.
ovrVector3f LinearAcceleration; ///< Acceleration in meters per second per second.
OVR_UNUSED_STRUCT_PAD(pad0, 4) ///< \internal struct pad.
double TimeInSeconds; ///< Absolute time that this pose refers to. \see ovr_GetTimeInSeconds
} ovrPoseStatef;

zhangyuchi45
Honored Guest

galopin said:

All you have for the head seat and touch controllers is that :smile: They have been renamed, but they are what you are asking for i believe.

typedef struct OVR_ALIGNAS(8) ovrPoseStatef_
{
ovrPosef ThePose; ///< Position and orientation.
ovrVector3f AngularVelocity; ///< Angular velocity in radians per second.
ovrVector3f LinearVelocity; ///< Velocity in meters per second.
ovrVector3f AngularAcceleration; ///< Angular acceleration in radians per second per second.
ovrVector3f LinearAcceleration; ///< Acceleration in meters per second per second.
OVR_UNUSED_STRUCT_PAD(pad0, 4) ///< \internal struct pad.
double TimeInSeconds; ///< Absolute time that this pose refers to. \see ovr_GetTimeInSeconds
} ovrPoseStatef;


Thanks galopin, but I don't think the AngularVelocity and LinearAcceleration are the raw sensor data from gyro and IMU, since these items also exist in SDK v0.8, where the raw sensor data is provide as the vairiable RawSensorData with the struct ovrSensorData (check the bolded part in the following quoted code). And about the ovrPoseStatef HeadPose, there is no difference in v0.8 and v1.3 or later.
typedef struct OVR_ALIGNAS(8) ovrTrackingState_
{
/// Predicted head pose (and derivatives) at the requested absolute time.
/// The look-ahead interval is equal to (HeadPose.TimeInSeconds - RawSensorData.TimeInSeconds).
ovrPoseStatef HeadPose;

/// Current pose of the external camera (if present).
/// This pose includes camera tilt (roll and pitch). For a leveled coordinate
/// system use LeveledCameraPose.
ovrPosef CameraPose;

/// Camera frame aligned with gravity.
/// This value includes position and yaw of the camera, but not roll and pitch.
/// It can be used as a reference point to render real-world objects in the correct location.
ovrPosef LeveledCameraPose;

/// The most recent calculated pose for each hand when hand controller tracking is present.
/// HandPoses[ovrHand_Left] refers to the left hand and HandPoses[ovrHand_Right] to the right hand.
/// These values can be combined with ovrInputState for complete hand controller information.
ovrPoseStatef HandPoses[2];

/// The most recent sensor data received from the HMD.
ovrSensorData RawSensorData;

/// Tracking status described by ovrStatusBits.
unsigned int StatusFlags;

/// Hand status flags described by ovrStatusBits.
/// Only ovrStatus_OrientationTracked and ovrStatus_PositionTracked are reported.
unsigned int HandStatusFlags[2];

/// Tags the vision processing results to a certain frame counter number.
uint32_t LastCameraFrameCounter;

OVR_UNUSED_STRUCT_PAD(pad0, 4) ///< \internal struct padding

} ovrTrackingState;

galopin
Heroic Explorer
I think what you see here is an API evolution, they moved the raw data in the pose first to support for several devices (hmd+touches), then later decide to remove what is a redundant entry for the HMD when moving forward the final API.

if you look at the definition of ovrSensorData here ( https://developer.oculus.com/doc/0.4.4-libovr/structovr_sensor_data.html ), it is that, like magnometer defined as a rad/s², exactly the same as AngularAcceleration in the pose now.

Of course, they may have some hidden filtering over the values, but i do believe what is in the pose now is what you are asking and there is nothing more they could have provide here.

EDIT : @cybereality Is it possible to fix the forum that spawn random smiley everywhere ?

zhangyuchi45
Honored Guest

galopin said:

I think what you see here is an API evolution, they moved the raw data in the pose first to support for several devices (hmd+touches), then later decide to remove what is a redundant entry for the HMD when moving forward the final API.

if you look at the definition of ovrSensorData here ( https://developer.oculus.com/doc/0.4.4-libovr/structovr_sensor_data.html ), it is that, like magnometer defined as a rad/s², exactly the same as AngularAcceleration in the pose now.

Of course, they may have some hidden filtering over the values, but i do believe what is in the pose now is what you are asking and there is nothing more they could have provide here.

EDIT : @cybereality Is it possible to fix the forum that spawn random smiley everywhere ?



Thanks galopin, I'll try to see whether they are what I'm looking for. Apparently, I can get raw sensor data even the constellation system is not working hope this will happen with SDK later than v1.3.