07-27-2016 01:21 AM
07-27-2016 11:02 AM
typedef struct OVR_ALIGNAS(8) ovrPoseStatef_
{
ovrPosef ThePose; ///< Position and orientation.
ovrVector3f AngularVelocity; ///< Angular velocity in radians per second.
ovrVector3f LinearVelocity; ///< Velocity in meters per second.
ovrVector3f AngularAcceleration; ///< Angular acceleration in radians per second per second.
ovrVector3f LinearAcceleration; ///< Acceleration in meters per second per second.
OVR_UNUSED_STRUCT_PAD(pad0, 4) ///< \internal struct pad.
double TimeInSeconds; ///< Absolute time that this pose refers to. \see ovr_GetTimeInSeconds
} ovrPoseStatef;
07-28-2016 12:01 AM
galopin said:
All you have for the head seat and touch controllers is that :smile: They have been renamed, but they are what you are asking for i believe.typedef struct OVR_ALIGNAS(8) ovrPoseStatef_
{
ovrPosef ThePose; ///< Position and orientation.
ovrVector3f AngularVelocity; ///< Angular velocity in radians per second.
ovrVector3f LinearVelocity; ///< Velocity in meters per second.
ovrVector3f AngularAcceleration; ///< Angular acceleration in radians per second per second.
ovrVector3f LinearAcceleration; ///< Acceleration in meters per second per second.
OVR_UNUSED_STRUCT_PAD(pad0, 4) ///< \internal struct pad.
double TimeInSeconds; ///< Absolute time that this pose refers to. \see ovr_GetTimeInSeconds
} ovrPoseStatef;
typedef struct OVR_ALIGNAS(8) ovrTrackingState_
{
/// Predicted head pose (and derivatives) at the requested absolute time.
/// The look-ahead interval is equal to (HeadPose.TimeInSeconds - RawSensorData.TimeInSeconds).
ovrPoseStatef HeadPose;
/// Current pose of the external camera (if present).
/// This pose includes camera tilt (roll and pitch). For a leveled coordinate
/// system use LeveledCameraPose.
ovrPosef CameraPose;
/// Camera frame aligned with gravity.
/// This value includes position and yaw of the camera, but not roll and pitch.
/// It can be used as a reference point to render real-world objects in the correct location.
ovrPosef LeveledCameraPose;
/// The most recent calculated pose for each hand when hand controller tracking is present.
/// HandPoses[ovrHand_Left] refers to the left hand and HandPoses[ovrHand_Right] to the right hand.
/// These values can be combined with ovrInputState for complete hand controller information.
ovrPoseStatef HandPoses[2];
/// The most recent sensor data received from the HMD.
ovrSensorData RawSensorData;
/// Tracking status described by ovrStatusBits.
unsigned int StatusFlags;
/// Hand status flags described by ovrStatusBits.
/// Only ovrStatus_OrientationTracked and ovrStatus_PositionTracked are reported.
unsigned int HandStatusFlags[2];
/// Tags the vision processing results to a certain frame counter number.
uint32_t LastCameraFrameCounter;
OVR_UNUSED_STRUCT_PAD(pad0, 4) ///< \internal struct padding
} ovrTrackingState;
07-28-2016 11:13 AM
07-31-2016 12:28 AM
galopin said:
I think what you see here is an API evolution, they moved the raw data in the pose first to support for several devices (hmd+touches), then later decide to remove what is a redundant entry for the HMD when moving forward the final API.
if you look at the definition of ovrSensorData here ( https://developer.oculus.com/doc/0.4.4-libovr/structovr_sensor_data.html ), it is that, like magnometer defined as a rad/s², exactly the same as AngularAcceleration in the pose now.
Of course, they may have some hidden filtering over the values, but i do believe what is in the pose now is what you are asking and there is nothing more they could have provide here.
EDIT : @cybereality Is it possible to fix the forum that spawn random smiley everywhere ?