Build custom arm model for VR

It's time for some fun. An awesome use case in how to use quaternions to achieve portable results. If you aren't familiar with quaternions, please feel free to review them in my previous article. If that's confusing, it's no problem, you can use the code from below, but it might be a bit tough to understand without the prerequisite math knowledge.

So let's write the inspiration:


Notice that controller?

One here too!

In the world of mobile VR, the two main systems, Google's Daydream and Oculus' mobile GearVR, are both now shipping with a controller that is used much like a laser pointer in the virtual world. While both Google and Oculus provide an SDK for working with the controller, much of what it takes to work with it is missing in any form of documentation. Thus most apps rely upon other people's controller implementation and create a project wide dependency (Unity, Unreal) just for one very small aspect of the application. Also, my project, the VR controlled drone will require custom work with the controller SDK to get right. And so, it's time to fill in the gap of knowledge and show with a little math, we can free ourselves of the 3rd party engine requirement and keep our code base portable.

First let's talk about what our two control points are. We get a head oriented quaternion from both the controller, and the headset. This gives us the 3 axis orientation of the controller itself along with the headset. We also know based upon the gyroscope, how fast the angular acceleration is for both the headset and the controller (this gives us information about how the controller/headset are currently moving).

The other thing we know is the basic physiology of the human and the way humans will move in relation to those quaternions.


There's a limited set of ways humans move.

Let's rewrite all of that knowledge into structures that can hold it.

    Vector3f torsoDir;
    Quatf shoulderRotation;
    Vector3f shoulderPosition;
    Vector3f elbowOffset;
    Quatf elbowRotation;
    Vector3f elbowPosition;
    Quatf wristRotation;
    Vector3f wristPosition;
    Vector3f zero_accel;
    Vector3f filtered_accel;
    Vector3f filtered_velocity;
    Quatf startDir;
    Vector3f controllerOrient;
Now notice what we're interested in here. Which direction and position are individual parts of a human anatomy situated in? I would submit that this will give us the ability to model a human arm holding a controller as if we can fill in what we know from the headset and the controller, and make positional settings based around knowledge of the normal human anatomy, we can fill in the blanks.

Time to start filling in the blanks:

// First define a what direction Forward is in our world, we go INTO the z axis (positive Z is coming towards the eye negative away from the eye).
const Vector3f FORWARD{ 0.0f, 0.0f, -1.0f };
// Following that line, Up means the Y axis in the positive direction goes, well UP.
const Vector3f UP{0.0f, 1.0f, 0.0f};

// Now let's give our gravity calibrations which allow for us to smooth out the readings we're going to take
float GRAVITY_CALIB_STRENGTH = 0.999f;
float VELOCITY_FILTER_SUPPRESS = 0.99f;
float MIN_ACCEL = 1.0f;
// Remember this from high school physics, gravity accelerates at 9.807 meters per second per second
float GRAVITY_FORCE = 9.807f;

// Now for the positioning of objects of the body, the following are in meters.
// The right shoulder in relation to the center of the eyes (bridge of nose) is a little down and to the right and a little behind
Vector3f DEFAULT_SHOULDER_RIGHT{ 0.015f, -0.015f, 0.003f };
// In relation to the should the elbow is about 1/2 a meter away from the current rotation and point of the shoulder.  In the resting position, the elbow is also a little out and forward of the shoulder.
Vector3f DEFAULT_ELBOW_POS{0.095f, -0.5f, 0.075f};
// Finally, in relation to the elbow, the wrist is 1/4 a meter away from the current rotation and point of the elbow.  In the resting position it is a straight direction to get to the wrist.
Vector3f DEFAULT_WRIST_POS(0.0f, 0.0f, -0.25f);

// We also know the most an elbow can rotate (it doesn't make sense to think an elbow can rotate more than 90 degrees).
Vector3f ELBOW_MIN_RANGE{ -0.05f, -0.1f, -0.2f };
Vector3f ELBOW_MAX_RANGE{ 0.05f, 0.1f, 0.0f };

Next let's look at what we can do every frame to fill out our torso orientation:

   void ControllerArmModel::UpdateTorsoDirection() {
        // Place the shoulder in anatomical positions based on the height and handedness.
        shoulderRotation = Quatf();
        shoulderPosition = DEFAULT_SHOULDER_RIGHT;
        shoulderPosition = shoulderPosition * Vector3f(CurrentHandedness()==RightHanded?1.0f:-1.0f, 1.0f, 1.0f);

        // Determine gaze direction horizontally by grabbing the center view rotation matrix from the headset and turning it into a quaternion.
        Vector3f headDirection = Quatf(getCenterViewMatrix()) * FORWARD;
        headDirection.y = 0;
        headDirection.Normalize();
        torsoDir = headDirection;

        // Rotate the fixed joints
        Quatf gazeRotation = FromToRotation(FORWARD, torsoDir);
        shoulderRotation = gazeRotation;
        shoulderPosition = gazeRotation.Rotate(shoulderPosition);
    }
So now we have positioned the horizontal orientation of the torso based around the center of the headset orientation and rotated/positioned the shoulder so it makes sense with that headset positioning.

To move further down the arm, we need to now about how the wrist is situated:

    void ControllerArmModel::UpdateFromController() {
        //Get the orientation-adjusted acceleration
        Vector3f Accel = getControllerAcelleration();
        Quatf controllerOrientation = getControllerOrientation();
        Accel = controllerOrientation.Rotate(Accel);

        //very slowly calibrate gravity force out of acceleration.
        zero_accel = zero_accel * GRAVITY_CALIB_STRENGTH + Accel * (1.0f - GRAVITY_CALIB_STRENGTH);
        filtered_accel = Accel - zero_accel;

        // if no tracking history, reset the velocity.
        if (first_update) {
            filtered_velocity = {0.0f, 0.0f, 0.0f};
            startDir = controllerOrientation;
            first_update = false;
        }

        // IMPORTANT: The accelerometer is not reliable at these low magnitudes
        // so ignore it to prevent drift
        if (filtered_accel.Magnitude() < MIN_ACCEL) {
            // Suppress the acceleration.
            filtered_accel = {0.0f, 0.0f, 0.0f};
            filtered_velocity *= 0.5f;
        } else {
            // if the velocity is decreasing, prevent snap-back by reducing deceleration
            Vector3f new_velocity =
                    filtered_velocity + filtered_accel * getDeltaTimeSinceLastFrame();
            if (new_velocity.MagnitudeSquared() < filtered_velocity.MagnitudeSquared()) {
                filtered_accel *= 0.5f;
            }
        }
    }

    void ControllerArmModel::UpdateVelocity() {
        // Update the filtered velocity
        filtered_velocity = filtered_velocity + (filtered_accel * getDeltaTimeSinceLastFrame());
        filtered_velocity = filtered_velocity * VELOCITY_FILTER_SUPPRESS;
    }

So all we did here was to get information from the controller and update our understanding of how it's positioned within our virtual space. Next we want to apply this information to the arm model. To do that, we have positioning and rotation of the shoulder, so we want to place the arm in the anatomically correct orientation and position with respect to the shoulder:

void ControllerArmModel::TransformElbow() {
        elbowOffset += filtered_velocity * getDeltaTimeSinceLastFrame();
        elbowOffset.x = Clamp(elbowOffset.x, ELBOW_MIN_RANGE.x, ELBOW_MAX_RANGE.x);
        elbowOffset.y = Clamp(elbowOffset.y, ELBOW_MIN_RANGE.y, ELBOW_MAX_RANGE.y);
        elbowOffset.z = Clamp(elbowOffset.z, ELBOW_MIN_RANGE.z, ELBOW_MAX_RANGE.z);
    }

Now that we know where the arm is positioned at, we can rotate it and position the wrist so that it makes sense where the controller is located, and oriented, so here comes the magic:

    void ControllerArmModel::ApplyArmModel() {
        // Find the controller's orientation relative to the player
        Quatf controllerOrientation = GetCurrentOrient();
        controllerOrientation = shoulderRotation.Inverted() * controllerOrientation;

        // get the relative positions of the joints
        Vector3f handedMultiplier = {getCurrentHandedness() == RightHanded ? 1.0f : -1.0f,
                                               1.0f, 1.0f};
        elbowPosition = Vector3f(DEFAULT_ELBOW_POS.x,
                                 DEFAULT_ELBOW_POS.y + added_elbow_height,
                                 DEFAULT_ELBOW_POS.z + added_elbow_depth);
        elbowPosition = (elbowPosition * handedMultiplier) + elbowOffset;
        wristPosition = DEFAULT_WRIST_POS * handedMultiplier;

        // Extract just the x rotation angle
        Vector3f controller_forward = controllerOrientation.Rotate(FORWARD);
        float x_angle = 90.0f - controller_forward.AngleDegrees(UP);

        // Remove the z rotation from the controller
        Quatf x_y_rotation = FromToRotation(FORWARD, controller_forward);

        // Offset the elbow by the extension
        const float MIN_EXTENSION_ANGLE = 7.0f;
        const float MAX_EXTENSION_ANGLE = 60.0f;
        float normalized_angle =
(x_angle - MIN_EXTENSION_ANGLE) / (MAX_EXTENSION_ANGLE - MIN_EXTENSION_ANGLE);
        float extension_ratio = Clamp(normalized_angle, 0.0f, 1.0f);

        // Calculate the lerp interpolation factor
        const float EXTENSION_WEIGHT = 0.4f;
        float total_angle = RadToDegree(x_y_rotation.Angle(Quatf()));
        float lerp_suppression = 1.0f - powf(total_angle / 180.0f, 6);
        float lerp_value = lerp_suppression * (0.4f + 0.6f * extension_ratio * EXTENSION_WEIGHT);

        // Apply the absolute rotations to the joints
        Quatf lerp_rotation = Quatf().Lerp(x_y_rotation, lerp_value);
        elbowRotation = shoulderRotation * lerp_rotation.Inverted() * controllerOrientation;
        wristRotation = shoulderRotation * controllerOrientation;

        // Determine the relative positions
        elbowPosition = shoulderRotation.Rotate(elbowPosition);
        wristPosition = elbowPosition + elbowRotation.Rotate(wristPosition);
    }

As you can see from the above, the process involves rotate and translate to each of the joints. Also we introduce looking at a Linear intERPolation or a Lerp to get us tracking from one location to the next. There's only so many places that a position can move to with respect to where it has been which is why that gives us extra help. Also, and most importantly is recognizing that the inverse of a direction or rotation means the orientation of the opposite direction. So wrist rotation is equal to the shoulderRotation (gives the torso orientation embedded within the shoulder, by the controllerOrientation). The Elbow rotation is the shoulderRotation by the orientation necessary to reach the wrist by the controller orientation. Look at this long enough does indeed start to make sense how to do this.

To sum it all up, to work with the Snapdragon Flight drone project, we're going to take the wrist position of the above code within both the Daydream and the GearVR clients and allow us to know the position and orientation of the controller. To get this data, every frame, we'll follow the above sequence:

  • Update our understanding of the two points of information, the headset and the controller by:
    • Update the torso direction
    • Update our understanding of the controller's orientations
  • TransformElbow to the correct offset to use from the shoulder
  • ApplyArmModel to actually place everything in the correct place
If you carefully notice, every frame, we check if the current handedness is set and apply right or left arm transformation by multiplying out to place the shoulder in the right place. Now to get this to work in Daydream, and GearVR, we need to get the controller to tell us this new information. Here's how I do it:

First Daydream (where I'll note the bulk of the code in this tutorial comes from):

    void DaydreamController::OnSurfaceCreated(gvr_context *gvrContext) {
        controller_api_.reset(new gvr::ControllerApi);
        controller_api_->Init(gvr::ControllerApi::DefaultOptions(), gvrContext);
        controller_api_->Resume();
    }

    int DaydreamController::getCurrentHandedness() {
        int result = UNKNOWN;
        if(GetIsReady()) {
            switch (gvr_api_->GetUserPrefs().GetControllerHandedness()) {
                case GVR_CONTROLLER_RIGHT_HANDED:
                    result = RIGHT;
                    break;
                case GVR_CONTROLLER_LEFT_HANDED:
                    result = LEFT;
                    break;
                default:
                    result = UNKNOWN;
                    break;
            }
        }
        return result;
    }

    bool DaydreamController::GetIsReady() {
        return controller_state_.GetApiStatus() == GVR_CONTROLLER_API_OK &&
               controller_state_.GetConnectionState() == GVR_CONTROLLER_CONNECTED;
    }

    Vector3f DaydreamController::GetCurrentAccel() {
        gvr_vec3f gvrAccel = controller_state_.GetAccel();
        return {gvrAccel.x, gvrAccel.y, gvrAccel.z};
    }

    Quatf DaydreamController::GetCurrentOrient() {
        auto quat = controller_state_.GetOrientation();
        return {quat.qx, quat.qy, quat.qz, quat.qw};
    }

    void DaydreamController::PauseTracking() {
        if(controller_api_ != 0)
            controller_api_->Pause();
    }

    void DaydreamController::ResumeTracking() {
        if(controller_api_ != 0)
            controller_api_->Resume();
    }

Now we need to look at how to do the same thing for GearVR:

OculusController::OculusController()
: CurrentHandedness(Unknown)
, HeadsetDeviceID(ovrDeviceIdType_Invalid)
, RemoteDeviceID(ovrDeviceIdType_Invalid)
{

}

OculusController::~OculusController() {

}

int OculusController::getCurrentHandedness() {
    return CurrentHandedness;
}

void OculusController::Update() {
    bool foundRemote = false;
    bool foundHeadset = false;

    for (uint32_t deviceIndex = 0; ; deviceIndex++)
    {
        ovrInputCapabilityHeader curCaps;

        if(vrapi_EnumerateInputDevices(Ovr, deviceIndex, &curCaps ) < 0)
        {
            // No more devices, we are done!
            break;
        }
        switch(curCaps.Type)
        {
            case ovrControllerType_TrackedRemote:
                if( !foundRemote )
                {
                    foundRemote = true;
                    if(RemoteDeviceID != curCaps.DeviceID )
                    {
                        RemoteDisconnect( RemoteDeviceID );
                        RemoteConnect( curCaps.DeviceID );
                    }
                }
                break;
            case ovrControllerType_Headset:
                if( !foundHeadset )
                {
                    foundHeadset = true;
                    if( HeadsetDeviceID != curCaps.DeviceID )
                    {
                        HeadsetDisconnect(HeadsetDeviceID);
                        HeadsetConnect(curCaps.DeviceID);
                    }
                }
                break;
            default:
                break;
        }
    }

    // if no remotes found, disconnect any connected remote
    if(!foundRemote && RemoteDeviceID != ovrDeviceIdType_Invalid)
        RemoteDisconnect(RemoteDeviceID);
    if(!foundHeadset && HeadsetDeviceID != ovrDeviceIdType_Invalid)
        HeadsetDisconnect(HeadsetDeviceID);
}

void OculusController::PauseTracking() {
    if(RemoteDeviceID != ovrDeviceIdType_Invalid)
        RemoteDisconnect(RemoteDeviceID);
    if(HeadsetDeviceID != ovrDeviceIdType_Invalid)
        HeadsetDisconnect(HeadsetDeviceID);
}

bool OculusController::GetIsReady() {
    return RemoteDeviceID != ovrDeviceIdType_Invalid;
}

Vector3f OculusController::GetCurrentAccel() {
    if(GetIsReady()) {
        OVR::Vector3f r( remoteTracking.HeadPose.AngularAcceleration );
        return { r.x, r.y, r.z };
    }
    return Vector3f();
}

Quatf OculusController::GetCurrentOrient() {
    if(GetIsReady()) {
        OVR::Quatf r( remoteTracking.HeadPose.Pose.Orientation );
        return { r.x, r.y, r.z, r.w };
    }
    return Quatf();
}

void OculusController::RemoteDisconnect(ovrDeviceID id) {
    log("Remote disconnected, ID = %d", id);
    RemoteDeviceID = ovrDeviceIdType_Invalid;
}

void OculusController::RemoteConnect(ovrDeviceID id) {
    RemoteDeviceID = id;
    ovrInputTrackedRemoteCapabilities remoteCapabilities;

    remoteCapabilities.Header.Type = ovrControllerType_TrackedRemote;
    remoteCapabilities.Header.DeviceID = RemoteDeviceID;
    ovrResult result = vrapi_GetInputDeviceCapabilities( Ovr, &remoteCapabilities.Header );
    if ( result == ovrSuccess )
    {
        CurrentHandedness = (remoteCapabilities.ControllerCapabilities & ovrControllerCaps_RightHand )? Right : Left;
        ovrInputStateTrackedRemote remoteInputState;
        remoteInputState.Header.ControllerType = ovrControllerType_TrackedRemote;
        vrapi_GetCurrentInputState( Ovr, RemoteDeviceID, &remoteInputState.Header );
    }
    else
    {
        err( "vrapi_GetInputDeviceCapabilities: Error %d", result );
    }
    vrapi_RecenterInputPose( Ovr, RemoteDeviceID );
}

void OculusController::HeadsetDisconnect(ovrDeviceID id) {
    log("Headset disconnected, ID=%d", id);
    HeadsetDeviceID = ovrDeviceIdType_Invalid;
}

void OculusController::HeadsetConnect(ovrDeviceID id) {
    HeadsetDeviceID = id;
    ovrInputHeadsetCapabilities hmtCapabilities;
    hmtCapabilities.Header.Type = ovrControllerType_Headset;
    hmtCapabilities.Header.DeviceID = HeadsetDeviceID;
    ovrResult result = vrapi_GetInputDeviceCapabilities(Ovr, &hmtCapabilities.Header);
    if(result == ovrSuccess)
    {
        log("Headset connected %d", HeadsetDeviceID);
    }
    else
    {
        err( "vrapi_GetInputDeviceCapabilities: Error %d", result );
    }
    vrapi_SetRemoteEmulation( Ovr, false );
}
Now just call update once per frame and draw a controller where the wrist position and orientation is. Draw the cursor on the screen by projecting it out from the quaternion of the wrist orientation at a set distance. Finally set the cursor to lookat the headset and you have a laser pointer implementation that accurately mimics the real life human model it is emulating. You also have the keys to make it work in both Daydream and Oculus.

If all of this is too confusing, please stay tuned, we'll use this in our Snapdragon Flight VR controlled drone project. Which means that the entire app will not only be released on the Oculus Store/Play Store but I'll also open source the entire thing when I'm done so you can see how I did it. Next up, hardware!! Stay tuned!

Comments

Popular posts from this blog

Drone VR Software Architecture

Build Snapdragon Flight Drone

Soldering new connectors for the drone