Augmented Reality in Xamarin.Android with ARCore

Avatar

Jon

Now that you’ve had a chance to augment reality in your Xamarin iOS apps with ARKit, it’s time to explore Google’s take on AR in your Xamarin Android apps.

The new ARCore SDK provides APIs for Augmented Reality features, such as motion tracking, plane detection, and light estimation. These are the building blocks you will use to add AR experiences to your Android apps.

Getting Started with ARCore

ARCore is currently only available on select devices such as the Google Pixel, Google Pixel 2, and the Samsung Galaxy S8.

In order to use ARCore, you need to prepare your device by downloading and installing arcore-preview.apk.

After you set up your device for ARCore development, you need to install the ARCore prerelease NuGet package.

ARCore API Basics

To help you detect surfaces to place objects on and calculate their location in space relative to the camera, ARCore uses a few basic types.

  • Session Object: Your main point of interaction with ARCore. It will help manage the AR state by keeping track of any anchors you add, surfaces the engine has detected, and current snapshots of the device.
  • Plane: A surface the SDK has detected, onto which you can place an Anchor to describe a fixed real-world location of an object (including its orientation). Currently, surfaces facing upward and downward can be detected separately (think floor and ceiling).
  • Session: The current snapshot of the AR state when you call .Update(), which returns a Frame Object.
  • Frame: A convenient HitTest(..) method, which can help determine if tapped coordinates on the display intersect with any planes, for example. Each frame also contains information about the camera’s orientation and relation to the real world and helps compute projection matrices for displaying visual representations to the user.
  • Another interesting feature of the SDK is the ability to obtain a LightEstimate from a given frame. This estimate includes the PixelIntensity of the camera view.

Basic Walkthrough

We’ve ported the HelloAR sample to Xamarin, and you can go check it out on GitHub! Now let’s walk through a few of the basic things going on in this sample.

First, in your activity you need to create a session in OnCreate and make sure ARCore is supported on the device at runtime:

var config = Config.CreateDefaultConfig();
session = new Session(this);

// Make sure ARCore is supported on this device
if (!session.IsSupported(config)) {
    Toast.MakeText(this, "ARCore unsupported!", ToastLength.Long).Show();
    Finish();
}

Remember, you also need to request Android.Manifest.Permission.Camera permissions to display the live camera feed / augmented reality view to the user. In the “HelloAR” sample, we use a GLSurfaceView to render camera and augmentations to the user. Make sure you set up your GL Surface or look at how it’s done in the sample code.

With a session running, we can obtain a snapshot of the AR system state in our GL surface’s OnDrawFrame implementation. After we check to ensure the frame is in a Tracking  state, we can check for any hit results and, assuming they intersect a plane, add an anchor to the plane.

// See the PlaneAttachment class from the sample
// This helps associate Anchors with Planes they are attached to
List<PlaneAttachment> planeAttachments = new List<PlaneAttachment>();
void OnDrawFrame (IGL10 gl)
{
    var frame = session.Update();

    // You could keep track of taps by queueing up
    // MotionEvent's from a tap gesture recognizer
    var tap = motionEventsQueue.Dequeue();

    // Make sure we've got a tap and are in a tracking state for our frame
    if (tap != null && frame.GetTrackingState() == Frame.TrackingState.Tracking) {
        // Look at each hittest result
        foreach (var hit in frame.HitTest(tap)) {
            // We could get PointCloudsHitResult as well, check for Plane
            var planeHit = hit as PlaneHitResult;
            if (planeHit != null && planeHit.IsHitInPolygon) {
                // Create a new anchor
                var anchor = session.AddAnchor(hit.HitPose);
                // Keep track of our anchors and the planes they are attached to
                planeAttachments.Add(new PlaneAttachment(planeHit, anchor))
            }
        }
    }
}

We also want to render the various objects in our scene in our drawing method. The HelloAR sample has various renderers to do the heavy OpenGL lifting and achieve this based on the projections calculated from the frame:

// Get projection matrix.
float[] projectionMatrix = new float[16];
session.GetProjectionMatrix(projectionMatrix, 0, 0.1f, 100.0f);

// Get camera matrix and draw.
float[] viewMatrix = new float[16];
frame.GetViewMatrix(viewMatrix, 0);

// Draw the detected planes
planeRenderer.DrawPlanes(session.AllPlanes, frame.Pose, projectionMatrix);

// Get lighting from avg intensity of the image
var lightIntensity = frame.LightEstimate.PixelIntensity;

// Draw all of our anchors attached to planes
float scaleFactor = 1.0f;
float[] anchorMatrix = new float[16];

foreach (var planeAttachment in planeAttachments) {
    // Only draw attachments currently tracking
    if (!planeAttachment.IsTracking)
        continue;

    // Get the current combined pose of an Anchor and Plane in world space
    planeAttachment.GetPose().ToMatrix(anchorMatrix, 0);

    // Update and draw the model
    objectRenderer.UpdateModelMatrix(anchorMatrix, scaleFactor);
    objectRenderer.Draw(viewMatrix, projectionMatrix, lightIntensity);
}

After dissecting the sample, you can see the actual ARCore code is relatively straightforward and most of the sample code is about the OpenGL rendering. 

Again, be sure to check out the HelloAR sample in its entirety on GitHub! We look forward to seeing what Augmented Reality experiences you create with ARCore in your Xamarin Android apps.

Discuss this post on the forums!

2 comments

Comments are closed.