Quantcast
Channel: Yudiz Solutions Ltd.
Viewing all articles
Browse latest Browse all 595

Object interaction in ARCore for android

$
0
0

Overview

After a long wait, google has finally released a stable version of ARCore sdk for android.
But it hasn’t disappointed us from functionality point of view. The brand new sdk is capable to detect vertical as well as horizontal surfaces unlike prior couple of developer versions of sdk which were just able to detect horizontal surfaces.

ARCore SDK 1.0

arcode

Latest sdk supports a longer range of devices including Asus and Huawei products. But roses often come with thorns!

Google has imposed a little restriction on memory usage of devices while running AR apps which forces us to use background threads for heavy operations like changing texture of objects at run time.
This is obvious step in order to support devices with lower memory.
In earlier versions of sdk, memory of devices was kind of vulnerable. So, as a conclusion, I’ll consider this positive.

Google has also released emulators to test apps. Isn’t it mind boggling?

simulator

I have kept an eye on ARCore since the day Google released its developer preview version. The sample app provided by Google uses OpenGL to read and load objects. But, it just shows us how to place objects.
As a developer, one cannot stay satisfied with such simple functionality!
As I’m a beginner in OpenGL for android, I searched a lot to add object interaction functionality in the app.

I’ll show you how to rotate, scale and move the objects around and an eye-catching functionality – changing objects at run time.

arcode-gif

Initially, you will need the ARCore sample app provided by Google, an emulator which supports ARCore app or an ARCore supported devices and 3D models with their textures.
I’ll not dive into the deep ocean of ARCore and OpenGl, explaining the very basics. The only thing to keep in mind is, ARCore places points in real world and tracks them. The task of drawing and moving objects is solely based on graphics libraries, in this case: OpenGL.

Loading object into ARCore

Place models and their textures in assets folder.

models

Here, andy and shoes are our two models.
Now, declare two String variables in MainActivity.java to hold the values of object file and texture file. Initialize them as shown below.

private String objName = "models/shoes.obj";
private String textureName = "models/shoes.jpg";

In onSurfaceCreated( ) method, use these variables to create model.

@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {

   GLES20.glClearColor(0.1f, 0.1f, 0.1f, 1.0f);
   try {
       backgroundRenderer.createOnGlThread(getContext());
       planeRenderer.createOnGlThread(getContext(), "models/trigrid.png");
       pointCloudRenderer.createOnGlThread(getContext());
       virtualObject.createOnGlThread(getContext(), objName, textureName);
       virtualObject.setMaterialProperties(0.0f, 2.0f, 0.5f, 6.0f);

   } catch (IOException e) {
       Log.e(TAG, "Failed to read an asset file", e);
   }
}

Note: In onDraw( ) method, change the if-condition to restrict the user to place one object at a time.

if (tap != null && camera.getTrackingState() == TrackingState.TRACKING) {
   for (HitResult hit : frame.hitTest(tap)) {
       Trackable trackable = hit.getTrackable();
       if ((trackable instanceof Plane && ((Plane)trackable).isPoseInPolygon(hit.getHitPose()))
               || (trackable instanceof Point
               && ((Point) trackable).getOrientationMode()
               == Point.OrientationMode.ESTIMATED_SURFACE_NORMAL)) {
           if (anchors.size() >= 1) {
               anchors.get(0).detach();
               anchors.remove(0);
           }
           anchors.add(hit.createAnchor());
           break;
       }
   }
}

Rotating object

I have used a helper class to detect pinch-to-rotate gesture. Based on requirement, any gesture can be used here.

package com.yudiz.arexample.helpers;

import android.view.MotionEvent;

public class RotationGestureDetector {
   private static final int INVALID_POINTER_ID = -1;
   private float fX, fY, sX, sY;
   private int ptrID1, ptrID2;
   private float mAngle;

   private OnRotationGestureListener mListener;

   public float getAngle() {
       return mAngle;
   }

   public RotationGestureDetector(OnRotationGestureListener listener) {
       mListener = listener;
       ptrID1 = INVALID_POINTER_ID;
       ptrID2 = INVALID_POINTER_ID;
   }

   public boolean onTouchEvent(MotionEvent event) {
       switch (event.getActionMasked()) {
           case MotionEvent.ACTION_DOWN:
               ptrID1 = event.getPointerId(event.getActionIndex());
               break;
           case MotionEvent.ACTION_POINTER_DOWN:
               ptrID2 = event.getPointerId(event.getActionIndex());
               sX = event.getX(event.findPointerIndex(ptrID1));
               sY = event.getY(event.findPointerIndex(ptrID1));
               fX = event.getX(event.findPointerIndex(ptrID2));
               fY = event.getY(event.findPointerIndex(ptrID2));
               break;
           case MotionEvent.ACTION_MOVE:
               if (ptrID1 != INVALID_POINTER_ID && ptrID2 != INVALID_POINTER_ID) {
                   float nfX, nfY, nsX, nsY;
                   nsX = event.getX(event.findPointerIndex(ptrID1));
                   nsY = event.getY(event.findPointerIndex(ptrID1));
                   nfX = event.getX(event.findPointerIndex(ptrID2));
                   nfY = event.getY(event.findPointerIndex(ptrID2));

                   mAngle = angleBetweenLines(fX, fY, sX, sY, nfX, nfY, nsX, nsY);

                   if (mListener != null) {
                       mListener.OnRotation(this);
                   }
               }
               break;
           case MotionEvent.ACTION_UP:
               ptrID1 = INVALID_POINTER_ID;
               break;
           case MotionEvent.ACTION_POINTER_UP:
               ptrID2 = INVALID_POINTER_ID;
               break;
           case MotionEvent.ACTION_CANCEL:
               ptrID1 = INVALID_POINTER_ID;
               ptrID2 = INVALID_POINTER_ID;
               break;
       }
       return true;
   }

   private float angleBetweenLines(float fX, float fY, float sX, float sY, float nfX, float nfY, float nsX, float nsY) {
       float angle1 = (float) Math.atan2((fY - sY), (fX - sX));
       float angle2 = (float) Math.atan2((nfY - nsY), (nfX - nsX));

       float angle = ((float) Math.toDegrees(angle1 - angle2)) % 360;
       if (angle < -180.f) angle += 360.0f;
       if (angle > 180.f) angle -= 360.0f;
       return angle;
   }

   public static interface OnRotationGestureListener {
       public void OnRotation(RotationGestureDetector rotationDetector);
   }
}

We have to implement its OnRotationGestureListener( ) in our main class.

@Override
public void OnRotation(RotationGestureDetector rotationDetector) {
   float angle = rotationDetector.getAngle();
   GlobalClass.rotateF = GlobalClass.rotateF + angle / 10;
}

Declare a static public float variable to store the rotation value, in this case: GlobalClass.rotateF.

Now, in ObjectRenderer.java class, declare and initialize a matrix (4 x 4 array).

private float[] mFinalModelViewProjectionMatrix = new float[16];

In draw( ) method, edit code as shown below.

ShaderUtil.checkGLError(TAG, "Before draw");

Matrix.multiplyMM(modelViewMatrix, 0, cameraView, 0, modelMatrix, 0);
Matrix.multiplyMM(modelViewProjectionMatrix, 0, cameraPerspective, 0, modelViewMatrix, 0);

//rotation
Matrix.setRotateM(mRotationMatrix, 0, GlobalClass.rotateF, 0.0f, 1.0f, 0.0f);

Matrix.multiplyMM(mFinalModelViewProjectionMatrix, 0, modelViewProjectionMatrix, 0, mRotationMatrix, 0);

mFinalModelViewProjectionMatrix = modelViewProjectionMatrix;

GLES20.glUseProgram(program);

Matrix.setRotateM(mRotationMatrix, 0, GlobalClass.rotateF, 0.0f, 1.0f, 0.0f);
Here, I have used the rotation factor to change the matrix of the object along y-axis.
4th, 5th and 6th arguments are for x, y and z axis respectively.

Use the final matrix as in the code below.

GLES20.glUniformMatrix4fv(modelViewProjectionUniform, 1, false, mFinalModelViewProjectionMatrix, 0);

Moving object

This is actually a workaround to translate the object along the surface.
Redrawing the object in onScroll( ) method of surface view does this trick.

@Override
public boolean onScroll(MotionEvent e1, MotionEvent e2, float distanceX, float distanceY) {
   if (mPtrCount < 2) {
       queuedSingleTaps.offer(motionEvent);
       return true;
   } else
       return false;
}

Here, I have used a counter to calculate number of touches on the surface. If there are less than 2 fingers on the surface, the object will get redrawn on the point. Hence, scrolling gesture will produce a translating effect on object.

Scaling object

To implement this functionality, we have used onDoubleTap( ) listener.

We need to store scale factor in a public static variable, in this case: GlobalClass.scaleFactor.

This factor is used in onDrawFrame( ) method of main activity. This method is executed continuously in fraction of a second.

virtualObject.updateModelMatrix(anchorMatrix, GlobalClass.scaleFactor);

updateModelMatrix( ) is a method of object renderer class which sets the scale of the model.

@Override
public boolean onDoubleTap(MotionEvent e) {
   GlobalClass.scaleFactor += GlobalClass.scaleFactor;
   return true;
}

I have increased the factor’s value in the method.

Changing object at run time

Here comes the coolest functionality. We got a button, clicking on it will change the model. All we have to do is change the values of String variables which we declared for storing model name and texture.

@Override
public void onClick(View view) {
   objName = "models/andy.obj";
   textureName = "models/andy.png";
   isObjectReplaced = true;
}

We have to keep a track of whether the model is changed. isObjectReplaced is used for it.

In onDrawFrame( ) method, we have to add the below code. This will replace the object.

if (isObjReplaced) {
   isObjReplaced = false;
   try {
       virtualObject.createOnGlThread(getContext(), objName, textureName);
       virtualObject.setMaterialProperties(0.0f, 2.0f, 0.5f, 6.0f);
   } catch (IOException e) {
       e.printStackTrace();
   }
   return;
}

Tip: To read model and textures from sd card, replace code in createIOnGlThread( )

File dir = new File(<path_to_file>);
FileInputStream objInputStream = new FileInputStream(dir);
Obj obj = ObjReader.read(objInputStream);

Here, at Yudiz, we are concentrating on advanced ARCore topics like managing multiple objects simultaneously, selecting models using touch gestures.

Conclusion

ARCore is Google’s answer to Apple’s ARKit. I personally think that it will dominate in AR field as it has great potential.


Viewing all articles
Browse latest Browse all 595

Trending Articles