Quantcast
Channel: Yudiz Solutions Ltd.
Viewing all articles
Browse latest Browse all 595

Sceneform SDK : A boon for Android developers

$
0
0

Overview

With AR being in trend, is it just me or is using a 2D and stationary image too mainstream now-a-days? 😀

ARCore brings Sceneform SDK which is capable to scan images and load 3D models accordingly which allow users to interact with them using gestures. They call this feature – Augmented Images. Not only this but a lot of other AR stuff can be done with Sceneform SDK and in a lot easier way than ever imagined.

Sceneform Overview

Google announced Sceneform SDK in Google I/O 2018. It handles all the 3D graphics, OpenGL and such complex stuff by itself allowing an Android developer to easily develop the AR apps with lesser lines of code. It requires a plugin to be installed and an Android Studio with version 3.1 or above to run.

Sceneform Capabilities

Being in beta, along with Augmented Images, it provides basic AR functionality like moving, rotating and scaling a 3D model.

It has a functionality called cloud anchors wherein two/multiple users place a 3D model each using their respective devices in a single frame (same environment) and both the models can be viewed from both the devices. Models can also interact with each other. That’s cool, right?

Now, here comes my favourite part… (drum rolls)
Being a native android developer, the functionality that I find most interesting, is that it can convert a layout/app-screen into a renderable and can load it as a 3D model into physical environment. Can’t stop thinking about unending varieties of applications that can be developed based on this concept !

sceneform-image1

Without using any other minute in dreaming about Sceneform enabled app possibilities :D, let’s dive into practical scenarios to have a look at its performance.

Basic information to get started

To start with, we will need it’s plugin to be installed. Go to Preferences and search for sceneform as shown below. Install it and restart the studio.

sceneform-image2

3D models can be downloaded from Google’s own website – https://poly.google.com. Sceneform supports 3D models with .obj, .gltf and .fbx extensions. SDK has its own extensions for models. It converts them into .sfa and .sfb formats.

.sfb (Sceneform Binary asset) is the actual model that is to be loaded into app and .sfa (Sceneform Asset Definition) is human-readable description of .sfb file.

Below is an example for .sfa file.

sceneform-image3

It stores information of the model like its scale size, textures to be loaded and other material properties. More information regarding .sfa attributes can be found at https://developers.google.com/ar/develop/java/sceneform/sfa

3D model can be converted into these formats just by right clicking on them and selecting Import Sceneform Asset. This will open a dialog wherein we can specify the output locations.

sceneform-image4

Plugin also provides a viewer to view the model in studio without running the app. It’s something that I craved for while using ARCore’s older SDKs. 😀

sceneform-image5

Practical

In our demo, we’ll concentrate on converting a layout/screen into a 3D model. We’ll develop an app wherein we’ll scan an image (Yudiz team’s picture) which will pop up 3 buttons or tappable icons in 3D to redirect user into respective screens when clicked.

Below is the image that I’ll use.

sceneform-image6

Remember : The image should be unique enough to get identified by the SDK.
Store it in assets folder.

Let’s have a look at the other required resource.

sceneform-image7

This is the layout that will pop up when image gets detected by SDK. You can design any layout based on your requirements.

Now, skipping the explanation of boilerplate code that will be needed to detect the supported devices and to initialize the ARCore fragment, let’s have a look at the core functionality.

private boolean setupAugmentedImageDb(Config config) {
   AugmentedImageDatabase augmentedImageDatabase;

   Bitmap augmentedImageBitmap = loadAugmentedImage();
   if (augmentedImageBitmap == null) {
       return false;
   }

   augmentedImageDatabase = new AugmentedImageDatabase(session);
   augmentedImageDatabase.addImage("picTeamYudiz", augmentedImageBitmap);

   config.setAugmentedImageDatabase(augmentedImageDatabase);
   return true;
}

private Bitmap loadAugmentedImage() {
   try (InputStream is = getAssets().open(picTeamYudiz + ".png")) {
       return BitmapFactory.decodeStream(is);
   } catch (IOException e) {
       Log.e(TAG, "IO exception loading augmented image bitmap.", e);
   }
   return null;
}

We need to create an AugmentedImageDatabase to store the images with unique names.

private void onUpdateFrame(FrameTime frameTime) {
   Frame frame = arSceneView.getArFrame();
   Collection updatedAugmentedImages =
           frame.getUpdatedTrackables(AugmentedImage.class);

   if (node == null)
       node = new AugmentedImageNode(this);

   for (AugmentedImage augmentedImage : updatedAugmentedImages) {
       if (augmentedImage.getTrackingState() == TrackingState.TRACKING)
           if (augmentedImage.getName().equals("picTeamYudiz")) {
               node.setImage(augmentedImage);
               arSceneView.getScene().addChild(node);
           }
   }

}

This method gets fired whenever screen frame is updated.

Collection<AugmentedImage> updatedAugmentedImages =
       frame.getUpdatedTrackables(AugmentedImage.class);

This code is used to fetch all the images stored in augmented database.

for (AugmentedImage augmentedImage : updatedAugmentedImages) {
   if (augmentedImage.getTrackingState() == TrackingState.TRACKING)
       if (augmentedImage.getName().equals("picTeamYudiz")) {
           node.setImage(augmentedImage);
           arSceneView.getScene().addChild(node);
       }
}

Here, for loop is used to check where any of the fetched images is same as that we stored in DB.
When this condition is satisfied, the layout is converted into a renderer and gets added in ArScene. This is shown in below code.

public void setImage(AugmentedImage image) {
   this.image = image;

   CompletableFuture<ViewRenderable> viewCompFuture =
           ViewRenderable.builder().setView(context, R.layout.layout_renderable).build();

   CompletableFuture.allOf(viewCompFuture)
           .handle((notUsed, throwable) -> {
               try {
                   renderableView = viewCompFuture.get();
               } catch (InterruptedException e) {
                   e.printStackTrace();
               } catch (ExecutionException e) {
                   e.printStackTrace();
               }

               return null;
           });

   setAnchor(image.createAnchor(image.getCenterPose()));

   Node solarControls = new Node();
   solarControls.setParent(this);
   solarControls.setLocalPosition(new Vector3(0.0f, 0.0f, 0.0f));
   solarControls.setRenderable(renderableView);

   View renderableLayout = renderableView.getView();

   listeners(renderableLayout);

}

private void listeners(View renderableLayout) {
   renderableLayout.findViewById(R.id.ivContactUs).setOnClickListener(this);
   renderableLayout.findViewById(R.id.ivYudiz).setOnClickListener(this);
   renderableLayout.findViewById(R.id.ivLinkedIn).setOnClickListener(this);
}

Here, a CompletableFuture object is created with layout which ultimately will provide a renderable.
I have obtained a view from renderable to find the IDs of the elements and to set click listeners for them.

That’s it. We have successfully added interactions to the image. Yay ! 😀

Check out the git repository for better understanding.
https://gitlab.com/YudizSumeet/augmented-images.git

Video Description

Google’s Sceneform SDk brings the marker detection feature called Augmented Images. Scanning a unique image to load 3D models in the virtual environments can be used in various ways for business and personal purposes. Here is one such demo.

Application ideas

An application for I-card can be developed using this feature. A card, which has human unreadable content like QR code, can be scanned and actual information can be fetched to show it in 3D using ARCore.

Conclusion

Sceneform SDK is not less than a boon for Android developers who are eager to learn AR. Being so powerful in beta version, I’m eager to see what its future features will comprise of.


Viewing all articles
Browse latest Browse all 595

Trending Articles