ARCore and Sceneform in Android — Performing Gestures and Collisions on Transformable Nodes | by Anupam Chugh

Build AR apps easily on Android with Sceneform

Photo by Bram Van Oost on Unsplash

At the turn of this new decade, if there’s one field that has the potential to completely change our way of interacting with smartphones, it’s augmented reality (AR Glasses: are you reading this?). Google has been pushing ahead with its own platform—ARCore—that enables developers to create AR experiences on Android, iOS, Unity, and more.

ARCore does a lot of things. From motion tracking to environmental understanding and light estimation, it has most of the bases covered already. Yet, despite being introduced in 2017, it didn’t garner much attention or progress for that matter until recently, when new features such as augmented faces, shared camera and AR elements were added and began to show the true potential of ARCore.

One of the reasons that ARCore’s adoption took some time was due to its initial dependency on OpenGL for 3D graphics. 3D content importing and viewing wasn’t possible in Java (or Kotlin) when ARCore was released, forcing developers to use OpenGL (the steep learning curving can give nightmares to anyone) or Unity, which is a pain when it comes to integration with Android components. Google had to do something about this.

They introduced Sceneform in 2018, and the rest is history.

Sceneform is a 3D framework that allows us to render 3D models by using a high-level graphics API. It comes with a plugin that lets you import, preview, and build 3D assets directly from Android Studio. Sceneform is tightly integrated with ARCore and makes it easy for Java and Kotlin developers to build high-quality Android AR Apps.

The Sceneform SDK handles the following things on your behalf:

  • Device compatibility check for ARCore
  • Camera permissions
  • Creating ARCore sessions

Now that we’ve got a good idea of ​​what Sceneform does for us, let’s define the goals of this article.

  • Addressing some key terminologies and building blocks of ARCore and Sceneform.
  • Setting up plane detection and hit testing to add multiple nodes in an Android AR Application using Kotlin.
  • Handling gestures and collisions on the transformable nodes.

Sceneform is a node-based graph that does quite a few things. From handling plane detection and allowing us to set up nodes on the scene (can be zero or more nodes), to performing hit testing, and beyond.

Hit testing is a way of transforming 2D coordinates from the screen where the user taps into their 3D projection in the AR scene. Imagine a ray light originating from the tapped point on the screen and going through the camera view of your phone. The first point of intersection of this imaginary ray of light with the plane surface of the AR scene gives us the world position. We can then set up our nodes on this world position.

In the previous section, we discussed terms nodes, transformable nodes, and more. Let’s define them here:

  • Scene —where our 3D objects are rendered. It has a tree-like data structure.
  • Node — an object that contains all the information required to be rendered on the screen. It can contain a renderable property on which we can set our 3D assets and material shapes. Also, it consists of a collisionShape that helps detect collisions. Nodes can have zero or more child nodes and one parent.
  • Renderable — 3D models that can be created from asset files (OBJ, FBX, glTF) as shown here or using MaterialFactory or ShapeFactory to create basic shapes such as cubes, spheres, and cylinders with textures
  • Anchor Nodes — These types of nodes are assigned a particular position in the AR world space. Typically, this is the first node that’s placed once a plane is detected.
  • Transformable Nodes — As the name suggests, they can be transformed. These have the ability to scale, translate, and rotate in the AR scene by reacting to user gestures.
  • Pose — Provides the position and orientation of the node in the scene. We can also determine the pose of the camera and find the distance between the camera and an anchor node in the scene.

Once you add the ArFragment to your Activity’s layout, it does the groundwork of setting up the ARCore session and the ArSceneView once it’s checked that the device passes the ARCore compatibility test.

setOnTapArPlaneListener is set in the ArFragment to listen to changes whenever any click event takes place on the detected plane. By using the HitResult returned from the listener, we can add anchors to the plane over which we’ll set our nodes.

MaterialFactory and ShapeFactory

The following code shows how to create a renderable with a shape and put it on the node:

MaterialFactory.makeOpaqueWithColor(this, Color(
.thenAccept { material ->
val vector3 = Vector3(0.05f, 0.05f, 0.05f)
cubeRenderable = ShapeFactory.makeCube(vector3,, material)

cubeRenderable!!.isShadowCaster = false
cubeRenderable!!.isShadowReceiver = false


MaterialFactory lets us define the material type — metallic, color, opacity and more—and put it on the ShapeFactory instance. The ShapeFactory class allows us to define the size of the shape. The functions for setting up a sphere (with radius) and a cylinder (with radius and height) are:

ShapeFactory.makeCylinder(0.1f,0.3f,, material)ShapeFactory.makeSphere(0.1f,, material)

Here’s an illustration of what it looks like when you place a node with a renderable set on the ArSceneView.

The dots you’re seeing are feature point on the plane where an anchor can be placed.

The circle around the node is displayed when the user selects it or is using the select() method programmatically.

The following code snippet showcases how to add the renderable we created above to a transformable node on a plane:

arFragment!!.setOnTapArPlaneListener { hitResult, plane, motionEvent -> val anchor = hitResult.createAnchor()
val anchorNode = AnchorNode(anchor)
val node = TransformableNode(arFragment!!.transformationSystem)
node.renderable = cubeRenderable



As a result, upon tapping on a plane, we can add a transformable node that can be moved about in the boundaries of that plane.


Now that we’ve had a good look at the different components of Sceneform, let’s build an ARCore-based Android application that detects collisions and computes the distance between two transformable nodes.

To start, create a new Android Studio project in Kotlin. Please ensure that the minimum Android API Level is 27 for ARCore and Sceneform to work, and add the following dependency in your app’s build.gradle file:

implementation ''

Next up, let’s add an ArFragment to the activity_main.xml layout file, as shown below:

We’ve also added a TextView that’ll display the distance between the nodes.

Now head back to the MainActivity.kt class, where we’ll check if the OpenGL version is 3.0 or higher before hooking up the layout:

override fun onCreate(savedInstanceState: Bundle?) {

if (!checkIsSupportedDeviceOrFinish(this)) {
Toast.makeText(applicationContext, "Device not supported", Toast.LENGTH_LONG).show()


fun checkIsSupportedDeviceOrFinish(activity: Activity): Boolean {

val openGlVersionString = (Objects.requireNonNull(activity.getSystemService(Context.ACTIVITY_SERVICE)) as ActivityManager)
if (java.lang.Double.parseDouble(openGlVersionString) < MIN_OPENGL_VERSION) {
Toast.makeText(activity, "Sceneform requires OpenGL ES 3.0 or later", Toast.LENGTH_LONG)
return false
return true

Now, let’s modify the setOnTapArPlaneListener method to accommodate two transformable nodes. In the following code from the MainActivity.kt class, we’ve initialized two 3D rendering models — one of them to modify the transformable material during the collision.

Take note of the Scene.OnUpdateListener interface, onUpdate(frameTime: FrameTime)that we’ve set in the Activity class. This would invoke the callback method once per frame, immediately before the scene is updated. We can handle any events in that function. The FrameTime The property would provide us time information for the current frame, thereby allowing us to leverage that if we want to add some customizations to the scene periodically.

In the next section, we’ll detect collision events on the two nodes

Sceneform’s way of dealing with a collision is a little different (it terms it as an overlap). Currently, it doesn’t have an event handler that tells you when the collision has begun and when it’s ended.

By invoking the overlapTest() function and passing the node, you can determine if it’s overlapping with any of the nodes in the scene. Alternatively, you can use the overlapTestAll() function to get a list of overlapping nodes.

The following code handles collision and changes the renderable material of the node accordingly:

For the sake of simplicity, we’re detecting the overlap only once two transformable nodes are laid on the plane.

To compute the distance between the two transformable nodes continuously, add the following code inside the onUpdate callback function:

val positionA = nodeA!!.worldPosition
val positionB = nodeB!!.worldPositionval dx = positionA.x - positionB.x
val dy = positionA.y - positionB.y
val dz = positionA.z - positionB.z

val distanceMeters = Math.sqrt((dx * dx + dy * dy + dz * dz).toDouble()).toFloat()

val distanceFormatted = String.format("%.2f", distanceMeters)

tvDistance!!.text = "Distance between nodes: $distanceFormatted metres"

Finally, with everything set, running the above application on an ARCore-compatible device gave us the following:

ARCore with Sceneform has been making some good progress recently. With the latest updates releasing interesting features such as augmented faces, UI elements, loading elements at runtime in AR, a shared camera, and more, Google is slowly and steadily catching up with its rivals. ARCore’s cross-platform support gives it a big advantage over others, as well.

The release of the new Depth API should only help developers build more immersive AR scenes and experiences.

The full source code for this piece is available in the GitHub Repository. That’s it for this one. Thanks for reading.

Leave a Comment