Rendering

Common questions about KudanAR’s custom rendering engine


Do I need to use Kudan’s rendering engine?

No. With KudanCV, you can interface our computer vision with any 3rd party rendering engine, including any custom engines you’ve made yourself. The AR SDKs, however, do come with their own rendering capabilities that cannot be decoupled.

What 3D model formats does the renderer support?

The renderer requires a custom lightweight format, known as .armodel, for fast loading and increased performance on mobile devices. Kudan’s ARToolkit can convert .FBX.OBJ and .DAE model files to the .armodel format. For more information, see our 3D Models page.

What 3D model features does the renderer support?

The renderer supports:

  • Meshes
  • Complex Scene Graphs (Nodes)
  • Blend Shapes / Morph Targets
  • Bones
  • Animations

For more information, see our 3D Models page.

What properties are keyable in animations?

Node transformations, blend shape influences and node visibility are all keyable.

How many blend shapes can be active at once?

Currently, the renderer will morph between two different shapes, whether they are from separate channels or not. In-between shapes on a blend shape deformer are fine since only two contribute at once.

Is there a maximum polygon count?

We do not place a hard limit on the number of polygons a model can have. Our renderer is capable of rendering models with over 1 Million Triangles without much performance impact. Having said that, it is always better to have as few polygons on a model as possible when working with mobile applications.

What material types do you support?

Per-pixel lighting with fresnel reflections, occlusion and normal maps. We also provide custom AR-specific materials for use with object occlusion or for working with the camera texture.

Can I create my own Shaders?

We are looking at adding programmable shader support for a future release.

Can I use the camera texture deform without a marker?

Yes, the camera texture extractor will work no matter what is controlling the position, so it also works in ArbiTrack. Just position the target node as usual.

Can I load content from the web?

All content loaders such as the model importer, texture and video loaders can load from a full path name. Your downloader should download content to the app’s cache directory and pass the full path as appropriate.

Can I control the scenegraph from the web?

You would need to write your own importer for a format such as JSON/XML that describes the scene and use the appropriate node creation API accordingly, but yes, this is possible.

Can I use the 3D format for 2D animations?

Yes, 2D animations are also supported by the renderer.