Information regarding use of 3D models in KudanAR
KudanAR supports the following formats:
- FBX (*.fbx)
- OBJ (*.obj)
- COLLADA (*.dae)
All modelling software is capable of working with these formats, but Maya is recommended for the best results using the framework. If you need to work with a format other than one of the listed formats you can try importing the model into some 3D modelling software and exporting as one of the supported formats.
.FBX is by far the most robust format and has support for animations and scenegraphs. It is also widely used and supported by most modelling applications. When exporting an
.FBX file, it is recommended to do so using the binary format if it is available. The ASCII format is also supported, but for the best results use binary.
OBJ is incapable of representing scenegraphs and has no support for animations, but is still a popular format that can be used for basic, static models.
COLLADA is another format widely used by many modelling applications, though because it is designed as an intermediary format, rather than an endpoint for models, it is recommended that it be re-exported as one of the other supported formats.
Meshes should be made with polygons for the best results. Non-polygon meshes will be converted by the model importer. Meshes should have a single material to reduce the number of draw calls and render passes for better rendering performance. For good performance you should try to keep the number of meshes rendered at any given time to below 50, as a high number of meshes can cause significant slow down in the GPU. Due to hardware limitations, meshes exceeding 65535 vertices will be split into separate meshes.
Complex scenegraphs can be utilised when using the FBX format. It is helpful to have unique names for each node in the scene, though this is not compulsory.
Skinned meshes are supported. There is a hard limit of 64 bones per mesh on iOS and 32 on Android but you can exceed this by splitting the mesh into multiple meshes. The model converter supports meshes where any one vertex can be influenced by any number of bones, but only the 4 highest weighted bones will be used for rendering. You should ensure that your model looks correct under these circumstances, although it is uncommon for this to be problematic.
Blend shapes / morph targets
The renderer supports blend shapes (also known as morph targets). This allows you to deform a mesh and is particularly well-suited to our camera texture mapping feature.
Due to rendering in real-time on mobile devices, there are limitations on the blend shapes. You can only mix between two different shapes at any one time. For example, you cannot mix between a happy face blend shape and an angry face blend shape because there are three shapes involved: happy, angry and regular. But a single shape with 10 in-between shapes is perfectly fine because you’re only ever mixing between two shapes.
Animations are imported when using the FBX format. The following properties are keyable:
- Node transformation.
- Node visibility.
- Blend shape influence.
Currently animation takes are not supported so different animation cycles need to be split into separate files. Support for this will be added in the future.
Animations don’t need to be baked and only the channels that actually change need to be keyed.
On Android, bone animations are supported, but only when using an ARLightMaterial. Other materials or textures will not correctly display a bone animation.
Kudan’s ARModel Format
The KudanAR framework only works with our
.armodel format. This format is designed to result in a small file size and fast loading.
.armodel files can be created using the Kudan AR Toolkit (N.B. If you are using your model in Android, you will need to change the extension of the exported file from .armodel to .jet or include .armodel in a noCompress block in your build gradle). Textures are bundled with the app separately.
ARModel files can be loaded via the ARModelImporter class:
ARModelImporter *importer = [[ARModelImporter alloc] initWithBundled:@"test.armodel"]; ARModelNode *modelNode = [importer getNode];
let importer = ARModelImporter(bundled: "test.armodel") let modelNode: ARModelNode? = importer.getNode()
ARModelImporter importer = new ARModelImporter(); importer.loadFromAsset("test.jet"); ARModelNode modelNode = importer.getNode();
The resulting ARModelNode can then be added to other nodes in order to get rendered.
Textures are limited to a maximum size of 2048x2048px and 8-bit color-depth. The dimensions must be a power of two.
In Android file extensions of model assets must be changed to “.jet” to avoid file compression.
Materials are used in conjunction with the renderer and define how your object is displayed.
This is an implementation of phong shading. These materials are made up of a collection of ARMaterialPropererties. For most properties, you can use either an ARVector value for storing uniform values across the material which is multiplied by by a float factor, or an ARTexture that can be used to apply a coloured texture to the model or map that can be used for lighting.
Light Materials contain the following properties:
- colour – The material’s main colour texture, representing the base colour of the surface. Using a vector for a this parameter to use a single colour is possible, but usually you would assign a texture map. This should represent the colours of the surface of the object. If using a texture map, make sure it is unlit, since lighting will be added via a combination of ARLights and the other material parameters.
- blendTexture – An second, alternative texture that can be loaded to transform the material. Use the blendParameter to determine the ratio of colour to blend texture in drawing.
- ambient – The ambient light that universally illuminates the material. It can be thought of as a “minimum light level”. Can be set using a vector for uniform lighting or a map passed as a texture.
- diffuse – Diffuse lighting computes a simple lighting model. Lighting on the surface decreases as the angle between it and the light decreases. Lighting depends only on this angle, and does not change as the camera moves or rotates around. Diffuse lighting applied to a material can be set using a vector for uniform diffuse lighting or a diffuse map passed as a texture.
- reflection – This property controls the reflectivity of a material and what the object reflects, and contains two properties of its own, an environment map and a reflectivity factor, which are used for image-based reflections and refractions. The Environment map variable is a texture cube that simulates the environment surrounding the material, and must be assigned for reflection to occur.
- normal – Normal maps are a type of Bump Map, a special kind of texture that allows you to add surface detail such as bumps, grooves, and scratches to a model which catch the light as if they are represented by real geometry. The normal map used for a material must be set using a texture.
- shininess – The shininess parameter affects the amount of specular light reflected by the material. A lower value makes the material appear duller, and a higher value makes the material appear shinier.
- alpha – The alpha value controls the transparency level for the material. This only affects the material if transparency is enabled.
- blendParameter – A float used to blend the colour texture with the blend texture with the default being 0 which is the colour texture and 1 being the blend texture.
- refractivityRatio – Represents the ratio between the refractive indices of mediums the light enters and exits during refraction.
- maxLights – Maximum number of dynamic lights the material can be lit by in the scene. More lights will cause more intensive calculations and produce lower frame rates.
- perPixelShader – Whether the lighting calculations should occur on a per-vertex or a per-pixel basis. Per-pixel lighting calculations produce better visual results at the expense of performance. Some lighting functions, such as normal maps and accurate dynamic lighting, are unavailable in the per-vertex case.
A colour material is a simple unlit material that represents a material of a singular solid colour. The same effect can be achieved by assigning a vector to the colour property of a light material and setting the ambient lighting on the material to 1. However, if such an effect is the desired outcome, using a colour material is recommended for its low overhead.
A texture material is a simple unlit material that is created using an image with a texture map. This particular material also supports transparency, which can be modified programatically, and is the material used by ARImageNode to display images.
An occlusion material allows you to hide certain parts of the scene. This is useful for causing parts of the camera texture to occlude your CG content, for instance, creating a hole in a marker, or making a model appear to move behind a real-world object.
Working with materials
Materials are created programmatically and must be assigned to meshes accordingly. Meshes in the scene are represented by ARMeshNodes which have a material property designating which material to render it with.
All mesh nodes in a 3D model are grouped together in the ARModelNode’s meshNodes property. The following is a quick example of how to create a texture material and assign it to a mesh:
ARModelImporter *modelImporter = [[ARModelImporter alloc] initWithBundled:@"example.armodel"]; ARModelNode *modelNode = [modelImporter getNode]; ARMeshNode *meshNode = [modelNode findMeshNode:"Example Mesh"]; ARTexture *texture = [[ARTexture alloc] initWithUIImage:[UIImage imageNamed:@"exampleTexture.jpg"]]; ARTextureMaterial *textureMaterial = [ARTextureMaterial materialWithTexture:texture]; meshNode.material = textureMaterial;
let modelImporter = ARModelImporter(bundled: "example.armodel") let modelNode: ARModelNode? = modelImporter.getNode() let meshNode: ARMeshNode? = modelNode?.findMeshNode("Example Mesh") let texture = ARTexture(uiImage: UIImage(named: "exampleTexture.jpg")) let textureMaterial = ARTextureMaterial(texture) meshNode?.material = textureMaterial
ARModelImporter modelImporter = new ARModelImporter(); modelImporter.loadFromAsset("example.armodel"); ARModelNode modelNode = modelImporter.getNode(); ARMeshNode meshNode = modelNode.getMeshNodes(); ARTexture2D texture = new ARTexture2D(); texture.loadFromAsset("exampleTexture.jpg"); ARTextureMaterial *textureMaterial = new ARTextureMaterial(texture); meshNode.setMaterial(textureMaterial);
Extracted Camera Texture
The Extracted Camera Texture is a type of texture that can be used with materials. It is responsible for taking part of the live camera texture and making it usable as part of the rendering. It is useful for deforming the camera image and making it come alive, while maintaining realism due to perfectly matching the lighting and noise of the real world image. For more information, see our public code samples and API documentation.
Life Size Models
In order to get your 3D model to appear life size whilst using ArbiTrack, you will need to know the real-world dimensions of your object.
The units in our engine are arbitrary, much like you would have within a 3D modelling application.
By default, when using KudanAR, we set the camera height to be 120. We generally think of this as 120cm, but you can interpret this as 12 inches or 12cm, you will simply need to be consistent in what you choose this unit to be, then consistently apply this scale to your 3D models.
So let’s go with 1 unit as 1cm; if you create a 3D box that is 120x120x120 units, it should appear as a cube that reaches the height of the camera. If you have a cube that is 5 units on each axis, but its real size is 130cm, the calculation you would have to do is:
Real Size / Unit Size = Uniform Scale Value
130/5 = 26
While the renderer is capable of handling highly detailed 3D models exceeding 1 million triangles, there are certain things you can do to improve performance, especially on lower power devices:
- Reduce the number of triangles. 100,000 is a good target to aim for.
- Reduce the number of different materials, combining any meshes with identical materials. Instead of having lots of different textures, compact them into an atlas and share it across meshes.
- Bones are more efficient than blend shapes. If your deform can be done with bones then do so, as it is a much better approach.
- Try using normal maps with simpler geometry. This can significantly reduce the model file size as well.
- Try to share materials as much as possible. If the same image appears in multiple locations throughout the scene, that image should be loaded only once. ARMaterial instances can be shared by multiple meshes.
These features are in development and will be made public in the near future:
- Custom Shader Support
- Real-time shadows
- Mixing of blend shapes
- Post-processing effects