KudanAR - iOS  1.6.0
ARCameraStream Class Reference

#import <ARCameraStream.h>

Inherits NSObject, and <AVCaptureVideoDataOutputSampleBufferDelegate>.

Instance Methods

(ARColour *) - averageColourFromData:withChannels:downsamplingWidth:height:
 
(void) - initialise
 
(void) - deinitialise
 
(void) - start
 
(void) - stop
 
(void) - addDelegate:
 
(void) - removeDelegate:
 
(void) - removeDelegates
 

Class Methods

(ARCameraStream *) + getInstance
 

Properties

float width
 
float height
 
float padding
 
ARTexturecameraTexture
 
ARTexturecameraTextureY
 
ARTexturecameraTextureUV
 
NSArray * delegates
 

Detailed Description

The ARCameraStream is a singleton manager class for handling the camera stream. It can be used to gain access to the background camera texture or to provide camera events.

Method Documentation

◆ addDelegate:

- (void) addDelegate: (id<ARCameraStreamEvent>)  delegate
Parameters
delegateThe delegate to add. Add a delegate for camera update event notifications.

◆ averageColourFromData:withChannels:downsamplingWidth:height:

- (ARColour *) averageColourFromData: (NSData *)  data
withChannels: (int)  channels
downsamplingWidth: (int)  width
height: (int)  height 

Returns the average colour of an image expressed as a byte array. The image will be downsized before computation of the average. Smaller images will compute faster at the expense of accuracy. It is recommended to reduce image size as much as possible.

Example of use:

ARColour *averageColour = [cameraStream averageColourFromData:cameraStream.cameraTexture._rawImage withChannels:3 downSamplingWidth:64 height:64];
Parameters
dataNSData object containing the image data.
channelsThe number of channels contained within the image data. For example, RGB data would contain 3 channels.
widthThe width, in pixels, to which the image will be scaled before computing the average.
heightThe height, in pixels, to which the image will be scaled before computing the average.
Returns
Average colour of the image as an ARColour object.

◆ deinitialise

- (void) deinitialise

Deinitialise the camera. This is handled automatically when your view controller is a subclass of the ARCameraViewController.

Example of use:

◆ getInstance

+ (ARCameraStream *) getInstance

Gets the instance of the camera stream singleton.

Example of use:

Returns
The singleton instance.

◆ initialise

- (void) initialise

Initialise the camera. This is handled automatically when your view controller is a subclass of the ARCameraViewController. When the ARCameraStream instance is initialised, it performs the following: Creates a reference to the renderer's texture cache. Creates a reference to a valid camera device. Sets up a new capture session with a valid camera input device. Sets the camera to "continous autofocus" mode, if supported.

Example of use:

◆ removeDelegate:

- (void) removeDelegate: (id<ARCameraStreamEvent>)  delegate
Parameters
delegateThe delegate to remove. Remove a delegate from camera update event notifications.

◆ removeDelegates

- (void) removeDelegates

Removes all ARCameraStreamEvent delegates currently associated with this singleton.

◆ start

- (void) start

Start the camera stream. This is handled automatically when your view controller is a subclass of the ARCameraViewController.

Example of use:

◆ stop

- (void) stop

Stop the camera stream. This is handled automatically when your view controller is a subclass of the ARCameraViewController.

Example of use:

Property Documentation

◆ cameraTexture

- (ARTexture*) cameraTexture
readwritenonatomicassign

The full-colour camera texture, including all 3 channels.

◆ cameraTextureUV

- (ARTexture*) cameraTextureUV
readwritenonatomicassign

Camera chroma texture. This texture includes both the U and V channels, which contain the colour information of the image. The is not a full colour texture, as the U and V channels are colour differences that depend on the luminance values of the Y channel to be correct.

◆ cameraTextureY

- (ARTexture*) cameraTextureY
readwritenonatomicassign

Camera luma texture. This texture only includes the Y(Luma) colour channel, meaning it will appear as a black-and-white image.

◆ delegates

- (NSArray*) delegates
readwritenonatomicassign

Array containing all ARCameraStreamEvent delegates that have been added to this singleton.

◆ height

- (float) height
readwritenonatomicassign

The height of the camera image, in pixels.

◆ padding

- (float) padding
readwritenonatomicassign

The thickness of padding around the camera image, in pixels. This effectively creates a border around the camera stream.

◆ width

- (float) width
readwritenonatomicassign

The width of the camera image, in pixels.


The documentation for this class was generated from the following files:
ARCameraStream::cameraTexture
ARTexture * cameraTexture
Definition: ARCameraStream.h:56
ARCameraStream::height
float height
Definition: ARCameraStream.h:46
+[ARCameraStream getInstance]
ARCameraStream * getInstance()
Definition: ARCameraStream.mm:34
-[ARCameraStream initialise]
void initialise()
Definition: ARCameraStream.mm:42
-[ARCameraStream stop]
void stop()
Definition: ARCameraStream.mm:149
-[ARCameraStream start]
void start()
Definition: ARCameraStream.mm:144
ARColour
Definition: ARColour.h:8
-[ARCameraStream deinitialise]
void deinitialise()
Definition: ARCameraStream.mm:128
ARCameraStream
Definition: ARCameraStream.h:36
ARTexture::_rawImage
NSData * _rawImage
Definition: ARTexture.h:16