README.md 7.71 KB
Newer Older
Apple's avatar
Apple committed
1
2
3
4
5
6
# Streaming Depth Data from the TrueDepth Camera

Visualize depth data in 2D and 3D from the TrueDepth camera.

## Overview

Apple's avatar
Apple committed
7
The TrueDepth camera provides depth data in real time that allows you to determine the distance of a pixel from the front-facing camera. This sample demonstrates how to use the AVFoundation framework’s capture API to read data from the TrueDepth camera, and how to display it in an intuitive fashion onscreen.
Apple's avatar
Apple committed
8
9
10

The sample shows two different views: a 2D view that distinguishes depth values by mapping depth to color, and a 3D view that renders data as a point cloud.

Apple's avatar
Apple committed
11
To see this sample app in action, build and run the project in Xcode on an iOS device running iOS 11 or later. Because Xcode doesn’t have access to the TrueDepth camera, this sample will not build or run in the Xcode simulator.
Apple's avatar
Apple committed
12
13
14

## Set Up a Capture Session

Apple's avatar
Apple committed
15
Set up an `AVCaptureSession` on a separate thread via the session queue. Initialize this session queue before configuring the camera for capture, like so: 
Apple's avatar
Apple committed
16
17
18
19
20
21
22
23
24

``` swift
private let sessionQueue = DispatchQueue(label: "session queue", attributes: [], autoreleaseFrequency: .workItem)
```

The `startRunning` method is a blocking call that may take time to execute. Dispatch session setup to the session queue so the main queue isn’t blocked, allowing the app’s UI to stay responsive:

``` swift
sessionQueue.async {
25
    self.configureSession()
Apple's avatar
Apple committed
26
27
28
}
```

Apple's avatar
Apple committed
29
Setting up the camera for video capture follows many of the same steps as normal video capture. See [Setting Up a Capture Session](https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/setting_up_a_capture_session) for details on configuring streaming setup.
Apple's avatar
Apple committed
30
31
32
33
34
35
36
37
38
39
40
41

On top of normal setup, request depth data by declaring a separate output:

``` swift
private let depthDataOutput = AVCaptureDepthDataOutput()
```

Explicitly add this output type to your capture session:

``` swift
session.addOutput(depthDataOutput)
depthDataOutput.isFilteringEnabled = false
42
43
44
45
46
if let connection = depthDataOutput.connection(with: .depthData) {
    connection.isEnabled = true
} else {
    print("No AVCaptureConnection")
}
Apple's avatar
Apple committed
47
48
49
50
51
52
53
```

Search for the highest resolution available with floating-point depth values, and lock the configuration to the format.

``` swift
let depthFormats = videoDevice.activeFormat.supportedDepthDataFormats
let filtered = depthFormats.filter({
54
    CMFormatDescriptionGetMediaSubType($0.formatDescription) == kCVPixelFormatType_DepthFloat16
Apple's avatar
Apple committed
55
56
})
let selectedFormat = filtered.max(by: {
57
    first, second in CMVideoFormatDescriptionGetDimensions(first.formatDescription).width < CMVideoFormatDescriptionGetDimensions(second.formatDescription).width
Apple's avatar
Apple committed
58
59
60
61
62
63
64
65
66
67
68
69
70
71
})

do {
    try videoDevice.lockForConfiguration()
    videoDevice.activeDepthDataFormat = selectedFormat
    videoDevice.unlockForConfiguration()
} catch {
    print("Could not lock device for configuration: \(error)")
    setupResult = .configurationFailed
    session.commitConfiguration()
    return
}
```

Apple's avatar
Apple committed
72
Synchronize the normal RGB video data with depth data output. The first output in the `dataOutputs` array is the master output.
Apple's avatar
Apple committed
73
74
75
76
77
78

``` swift
outputSynchronizer = AVCaptureDataOutputSynchronizer(dataOutputs: [videoDataOutput, depthDataOutput])
outputSynchronizer!.setDelegate(self, queue: dataOutputQueue)
```

Apple's avatar
Apple committed
79
The `CameraViewController` implementation creates and manages this session to interface with the camera. It also contains UI to toggle between the two viewing modes, 2D and 3D.
Apple's avatar
Apple committed
80
81
82

## Visualize Depth Data in 2D

Apple's avatar
Apple committed
83
The sample uses JET color coding to distinguish depth values, ranging from red (close) to blue (far). A slider controls the blending of the color code and the actual color values. Touching a pixel displays its depth value.
Apple's avatar
Apple committed
84

Apple's avatar
Apple committed
85
`DepthToJETConverter` performs the conversion. It separates the color spectrum into histogram bins, colors a Metal texture from depth values obtained in the image buffer, and renders that texture into the preview.
Apple's avatar
Apple committed
86
87
88

``` swift
var cvTextureOut: CVMetalTexture?
89
90
91
92
93
94
CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, nil, textureFormat, width, height, 0, &cvTextureOut)
guard let cvTexture = cvTextureOut, let texture = CVMetalTextureGetTexture(cvTexture) else {
    print("Depth converter failed to create preview texture")
    CVMetalTextureCacheFlush(textureCache, 0)
    return nil
}
Apple's avatar
Apple committed
95
96
97
98
```

## Visualize Depth Data in 3D

Apple's avatar
Apple committed
99
The sample’s 3D viewer renders data as a point cloud. Control the camera with the following gestures:
Apple's avatar
Apple committed
100

Apple's avatar
Apple committed
101
102
103
104
* Pinch to zoom. 
* Pan to move the camera around the center. 
* Rotate with two fingers to turn the camera angle. 
* Double-tap the screen to reset the initial position. 
Apple's avatar
Apple committed
105

Apple's avatar
Apple committed
106
The sample implements a 3D point cloud as a `PointCloudMetalView`. It uses a Metal vertex shader to control geometry and a Metal fragment shader to color individual vertices, keeping the depth texture and color texture separate:
Apple's avatar
Apple committed
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157

``` objective-c
CVMetalTextureCacheRef _depthTextureCache;
CVMetalTextureCacheRef _colorTextureCache;
```

The depth frame’s depth map provides the basis for the Metal view’s depth texture:

``` objective-c
CVPixelBufferRef depthFrame = depthData.depthDataMap;
CVMetalTextureRef cvDepthTexture = nullptr;
if (kCVReturnSuccess != CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
                        _depthTextureCache,
                        depthFrame,
                        nil,
                        MTLPixelFormatR16Float,
                        CVPixelBufferGetWidth(depthFrame),
                        CVPixelBufferGetHeight(depthFrame),
                        0,
                        &cvDepthTexture)) {
    NSLog(@"Failed to create depth texture");
    CVPixelBufferRelease(colorFrame);
    return;
}

id<MTLTexture> depthTexture = CVMetalTextureGetTexture(cvDepthTexture);
```

The RGB image provides the basis for the Metal view’s color texture:

``` objective-c
CVMetalTextureRef cvColorTexture = nullptr;
if (kCVReturnSuccess != CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
                        _colorTextureCache,
                        colorFrame,
                        nil,
                        MTLPixelFormatBGRA8Unorm,
                        CVPixelBufferGetWidth(colorFrame),
                        CVPixelBufferGetHeight(colorFrame),
                        0,
                        &cvColorTexture)) {
    NSLog(@"Failed to create color texture");
    CVPixelBufferRelease(colorFrame);
    return;
}

id<MTLTexture> colorTexture = CVMetalTextureGetTexture(cvColorTexture);
```

## Track Thermal State

Apple's avatar
Apple committed
158
Processing depth data from a live stream may cause the device to heat up. Keep tabs on the thermal state so you can alert the user if it exceeds a dangerous threshold.
Apple's avatar
Apple committed
159
160
161
162

``` swift
@objc
func thermalStateChanged(notification: NSNotification) {
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
    if let processInfo = notification.object as? ProcessInfo {
        showThermalState(state: processInfo.thermalState)
    }
}

func showThermalState(state: ProcessInfo.ThermalState) {
    DispatchQueue.main.async {
        var thermalStateString = "UNKNOWN"
        if state == .nominal {
            thermalStateString = "NOMINAL"
        } else if state == .fair {
            thermalStateString = "FAIR"
        } else if state == .serious {
            thermalStateString = "SERIOUS"
        } else if state == .critical {
            thermalStateString = "CRITICAL"
        }
        
        let message = NSLocalizedString("Thermal state: \(thermalStateString)", comment: "Alert message when thermal state has changed")
        let alertController = UIAlertController(title: "TrueDepthStreamer", message: message, preferredStyle: .alert)
        alertController.addAction(UIAlertAction(title: NSLocalizedString("OK", comment: "Alert OK button"), style: .cancel, handler: nil))
        self.present(alertController, animated: true, completion: nil)
    }
}
Apple's avatar
Apple committed
187
```