持续创作,加速成长!这是我参加「日新方案 10 月更文挑战」的第1天,点击查看活动概况

AVFoundation 是Apple iOS和OS X体系中用于处理依据时刻的媒体数据的高档结构,通过开发所需的东西提供了强大的功能集,让开发者能够依据苹果渠道创建当下最先进的媒体应用程序,其针对64位处理器设计,充分利用了多核硬件优势,会主动提供硬件加速操作,确保大部分设备能以最佳性能运转,是iOS开发接触音视频开发必学的结构之一

参加日新方案,持续记载AVFoundation学习,Demo学习地址,里面封装了一些东西类,能够直接运用,这篇文章首要叙述AVFoundation中的AVCaptureSession等类实现媒体捕捉功能,其他类的相关用法可查看我的其他文章。

捕捉媒体

媒体捕捉是AVFoundation的中心功能之一,也是开发音视频App必不可少的功能。捕捉用到的类如图所示

【AVFoundation】AVCaptureSession媒体捕捉

  • AVCaptureSession是AVFoundation捕捉栈的中心类,捕捉会话用于衔接输入和输出资源,管理从物理设备得到的输入流,例如从摄像头得到的视频从麦克风得到的音频,以不同的方法输出给一个或多个输出,能够动态装备输入和输出线路,让开发者能够在会话中按需重新装备捕捉环境。

  • AVCaptureDevice为摄像头麦克风等物理设备界说了一个接口,在iOS10以后,运用AVCaptureDeviceDiscoverySession获取设备。

  • AVCaptureDevice包装成AVCaptureDeviceInput才干增加到捕捉回话中。

  • AVFoundation界说了AVCaptureOutput的许多扩展类,AVCaptureOutput是一个笼统基类,用于从捕捉回话中得到数据,结构界说了这个笼统类的高档扩展类,常用的有AVCaptureStillImageOutput静态图片输出、AVCaptureMovieFileOutput视频文件输出、AVCaptureVideoDataOutput视频流数据输出、AVCaptureAudioDataOutput音频流数据输出、AVCaptureMetadataOutput元数据输出。留意,不能一起装备AVCaptureVideoDataOutputAVCaptureMovieFileOutput,二者无法一起启用。

  • AVCaptureVideoPreviewLayer是CoreAnimation结构中CALayer的一个子类,对捕捉视频数据实时预览。当然也能够运用GLKViewUIImageView预览实时视频流的Buffer。

详细代码能够看Demo中的CQCaptureManager类对捕捉东西的封装

1.创建会话

创建会话并装备分辨率

  • 装备分辨率留意要判别下能否支撑,例如老机型前置摄像头装备4k是不支撑的。
  • 不同分辨率的缩放倍数也是不同的
self.captureSession = [[AVCaptureSession alloc] init];
- (void)configSessionPreset:(AVCaptureSessionPreset)sessionPreset {
  [self.captureSession beginConfiguration];
  if ([self.captureSession canSetSessionPreset:sessionPreset]) {
    self.captureSession.sessionPreset = sessionPreset;
  } else {
    self.captureSession.sessionPreset = AVCaptureSessionPresetHigh;
  }
  [self.captureSession commitConfiguration];
  self.isConfigSessionPreset = YES;
}

2.装备视频输入

/// 装备视频输入
- (BOOL)configVideoInput:(NSError * _Nullable *)error {
    // 增加视频捕捉设备
    // 拿到默许视频捕捉设备 iOS默许后置摄像头
//    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDevice *videoDevice = [self getCameraWithPosition:AVCaptureDevicePositionBack];
    // 将捕捉设备转化为AVCaptureDeviceInput
    // 留意:会话不能直接运用AVCaptureDevice,必须将AVCaptureDevice封装成AVCaptureDeviceInput对象
    AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:error];
    // 将捕捉设备增加给会话
    // 运用前判别videoInput是否有效以及能否增加,因为摄像头是一个公共设备,不属于任何App,有或许别的App在运用,增加前应该先进行判别是否能够增加
    if (videoInput && [self.captureSession canAddInput:videoInput]) {
        // 将videoInput 增加到 captureSession中
        [self.captureSession beginConfiguration];
        [self.captureSession addInput:videoInput];
        [self.captureSession commitConfiguration];
        self.videoDeviceInput = videoInput;
        return YES;
    }else {
        return NO;
    }
}
/// 移除视频输入设备
- (void)removeVideoDeviceInput {
    if (self.videoDeviceInput) [self.captureSession removeInput:self.videoDeviceInput];
    self.videoDeviceInput = nil;
}
  • 获取摄像头,iOS10之后运用AVCaptureDeviceDiscoverySession获取
  • 长焦超广或者双摄三摄必须运用AVCaptureDeviceDiscoverySession获取,[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]无法获取
/// 依据position拿到摄像头
- (AVCaptureDevice *)getCameraWithPosition:(AVCaptureDevicePosition)position {
    /**
     AVCaptureDeviceTypeBuiltInWideAngleCamera 广角(默许设备,28mm左右焦段)
     AVCaptureDeviceTypeBuiltInTelephotoCamera 长焦(默许设备的2x或3x,只能运用AVCaptureDeviceDiscoverySession获取)
     AVCaptureDeviceTypeBuiltInUltraWideCamera 超广角(默许设备的0.5x,只能运用AVCaptureDeviceDiscoverySession获取)
     AVCaptureDeviceTypeBuiltInDualCamera (一个广角一个长焦(iPhone7P,iPhoneX),能够主动切换摄像头,只能运用AVCaptureDeviceDiscoverySession获取)
     AVCaptureDeviceTypeBuiltInDualWideCamera (一个超广一个广角(iPhone12 iPhone13),能够主动切换摄像头,只能运用AVCaptureDeviceDiscoverySession获取)
     AVCaptureDeviceTypeBuiltInTripleCamera (超广,广角,长焦三摄像头,iPhone11ProMax iPhone12ProMax iPhone13ProMax,能够主动切换摄像头,只能运用AVCaptureDeviceDiscoverySession获取)
     AVCaptureDeviceTypeBuiltInTrueDepthCamera (红外和摄像头, iPhone12ProMax iPhone13ProMax )
     */
    NSArray *deviceTypes;
    if (position == AVCaptureDevicePositionBack) {
        deviceTypes = @[AVCaptureDeviceTypeBuiltInDualCamera,
                        AVCaptureDeviceTypeBuiltInDualWideCamera,
                        AVCaptureDeviceTypeBuiltInTripleCamera, ];
    } else {
        deviceTypes = @[AVCaptureDeviceTypeBuiltInWideAngleCamera];
    }
    AVCaptureDeviceDiscoverySession *deviceSession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:deviceTypes mediaType:AVMediaTypeVideo position:position];
    if (deviceSession.devices.count) return deviceSession.devices.firstObject;
    if (position == AVCaptureDevicePositionBack) {
        // 非多摄手机
        deviceTypes = @[AVCaptureDeviceTypeBuiltInWideAngleCamera];
        AVCaptureDeviceDiscoverySession *deviceSession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:deviceTypes mediaType:AVMediaTypeVideo position:position];
        if (deviceSession.devices.count) return deviceSession.devices.firstObject;
    }
    return nil;
}

3.装备音频输入

/// 装备音频输入
- (BOOL)configAudioInput:(NSError * _Nullable *)error {
    // 增加音频捕捉设备 ,假如只是拍照静态图片,能够不用设置
    // 选择默许音频捕捉设备 即返回一个内置麦克风
    AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    self.audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:error];
    if (self.audioDeviceInput && [self.captureSession canAddInput:self.audioDeviceInput]) {
        [self.captureSession beginConfiguration];
        [self.captureSession addInput:self.audioDeviceInput];
        [self.captureSession commitConfiguration];
        return YES;
    }else {
        return NO;
    }
}
/// 移除音频输入设备
- (void)removeAudioDeviceInput {
    if (self.audioDeviceInput) [self.captureSession removeInput:self.audioDeviceInput];
}

5.装备输出

#pragma mark - Func 静态图片输出装备
/// 装备静态图片输出
- (void)configStillImageOutput {
    // AVCaptureStillImageOutput 从摄像头捕捉静态图片
    self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    // 装备字典:希望捕捉到JPEG格式的图片
    self.stillImageOutput.outputSettings = @{AVVideoCodecKey:AVVideoCodecJPEG};
    // 输出衔接 判别是否可用,可用则增加到输出衔接中去
    [self.captureSession beginConfiguration];
    if ([self.captureSession canAddOutput:self.stillImageOutput]) {
        [self.captureSession addOutput:self.stillImageOutput];
    }
    [self.captureSession commitConfiguration];
}
/// 移除静态图片输出
- (void)removeStillImageOutput {
    if (self.stillImageOutput) [self.captureSession removeOutput:self.stillImageOutput];
}
#pragma mark - Func 电影文件输出装备
/// 装备电影文件输出
- (void)configMovieFileOutput {
    // AVCaptureMovieFileOutput,将QuickTime视频录制到文件体系
    self.movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
    [self.captureSession beginConfiguration];
    if ([self.captureSession canAddOutput:self.movieFileOutput]) {
        [self.captureSession addOutput:self.movieFileOutput];
    }
    [self.captureSession commitConfiguration];
}
/// 移除电影文件输出
- (void)removeMovieFileOutput {
    if (self.movieFileOutput) [self.captureSession removeOutput:self.movieFileOutput];
}

6.开端会话\完毕会话

// 异步开端会话
- (void)startSessionAsync {
    // 查看是否处于运转状况
    if (![self.captureSession isRunning]) {
        // 运用同步调用会损耗一定的时刻,则用异步的方法处理
        dispatch_async(self.captureVideoQueue, ^{
            [self.captureSession startRunning];
        });
    }
}
// 异步停止会话
- (void)stopSessionAsync {
    // 查看是否处于运转状况
    if ([self.captureSession isRunning]) {
        dispatch_async(self.captureVideoQueue, ^{
            [self.captureSession stopRunning];
        });
    }
}

7.捕捉静态图片

#pragma mark - 静态图片捕捉
#pragma mark Public Func 静态图片捕捉
// 捕捉静态图片
- (void)captureStillImage {
    if (!self.isConfigSessionPreset) [self configSessionPreset:AVCaptureSessionPresetMedium];
    if (!self.videoDeviceInput) {
        NSError *configError;
        BOOL configResult = [self configVideoInput:&configError];
        if (!configResult) return;
    }
    if (!self.stillImageOutput) [self configStillImageOutput];
    [self startSessionSync];
    // 获取图片输出衔接
    AVCaptureConnection *connection = [self.stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
    // 即便程序只支撑纵向,可是假如用户横向拍照时,需求调整结果相片的方向
    // 判别是否支撑设置视频方向, 支撑则依据设备方向设置输出方向值
    if (connection.isVideoOrientationSupported) {
        connection.videoOrientation = [self getCurrentVideoOrientation];
    }
    [self.stillImageOutput captureStillImageAsynchronouslyFromConnection:connection completionHandler:^(CMSampleBufferRef  _Nullable imageDataSampleBuffer, NSError * _Nullable error) {        if (imageDataSampleBuffer != NULL) {            dispatch_async(dispatch_get_main_queue(), ^{                if (self.delegate && [self.delegate respondsToSelector:@selector(mediaCaptureImageFileSuccess)]) {
                    [self.delegate mediaCaptureImageFileSuccess];
                }
            });
            // CMSampleBufferRef转UIImage 并写入相册
            NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
            UIImage *image = [[UIImage alloc] initWithData:imageData];
            [self writeImageToAssetsLibrary:image];
        } else {
            dispatch_async(dispatch_get_main_queue(), ^{
                if (self.delegate && [self.delegate respondsToSelector:@selector(mediaCaptureImageFailedWithError:)]) {
                    [self.delegate mediaCaptureImageFailedWithError:error];
                }
            });
            NSLog(@"NULL sampleBuffer:%@",[error localizedDescription]);
        }
    }];
}
#pragma mark Private Func 静态图片捕捉
/**
 Assets Library 结构
 用来让开发者通过代码方法拜访iOS photo
 留意:会拜访到相册,需求修正plist 权限。否则会导致项目崩溃
 */
/// 将UIImage写入到用户相册
- (void)writeImageToAssetsLibrary:(UIImage *)image {
    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
    // 参数1 图片, 参数2 方向, 参数3 回调
    [library writeImageToSavedPhotosAlbum:image.CGImage orientation:(NSUInteger)image.imageOrientation completionBlock:^(NSURL *assetURL, NSError *error) {        if (!error) {            dispatch_async(dispatch_get_main_queue(), ^{                if (self.delegate && [self.delegate respondsToSelector:@selector(assetLibraryWriteImageSuccessWithImage:)]) {
                    [self.delegate assetLibraryWriteImageSuccessWithImage:image];
                }
            });
        } else {
            dispatch_async(dispatch_get_main_queue(), ^{
                if (self.delegate && [self.delegate respondsToSelector:@selector(assetLibraryWriteImageFailedWithError:)]) {
                    [self.delegate assetLibraryWriteImageFailedWithError:error];
                }
            });
        }
    }];
}

8.捕捉视频文件

#pragma mark - 电影文件捕捉
#pragma mark Public Func 电影文件捕捉
// 开端录制电影文件
- (void)startRecordingMovieFile {
    if (!self.isConfigSessionPreset) [self configSessionPreset:AVCaptureSessionPresetMedium];
    if (!self.videoDeviceInput) {
        NSError *configError;
        BOOL configResult = [self configVideoInput:&configError];
        if (!configResult) return;
    }
    if (!self.movieFileOutput) [self configMovieFileOutput];
    [self startSessionSync];
    if ([self isRecordingMovieFile]) return;
    AVCaptureConnection *videoConnection = [self.movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    // 设置输出方向
    // 即便程序只支撑纵向,可是假如用户横向拍照时,需求调整结果相片的方向
    // 判别是否支撑设置视频方向, 支撑则依据设备方向设置输出方向值
    if (videoConnection.isVideoOrientationSupported) {
        videoConnection.videoOrientation = [self getCurrentVideoOrientation];
    }
    // 设置视频帧稳定
    // 判别是否支撑视频稳定 能够明显提高视频的质量。只会在录制视频文件触及
//    if (videoConnection.isVideoStabilizationSupported) {
//        videoConnection.enablesVideoStabilizationWhenAvailable = YES;
//    }
    videoConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto;
    // 设置对焦
    AVCaptureDevice *device = [self getActiveCamera];
    // 摄像头能够进行平滑对焦模式操作。即减慢摄像头镜头对焦速度。当用户移动拍照时摄像头会测验快速主动对焦。
    if (device.isSmoothAutoFocusEnabled) {
        NSError *error;
        if ([device lockForConfiguration:&error]) {
            device.smoothAutoFocusEnabled = YES;
            [device unlockForConfiguration];
        } else {
            dispatch_async(dispatch_get_main_queue(), ^{
                if (self.delegate && [self.delegate respondsToSelector:@selector(deviceConfigurationFailedWithError:)]) {
                    [self.delegate deviceConfigurationFailedWithError:error];
                }
            });
        }
    }
    self.movieFileOutputURL = [self getVideoTempPathURL];
    // 开端录制 参数1:录制保存途径  参数2:署理
    [self.movieFileOutput startRecordingToOutputFileURL:self.movieFileOutputURL recordingDelegate:self];
}
// 停止录制电影文件
- (void)stopRecordingMovieFile {
    if ([self isRecordingMovieFile]) {
        [self.movieFileOutput stopRecording];
    }
}
// 是否在录制电影文件
- (BOOL)isRecordingMovieFile {
    return self.movieFileOutput.isRecording;
}
// 录制电影文件的时刻
- (CMTime)movieFileRecordedDuration {
    return self.movieFileOutput.recordedDuration;
}
#pragma mark AVCaptureFileOutputRecordingDelegate
/// 捕捉电影文件成功的回调
- (void)captureOutput:(AVCaptureFileOutput *)output didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray<AVCaptureConnection *> *)connections error:(NSError *)error {
    if (error) {
        dispatch_async(dispatch_get_main_queue(), ^{
            if (self.delegate && [self.delegate respondsToSelector:@selector(mediaCaptureMovieFileFailedWithError:)]) {
                [self.delegate mediaCaptureMovieFileFailedWithError:error];
            }
        });
    } else {
        dispatch_async(dispatch_get_main_queue(), ^{
            if (self.delegate && [self.delegate respondsToSelector:@selector(mediaCaptureMovieFileSuccess)]) {
                [self.delegate mediaCaptureMovieFileSuccess];
            }
        });
        // copy一个副本再置为nil
        // 将文件写入相册
        [self writeVideoToAssetsLibrary:self.movieFileOutputURL.copy];
        self.movieFileOutputURL = nil;
    }
}
#pragma mark Private Func 电影文件捕捉
/// 创建视频文件暂时途径URL
- (NSURL *)getVideoTempPathURL {
    NSFileManager *fileManager = [NSFileManager defaultManager];
    NSString *tempPath = [fileManager temporaryDirectoryWithTemplateString:@"video.XXXXXX"];
    if (tempPath) {
        NSString *filePath = [tempPath stringByAppendingPathComponent:@"temp_video.mov"];
        return [NSURL fileURLWithPath:filePath];
    }
    return nil;
}
/// 将视频文件写入到用户相册
- (void)writeVideoToAssetsLibrary:(NSURL *)videoURL {
    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
    // 和图片不同,视频的写入更耗时,所以写入之前应该判别是否能写入
    if (![library videoAtPathIsCompatibleWithSavedPhotosAlbum:videoURL]) return;
    [library writeVideoAtPathToSavedPhotosAlbum:videoURL completionBlock:^(NSURL *assetURL, NSError *error) {
        if (error) {
            dispatch_async(dispatch_get_main_queue(), ^{
                if (self.delegate && [self.delegate respondsToSelector:@selector(assetLibraryWriteMovieFileFailedWithError:)]) {
                    [self.delegate assetLibraryWriteMovieFileFailedWithError:error];
                }
            });
        } else {
            // 写入成功 回调封面图
            [self getVideoCoverImageWithVideoURL:videoURL callBlock:^(UIImage *coverImage) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    if (self.delegate && [self.delegate respondsToSelector:@selector(assetLibraryWriteMovieFileSuccessWithCoverImage:)]) {
                        [self.delegate assetLibraryWriteMovieFileSuccessWithCoverImage:coverImage];
                    }
                });
            }];
        }
    }];
}
/// 获取视频文件封面图
- (void)getVideoCoverImageWithVideoURL:(NSURL *)videoURL callBlock:(void(^)(UIImage *))callBlock {
    dispatch_async(self.captureVideoQueue, ^{
        AVAsset *asset = [AVAsset assetWithURL:videoURL];
        AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
        // 设置maximumSize 宽为100,高为0 依据视频的宽高比来核算图片的高度
        imageGenerator.maximumSize = CGSizeMake(100.0f, 0.0f);
        // 捕捉视频缩略图会考虑视频的改变(如视频的方向改变),假如不设置,缩略图的方向或许犯错
        imageGenerator.appliesPreferredTrackTransform = YES;
        CGImageRef imageRef = [imageGenerator copyCGImageAtTime:kCMTimeZero actualTime:NULL error:nil];
        UIImage *image = [UIImage imageWithCGImage:imageRef];
        CGImageRelease(imageRef);
        dispatch_async(dispatch_get_main_queue(), ^{
            !callBlock ?: callBlock(image);
        });
    });
}

9.预览视频

previewView.session = captureManager.captureSession