本文主要内容

一.AVFoundation中心功用和中心类的介绍
二.视频预览图层
三.视频捕捉关于AVCaptureSession的装备
四.完结前后摄像头的改动 五.完结摄像头主动聚集功用 六.完结摄像头主动曝光以及承认曝光 七.完结摄像头手电筒和亮光等形式的敞开封闭 八.静态图片的摄影 九.视频录制完结及视频摄影缩略图完结

一.AVFoundation中心功用和中心类的介绍

AVFoundation中心功用

  • iOS 8.0之后推出的音视频结构
  • 相片/视频捕捉功用
  • 小视频/直播

AVFoundation中心类

  • 捕捉会话:AVCaptureSession(最常见的类,类似“排插”的功用,用于输入和输出的衔接作业)

  • 捕捉设备:AVCaptureDevice(如摄像头、麦克风)

  • 捕捉设备输入:AVCaptureDeviceInput(音视频输入)

  • 捕捉设备输出:AVCaptureOutput 抽象类(静态图片/音视频等) AVCaptureStillImageOutput AVCaptureMovieFileOutput AVCaptureAudioDataOutput AVCaptureVideodataOutput

  • 捕捉衔接:AVCatureConnection(依据捕捉的输入设备主动建立输入和输出的衔接)

  • 捕捉预览:AVCaptureVideoPreviewLayer(显示摄像头实时捕捉的内容)

二.视频预览图层

1、视频预览图层

  • THPreviewView:摄影进程中显示预览作用
  • 主要用到AVCaptureVideoPreviewLayer

THPreviewView.h

#import <AVFoundation/AVFoundation.h>
@protocol THPreviewViewDelegate <NSObject>
// 点击聚集
- (void)tappedToFocusAtPoint:(CGPoint)point; 
// 点击曝光
- (void)tappedToExposeAtPoint:(CGPoint)point; 
// 点击重置聚集&曝光
- (void)tappedToResetFocusAndExposure; 
@end
@interface THPreviewView : UIView
//session用来相关AVCaptureVideoPreviewLayer 和 激活AVCaptureSession
@property (strong, nonatomic) AVCaptureSession *session;
@property (weak, nonatomic) id<THPreviewViewDelegate> delegate;
@property (nonatomic) BOOL tapToFocusEnabled; //是否聚集
@property (nonatomic) BOOL tapToExposeEnabled; //是否曝光
@end

THPreviewView.m重要代码

//私有办法 用于支撑该类界说的不同接触处理办法。 将屏幕坐标系上的触控点转化为摄像头上的坐标系点
- (CGPoint)captureDevicePointForPoint:(CGPoint)point {
  AVCaptureVideoPreviewLayer *layer = (AVCaptureVideoPreviewLayer *)self.layer;
  return [layer captureDevicePointOfInterestForPoint:point];
}
AVCaptureVideoPreviewLayer界说了两个办法用于摄像头和屏幕坐标系转化:
1、captureDevicePointOfInterestForPoint:获取屏幕坐标系的CGPoint数据,返回转化得到摄像头设备坐标系的CGPoint数据
2、pointForCaptureDevicePointOfInterest:获取摄像头坐标系的CGPoint数据,返回转化得到屏幕坐标系的CGPoint数据

2、捕捉控制器

  • THCameraController

THCameraController.h重要代码

#import <AVFoundation/AVFoundation.h>
@protocol THCameraControllerDelegate <NSObject>
// 1.设备过错/媒体捕捉过错/写入时过错,对相应过错进行回调处理
- (void)deviceConfigurationFailedWithError:(NSError *)error;
- (void)mediaCaptureFailedWithError:(NSError *)error;
- (void)assetLibraryWriteFailedWithError:(NSError *)error;
@end
@interface THCameraController : NSObject
@property (weak, nonatomic) id<THCameraControllerDelegate> delegate;
@property (nonatomic, strong, readonly) AVCaptureSession *captureSession;
// 2.用于设置、装备视频捕捉会话
- (BOOL)setupSession:(NSError **)error;
- (void)startSession;
- (void)stopSession;
// 3.切换前后摄像头
- (BOOL)switchCameras; // 切换前后摄像头
- (BOOL)canSwitchCameras; // 判别是否支撑切换摄像头
@property (nonatomic, readonly) NSUInteger cameraCount; // 摄像头个数
@property (nonatomic, readonly) BOOL cameraHasTorch; // 手电筒
@property (nonatomic, readonly) BOOL cameraHasFlash; // 亮光灯
@property (nonatomic, readonly) BOOL cameraSupportsTapToFocus; // 聚集
@property (nonatomic, readonly) BOOL cameraSupportsTapToExpose;// 曝光
@property (nonatomic) AVCaptureTorchMode torchMode; // 手电筒形式
@property (nonatomic) AVCaptureFlashMode flashMode; // 亮光灯形式
// 4.聚集、曝光、重设聚集/曝光的办法
- (void)focusAtPoint:(CGPoint)point;
- (void)exposeAtPoint:(CGPoint)point;
- (void)resetFocusAndExposureModes;
// 5.完结捕捉静态图片 & 视频的功用
// 捕捉静态图片
- (void)captureStillImage;
//视频录制
//开端录制
- (void)startRecording;
//中止录制
- (void)stopRecording;
//获取录制状况
- (BOOL)isRecording;
//录制时刻
- (CMTime)recordedDuration;
@end

三.AVCaptureSession的装备

  • THCameraController类功用:视频/相片的捕捉
  • 1.初始化
  • 2.设置分辨率
  • 3.装备输入设备(留意有必要转化为AVCaptureDeviceInput目标)
  • 4.装备输入设备包括音频输入和视频输入
  • 5.装备输出包括静态图画输出和视频文件输出
  • 留意:在为session增加输入输出时,必定要判别能否增加,由于摄像头并不隶属于某一个APP,而是公共设备。
  • 触及摄像头、相册、麦克风,需求给用户提示,处理隐私需求

1.AVCaptureSession基本装备

THCameraController.m重要代码

@interface THCameraController () <AVCaptureFileOutputRecordingDelegate>
//视频行列
@property (strong, nonatomic) dispatch_queue_t videoQueue; 
// 捕捉会话
@property (strong, nonatomic) AVCaptureSession *captureSession;
//输入
@property (weak, nonatomic) AVCaptureDeviceInput *activeVideoInput;
@property (strong, nonatomic) AVCaptureStillImageOutput *imageOutput;
@property (strong, nonatomic) AVCaptureMovieFileOutput *movieOutput;
@property (strong, nonatomic) NSURL *outputURL;
@end
@implementation THCameraController
- (BOOL)setupSession:(NSError **)error {
    // 1.开辟空间
    self.captureSession = [[AVCaptureSession alloc] init];
    // 2.设置分辨率:宽高比
    self.captureSession.sessionPreset = AVCaptureSessionPresetHigh;
#pragma mark - (一)增加视频的输入设备 
    // 3.拿到默许视频捕捉设备:iOS默许后置摄像头为默许视频捕捉设备。
    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    // 4.必定要将捕捉设备转化AVCaptureDeviceInput
    // 留意:为session增加捕捉设备,有必要将此设备封装成AVCaptureDeviceInput目标
    AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice: videoDevice error: error];
    // 5.判别videoInput是否有效
    if (videoInput) {
        // 5.1留意⚠️:判别是否能够增加
        if (self.captureSession canAddInput: videoInput) {
            // 5.2将videoInput增加到会话
            [self.captureSession addInput: videoInput];
            // 5.3设置活泼输入设备为此设备
            self.activeVideInput = videoInput
        }
    } else {
        returen NO;
    }
#pragma mark - (二)增加音频的输入设备:内置麦克风 
    AVCaptureDevice *audioDevice = [AVCaptureDecice defaultDeviceWithMediaType: AVMediaTypeAudio];
    // 必定要将捕捉设备转化为AVCaptureDeviceInput
    // 留意⚠️:为session增加捕捉设备,有必要将此设备封装成AVCaptureDeviceInput目标
    AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput DeviceInputWithDevice: audioDevice error: error];
    if (audioInput) {
        if ([self.captureSession canAddInput: audioInput]) {
            [self.captureSession addInput: audioInput];
        }
    } else {
        return NO;
    }
#pragma mark - 设置输出(相片/视频文件)
    // 图片AVCaptureStillImageOutput
    self.imageOutput = [[AVCaptureStillImageOutput alloc] init];
    // 捕捉到的图片存储格局.jpg
    self.imageOutput.outputSettings = @{AVVideoCodecKey:AVVideoCodecJPEG};
    if ([self.captureSession canAddOutput: self.imageOutput]) {
        [self.captureSession addOutput: self.imageOutput];
    }
    //视频AVCaptureMovieFileOutput实例:默许QuickTime
    self.movieOutput = [[AVCaptureMovieFileOutput alloc] init];
    if ([self.captureSession canAddOutput: self.movieOutput]) {
        [self.captureSession addOutput: self.movieOutput];
    }
    // 视频行列
    self.videoQueue = dispatch_queue_create("cc.videoQueue", NULL);
    return YES;
}

2.AVCaptureSession的敞开和完毕

THCameraController.m重要代码

// 开端捕捉
- (void)startSession {
    // 查看是否处于运行状况
    if (![self.captureSession isRunning]) {
        // 运用同步调用会损耗必定的时刻,则用异步的办法处理 
    dispatch_async(self.videoQueue, ^{
      [self.captureSession startRunning];
    });
  }
}
// 完毕捕捉
- (void)startSession {
    // 查看是否处于运行状况
  if ([self.captureSession isRunning]) {
    // 运用异步办法,中止运行
    dispatch_async(self.videoQueue, ^{
      [self.captureSession stopRunning];
    });
  }
}

四.完结前后摄像头的改动

  • 默许后置摄像头

THCameraController.m重要代码

#pragma mark - Device Configuration  装备摄像头支撑的办法
// 寻找指定摄像头设备(前置或许后置摄像头)
- (AVCaptureDevice *)cameraWithPosition:(AVCaptureDevicePosition)position {
    // 获取一切Video设备
    NSArray *devices = [AVCaptureDevice deviceWithMediaType: AVMediaTypeVideo];
    // 遍历一切设备
    for (AVCaptureDevice *device in devices) {
        if (device.position == positon) {
            return device;
        }
    }
}
// 获取当时活泼设备
- (AVCaptureDevice *)activeCamera {
    return self.activeVideoInput.device
}
// 获取当时不活泼设备
- (AVCaptureDevice *)inactiveCamera {
    AVCaptureDevice *device = nil;
    if (self.cameraCount > 1) {
        // 判别当时是前置仍是后置
        if ([self activeCamera] == AVCaptureDevicePositionBack) {
            device = [self cameraWithPosition: AVCaptureDevicePositionFront];
        } else {
            device = [self cameraWithPosition: AVCaptureDevicePositionBack];
        }
    }
    return device;
}
// 能否切换摄像头
- (BOOL)canSwitchCameras {
    return self.cameraCount > 1;
}
// 摄像头个数
- (NSUInteger)cameraCount {
    return [[AVCaptureDevice devicesWithMediaType: AVMediaTypeVideo] count];
}
// 切换摄像头
- (BOOL)switchCameras {
    // 1.判别是否能切换摄像头
    if (![self canSwitchcameras]) {
        return NO;
    }
    // 2.当时设备的反向设备
    AVCaptureDevice *videoDevice = [self inactiveCamera];
    // 3.将device转化为AVCaptureDeviceInput
    NSError *error;
    AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice: videoDevice error: &error];
    // 4.增加
    if (videoInput) {
        // 标注原始装备要产生改动
        [self.captureSession beginConfiguration];
        // 将原来的输入设备移除
        [self.captureSession removeInput: self.activeVideoInput];
        // 判别能否加入
        if ([self.captureSession canAddInput: videoInput]) {
            [self.captureSession addInput: videoInput];
            // 活泼设备更新
            self.activeVideoInput = videoInput;
        } else {
            // 假如新设备无法加入,则将原来的视频输入设备增加进去
            [self.captureSession addInput: self.activeVideoInput];
        }
        // 装备完结后提交
        [self.captureSession commitConfiguration];
    } else {
        // 假如产生过错:设备增加过错
        [self.delegate deviceConfigurationFailedWithError: error];
        return NO;
    }
    return YES;
}

五.完结摄像头主动聚集功用

  • 在修正设备的动作时,需求判别该设备是否支撑

THCameraController.m重要代码

#pragma mark - Focus Methods 点击聚集办法的完结
// 是否支撑聚集功用
- (BOOL)cameraSupportsTapToFocus {
    //问询目前活泼的摄像头是否支撑爱好点对焦
    return [[self activeCamera] isFocusPointOfInteresSupported];
}
- (void)focusAtPoint:(CGPoint)point {
    AVCaptureDevice *device = [self activeCamera];
    // 判别该设备是否支撑爱好点对焦/是否支撑主动对焦
    if (device.isFocusPointOfInterestSupported && [device isFocusModeSupported: AVCaptureFocusModeAutoFocus]) {
        //承认该设备装备
        NSError *error;
        // 由于装备时,不能让多个目标对它进行更改,所以要对该进程上锁。
        if ([device lockForConfiguration: &error]) {
            // 聚集方位
            device.focusPointOfInteres = point;
            // 聚集形式
            device.focusMode = AVCaptureFocusModeAutoFocus;
            // 修正完毕,开释锁
            [device unlockForConfiguration];
        } else {
            // 设备过错
            [self.delegate deviceConfigurationFailedWithError: error];
        }
    }
}

六.完结摄像头主动曝光以及承认曝光

THCameraController.m重要代码

#pragma mark - Exposure Methods  点击曝光的办法完结
// 是否支撑曝光
- (BOOL)cameraSupportsTapToExpose {
    return [[self activeCamera] isExpsurePointOfInterestSupported];
}
static const NSString *THCameraAdjustingExposureContext;
- (void)exposeAtPoint:(CGPoint)point {
    AVCaptureDevice *device = [self activeCamera];
    // 曝光形式
    AVCaptureExposureMode exposureMode = AVCaptureExposureModeContinuousAutoExposure;
    // 是否支撑曝光及上述曝光形式
    if (device.isExposurePointOfInterestSupported && [device isExposureModeSupported: exposureMode]) {
        [device isExposureModeSupported: exposureMode];
        NSError error;
        // 承认设备预备装备
        if ([device lockForConfiguration: &error]) {
            // 装备期望值
            device.exposurePointOfInterest = point;
            device.exposureMode = exposureMode;
            // 判别设备是否支撑承认曝光的形式
            if ([device isExposureModeSupported: AVCaptureExposureModeLocked]) {
                // 支撑,则运用kvo承认设备的adjustingExposure特点的状况
                [device addObserver: self forKeyPath: @"adjustingExposure" options: NSKeyValueObservingOptionNew context: &THCameraAdjustingExposureContext];
            }
            [device unlockForConfiguration];
        } else {
            [self.delegate deviceConfigurationFailedWithError: error];
        }
    }
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
    // 判别context上下文是否为THCameraAdjustingExposureContext
    if (context == &THCameraAdjustingExposureContext) {
        // 获取device
        AVCaptureDevice *device = (AVCaptureDevice *)object;
        // 判别设备是否不再调整曝光等级,承认设备的exposureMode是否能够设置为AVCaptureExposureModeLocked
        if (!device.isAdjustingExposure && [device isExposureModeSupported: AVCaptureExposureModeLocked]) {
            // 移除作为adjustingExposure的self,就不会得到后续变更的告诉
            [object removeObserver: self forKeyPath: @"adjustingExposure" context: &THCameraAdjustingExposureContext];
            // 异步办法调回主行列
            dispatch_async(dispatch_get_main_queue(), ^{
                NSError *error;
                if ([device lockForConfiguration: &error]) {
                    //修正exposureMode
                    device.exposureMode = AVCaptureExposureModeLocked;
                    // 开释该承认
                    [device unlockForConfiguration];
                } else {
                    [self.delegate deviceConfigurationFailedWithError: error];
                }
            });
        } else {
        [super observeValueForKeyPath:self ofObject:object change:change context:context];
        }
    }
}
// 从头设置对焦&曝光
- (void)resetFocusAndExposureModes {
    AVCaptureDevice *device = [self activeCamera];
    AVCaptureFocusMode focusMode = AVCaptureFocusModeContinuousAutoFocus;
    // 获取对焦爱好点和连续主动对焦形式 是否被支撑
    BOOL canResetFocus = [device isFocusPointOfInterestSupported] && [device isFocusModeSupported: focusMode];
    AVCaptureExposureMode exposureMode = AVCaptureExposureModeContinuousAutoExposure;
    // 承认曝光度能够被重设
    BOOL canResetExposure = [device isFocusPointOfInterestSupported] && [device isExposureModeSupported: exposureMode];
    // 回忆一下,捕捉设备空间左上角(0,0) 右上角(1,1)  中心点(0.5,0.5)
    CGPoint centerPoint = CGPointMake(0.5f,0.5f);
    NSError *error;
    // 承认设备,预备装备
    if ([device lockForConfiguration: &error]) {
        // 焦点可设,则修正
        if (canResetFocus) {
            device.focusMode = focusMode;
            device.focusPointOfInterest = centerPoint;
        }
        // 曝光度可设,则设置为期望的曝光形式
        if (canResetExposure) {
            device.exposureMode = exposureMode;
            device.exposurePointOfInterest = centerPoint;
        }
        // 开释承认
        [device unlockForConfiguration];
    } else {
        [self.delegate deviceConfigurationFailedWithError: error];
    }
}

七.完结摄像头手电筒和亮光等形式的敞开封闭

THCameraController.m重要代码

#pragma mark - Flash and Torch Modes 亮光灯 & 手电筒
// 判别是否有亮光灯
- (BOOL)cameraHasFlash {
    return [[self activeCamera] hasFlash];
}
// 亮光灯形式
- (AVCaptureFlashMode)flashMode {
    return [[self activeCamera] flashMode];
}
// 设置亮光灯
- (void)setFlashMode:(AVCaptureFlashMode)flashMode {
    // 获取会话
    AVCaptureDevice *device = [self activeCamera];
    // 判别是否支撑亮光灯形式
    if ([device isFlashModeSupported: flashMode]) {
        // 假如支撑,则承认设备
        NSError *error;
        if ([device lockForConfiguration: &error]) {
            // 修正亮光灯形式
            device.flashMode = flashMode;
            // 修正完结,解锁开释设备
            [device unlockForConfiguration];
        } else {
            [self.delegate deviceConfigurationFailedWithError: error];
        }
    } 
}
// 是否支撑手电筒
- (BOOL)cameraHasTorch {
    return [[self activeCamera] hasTorch];
}
// 手电筒形式
- (AVCaptureTorchMode)torchMode {
    return [[self activeCamera] torchMode];
}
- (void)setTorchMode:(AVCaptureTorchMode)torchMode {
    AVCaptureDevice *device = [self activeCamera];
    if ([device isTorchModeSupported: torchMode]) {
        NSError *error;
        if (device lockForConfiguration: &error) {
            device.torchMode = torchMode;
            [device unlockForConfiguration];
        } else {
            [self.delegate deviceConfigurationFailedWithError: error];
        }
    }
}

八.静态图片的摄影

  • 装备Session

THCameraController.m重要代码

#pragma mark - Image Capture Methods 摄影静态图片
/*
    AVCaptureStillImageOutput是AVCaptureOutput的子类,用于捕捉图片
*/
- (void)captureStillImage {
    // 获取衔接
    AVCaptureConnection *connection = [self.imageOutput connectionWithMediaType: AVMediaTypeVideo];
    // 程序只支撑纵向,可是假如用户横向摄影时,需求调整结果相片的方向
    // 判别是否设置视频方向
    if (connection.isVideoOrientationSupported) {
        // 获取方向值
        connection.videoOrientation = [self currentVideoOrientation];
    }
    // 界说一个handle块,会返回1个图片的NSData数据
    id handle = ^(CMSampleBufferRef sampleBuffer, NSError *error) {
        if (sampleBuffer != NULL) {
            NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation: sampleBuffer];
            UIImage *image = [[UIImage alloc] initWithData: imageData];
            // 要点:捕捉图片成功后,将图片传递出去
            [self writeImageToAssetsLibrary:image];
        } else {
            NSLog(@"NULL sampleBuffer:%@",[error localizedDescription]);
        }
    };
    // 捕捉静态图片
    [self.imageOutput captureStillImageAsynchronouslyFromConnection: connection completionHandler:^(CMSampleBufferRef _Nullable imageDataSampleBuffer, NSError *_Nullable error) {
        if (imageDataSampleBuffer != NULL) {
            NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation: imageDataSampleBuffer];
            UIImage *image = [[UIImage alloc] initWithData: imageData];
            [self writeImageToAssetsLibrary:image];
        } else {
            NSLog(@"sampleBuffer is null!");
        }
    }];
}
// 获取当时设备方向
- (AVCaptureVideoOrientation)currentVideoOrientation {
    AVcaptureVideoOrientation orientation;
    // 获取UIDevice的orientation
    switch ([UIDevice currentDevice].orientation) {
        case UIDeviceOrientationPortrait:
            orientation = AVCaptureVideoOrientationPortrait;
            break;
        case UIDeviceOrientationLandscapeRight:
            orientation = AVCaptureVideoOrientationLandscapeRight;
            break;
        case UIDeviceOrientationPortraitUpsideDown:
            orientation = AVCaptureVideoOrientationPortraitUpsideDown;
            break;
        default:
            orientation = AVCaptureVideoOrientationLandscapeRight;
            break;
    }
    return orientation;
    return 0;
 }
 /*
     Assets Library 结构
     用来让开发者经过代码办法拜访iOS photo
     留意:会拜访到相册,需求修正plist权限,否则会导致项目溃散
 */
 - (void)writeImageToAssetsLibrary:(UIImage *)image {
     // 创立ALAssetsLibrary 实例
     ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
     // 参数1:图片(参数为CGImageRef,所以image.CGImage)
     // 参数2:方向参数转为NSUInterger
     // 参数3: 写入成功/失败处理
     [library writeImageToSavedPhotosAlbum: image.CGImage orientation: (NSUInteger)image.imageOrientation completionBlock:^(NSURL *assetURL, NSError *error) {
         if(!error) {
             [self postThumbnailNotification: image];
         } else {
             // 失败打印过错信息
             id message = [error localizedDescription];
             NSLog(@"%@",message);
         }
     }];
 }
 // 发送缩略图告诉
 - (void)postThumbnailNotification:(UIImage *)image {
     // 回到主行列
     dispatch_async(dispatch_get_main_queue(), ^{
         // 发送请求
         NSNotificationCenter *nc = [NSNotificationCenter defaultCenter];
         [nc postNotificationName: THThumbnailCreatedNotification object: image];
     });
 }

九.视频录制完结及视频摄影缩略图完结

视频内容的捕捉。当设置捕捉会话时,增加一个名为AVCaptureMo vieFileOutput的输出,这个类界说了办法将QuickTime影片捕捉到磁盘。这个类大多数中心功用都继承于超类AVCaptureFileOut put,比方录制到最长时限或录制到特定文件大小时停止;还能够装备成保存最小可用的磁盘空间,这一点在存储空间有限的移动设备上录制视频时非常重要。 通常当QuickTime影片预备发布时,影片头的元数据处于文件的开端方位,这样能够让视频播放器快速读取头包括信息,来承认文件的内容、结构和其包括的对个样本的方位。不过,当录制一个QuickTime影片时,直到一切的样片都完结捕捉后才能创立信息头。当录制完毕后,创立头数据并将它附在文件结尾处。

iOS音视频底层(一)之AVFoundation媒体捕捉

将创立头的进程放在一切影片样本完结捕捉之后存在一个问题,尤其是在移动设备的情况下。假如遇到溃散或其他中止,比方有电话拨入,则影片头就不会被正确写入,会在磁盘生成一个不可读的影片文件。AVCaptureMovieFileOutput供给一个中心功用便是分段捕捉QuickTime影片。

iOS音视频底层(一)之AVFoundation媒体捕捉

当录制开端时,在文件最前面写入一个最小化的头信息,跟着录制的进行,片段依照必定的周期写入,创立完整的头信息。默许状况下,每10秒写入一个片段,不过这个时刻的间能够经过修正捕捉设备输出的movieFragentInterval特点来改动。写入片段的办法能够逐渐创立完整的QuickTime影片头。这样保证了当遇到应用程序溃散或中止时,影片仍然会以最好的一个写入片段为终点进行保存。 THCameraController.m重要代码

#pragma mark - Video Capture Methods 捕捉视频
// 捕捉是否录制状况
- (BOOL)isRecording {
    return self.movieOutput.isRecording;
}
// 开端录制
- (void)startRecording {
    if (![self isRecording]) {
        // 1.获取当时视频捕捉衔接信息:用于捕捉视频数据装备一些中心特点
        AVCaptureConnection *videoConnection = [self.movieOutput connectionWithMediaType: AVMediaTypeVideo];
        // 判别是否支撑设置videoOrientation特点
        if ([videoConnection isVideoOrientationSupported]) {
            // 支撑则修正当时视频的方向
            videoConnection.videoOrientation = [self currentVideoOrientation];
        }
        // 2.判别是否支撑视频安稳:能够明显进步视频的质量,只会在录制视频文件触及
        if ([videoConnection isVideoStabilizationSupported]) {
            videoConnection.enablesVideoStabilizationWhenAvailable = YES;
        }
        // 3.摄像头进行滑润对焦:即减慢摄像头镜头对焦速度,当用户移动摄影时摄像头会尝试快速主动对焦
        AVCaptureDevice *device = [self activeCamera];
        if (device.isSmoothAutoFocusEnabled) {
            NSError *error;
            if ([device lockForConfiguration: &error]) {
                device.smoothAutoFocusEnabled = YES;
                [device unlockForConfiguration];
            } else {
                [self.delegate deviceConfigurationFailedWithError: error];
            }
        }
        // 4.获取途径:查找写入捕捉视频的仅有文件体系URL
        self.outputURL = [self uniqueURL];
        // 5.摄像头的相关装备完结
        // 开端录制:直播/小视频 --> 捕捉到视频信息 --> 紧缩(AAC/H264)
        //录制成一个QuickTime视频文件存储到相册(AVFoundation完结的硬编码)
        [self.movieOutput startRerdingToOutputFileURL: self.outputURL recordingDelegate: self];
    }
}
// 录制时刻
- (CMTime)recordedDuration {
    return self.movieOutput.recordingedDuration;
}
// 完结写入视频仅有文件体系URL
- (NSURL *)uniqueURL {
    NSFileManager *fileManager = [NSFileManager defaultManager];
    // temporaryDirectoryWithTemplateString
    NSString *dirPath = [fileManager temporaryDirectoryWithTemplateString: @"kamera.XXXXXX"];
    if (dirPath) {
        NSString *filePath = [dirPath stringByAppendingPathComponent: @"kamera_movie.mov"];
        return [NSURL fileURLWithPath: filePath];
    }
    return nil;
} 
// 中止录制
- (void)stopRecording {
    // 是否正在录制
    if ([self isRecorfing]) {
        [self.movieOutput stopRecording];
    }
}
#pragma - mark - AVCaptureFileOutputRecordingDelegate
// 完结录制
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
    // 过错
    if (error) {
        [self.delegate mediaCaptureFailedWithError: error];
    } else {
        //写入
        [self writeVideoToAssetsLibrary: [self.outputURL copy]];
    }
    self.outputURL = nil;
}
// 写入捕捉到的视频到相册
- (void)writeVideoToAssetsLibrary:(NSURL *)videoURL {
    //ALAssetsLibrary实例:供给写入视频的接口
    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
    // 写资源库写入前,查看视频是否可被写入
    if ([library videoAtPathIsCompatibleWithSavedPhotoAlbum: videoURL]) {
        // 创立block块
        ALAssetsLibraryWriteVideoCompletionBlock completionBlock;
        completionBlock = (NSURL *assetURL, NSError *error) {
            if (error) {
                [self.delegate assetLibraryWriteFailedWithError: error];
            } else {
                // 用于界面展示视频缩略图
                [self generateThumbnailForVideoAtURL: videoURL];
            }
        };
        // 执行实践写入资源库的动作
        [library writeVideoAtPathToSavedPhotosAlbum: videoURL completionBlock: completionBlock];
    }
}
// 获取视频左下角缩略图
- (void)generateThumbnailForVideoAtURL:(NSURL *)videoURL {
    // 在videoQueue上
    dispatch_async(self.videoQueue, {
        // 建立新的AVAsset & AVAssetImageGenerator
        AVAsset *asset = [AVAsset assetWithURL: videoURL];
        AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset: asset];
        // 设置maximumSize宽为100,高为0,依据视频的宽高比来核算图片的高度
        imageGenerator.maximumSize = CGSizeMake(100.0f, 0.0f);
        // 捕捉视频缩略图会考虑视频的改变,如视频的方向改变,假如不设置,缩略图的方向或许出错
        imageGenerator.appliesPreferredTrackTransform = YESl
        // 获取CGImageRef图片,留意需求自己管理它的创立和开释
        CGImageRef imageRef = [imageGenerator copyCGImageAtTime:kCMTimeZero actualTime: NULL error: nil];
        // 将图片转化为UIImage
        UIImage *image = [UIImage imageWithCGImage: imageRef];
        // 开释CGImageRef imageRef避免内存走漏
        CGImageRelease(imageRef);
        // 回到主线程
        dispatch_async(dispatch_get_main_queue(),{
            // 发送告诉,传递最新的image
            [self postThumbnailNotification:image];
        });
    });
}

有任何问题,欢迎各位谈论指出!觉得博主写的还不错的费事点个赞喽