iOS 自定义相机, UIImagePickerController && AVCaptureSession (附微信小视频模仿demo)



iOS 自定义相机, UIImagePickerController && AVCaptureSession (附微信小视频模仿demo)

今天介绍自定义相机的两种方式,一种是UIImagePickerController,一种是AVCaptureSession.

UIImagePickerController

UIImagePickerController非常方便简单,是苹果自己封装好了的一套API,效果如下:

AVCaptureSession

但是上面的 API只能进行简单的相机视图修改,有时无法满足我们的需求.例如我们需要更加复杂的OverlayerView(自定义相机视图),这时候我们就要自定义一个相机了.AVCaptureSession能帮助我们.

code

心急的人,完整Demo在最下边儿.其他朋友可依次看下去

UIImagePickerController

属性

首先看到,需要遵守的协议有2个,分别是UINavigationControllerDelegateUIImagePickerControllerDelegate

1
@property(nullable,nonatomic,weak) id <UINavigationControllerDelegate, UIImagePickerControllerDelegate> delegate; //需要遵守的协议有2个

还有一些基本设置,如输入源/媒体类型,还有allowsEditing为 YES 即可允许用户拍照后立即进行编辑,并且出现编辑框

1
2
3
4
5
//基本设置
@property(nonatomic) UIImagePickerControllerSourceType sourceType; //图片输入源,相机或者相册,图库
@property(nonatomic,copy) NSArray<NSString *> *mediaTypes; //媒体类型
@property(nonatomic) BOOL allowsEditing //是否允许用户编辑
@property(nonatomic) BOOL allowsImageEditing //废弃,用上面的即可

这2个属性和视频相关

1
2
@property(nonatomic) NSTimeInterval videoMaximumDuration //最长摄制时间
@property(nonatomic) UIImagePickerControllerQualityType videoQuality //视频摄制质量

这2个和自定义相机视图有关,注意,下面的只有当输入源为UIImagePickerControllerSourceTypeCamera才可用,否则崩溃

1
2
3
4
5
6
7
@property(nonatomic) BOOL showsCameraControls //默认 YES, 设置为 NO 即可关闭所有默认 UI
@property(nullable, nonatomic,strong) __kindof UIView *cameraOverlayView //自定义相机覆盖视图
@property(nonatomic) CGAffineTransform cameraViewTransform //拍摄时屏幕view的transform属性,可以实现旋转,缩放功能
@property(nonatomic) UIImagePickerControllerCameraCaptureMode cameraCaptureMode //设置模式为拍照或者摄像
@property(nonatomic) UIImagePickerControllerCameraDevice cameraDevice //默认开启前置/后置摄像头
@property(nonatomic) UIImagePickerControllerCameraFlashMode cameraFlashMode //设置默认的闪光灯模式,有开/关/自动

方法

第一个是判断输入源的,一般用来判断是否支持拍照

1
2
3
4
5
6
7
typedef NS_ENUM(NSInteger, UIImagePickerControllerSourceType) {
UIImagePickerControllerSourceTypePhotoLibrary, //图库
UIImagePickerControllerSourceTypeCamera, //相机-->这个就是拍照
UIImagePickerControllerSourceTypeSavedPhotosAlbum //相册
};
+ (BOOL)isSourceTypeAvailable:(UIImagePickerControllerSourceType)sourceType; //判断是否支持某种输入源

这个是判断是否支持前/后置摄像头

1
2
3
4
5
6
typedef NS_ENUM(NSInteger, UIImagePickerControllerCameraDevice) {
UIImagePickerControllerCameraDeviceRear, //后置
UIImagePickerControllerCameraDeviceFront //前置
};
+ (BOOL)isCameraDeviceAvailable:(UIImagePickerControllerCameraDevice)cameraDevice //是否支持前/后置摄像头

拍照/摄像的方法

1
2
3
- (void)takePicture; //手动调取拍照的方法,也可当做`-imagePickerController:didFinishPickingMediaWithInfo:`代理的回调,无法捕捉动态图像
- (BOOL)startVideoCapture; //开始摄像
- (void)stopVideoCapture; //停止摄像

代理相关

注意,UIImagePickerController不能直接取消,必须在收到以下代理方法后才可以 dismiss

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
@protocol UIImagePickerControllerDelegate<NSObject>
@optional
//这个已经废弃了,不用管
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)image editingInfo:(nullable NSDictionary<NSString *,id> *)editingInfo;
//主要是下面这2个
//拍照后,点击使用照片和向此选取照片后都会调用以下代理方法
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary<NSString *,id> *)info;
//点击取消的时候会调用,通常在此 dismiss
- (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker;
@end

保存照片/视频到手机,三个比较重要的方法,记得导入AssetsLibrary.framework,并且在需要调用存储方法的的地方导入<AssetsLibrary/AssetsLibrary.h>

1
2
3
4
5
6
7
8
9
// 将照片保存到相册,回调方法必须用 - (void)image:(UIImage *)image didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo;
UIKIT_EXTERN void UIImageWriteToSavedPhotosAlbum(UIImage *image, __nullable id completionTarget, __nullable SEL completionSelector, void * __nullable contextInfo);
//是否可以将视频保存到相册,通常在调用下面的方法之前会调用这个
UIKIT_EXTERN BOOL UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(NSString *videoPath) NS_AVAILABLE_IOS(3_1);
// 将视频保存到相册,回调方法必须用 - (void)video:(NSString *)videoPath didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo;
UIKIT_EXTERN void UISaveVideoAtPathToSavedPhotosAlbum(NSString *videoPath, __nullable id completionTarget, __nullable SEL completionSelector, void * __nullable contextInfo) NS_AVAILABLE_IOS(3_1);

用法如下

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
- (void)savePic
{
UIImageWriteToSavedPhotosAlbum(imgPath, self, @selector(image:didFinishSavingWithError:contextInfo:), nil);
}
- (void)image:(UIImage *)image didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo{
if (error) {
NSLog(@"保存照片过程中发生错误,错误信息:%@",error.localizedDescription);
}else{
NSLog(@"照片保存成功.");
}
}
- (void)saveVideo
{
UISaveVideoAtPathToSavedPhotosAlbum(videoPath,self, @selector(video:didFinishSavingWithError:contextInfo:), nil);//保存视频到相簿
}
- (void)video:(NSString *)videoPath didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo
{
if (error) {
NSLog(@"保存视频过程中发生错误,错误信息:%@",error.localizedDescription);
}else{
NSLog(@"视频保存成功.");
}
}

另外一个是ALAssetsLibrary,不过在 iOS9.0 已经全部被废弃了,改成了PHPhotoLibrary(这是 Photos 框架的一个组成部分,完整的可以点这个链接,关于这个框架的用法,改天再写一个吧)

9.0及以前的写法

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
//保存照片到相册
[[[ALAssetsLibrary alloc]init] writeImageToSavedPhotosAlbum:[img CGImage] orientation:(ALAssetOrientation)img.imageOrientation completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
NSLog(@"Save image fail:%@",error);
}else{
NSLog(@"Save image succeed.");
}
}];
//保存视频到相册
[[[ALAssetsLibrary alloc]init] writeVideoAtPathToSavedPhotosAlbum:[NSURL URLWithString:videoPath] completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
NSLog(@"Save video fail:%@",error);
}else{
NSLog(@"Save video succeed.");
}
}];

AVCaptureSession

先放上一个微信小视频的模仿 Demo, 因为只能真机,所以比较嘈杂~大家别介意

流程

AVCaptureSession通过把设备的麦克风/摄像头(AVCaptureDevice)实例化成数据流输入对象(AVCaptureDeviceInput)后,再通过建立连接(AVCaptionConnection)将录制数据通过数据流输出对象(AVCaptureOutput)导出,而录制的时候咱们可以同步预览当前的录制界面(AVCaptureVideoPreviewLayer).

1
2
3
4
5
6
`AVCaptureSession`是一个会话对象,是设备音频/视频整个录制期间的管理者.
`AVCaptureDevice`其实是咱们的物理设备映射到程序中的一个对象.咱们可以通过其来操作:闪光灯,手电筒,聚焦模式等
`AVCaptureDeviceInput`是录制期间输入流数据的管理对象.
`AVCaptionConnection`是将输入流/输出流连接起来的连接对象,视频/音频稳定,预览与录制方向一致都在这里设置,还有audioChannels声道
`AVCaptureOutput`是输出流数据的管理对象,通过头文件可以看到有很多子类,而我们通常也使用其子类
`AVCaptureVideoPreviewLayer`是一个 `CALyer` ,可以让我们预览拍摄过程中的图像

详解各部分

此处,咱们就模仿一个微信小视频的 demo, 然后把各个主要步骤写在下面.

授权

首先获取授权AVAuthorizationStatus,我们需要获取哪个设备的使用权限,就进行请求,需要注意的是,如果用户未进行授权授权选择,咱们还要重复请求一次.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
//获取授权
- (void)getAuthorization
{
//此处获取摄像头授权
switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo])
{
case AVAuthorizationStatusAuthorized: //已授权,可使用 The client is authorized to access the hardware supporting a media type.
{
break;
}
case AVAuthorizationStatusNotDetermined: //未进行授权选择 Indicates that the user has not yet made a choice regarding whether the client can access the hardware.
{
//则再次请求授权
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {
if(granted){ //用户授权成功
return;
} else { //用户拒绝授权
return;
}
}];
break;
}
default: //用户拒绝授权/未授权
{
break;
}
}
}

相关枚举

1
2
3
4
5
6
7
8
9
/*
AVAuthorizationStatusNotDetermined = 0,// 未进行授权选择
AVAuthorizationStatusRestricted,    // 未授权,且用户无法更新,如家长控制情况下
AVAuthorizationStatusDenied,       // 用户拒绝App使用
AVAuthorizationStatusAuthorized,    // 已授权,可使用
*/

根据流程创建对象

AVCaptureSession

先创建本次小视频的会话对象_captureSession,设置视频分辨率.注意,这个地方设置的模式/分辨率大小将影响你后面拍摄照片/视频的大小

1
2
3
4
5
6
7
8
- (void)addSession
{
_captureSession = [[AVCaptureSession alloc] init];
if ([_captureSession canSetSessionPreset:AVCaptureSessionPreset640x480]) {
[_captureSession setSessionPreset:AVCaptureSessionPreset640x480];
}
}

相关枚举

1
2
3
4
5
6
7
8
9
10
11
12
/* 通常支持如下格式
(
AVAssetExportPresetLowQuality,
AVAssetExportPreset960x540,
AVAssetExportPreset640x480,
AVAssetExportPresetMediumQuality,
AVAssetExportPreset1920x1080,
AVAssetExportPreset1280x720,
AVAssetExportPresetHighestQuality,
AVAssetExportPresetAppleM4A
)
*/
AVCaptureDevice

然后咱们要把设备接入进来了,依次是_videoDevice_audioDevice,还有注意,在给会话信息添加设备对象的时候,需要调用_captureSession的一个方法组beginConfigurationcommitConfiguration,最后,咱们把录制过程中的预览图层PreviewLayer也添加进来,设置完毕后,开启会话startRunning–>注意,不等于开始录制,在不再需要使用会话相关时,还需要stopRunning

1
2
3
4
5
6
7
8
[_captureSession beginConfiguration];
[self addVideo];
[self addAudio];
[self addPreviewLayer];
[_captureSession commitConfiguration];
[_captureSession startRunning];
video 相关
1
2
3
4
5
6
7
8
9
- (void)addVideo
{
// 获取摄像头输入设备, 创建 AVCaptureDeviceInput 对象
_videoDevice = [self deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];
[self addVideoInput];
[self addMovieOutput];
}

相关枚举

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
/* MediaType
AVF_EXPORT NSString *const AVMediaTypeVideo NS_AVAILABLE(10_7, 4_0); //视频
AVF_EXPORT NSString *const AVMediaTypeAudio NS_AVAILABLE(10_7, 4_0); //音频
AVF_EXPORT NSString *const AVMediaTypeText NS_AVAILABLE(10_7, 4_0);
AVF_EXPORT NSString *const AVMediaTypeClosedCaption NS_AVAILABLE(10_7, 4_0);
AVF_EXPORT NSString *const AVMediaTypeSubtitle NS_AVAILABLE(10_7, 4_0);
AVF_EXPORT NSString *const AVMediaTypeTimecode NS_AVAILABLE(10_7, 4_0);
AVF_EXPORT NSString *const AVMediaTypeMetadata NS_AVAILABLE(10_8, 6_0);
AVF_EXPORT NSString *const AVMediaTypeMuxed NS_AVAILABLE(10_7, 4_0);
*/
/* AVCaptureDevicePosition
typedef NS_ENUM(NSInteger, AVCaptureDevicePosition) {
AVCaptureDevicePositionUnspecified = 0,
AVCaptureDevicePositionBack = 1, //后置摄像头
AVCaptureDevicePositionFront = 2 //前置摄像头
} NS_AVAILABLE(10_7, 4_0) __TVOS_PROHIBITED;
*/

下面是获取摄像头的方法

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
#pragma mark 获取摄像头-->前/后
- (AVCaptureDevice *)deviceWithMediaType:(NSString *)mediaType preferringPosition:(AVCaptureDevicePosition)position
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:mediaType];
AVCaptureDevice *captureDevice = devices.firstObject;
for ( AVCaptureDevice *device in devices ) {
if ( device.position == position ) {
captureDevice = device;
break;
}
}
return captureDevice;
}
//下面这2个也可以获取前后摄像头,不过有一定的风险,假如手机又问题,找不到对应的 UniqueID 设备,则呵呵了
//- (AVCaptureDevice *)frontCamera
//{
// return [AVCaptureDevice deviceWithUniqueID:@"com.apple.avfoundation.avcapturedevice.built-in_video:1"];
//}
//
//- (AVCaptureDevice *)backCamera
//{
// return [AVCaptureDevice deviceWithUniqueID:@"com.apple.avfoundation.avcapturedevice.built-in_video:0"];
//}

添加视频输入对象AVCaptureDeviceInput,根据输入设备初始化输入对象,用户获取输入数据,将视频输入对象添加到会话 (AVCaptureSession) 中

1
2
3
4
5
6
7
8
9
10
11
12
13
14
- (void)addVideoInput
{
NSError *videoError;
_videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:_videoDevice error:&videoError];
if (videoError) {
NSLog(@"---- 取得摄像头设备时出错 ------ %@",videoError);
return;
}
if ([_captureSession canAddInput:_videoInput]) {
[_captureSession addInput:_videoInput];
}
}

接下来是视频输出对象AVCaptureMovieFileOutput及连接管理对象AVCaptureConnection,还有视频稳定设置preferredVideoStabilizationMode,视频旋转方向setVideoOrientation

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
- (void)addMovieOutput
{
_movieOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([_captureSession canAddOutput:_movieOutput]) {
[_captureSession addOutput:_movieOutput];
AVCaptureConnection *captureConnection = [_movieOutput connectionWithMediaType:AVMediaTypeVideo];
if ([captureConnection isVideoStabilizationSupported]) {
captureConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto;
}
captureConnection.videoScaleAndCropFactor = captureConnection.videoMaxScaleAndCropFactor;
}
}

相关枚举

1
2
3
4
5
6
7
8
9
//设置视频旋转方向
/*
typedef NS_ENUM(NSInteger, AVCaptureVideoOrientation) {
AVCaptureVideoOrientationPortrait = 1,
AVCaptureVideoOrientationPortraitUpsideDown = 2,
AVCaptureVideoOrientationLandscapeRight = 3,
AVCaptureVideoOrientationLandscapeLeft = 4,
} NS_AVAILABLE(10_7, 4_0) __TVOS_PROHIBITED;
*/

audio 相关

咱们再接着添加音频相关的,包括音频输入设备AVCaptureDevice,音频输入对象AVCaptureDeviceInput,并且将音频输入对象添加到会话

1
2
3
4
5
6
7
8
9
10
11
12
13
- (void)addAudio
{
NSError *audioError;
_audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
_audioInput = [[AVCaptureDeviceInput alloc] initWithDevice:_audioDevice error:&audioError];
if (audioError) {
NSLog(@"取得录音设备时出错 ------ %@",audioError);
return;
}
if ([_captureSession canAddInput:_audioInput]) {
[_captureSession addInput:_audioInput];
}
}
AVCaptureVideoPreviewLayer

通过会话AVCaptureSession创建预览层AVCaptureVideoPreviewLayer,设置填充模式videoGravity,预览图层方向videoOrientation,并且设置 layer 想要显示的位置

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
- (void)addPreviewLayer
{
_captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
_captureVideoPreviewLayer.frame = self.view.layer.bounds;
// _captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
_captureVideoPreviewLayer.connection.videoOrientation = [_movieOutput connectionWithMediaType:AVMediaTypeVideo].videoOrientation;
_captureVideoPreviewLayer.position = CGPointMake(self.view.width*0.5,self.videoView.height*0.5);
CALayer *layer = self.videoView.layer;
layer.masksToBounds = true;
[self.view layoutIfNeeded];
[layer addSublayer:_captureVideoPreviewLayer];
}

相关枚举

1
2
3
/* 填充模式
Options are AVLayerVideoGravityResize, AVLayerVideoGravityResizeAspect and AVLayerVideoGravityResizeAspectFill. AVLayerVideoGravityResizeAspect is default.
*/

最后,开始录制视频,结束录制视频,重新录制

1
2
3
4
5
6
7
8
9
10
11
12
13
14
- (void)startRecord
{
[_movieOutput startRecordingToOutputFileURL:[self outPutFileURL] recordingDelegate:self];
}
- (NSURL *)outPutFileURL
{
return [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@%@", NSTemporaryDirectory(), @"outPut.mov"]];
}
- (void)stopRecord
{
[_movieOutput stopRecording];
}
录制相关delegate

包括开始录制,录制结束等

1
2
3
4
5
6
7
8
9
10
11
12
13
14
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections
{
NSLog(@"---- 开始录制 ----");
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
NSLog(@"---- 录制结束 ----%@ ",captureOutput.outputFileURL);
if (self.canSave) {
[self pushToPlay:captureOutput.outputFileURL];
self.canSave = NO;
}
}
压缩/保存视频

咱们需要把录制完毕的视频保存下来.而通常录制完毕的视频是很大的,咱们需要压缩一下再保存.

可以通过AVAssetExportSession来进行压缩,并且可以优化网络shouldOptimizeForNetworkUse,设置转后的格式outputFileType,并且开启异步压缩exportAsynchronouslyWithCompletionHandler

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
#pragma mark 保存压缩
- (NSURL *)compressedURL
{
return [NSURL fileURLWithPath:[[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, true) lastObject] stringByAppendingPathComponent:[NSString stringWithFormat:@"compressed.mp4"]]];
}
- (CGFloat)fileSize:(NSURL *)path
{
return [[NSData dataWithContentsOfURL:path] length]/1024.00 /1024.00;
}
// 压缩视频
- (IBAction)compressVideo:(id)sender
{
NSLog(@"开始压缩,压缩前大小 %f MB",[self fileSize:self.videoUrl]);
AVURLAsset *avAsset = [[AVURLAsset alloc] initWithURL:self.videoUrl options:nil];
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:avAsset];
if ([compatiblePresets containsObject:AVAssetExportPresetLowQuality]) {
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:avAsset presetName:AVAssetExportPreset640x480];
exportSession.outputURL = [self compressedURL];
exportSession.shouldOptimizeForNetworkUse = true;
exportSession.outputFileType = AVFileTypeMPEG4;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if ([exportSession status] == AVAssetExportSessionStatusCompleted) {
NSLog(@"压缩完毕,压缩后大小 %f MB",[self fileSize:[self compressedURL]]);
[self saveVideo:[self compressedURL]];
}else{
NSLog(@"当前压缩进度:%f",exportSession.progress);
}
}];
}
}
- (void)saveVideo:(NSURL *)outputFileURL
{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
NSLog(@"保存视频失败:%@",error);
} else {
NSLog(@"保存视频到相册成功");
}
}];
}
播放录制完的视频以及重复播放

播放录制的视频,以及重复播放

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
- (void)create
{
_playItem = [AVPlayerItem playerItemWithURL:self.videoUrl];
_player = [AVPlayer playerWithPlayerItem:_playItem];
_playerLayer =[AVPlayerLayer playerLayerWithPlayer:_player];
_playerLayer.frame = CGRectMake(200, 200, 100, 100);
_playerLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;//视频填充模式
[self.view.layer addSublayer:_playerLayer];
[_player play];
}
-(void)playbackFinished:(NSNotification *)notification
{
[_player seekToTime:CMTimeMake(0, 1)];
[_player play];
}

交互相关

接下来,是一些和交互相关的,比如切换摄像头,开关闪光灯,还有白平衡啥的.

注意,改变设备属性前一定要首先调用lockForConfiguration方法加锁,调用完之后使用unlockForConfiguration方法解锁.

意义是—进行设备属性修改期间,先锁定设备,防止多处同时修改设备.因为可能有多处不同的修改,咱们将其封装起来最好

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
-(void)changeDevicePropertySafety:(void (^)(AVCaptureDevice *captureDevice))propertyChange{
//也可以直接用_videoDevice,但是下面这种更好
AVCaptureDevice *captureDevice= [_videoInput device];
NSError *error;
BOOL lockAcquired = [captureDevice lockForConfiguration:&error];
if (!lockAcquired) {
NSLog(@"锁定设备过程error,错误信息:%@",error.localizedDescription);
}else{
[_captureSession beginConfiguration];
propertyChange(captureDevice);
[captureDevice unlockForConfiguration];
[_captureSession commitConfiguration];
}
}
开/关闪光灯

闪光模式开启后,并无明显感觉,所以还需要开启手电筒,并且开启前先判断是否自持,否则崩溃

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
- (IBAction)changeFlashlight:(UIButton *)sender {
BOOL con1 = [_videoDevice hasTorch]; //支持手电筒模式
BOOL con2 = [_videoDevice hasFlash]; //支持闪光模式
if (con1 && con2)
{
[self changeDevicePropertySafety:^(AVCaptureDevice *captureDevice) {
if (_videoDevice.flashMode == AVCaptureFlashModeOn) //闪光灯开
{
[_videoDevice setFlashMode:AVCaptureFlashModeOff];
[_videoDevice setTorchMode:AVCaptureTorchModeOff];
}else if (_videoDevice.flashMode == AVCaptureFlashModeOff) //闪光灯关
{
[_videoDevice setFlashMode:AVCaptureFlashModeOn];
[_videoDevice setTorchMode:AVCaptureTorchModeOn];
}
}];
sender.selected=!sender.isSelected;
}else{
NSLog(@"不能切换闪光模式");
}
}
切换摄像头

根据现在正在使用的摄像头来判断需要切换的摄像头

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
- (IBAction)changeCamera{
switch (_videoDevice.position) {
case AVCaptureDevicePositionBack:
_videoDevice = [self deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionFront];
break;
case AVCaptureDevicePositionFront:
_videoDevice = [self deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];
break;
default:
return;
break;
}
[self changeDevicePropertySafety:^(AVCaptureDevice *captureDevice) {
NSError *error;
AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:_videoDevice error:&error];
if (newVideoInput != nil) {
//必选先 remove 才能询问 canAdd
[_captureSession removeInput:_videoInput];
if ([_captureSession canAddInput:newVideoInput]) {
[_captureSession addInput:newVideoInput];
_videoInput = newVideoInput;
}else{
[_captureSession addInput:_videoInput];
}
} else if (error) {
NSLog(@"切换前/后摄像头失败, error = %@", error);
}
}];
}
聚焦模式,曝光模式,拉近/远镜头(焦距)

同时存在单击和双击的手势,咱们如下设置,requireGestureRecognizerToFail的作用就是每次只生效一个手势

1
2
3
4
5
6
7
8
9
10
11
12
13
14
-(void)addGenstureRecognizer{
UITapGestureRecognizer *singleTapGesture=[[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(singleTap:)];
singleTapGesture.numberOfTapsRequired = 1;
singleTapGesture.delaysTouchesBegan = YES;
UITapGestureRecognizer *doubleTapGesture=[[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(doubleTap:)];
doubleTapGesture.numberOfTapsRequired = 2;
doubleTapGesture.delaysTouchesBegan = YES;
[singleTapGesture requireGestureRecognizerToFail:doubleTapGesture];
[self.videoView addGestureRecognizer:singleTapGesture];
[self.videoView addGestureRecognizer:doubleTapGesture];
}

单击修改聚焦模式setFocusMode及聚焦点setFocusPointOfInterest,还有曝光模式setExposureMode及曝光点setExposurePointOfInterest

注意,摄像头的点范围是0~1,咱们需要把点转化一下,使用captureDevicePointOfInterestForPoint

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
-(void)singleTap:(UITapGestureRecognizer *)tapGesture{
CGPoint point= [tapGesture locationInView:self.videoView];
CGPoint cameraPoint= [_captureVideoPreviewLayer captureDevicePointOfInterestForPoint:point];
[self setFocusCursorAnimationWithPoint:point];
[self changeDevicePropertySafety:^(AVCaptureDevice *captureDevice) {
//聚焦
if ([captureDevice isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus]) {
[captureDevice setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
}else{
NSLog(@"聚焦模式修改失败");
}
//聚焦点的位置
if ([captureDevice isFocusPointOfInterestSupported]) {
[captureDevice setFocusPointOfInterest:cameraPoint];
}
//曝光模式
if ([captureDevice isExposureModeSupported:AVCaptureExposureModeAutoExpose]) {
[captureDevice setExposureMode:AVCaptureExposureModeAutoExpose];
}else{
NSLog(@"曝光模式修改失败");
}
//曝光点的位置
if ([captureDevice isExposurePointOfInterestSupported]) {
[captureDevice setExposurePointOfInterest:cameraPoint];
}
}];
}

下面是双击设置焦距videoZoomFactor

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
-(void)doubleTap:(UITapGestureRecognizer *)tapGesture{
NSLog(@"双击");
[self changeDevicePropertySafety:^(AVCaptureDevice *captureDevice) {
if (captureDevice.videoZoomFactor == 1.0) {
CGFloat current = 1.5;
if (current < captureDevice.activeFormat.videoMaxZoomFactor) {
[captureDevice rampToVideoZoomFactor:current withRate:10];
}
}else{
[captureDevice rampToVideoZoomFactor:1.0 withRate:10];
}
}];
}

相关枚举

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
/*
@constant AVCaptureFocusModeLocked 锁定在当前焦距
Indicates that the focus should be locked at the lens' current position.
@constant AVCaptureFocusModeAutoFocus 自动对焦一次,然后切换到焦距锁定
Indicates that the device should autofocus once and then change the focus mode to AVCaptureFocusModeLocked.
@constant AVCaptureFocusModeContinuousAutoFocus 当需要时.自动调整焦距
Indicates that the device should automatically focus when needed.
*/
/*
@constant AVCaptureExposureModeLocked 曝光锁定在当前值
Indicates that the exposure should be locked at its current value.
@constant AVCaptureExposureModeAutoExpose 曝光自动调整一次然后锁定
Indicates that the device should automatically adjust exposure once and then change the exposure mode to AVCaptureExposureModeLocked.
@constant AVCaptureExposureModeContinuousAutoExposure 曝光自动调整
Indicates that the device should automatically adjust exposure when needed.
@constant AVCaptureExposureModeCustom 曝光只根据设定的值来
Indicates that the device should only adjust exposure according to user provided ISO, exposureDuration values.
*/

Demo

终于说完了,感觉越到后面越无力,还是放上一个 demo 吧,里面包含了上面介绍的两种方式.

Demo 下载地址


原创文章,转载请注明地址: https://kevinmky.github.io