philm-iOS-wiki
  • 介绍
  • 网络层
    • 说明
  • UI
    • 说明
    • 在ios7以前使用ColorSpace的坑
    • UITableView偏移异常问题
    • drawRect时单独设置文字阴影无效
    • Xcode9下相册访问权限问题
    • 避免同时点击多个Button
    • scroll上的button延迟响应问题
    • uibutton触发边界事件
    • ios 11 上tableview 改动
    • YYImage 显示指定区域的图片
  • 数据持久化
    • 说明
  • 其它
    • 取消延迟执行之坑
    • NSString 转换 float 的精度问题
  • 每周阅读
    • 目录
    • 深入思考NSNotification
    • gitBook使用小助手
    • iOS App签名的原理
    • 响应链
    • iOS10跳转系统到设置页
    • SDWebImage下载高清图内存问题
    • iOS圆角避免离屏渲染
    • 常用的延时调用
    • iOS 神经网络
    • SDWebImage缓存策略
    • 3Dtouch
    • 为什么 Objective-C 对象存储在堆上而不是栈上
    • 深入浅出理解视频编码H264结构
    • CATextLayer学习
    • cocoaPods
    • 任意网站支持RSS
    • Metal简介
    • 动态更改icon
    • CAReplicatorLayer
    • 增加点击间隔
    • 勒索病毒当道的时代
    • iOS常用宏定义
    • Metal实现YUV转RGB渲染视频
    • 获取当前下载的app及下载进度
    • OpenGL ES 三种类型修饰 uniform attribute varying
    • 技术部门引入OKR
    • 基于runloop的线程保活、销毁与通信
    • 深入理解哈希表
    • TOLL-FREE BRIDGING 和 UNMANAGED
    • 开发者能拿到的标识符
    • Swift自定义LOG
    • 系统通知整理
    • iOS 中的 imageIO 与 image 解码
    • CGImageRef基本介绍及方法说明
    • Swift 3.0 语法
    • webview加载部分网页
    • 在CAAnimation中暂停动画
    • 把代码迁移到协调器上
    • ios11API更新整理
    • 非越狱iOS设备的远程控制实现原理
    • 关于本地化
    • swift命名空间
    • CoreML与coremltools体验
    • 力学动画
    • Swift 4官方文档中文版: The Basic(上)
    • swift 中的KVO用法
    • GPUImage的图像形变设计(简单形变部分)
    • iOS响应式架构
    • 移动端图片上传旋转、压缩的解决方案
    • AVFoundation使用指南AVAssert使用
    • 过渡动画
    • 谈谈 MVX 中的 Model
    • AVFoundation编程-AVPlayer使用
    • GPUImage的图像形变设计(复杂形变部分)
    • What's New in LLVM 9
    • ios的事件机制
    • GPUImage源码解读(一)
    • GPUImage源码解读(二)
    • iOS 启动优化
    • 模块化 Swift 中的状态
    • swift中的let和var背后的编程模式
    • Swift Runtime动态性分析
    • RAC下的响应式编程
    • GPUImage源码解读(三)
    • 如何准确判断webView是否加载完成
    • NSObject的+load和+initialize详解
    • ios8以后设置启动图
    • GPUImage源码解读(四)
    • Swift自动闭包
    • IOS11新特性
    • GPUImage源码解读(五)
    • 理解 OC 内部的消息调用、消息转发、类和对象
    • 修饰符
    • IOS 切面统计事件解耦
    • GPUImage源码解读(六)
    • CoreImage介绍
    • 影响Core Animation性能的原因
    • Instruments中的动画工具选项介绍
    • GPUImage源码解读(七)
    • Xcode 7新的特性Lightweight Generics 轻量级泛型与__kindof修饰符
    • GPUImage源码解读(八)
    • Core Image之自定 Filter
    • iOS通用链接
    • 谈nonatomic非线程安全问题
    • 深拷贝与浅拷贝
    • CIKernel 介绍
    • iOS11适配
    • GPUImage源码解读(九)
    • CVPixelBufferCreate使用的坑
    • ios一窥并发底层
    • ARKit进阶:物理世界
    • ARKit的工作原理及流程介绍
    • UI线程卡顿监控
    • FBKVOController使用
    • GPUImage源码解读(十)
    • WKWebView在ios11崩溃问题解决方法
    • 微信iOS SQLite源码优化实践
    • HEIF 和 HEVC 研究
    • 谈谈 iOS 中图片的解压缩
    • 提升 iOS 开发效率! Xcode 9 内置模拟器的9个技巧
    • ObjC和JavaScript的交互,在恰当的时机注入对象
    • iOS数据保护
    • iOS11中网络层的一些变化(Session707&709脱水版)
    • GPUImage源码解读(十一)
    • 一种避免 iOS 内存碎片的方法
    • pods的原理
    • GPUImage源码解读(十二)
    • GPUImage源码解读(十三)
    • iOS 11 Layout的新特性
    • iOS应用瘦身方法思路整理
    • GPUImage源码解读(十四)
    • CAEmitterLayer属性介绍
    • 浅析移动蜂窝网络的特点及其省电方案
    • 如何在 table view 中添加 3D Touch Peek & Pop 功能
    • iOS中锁的介绍与使用
    • NSLog效率低下的原因及尝试lldb断点打印Log
    • GPUImage源码解读(十五)
    • GPUImage源码解读(十六)
    • CADisplayLink
    • GPUImage源码解读(十七)
    • CADisplayLink
    • 老生常谈category增加属性的几种操作
    • 30行代码演示dispatch_once死锁
    • GPUImage源码解读(十八)
    • YYImage设计思路
    • GPUImage源码解读(十九)
    • 深入理解Tagged Pointer
    • iOS 11:WKWebView内容过滤规则详解
    • Swift语法对编译速度的影响
    • GPUImage源码解读(二十)
    • GPUImage源码解读(二十一)
    • iOS App间常用的五种通信方式
    • YYCache深入学习
    • 冲顶大会插件
    • iOS高性能图片架构与设计
    • YUV颜色编码解析
    • iOS传感器:App前后台切换后,获取敏感信息使用touch ID进行校验
    • GPUImage源码解读(二十二)
    • GPUImage源码解读(二十三)
    • 从零开始的机器学习 - Machine Learning(一)
    • 从零开始的机器学习 - Machine Learning(二)
    • GPUImage源码解读(二十四)
    • Objective-C消息转发机制
    • iOS 程序 main 函数之前发生了什么
    • MMKV--基于 mmap 的 iOS 高性能通用 key-value 组件
    • Objective-C 消息发送与转发机制原理
    • 谈Objective-C block的实现
    • GPUImage源码解读(二十五)
    • podfile语法
    • 轻量级低风险 iOS 热更新方案
    • 使用objection来模块化开发iOS项目
    • swift 中delegate的使用注意
    • 使用appledoc自动生成api文档
    • UITextChecker的使用
    • ARKit 如何给SCNNode贴Gif图片
    • Unity与iOS平台交互和原生插件开发
    • SceneKit编程珠玑
Powered by GitBook
On this page
  1. 每周阅读

GPUImage源码解读(十三)

GPUImageMovieWriter 方法不是很多,但是方法都比较长,内部处理也相对比较复杂。这里只给出了常见的方法。如果需要录制视频,可以仔细阅读

//// 初始话音频参数,如编码格式、声道数、采样率、码率
- (void)setHasAudioTrack:(BOOL)newValue audioSettings:(NSDictionary *)audioOutputSettings;
{
    _hasAudioTrack = newValue;

    if (_hasAudioTrack)
    {
        if (_shouldPassthroughAudio)
        {
            // Do not set any settings so audio will be the same as passthrough
            audioOutputSettings = nil;
        }
        else if (audioOutputSettings == nil)
        {
            AVAudioSession *sharedAudioSession = [AVAudioSession sharedInstance];
            double preferredHardwareSampleRate;

            if ([sharedAudioSession respondsToSelector:@selector(sampleRate)])
            {
                preferredHardwareSampleRate = [sharedAudioSession sampleRate];
            }
            else
            {
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
                preferredHardwareSampleRate = [[AVAudioSession sharedInstance] currentHardwareSampleRate];
#pragma clang diagnostic pop
            }

            AudioChannelLayout acl; //初始化音频通道
            bzero( &acl, sizeof(acl));
            acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono; //设置为单通道模式
            //配置音频参数
            audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                         [ NSNumber numberWithInt: kAudioFormatMPEG4AAC], AVFormatIDKey,
                                         [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
                                         [ NSNumber numberWithFloat: preferredHardwareSampleRate ], AVSampleRateKey,
                                         [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
                                         //[ NSNumber numberWithInt:AVAudioQualityLow], AVEncoderAudioQualityKey,
                                         [ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey,
                                         nil];
/*
            AudioChannelLayout acl;
            bzero( &acl, sizeof(acl));
            acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;

            audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   [ NSNumber numberWithInt: kAudioFormatMPEG4AAC ], AVFormatIDKey,
                                   [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
                                   [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
                                   [ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey,
                                   [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
                                   nil];*/
        }

        self.assetWriterAudioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioOutputSettings];
        [assetWriter addInput:assetWriterAudioInput];
        assetWriterAudioInput.expectsMediaDataInRealTime = NO;
    }
    else
    {
        // Remove audio track if it exists
    }
}

视频设置方法

//视频输入源 setting
- (void)initializeMovieWithOutputSettings:(NSDictionary *)outputSettings;
{
    isRecording = NO;

    self.enabled = YES;
    NSError *error = nil;
    self.assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL fileType:fileType error:&error];

    if (error != nil)
    {
        NSLog(@"Error: %@", error);
        if (failureBlock)
        {
            failureBlock(error);
        }
        else
        {
            if(self.delegate && [self.delegate respondsToSelector:@selector(movieRecordingFailedWithError:)])
            {
                [self.delegate movieRecordingFailedWithError:error];
            }
        }
    }

    // Set this to make sure that a functional movie is produced, even if the recording is cut off mid-stream. Only the last second should be lost in that case.
    //默认值为10S
//    _captureMovieFileOutput.movieFragmentInterval = kCMTimeInvalid
    assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000);

    // use default output settings if none specified
    //设置输出视频的宽高
    if (outputSettings == nil)
    {
        NSMutableDictionary *settings = [[NSMutableDictionary alloc] init];
        [settings setObject:AVVideoCodecH264 forKey:AVVideoCodecKey];
        [settings setObject:[NSNumber numberWithInt:videoSize.width] forKey:AVVideoWidthKey];
        [settings setObject:[NSNumber numberWithInt:videoSize.height] forKey:AVVideoHeightKey];
        outputSettings = settings;
    }
    // custom output settings specified
    else
    {
        NSString *videoCodec = [outputSettings objectForKey:AVVideoCodecKey];
        NSNumber *width = [outputSettings objectForKey:AVVideoWidthKey];
        NSNumber *height = [outputSettings objectForKey:AVVideoHeightKey];

        NSAssert(videoCodec && width && height, @"OutputSettings is missing required parameters.");

        if( [outputSettings objectForKey:@"EncodingLiveVideo"] ) {
            NSMutableDictionary *tmp = [outputSettings mutableCopy];
            [tmp removeObjectForKey:@"EncodingLiveVideo"];
            outputSettings = tmp;
        }
    }

    /*
    NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                                [NSNumber numberWithInt:videoSize.width], AVVideoCleanApertureWidthKey,
                                                [NSNumber numberWithInt:videoSize.height], AVVideoCleanApertureHeightKey,
                                                [NSNumber numberWithInt:0], AVVideoCleanApertureHorizontalOffsetKey,
                                                [NSNumber numberWithInt:0], AVVideoCleanApertureVerticalOffsetKey,
                                                nil];

    NSDictionary *videoAspectRatioSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                              [NSNumber numberWithInt:3], AVVideoPixelAspectRatioHorizontalSpacingKey,
                                              [NSNumber numberWithInt:3], AVVideoPixelAspectRatioVerticalSpacingKey,
                                              nil];

    NSMutableDictionary * compressionProperties = [[NSMutableDictionary alloc] init];
    [compressionProperties setObject:videoCleanApertureSettings forKey:AVVideoCleanApertureKey];
    [compressionProperties setObject:videoAspectRatioSettings forKey:AVVideoPixelAspectRatioKey];
    [compressionProperties setObject:[NSNumber numberWithInt: 2000000] forKey:AVVideoAverageBitRateKey];
    [compressionProperties setObject:[NSNumber numberWithInt: 16] forKey:AVVideoMaxKeyFrameIntervalKey];
    [compressionProperties setObject:AVVideoProfileLevelH264Main31 forKey:AVVideoProfileLevelKey];

    [outputSettings setObject:compressionProperties forKey:AVVideoCompressionPropertiesKey];
    */

    float bitRate = 87500.0f * 8.0f;
    int frameInterval = 24;


    NSDictionary *compressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                         [NSNumber numberWithFloat:bitRate], AVVideoAverageBitRateKey,
                                         [NSNumber numberWithInt:frameInterval*2], AVVideoMaxKeyFrameIntervalKey,
                                         nil];

    NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                              AVVideoCodecH264, AVVideoCodecKey,
                                              AVVideoScalingModeResizeAspectFill, AVVideoScalingModeKey,
                                              [NSNumber numberWithInteger:videoSize.width], AVVideoWidthKey,
                                              [NSNumber numberWithInteger:videoSize.height], AVVideoHeightKey, // square format
                                              compressionSettings, AVVideoCompressionPropertiesKey,
                                              nil];

    self.assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoCompressionSettings];
    assetWriterVideoInput.expectsMediaDataInRealTime = YES;

    // You need to use BGRA for the video in order to get realtime encoding. I use a color-swizzling shader to line up glReadPixels' normal RGBA output with the movie input's BGRA.
    NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
                                                           [NSNumber numberWithInt:videoSize.width], kCVPixelBufferWidthKey,
                                                           [NSNumber numberWithInt:videoSize.height], kCVPixelBufferHeightKey,
                                                           nil];
//    NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey,
//                                                           nil];

    assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:assetWriterVideoInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
    //添加视频输入源
    [assetWriter addInput:assetWriterVideoInput];
}
//处理音频
- (void)processAudioBuffer:(CMSampleBufferRef)audioBuffer;
{
    if (!isRecording)
    {
        return;
    }

    @autoreleasepool {
        //    if (_hasAudioTrack && CMTIME_IS_VALID(startTime))
         // 有音频数据才处理
        if (_hasAudioTrack)
        {
            CFRetain(audioBuffer);

            //当前时间戳
            CMTime currentSampleTime = CMSampleBufferGetOutputPresentationTimeStamp(audioBuffer);

            //        if (CMTIME_IS_INVALID(startTime))
            //        {
            //            runSynchronouslyOnContextQueue(_movieWriterContext, ^{
            //                if ((audioInputReadyCallback == NULL) && (assetWriter.status != AVAssetWriterStatusWriting))
            //                {
            //                    [assetWriter startWriting];
            //                }
            //                [assetWriter startSessionAtSourceTime:currentSampleTime];
            //                startTime = currentSampleTime;
            //            });
            //        }

            //音频输入不准备接受更多的媒体数据
            if (!assetWriterAudioInput.readyForMoreMediaData && _encodingLiveVideo)
            {
                NSLog(@"1: Had to drop an audio frame: %@", CFBridgingRelease(CMTimeCopyDescription(kCFAllocatorDefault, currentSampleTime)));
                if (_shouldInvalidateAudioSampleWhenDone)
                {
                    CMSampleBufferInvalidate(audioBuffer);
                }
                CFRelease(audioBuffer);
                return;
            }

            previousAudioTime = currentSampleTime;

            //if the consumer wants to do something with the audio samples before writing, let him.
            //回调音频数据
            if (self.audioProcessingCallback) {
                //need to introspect into the opaque CMBlockBuffer structure to find its raw sample buffers.
                CMBlockBufferRef buffer = CMSampleBufferGetDataBuffer(audioBuffer);
                CMItemCount numSamplesInBuffer = CMSampleBufferGetNumSamples(audioBuffer);
                AudioBufferList audioBufferList;

                CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(audioBuffer,
                                                                        NULL,
                                                                        &audioBufferList,
                                                                        sizeof(audioBufferList),
                                                                        NULL,
                                                                        NULL,
                                                                        kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,
                                                                        &buffer
                                                                        );
                //passing a live pointer to the audio buffers, try to process them in-place or we might have syncing issues.
                for (int bufferCount=0; bufferCount < audioBufferList.mNumberBuffers; bufferCount++) {
                    SInt16 *samples = (SInt16 *)audioBufferList.mBuffers[bufferCount].mData;
                    self.audioProcessingCallback(&samples, numSamplesInBuffer);
                }
            }

            //        NSLog(@"Recorded audio sample time: %lld, %d, %lld", currentSampleTime.value, currentSampleTime.timescale, currentSampleTime.epoch);
             // 写入音频block
            void(^write)() = ^() {
                //            while( ! assetWriterAudioInput.readyForMoreMediaData && ! _encodingLiveVideo && ! audioEncodingIsFinished ) {
                //                NSDate *maxDate = [NSDate dateWithTimeIntervalSinceNow:0.5];
                //                NSLog(@"audio waiting...");
                //                [[NSRunLoop currentRunLoop] runUntilDate:maxDate];
                //            }
                if (!assetWriterAudioInput.readyForMoreMediaData)
                {
                    NSLog(@"2: Had to drop an audio frame %@", CFBridgingRelease(CMTimeCopyDescription(kCFAllocatorDefault, currentSampleTime)));
                }
                else if(assetWriter.status == AVAssetWriterStatusWriting)
                {
                    if (![assetWriterAudioInput appendSampleBuffer:audioBuffer])
                        NSLog(@"Problem appending audio buffer at time: %@", CFBridgingRelease(CMTimeCopyDescription(kCFAllocatorDefault, currentSampleTime)));
                }
                else
                {
                    //NSLog(@"Wrote an audio frame %@", CFBridgingRelease(CMTimeCopyDescription(kCFAllocatorDefault, currentSampleTime)));
                }
                // 标记不被使用
                if (_shouldInvalidateAudioSampleWhenDone)
                {
                    CMSampleBufferInvalidate(audioBuffer);
                }
                CFRelease(audioBuffer);
            };
            //        runAsynchronouslyOnContextQueue(_movieWriterContext, write);
            if( _encodingLiveVideo )

            {
                runAsynchronouslyOnContextQueue(_movieWriterContext, write);
            }
            else
            {
                // 否则,直接写入
                write();
            }
        }
    }
}
PreviousGPUImage源码解读(十二)NextiOS 11 Layout的新特性

Last updated 7 years ago