iOS视频倒放
iOS视频倒放
视频的倒放就是视频从后往前播放,这个只适应于视频图像,对声音来说倒放只是噪音,没什么意义,所以倒放的时候声音都是去除的。
倒放实现
一般对H264编码的视频进行解码,都是从头至尾进行的,因为视频存在I帧、P帧、B帧,解码P帧的时候需要依赖前面最近的I帧或者前一个P帧,解码B帧的时候,不仅要依赖前面的缓存数据还要依赖后面的数据,这就导致了我们没法真正让解码器从后往前解码,只能把视频分成很多足够小的片段,对每一个片段单独进行处理。具体思路如下:我们需要先seek到最后第n个GOP的第一帧-I帧,然后把当前这个点到视频最后的图像都解码出来,存储在一个数组里面。这个n是根据解码数据大小定的,因为如果解码出来的数据太大,内存占用过多,会导致程序被杀掉,我是把视频分成一秒一个小片段,对这些片段,倒过来进行解码,然后把每一段解出来的图像,倒过来编码。使用AVFoundation可以很方便的实现github:
// SJReverseUtility.h
// playback
//
// Created by Lightning on 2018/7/12.
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
typedef void(^ReverseCallBack)(AVAssetWriterStatus status, float progress, NSError *error);
@interface SJReverseUtility : NSObject
- (instancetype)initWithAsset:(AVAsset *)asset outputPath:(NSString *)path;
- (void)startProcessing;
- (void)cancelProcessing;
@property (nonatomic, copy) ReverseCallBack callBack;
@property (nonatomic, assign) CMTimeRange timeRange;
@end
//
// SJReverseUtility.m
// playback
//
// Created by Lightning on 2018/7/12.
#import "SJReverseUtility.h"
@interface SJReverseUtility()
@property (nonatomic, strong) NSMutableArray *samples;
@property (nonatomic, strong) AVAsset *asset;
@property (nonatomic, strong) NSMutableArray *tracks;
@property (nonatomic, strong) AVMutableComposition *composition;
@property (nonatomic, strong) AVAssetWriter *writer;
@property (nonatomic, strong) AVAssetWriterInput *writerInput;
@property (nonatomic, strong) AVAssetWriterInputPixelBufferAdaptor *writerAdaptor;
@property (nonatomic, assign) uint frame_count;
@property (nonatomic, strong) AVMutableCompositionTrack *compositionTrack;
@property (nonatomic, assign) CMTime offsetTime;
@property (nonatomic, assign) CMTime intervalTime;
@property (nonatomic, assign) CMTime segDuration;
@property (nonatomic, assign) BOOL shouldStop;
@property (nonatomic, copy) NSString *path;
@end
@implementation SJReverseUtility
- (instancetype)initWithAsset:(AVAsset *)asset outputPath:(NSString *)path
{
self = [super init];
if (self) {
_asset = asset;
_composition = [AVMutableComposition composition];
AVMutableCompositionTrack *ctrack = [_composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
_compositionTrack = ctrack;
_timeRange = kCMTimeRangeInvalid;
_frame_count = 0;
_offsetTime = kCMTimeZero;
_intervalTime = kCMTimeZero;
[self setupWriterWithPath:path];
}
return self;
}
- (void)cancelProcessing
{
self.shouldStop = YES;
}
- (void)startProcessing
{
if (CMTIMERANGE_IS_INVALID(_timeRange)) {
_timeRange = CMTimeRangeMake(kCMTimeZero, _asset.duration);
}
CMTime duration = _asset.duration;
CMTime segDuration = CMTimeMake(1, 1);
self.segDuration = segDuration;
NSArray *videoTracks = [_asset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *track = videoTracks[0];
//should set before starting
self.writerInput.transform = track.preferredTransform;//fix video orientation
[self.writer startWriting];
[self.writer startSessionAtSourceTime:kCMTimeZero]; //start processing
//divide video into n segmentation
int n = (int)(CMTimeGetSeconds(duration)/CMTimeGetSeconds(segDuration)) + 1;
if (CMTIMERANGE_IS_VALID(_timeRange)) {
n = (int)(CMTimeGetSeconds(_timeRange.duration)/CMTimeGetSeconds(segDuration)) + 1;
duration = CMTimeAdd(_timeRange.start, _timeRange.duration);
}
__weak typeof(self) weakSelf = self;
for (int i = 1; i < n; i++) {
CMTime offset = CMTimeMultiply(segDuration, i);
if (CMTimeCompare(offset, duration) > 0) {
break;
}
CMTime start = CMTimeSubtract(duration, offset);
if (CMTimeCompare(start, _timeRange.start) < 0) {
start = kCMTimeZero;
segDuration = CMTimeSubtract(duration, CMTimeMultiply(segDuration, i-1));
}
self.compositionTrack = [_composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[self.compositionTrack insertTimeRange:CMTimeRangeMake(start, segDuration) ofTrack:track atTime:kCMTimeZero error:nil];
[self generateSamplesWithTrack:_composition];
[self encodeSampleBuffer];
if (self.shouldStop) {
[self.writer cancelWriting];
if ([[NSFileManager defaultManager] fileExistsAtPath:_path]) {
[[NSFileManager defaultManager] removeItemAtPath:_path error:nil];
}
!weakSelf.callBack? :weakSelf.callBack(weakSelf.writer.status, -1, weakSelf.writer.error);
return;
}
[self.compositionTrack removeTimeRange:CMTimeRangeMake(start, segDuration)];
!weakSelf.callBack? :weakSelf.callBack(weakSelf.writer.status, (float)i/n, weakSelf.writer.error);
}
[self.writer finishWritingWithCompletionHandler:^{
!weakSelf.callBack? :weakSelf.callBack(weakSelf.writer.status, 1.0f, weakSelf.writer.error);
}];
}
- (void)setupWriterWithPath:(NSString *)path
{
NSURL *outputURL = [NSURL fileURLWithPath:path];
AVAssetTrack *videoTrack = [[_asset tracksWithMediaType:AVMediaTypeVideo] lastObject];
// Initialize the writer
self.writer = [[AVAssetWriter alloc] initWithURL:outputURL
fileType:AVFileTypeMPEG4
error:nil];
NSDictionary *videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys:
@(videoTrack.estimatedDataRate), AVVideoAverageBitRateKey,
nil];
int width = videoTrack.naturalSize.width;
int height = videoTrack.naturalSize.height;
NSDictionary *writerOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:videoTrack.naturalSize.width], AVVideoWidthKey,
[NSNumber numberWithInt:videoTrack.naturalSize.height], AVVideoHeightKey,
videoCompressionProps, AVVideoCompressionPropertiesKey,
nil];
AVAssetWriterInput *writerInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo
outputSettings:writerOutputSettings
sourceFormatHint:(__bridge CMFormatDescriptionRef)[videoTrack.formatDescriptions lastObject]];
[writerInput setExpectsMediaDataInRealTime:NO];
self.writerInput = writerInput;
self.writerAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:writerInput sourcePixelBufferAttributes:nil];
[self.writer addInput:self.writerInput];
}
- (void)generateSamplesWithTrack:(AVAsset *)asset
{
// Initialize the reader
AVAssetReader *reader = [[AVAssetReader alloc] initWithAsset:asset error:nil];
AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] lastObject];
NSDictionary *readerOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange], kCVPixelBufferPixelFormatTypeKey, nil];
AVAssetReaderTrackOutput* readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack
outputSettings:readerOutputSettings];
[reader addOutput:readerOutput];
[reader startReading];
// read in the samples
_samples = [[NSMutableArray alloc] init];
CMSampleBufferRef sample;
while(sample = [readerOutput copyNextSampleBuffer]) {
[_samples addObject:(__bridge id)sample];
NSLog(@"count = %d",_samples.count);
CFRelease(sample);
}
if (_samples.count > 0 ) {
self.intervalTime = CMTimeMakeWithSeconds(CMTimeGetSeconds(self.segDuration)/(float)(_samples.count), _asset.duration.timescale);
}
}
- (void)encodeSampleBuffer
{
for(NSInteger i = 0; i < _samples.count; i++) {
// Get the presentation time for the frame
CMTime presentationTime = CMSampleBufferGetPresentationTimeStamp((__bridge CMSampleBufferRef)_samples[i]);
presentationTime = CMTimeAdd(_offsetTime, self.intervalTime);
size_t index = _samples.count - i - 1;
if (0 == _frame_count) {
presentationTime = kCMTimeZero;
index = _samples.count - i - 2; //倒过来的第一帧是黑的丢弃
}
CMTimeShow(presentationTime);
CVPixelBufferRef imageBufferRef = CMSampleBufferGetImageBuffer((__bridge CMSampleBufferRef)_samples[index]);
while (!_writerInput.readyForMoreMediaData) {
[NSThread sleepForTimeInterval:0.1];
}
_offsetTime = presentationTime;
BOOL success = [self.writerAdaptor appendPixelBuffer:imageBufferRef withPresentationTime:presentationTime];
_frame_count++;
if (!success) {
NSLog(@"status = %ld",(long)self.writer.status);
NSLog(@"status = %@",self.writer.error);
}
}
}
@end
在iOS里面,这段代码可以倒放任意时长的视频。但是在每一帧的时间戳上,还有待改进。
iOS视频倒放的更多相关文章
- 浅谈iOS视频开发
浅谈iOS视频开发 这段时间对视频开发进行了一些了解,在这里和大家分享一下我自己觉得学习步骤和资料,希望对那些对视频感兴趣的朋友有些帮助. 一.iOS系统自带播放器 要了解iOS视频开发,首先我们从 ...
- iOS视频编辑SDK
IOS视频编辑SDK接入说明 一.名词解释 分辨率:用于计算机视频处理的图像,以水平和垂直方向上所能显示的像素数来表示分辨率.常见视频分辨率的有1080P即1920x1080,720P即1080x72 ...
- iOS 视频开发学习
原文:浅谈iOS视频开发 这段时间对视频开发进行了一些了解,在这里和大家分享一下我自己觉得学习步骤和资料,希望对那些对视频感兴趣的朋友有些帮助. 一.iOS系统自带播放器 要了解iOS视频开发,首先我 ...
- iOS视频开发经验
iOS视频开发经验 手机比PC的优势除了便携外,我认为最重要的就是可以快速方便的创作多媒体作品.照片分享,语音输入,视频录制,地理位置.一个成功的手机APP从产品形态上都有这其中的一项或多项,比如in ...
- iOS - 视频开发
视频实质: 纯粹的视频(不包括音频)实质上就是一组帧图片,经过视频编码成为视频(video)文件再把音频(audio)文件有些还有字幕文件组装在一起成为我们看到的视频(movie)文件.1秒内出现的图 ...
- 最近这么火的iOS视频直播
快速集成iOS基于RTMP的视频推流 http://www.jianshu.com/p/8ea016b2720e iOS视频直播初窥:高仿<喵播APP> http://www.jiansh ...
- IOS 视频分解图片、图片合成视频
在IOS视频处理中,视频分解图片和图片合成视频是IOS视频处理中经常遇到的问题,这篇博客就这两个部分对IOS视频图像的相互转换做一下分析. (1)视频分解图片 这里视频分解图片使用的是AVAssetI ...
- 最简单的基于FFmpeg的移动端例子:IOS 视频解码器-保存
===================================================== 最简单的基于FFmpeg的移动端例子系列文章列表: 最简单的基于FFmpeg的移动端例子:A ...
- 最简单的基于FFmpeg的移动端例子:IOS 视频转码器
===================================================== 最简单的基于FFmpeg的移动端例子系列文章列表: 最简单的基于FFmpeg的移动端例子:A ...
随机推荐
- HTML5 : 文件上传下载
网站建设中,文件上传与下载在所难免,HTML5中提供的API在前端有着丰富的应用,完美的解决了各个浏览器的兼容性问题,所以赶紧get吧! FileList 对象和 file 对象 HTML 中的 in ...
- c# 利用反射 从json字符串 动态创建类的实例 并动态为实例成员赋值
转自 http://hi.baidu.com/wjinbd/item/c54d43d998beb33be3108fdd 1 创建自己要用的类 class stu { string _name; int ...
- #学习tips——写给自己的语录
用编程证明自己的观点! "我以为我懂了?"-- 花1/3的时间去学去记,剩下的时间应用做记忆实战. 记录一个转变 2018.6.7 (英文字幕-->无字幕!)good job ...
- Qt 之模型/视图(自定义按钮)
https://blog.csdn.net/liang19890820/article/details/50974059 简述 衍伸前面的章节,我们对QTableView实现了数据显示.自定义排序.显 ...
- DBGridEh常用技巧
一.增加多表头显示方式 DBGridEh1.UseMultiTitle:=True; //打开多标题显示方式 DBGridEh1.Columns[].Title.Caption:='员工编号'; // ...
- python爬虫系列:(一)、安装scrapy
1.安装python 下载好安装包,一路next安装即可 2.把python和pip加入环境变量. 我的电脑----->右键“属性”------>“高级系统设置”------->“环 ...
- linux 用户和组操作
linux用户操作 查看登陆用户:whoami (结果最简洁) 或者who mom likes 或者who am i查看所有用户:cat /etc/passwd 添加:sudo adduser lil ...
- 监控DAG状态
Add-PSSnapin microsoft.exchange* Add-PSSnapin Microsoft.Exchange.Management.PowerShell.E2010 $server ...
- 【深入理解JAVA虚拟机】第二部分.内存自动管理机制.5.调优实战
高性能硬件上的程序部署策略 在高性能硬件上部署程序,目前主要有两种方式: 通过64位JDK来使用大内存. -- 缺点:GC停顿时间长 使用若干个32位虚拟机建立逻辑集群来利用硬件资源. -- 思 ...
- July 10th 2017 Week 28th Monday
I get that look a lot, but I never let it get to me. 我常常受到异样的目光,但我从不把它们放在眼里. I don't feel good these ...