【iOS学习】 视频添加动效水印步骤简介
yuyutoo 2025-01-12 19:56 2 浏览 0 评论
简概:
本次文章主要介绍给视频添加动效水印的几种方式,以及实现代码。
使用AVFoundation + CoreAnimation 合成方式
基于Lottie 核心也是 CoreAnimation ,这里我们也可以使用AVFoundation + Lottie 合成方式
我们同样可以使用序列帧资源或者gif资源 来编写一段keyFrameAnination,这里我们就介绍一段 AVFoundation + Gif 合成方式
使用 GPUImageUIElement 将序列帧资源合并在目标资源上
使用 GPUImage 将水印视频合并在目标资源上
如果你有问题,或者对下述文字有任何意见与建议,可以在文章最后留言
视频处理后效果 GIF
原视频.gif
CoreAnimation.gif
Lottie.gif
GIF.gif
GPUImageType1.gif
GPUImageType2.gif
1.使用AVFoundation + CoreAnimation 合成方式
#pragma mark CorAnimation+ (void)addWaterMarkTypeWithCorAnimationAndInputVideoURL:(NSURL*)InputURL WithCompletionHandler:(void(^)(NSURL* outPutURL, intcode))handler{ NSDictionary *opts = [NSDictionary dictionaryWithObject:@(YES) forKey:AVURLAssetPreferPreciseDurationAndTimingKey]; AVAsset *videoAsset = [AVURLAsset URLAssetWithURL:InputURL options:opts]; AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init]; AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid]; NSError *errorVideo = [NSError new]; AVAssetTrack *assetVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo]firstObject]; CMTime endTime = assetVideoTrack.asset.duration; BOOL bl = [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetVideoTrack.asset.duration)
ofTrack:assetVideoTrack
atTime:kCMTimeZero error:&errorVideo];
videoTrack.preferredTransform = assetVideoTrack.preferredTransform; NSLog(@"errorVideo:%ld%d",errorVideo.code,bl); NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
formatter.dateFormat = @"yyyyMMddHHmmss"; NSString *outPutFileName = [formatter stringFromDate:[NSDate dateWithTimeIntervalSinceNow:0]]; NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"%@.mov",outPutFileName]]; NSURL* outPutVideoUrl = [NSURL fileURLWithPath:myPathDocs];
CGSize videoSize = [videoTrack naturalSize];
UIFont *font = [UIFont systemFontOfSize:60.0]; CATextLayer *aLayer = [[CATextLayer alloc] init];
[aLayer setFontSize:60];
[aLayer setString:@"H"];
[aLayer setAlignmentMode:kCAAlignmentCenter];
[aLayer setForegroundColor:[[UIColor greenColor] CGColor]];
[aLayer setBackgroundColor:[UIColor clearColor].CGColor]; CGSize textSize = [@"H"sizeWithAttributes:[NSDictionary dictionaryWithObjectsAndKeys:font,NSFontAttributeName, nil]];
[aLayer setFrame:CGRectMake(240, 470, textSize.width, textSize.height)];
aLayer.anchorPoint = CGPointMake(0.5, 1.0);
CATextLayer *bLayer = [[CATextLayer alloc] init];
[bLayer setFontSize:60];
[bLayer setString:@"E"];
[bLayer setAlignmentMode:kCAAlignmentCenter];
[bLayer setForegroundColor:[[UIColor greenColor] CGColor]];
[bLayer setBackgroundColor:[UIColor clearColor].CGColor]; CGSize textSizeb = [@"E"sizeWithAttributes:[NSDictionary dictionaryWithObjectsAndKeys:font,NSFontAttributeName, nil]];
[bLayer setFrame:CGRectMake(240+ textSize.width, 470, textSizeb.width, textSizeb.height)];
bLayer.anchorPoint = CGPointMake(0.5, 1.0);
CATextLayer *cLayer = [[CATextLayer alloc] init];
[cLayer setFontSize:60];
[cLayer setString:@"L"];
[cLayer setAlignmentMode:kCAAlignmentCenter];
[cLayer setForegroundColor:[[UIColor greenColor] CGColor]];
[cLayer setBackgroundColor:[UIColor clearColor].CGColor]; CGSize textSizec = [@"L"sizeWithAttributes:[NSDictionary dictionaryWithObjectsAndKeys:font,NSFontAttributeName, nil]];
[cLayer setFrame:CGRectMake(240+ textSizeb.width + textSize.width, 470, textSizec.width, textSizec.height)];
cLayer.anchorPoint = CGPointMake(0.5, 1.0);
CATextLayer *dLayer = [[CATextLayer alloc] init];
[dLayer setFontSize:60];
[dLayer setString:@"L"];
[dLayer setAlignmentMode:kCAAlignmentCenter];
[dLayer setForegroundColor:[[UIColor greenColor] CGColor]];
[dLayer setBackgroundColor:[UIColor clearColor].CGColor]; CGSize textSized = [@"L"sizeWithAttributes:[NSDictionary dictionaryWithObjectsAndKeys:font,NSFontAttributeName, nil]];
[dLayer setFrame:CGRectMake(240+ textSizec.width+ textSizeb.width + textSize.width, 470, textSized.width, textSized.height)];
dLayer.anchorPoint = CGPointMake(0.5, 1.0);
CATextLayer *eLayer = [[CATextLayer alloc] init];
[eLayer setFontSize:60];
[eLayer setString:@"O"];
[eLayer setAlignmentMode:kCAAlignmentCenter];
[eLayer setForegroundColor:[[UIColor greenColor] CGColor]];
[eLayer setBackgroundColor:[UIColor clearColor].CGColor]; CGSize textSizede = [@"O"sizeWithAttributes:[NSDictionary dictionaryWithObjectsAndKeys:font,NSFontAttributeName, nil]];
[eLayer setFrame:CGRectMake(240+ textSized.width + textSizec.width+ textSizeb.width + textSize.width, 470, textSizede.width, textSizede.height)];
eLayer.anchorPoint = CGPointMake(0.5, 1.0); CABasicAnimation* basicAni = [CABasicAnimation animationWithKeyPath:@"transform.scale"];
basicAni.fromValue = @(0.2f);
basicAni.toValue = @(1.0f);
basicAni.beginTime = AVCoreAnimationBeginTimeAtZero;
basicAni.duration = 2.0f;
basicAni.repeatCount = HUGE_VALF;
basicAni.removedOnCompletion = NO;
basicAni.fillMode = kCAFillModeForwards;
[aLayer addAnimation:basicAni forKey:nil];
[bLayer addAnimation:basicAni forKey:nil];
[cLayer addAnimation:basicAni forKey:nil];
[dLayer addAnimation:basicAni forKey:nil];
[eLayer addAnimation:basicAni forKey:nil];
CALayer *parentLayer = [CALayer layer]; CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:aLayer];
[parentLayer addSublayer:bLayer];
[parentLayer addSublayer:cLayer];
[parentLayer addSublayer:dLayer];
[parentLayer addSublayer:eLayer]; AVMutableVideoComposition* videoComp = [AVMutableVideoComposition videoComposition];
videoComp.renderSize = videoSize;
parentLayer.geometryFlipped = true;
videoComp.frameDuration = CMTimeMake(1, 30);
videoComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer]; AVMutableVideoCompositionInstruction* instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, endTime); AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
instruction.layerInstructions = [NSArray arrayWithObjects:layerInstruction, nil];
videoComp.instructions = [NSArray arrayWithObject: instruction];
AVAssetExportSession* exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=outPutVideoUrl;
exporter.outputFileType = AVFileTypeMPEG4;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = videoComp;
[exporter exportAsynchronouslyWithCompletionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ //这里是输出视频之后的操作,做你想做的
NSLog(@"输出视频地址:%@ andCode:%@",myPathDocs,exporter.error);
handler(outPutVideoUrl,(int)exporter.error.code);
});
}];
}
2.基于Lottie 核心也是 CoreAnimation ,这里我们也可以使用AVFoundation +Lottie 合成方式
与第一段代码不同的地方
LOTAnimationView* animation = [LOTAnimationView animationNamed:@"青蛙"];
animation.frame = CGRectMake(150, 340, 240, 240);
animation.animationSpeed = 5.0;
animation.loopAnimation = YES;
[animation play];
CALayer *parentLayer = [CALayer layer]; CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:animation.layer];
3.我们同样可以使用序列帧资源或者gif资源 来编写一段keyFrameAnination,这里我们就介绍一段 AVFoundation + Gif 合成方式
与第一段代码不同的地方是将gif 转成layer 的KEYFrameAnimation
CALayer *gifLayer1 = [[CALayer alloc] init];
gifLayer1.frame = CGRectMake(150, 340, 298, 253); CAKeyframeAnimation *gifLayer1Animation = [WatermarkEngine animationForGifWithURL:[[NSBundle mainBundle] URLForResource:@"雪人完成_1"withExtension:@"gif"]];
gifLayer1Animation.beginTime = AVCoreAnimationBeginTimeAtZero;
gifLayer1Animation.removedOnCompletion = NO;
[gifLayer1 addAnimation:gifLayer1Animation forKey:@"gif"];
CALayer *parentLayer = [CALayer layer]; CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:gifLayer1];
+ (CAKeyframeAnimation *)animationForGifWithURL:(NSURL *)url {
CAKeyframeAnimation *animation = [CAKeyframeAnimation animationWithKeyPath:@"contents"];
NSMutableArray * frames = [NSMutableArray new]; NSMutableArray *delayTimes = [NSMutableArray new];
CGFloat totalTime = 0.0; CGFloat gifWidth; CGFloat gifHeight;
CGImageSourceRef gifSource = CGImageSourceCreateWithURL((CFURLRef)url, NULL);
// get frame count
size_t frameCount = CGImageSourceGetCount(gifSource); for(size_t i = 0; i < frameCount; ++i) { // get each frame
CGImageRef frame = CGImageSourceCreateImageAtIndex(gifSource, i, NULL);
[frames addObject:(__bridge id)frame]; CGImageRelease(frame);
// get gif info with each frame
NSDictionary *dict = (NSDictionary*)CFBridgingRelease(CGImageSourceCopyPropertiesAtIndex(gifSource, i, NULL)); NSLog(@"kCGImagePropertyGIFDictionary %@", [dict valueForKey:(NSString*)kCGImagePropertyGIFDictionary]);
// get gif size
gifWidth = [[dict valueForKey:(NSString*)kCGImagePropertyPixelWidth] floatValue];
gifHeight = [[dict valueForKey:(NSString*)kCGImagePropertyPixelHeight] floatValue];
// kCGImagePropertyGIFDictionary中kCGImagePropertyGIFDelayTime,kCGImagePropertyGIFUnclampedDelayTime值是一样的
NSDictionary *gifDict = [dict valueForKey:(NSString*)kCGImagePropertyGIFDictionary];
[delayTimes addObject:[gifDict valueForKey:(NSString*)kCGImagePropertyGIFUnclampedDelayTime]];
totalTime = totalTime + [[gifDict valueForKey:(NSString*)kCGImagePropertyGIFUnclampedDelayTime] floatValue];
// CFRelease((__bridge CFTypeRef)(dict));
// CFRelease((__bridge CFTypeRef)(dict));
} if(gifSource) { CFRelease(gifSource);
}
NSMutableArray *times = [NSMutableArray arrayWithCapacity:3]; CGFloat currentTime = 0; NSInteger count = delayTimes.count; for(inti = 0; i < count; ++i) {
[times addObject:[NSNumber numberWithFloat:(currentTime / totalTime)]];
currentTime += [[delayTimes objectAtIndex:i] floatValue];
}
NSMutableArray *images = [NSMutableArray arrayWithCapacity:3]; for(inti = 0; i < count; ++i) {
[images addObject:[frames objectAtIndex:i]];
}
animation.keyTimes = times;
animation.values = images;
animation.timingFunction = [CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionLinear];
animation.duration = totalTime;
animation.repeatCount = HUGE_VALF;
returnanimation;
}
4.使用 GPUImage 将水印视频合并在目标资源上
#pragma mark GPUImage TWO VIDEO INPUT+ (void)addWaterMarkTypeWithGPUImageAndInputVideoURL:(NSURL*)InputURL AndWaterMarkVideoURL:(NSURL*)InputURL2 WithCompletionHandler:(void(^)(NSURL* outPutURL, intcode))handler{ NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
formatter.dateFormat = @"yyyyMMddHHmmss"; NSString *outPutFileName = [formatter stringFromDate:[NSDate dateWithTimeIntervalSinceNow:0]]; NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"%@.mov",outPutFileName]]; NSURL* outPutVideoUrl = [NSURL fileURLWithPath:myPathDocs];
GPUImageMovie* movieFile = [[GPUImageMovie alloc] initWithURL:InputURL];
GPUImageMovie* movieFile2 = [[GPUImageMovie alloc] initWithURL:InputURL2];
GPUImageScreenBlendFilter* filter = [[GPUImageScreenBlendFilter alloc] init];
[movieFile addTarget:filter];
[movieFile2 addTarget:filter];
GPUImageMovieWriter* movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:outPutVideoUrl size:CGSizeMake(540, 960) fileType:AVFileTypeQuickTimeMovie outputSettings: @
{ AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: @540, //Set your resolution width here
AVVideoHeightKey: @960, //set your resolution height here
AVVideoCompressionPropertiesKey: @
{ //2000*1000 建议800*1000-5000*1000
//AVVideoAverageBitRateKey: @2500000, // Give your bitrate here for lower size give low values
AVVideoAverageBitRateKey: @5000000, AVVideoProfileLevelKey: AVVideoProfileLevelH264HighAutoLevel, AVVideoAverageNonDroppableFrameRateKey: @30,
},
}
];
[filter addTarget:movieWriter]; AVAsset* videoAsset = [AVAsset assetWithURL:InputURL]; AVAssetTrack *assetVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo]firstObject];
movieWriter.transform = assetVideoTrack.preferredTransform; // [movie enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieFile2 startProcessing];
[movieWriter setCompletionBlock:^{ dispatch_async(dispatch_get_main_queue(), ^{ NSLog(@"movieWriter Completion");
handler(outPutVideoUrl,1);
});
}];
}
5.使用 GPUImageUIElement 将序列帧资源合并在目标资源上
#pragma mark GPUImageUIElement+ (void)addWaterMarkTypeWithGPUImageUIElementAndInputVideoURL:(NSURL*)InputURL WithCompletionHandler:(void(^)(NSURL* outPutURL, intcode))handler{ NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSDateFormatter *formatter = [[NSDateFormatter alloc] init];
formatter.dateFormat = @"yyyyMMddHHmmss"; NSString *outPutFileName = [formatter stringFromDate:[NSDate dateWithTimeIntervalSinceNow:0]]; NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"%@.mov",outPutFileName]]; NSURL* outPutVideoUrl = [NSURL fileURLWithPath:myPathDocs];
GPUImageMovie* movieFile = [[GPUImageMovie alloc] initWithURL:InputURL];
NSValue *value = [NSValue valueWithCGRect:CGRectMake([UIScreen mainScreen].bounds.size.width/2.0- (332/2.0) , [UIScreen mainScreen].bounds.size.height/2.0- (297/2.0) , 332, 297)]; NSValue *value2 = [NSValue valueWithCGAffineTransform:CGAffineTransformMake(1, 0, 0, 1, 0, 0)]; UIView* view = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 1, 1)];
GPUImageFilterGroup* filter = [WatermarkEngine addWatermarkWithResourcesNames:@[@"雨天青蛙"] Andframes:@[value] AndTransform:@[value2] AndLabelViews:@[view]];
[movieFile addTarget:filter];
GPUImageMovieWriter* movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:outPutVideoUrl size:CGSizeMake(540, 960) fileType:AVFileTypeQuickTimeMovie outputSettings: @
{ AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: @540, //Set your resolution width here
AVVideoHeightKey: @960, //set your resolution height here
AVVideoCompressionPropertiesKey: @
{ //2000*1000 建议800*1000-5000*1000
//AVVideoAverageBitRateKey: @2500000, // Give your bitrate here for lower size give low values
AVVideoAverageBitRateKey: @5000000, AVVideoProfileLevelKey: AVVideoProfileLevelH264HighAutoLevel, AVVideoAverageNonDroppableFrameRateKey: @30,
},
}
];
[filter addTarget:movieWriter]; AVAsset* videoAsset = [AVAsset assetWithURL:InputURL]; AVAssetTrack *assetVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo]firstObject];
movieWriter.transform = assetVideoTrack.preferredTransform; // [movie enableSynchronizedEncodingUsingMovieWriter:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{ dispatch_async(dispatch_get_main_queue(), ^{ NSLog(@"movieWriter Completion");
handler(outPutVideoUrl,1);
});
}];
}
+ (GPUImageFilterGroup*) addWatermarkWithResourcesNames:(NSArray* )resourcesNames Andframes:(NSArray*)frams AndTransform:(NSArray*)transforms AndLabelViews:(NSArray*)labelViews{
__block intcurrentPicIndex = 0; CGFloat width = CGRectGetWidth([UIScreen mainScreen].bounds); UIView* temp = [[UIView alloc] initWithFrame:[UIScreen mainScreen].bounds];
[temp setContentScaleFactor:[[UIScreen mainScreen] scale]];
__block UIImageView* waterImageView1 = [[UIImageView alloc] init];
__block UIImageView* waterImageView2 = [[UIImageView alloc] init];
__block UIImageView* waterImageView3 = [[UIImageView alloc] init]; for(intindex = 0; index < resourcesNames.count ; index++) { if(index == 0) {
waterImageView1.frame = [frams[index] CGRectValue]; UIImage* tempImage = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:@"%@_d",resourcesNames[index],currentPicIndex] ofType:@"png"]];
waterImageView1.image = tempImage;
waterImageView1.transform = [transforms[index] CGAffineTransformValue];
[temp addSubview:waterImageView1];
[temp addSubview:labelViews[index]];
}elseif(index == 1){
waterImageView2.frame = [frams[index] CGRectValue]; UIImage* tempImage = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:@"%@_d",resourcesNames[index],currentPicIndex] ofType:@"png"]];
waterImageView2.image = tempImage;
waterImageView2.transform = [transforms[index] CGAffineTransformValue];
[temp addSubview:waterImageView2];
[temp addSubview:labelViews[index]];
}else{
waterImageView3.frame = [frams[index] CGRectValue]; UIImage* tempImage = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:@"%@_d",resourcesNames[index],currentPicIndex] ofType:@"png"]];
waterImageView3.image = tempImage;
waterImageView3.transform = [transforms[index] CGAffineTransformValue];
[temp addSubview:waterImageView3];
[temp addSubview:labelViews[index]];
}
}
GPUImageFilterGroup* filterGroup = [[GPUImageFilterGroup alloc] init];
GPUImageUIElement *uiElement = [[GPUImageUIElement alloc] initWithView:temp];
GPUImageTwoInputFilter* blendFilter = [[GPUImageTwoInputFilter alloc] initWithFragmentShaderFromString:[WatermarkEngine loadShader:@"AlphaBlend_Normal"extension:@"frag"]];
GPUImageFilter* filter = [[GPUImageFilter alloc] init];
GPUImageFilter* uiFilter = [[GPUImageFilter alloc] init];
[uiElement addTarget:uiFilter];// [uiFilter setInputRotation:kGPUImageRotateLeft atIndex:0];
[filter addTarget:blendFilter];
[uiFilter addTarget:blendFilter];
[filterGroup addFilter:filter];
[filterGroup addFilter:uiFilter];
[filterGroup addFilter:blendFilter];
[filterGroup setInitialFilters:@[filter]];
[filterGroup setTerminalFilter:blendFilter]; // 71
// __unsafe_unretained typeof(self) this = self;
[filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime) {
currentPicIndex += 1;
for(intindex = 0; index < resourcesNames.count ; index++) { if(index == 0) {
waterImageView1.image = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:@"%@_d",resourcesNames[index],currentPicIndex] ofType:@"png"]];
}elseif(index == 1){
waterImageView2.image = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:@"%@_d",resourcesNames[index],currentPicIndex] ofType:@"png"]];
}else{
waterImageView3.image = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:@"%@_d",resourcesNames[index],currentPicIndex] ofType:@"png"]];
}
}
if(currentPicIndex == 89) {
currentPicIndex = 0;
}
[uiElement update];
}];
returnfilterGroup;
}
1
2
+ (NSString * _Nonnull)loadShader:(NSString *)name extension:(NSString *)extenstion { NSURL *url = [[NSBundle mainBundle] URLForResource:name withExtension:extenstion]; return[NSString stringWithContentsOfURL:url encoding:NSUTF8StringEncoding error:nil];
}
相关推荐
- 建筑福利-pdf转dwg格式转换器,再也不用描图-极客青年
-
作为一名经常熬夜画图的建筑狗或者cad用户,你体验过pdf图纸描图到cad吗?前几天一个老同学找我,说他的毕业设计需要我帮忙,发给我一份pdf图纸文件,问我怎么把pdf图纸转换成dwg格式。机智的我灵...
- 想学 HTML,不知从何入手?看完这篇文章你就知道了
-
很多人都说HTML是一门很简单的语言,看看书,看看视频就能读懂。但是,如果你完全没有接触过,就想通过看一遍教程,背背标签,想要完全了解HTML,真的有点太天真了。HTML中文...
- 「前端」HTML之结构
-
今天继续为大家分享前端的知识,如果对前端比较感兴趣的小伙伴,可以关注我,我会更大家继续分享更多与前端相关的内容,当然如果内容中又不当的或者文字错误的,欢迎大家在评论区留言,我会及时修改纠正。1.初识H...
- 手把手教你使用Python网络爬虫下载一本小说(附源码)
-
大家好,我是Python进阶者。前言前几天【磐奚鸟】大佬在群里分享了一个抓取小说的代码,感觉还是蛮不错的,这里分享给大家学习。...
- 用于处理pdf文件格式的转换器
-
在上传过程中如果单个文件太大则容易中断,而且文件太大的话对与存储也有些弊端。那么我们应该想到将文件进行压缩(注意这里压缩指的是不改变文件格式的压缩,而不是用变成压缩文件。这里就将以下用专门的软件压缩P...
- 乐书:在线 Kindle 电子书制作和转换工具
-
之前Kindle伴侣曾推荐过可以在Windows和Mac系统平台上运行的kindle电子书制作软件Sigil(教程),用它可以制作出高质量的的ePub格式电子书,当然最后还需要通...
- 付费文档怎么下载?教你5种方法,任意下载全网资源
-
网上查资料的时候,经常遇到需要注册登录或者付费的才能复制或者是下载,遇到这种情况大多数人都会选择重新查。...
- 捡来的知识!3种方法随便复制网页内容,白嫖真香呀
-
网上的资源真的多,所以许多人常常会从网上找资料。我们看到感兴趣的内容,第一时间可能会想要收入囊中。比如说截个图啊,或者挑选有意思的句子复制粘贴,记录下来。可是,有些时候,却会遇到这样的情况:1、内容不...
- AI的使用,生成HTML网页。
-
利用deepseek,豆包,kimi以及通义千问,写入相同的需求。【写一个网页,实现抽奖功能,点击“开始”,按键显示“停止”,姓名开始显示在屏幕上,人员包括:“张三”,“里斯”,“Bool”,“流水废...
- pdf转换成jpg转换器 4.1 官方正式版
-
pdf转换成jpg工具软件简介pdf转换成jpg转换器是一款界面简洁,操作方便的pdf转换成jpg转换器。pdf转换成jpg转换器可以将PDF文档转换为JPG,BMP,GIF,PNG,TIF图片文件。...
- 办公必备的office转换成pdf转换器怎么用?
-
2016-02-2415:53:37南方报道网评论(我要点评)字体刚从校园走出社会,对于快节奏的办公环境,难免会觉得有些吃力。在起步阶段力求将手头上的事情按时完工不出错,但是渐渐的你会发现,别人只...
- 为什么PDF转Word大多要收费?
-
PDF转Word大多都要收费?并非主要是因为技术上的难度,而是基于多方面的商业和版权考虑的,下面给大家浅分析下原因:...
- 如何用python生成简单的html report报告
-
前提:用python写了一个简单的log分析,主要也就是查询一些key,value出来,后面也可以根据需求增加。查询出来后,为了好看,搞个html表格来显示。需要的组件:jinja2flask...
- 学用系列|如何搞定word批量替换修改和格式转换?这里一站搞定
-
想必不少朋友都会碰到批量修改word文档内容、压缩文档图片、文件格式转换等重复性文档处理工作的需要,今天胖胖老师就推荐给大家一个免费工具XCLWinKits,一站搞定你所有的需要。什么是XCLWinK...
- 这款PDF文档转换神器,能帮你解决PDF使用中的许多难点
-
不管是平时的学习还是工作,相信许多朋友都经常接触PDF文件。可以说,PDF文件在我们的日常办公学习过程中的重要性和Word文档一样重要。在之前的更新中,小编介绍了几款非常不错的PDF文档格式转换软件,...
你 发表评论:
欢迎- 一周热门
-
-
前端面试:iframe 的优缺点? iframe有那些缺点
-
带斜线的表头制作好了,如何填充内容?这几种方法你更喜欢哪个?
-
漫学笔记之PHP.ini常用的配置信息
-
其实模版网站在开发工作中很重要,推荐几个参考站给大家
-
推荐7个模板代码和其他游戏源码下载的网址
-
[干货] JAVA - JVM - 2 内存两分 [干货]+java+-+jvm+-+2+内存两分吗
-
正在学习使用python搭建自动化测试框架?这个系统包你可能会用到
-
织梦(Dedecms)建站教程 织梦建站详细步骤
-
【开源分享】2024PHP在线客服系统源码(搭建教程+终身使用)
-
2024PHP在线客服系统源码+完全开源 带详细搭建教程
-
- 最近发表
- 标签列表
-
- mybatis plus (70)
- scheduledtask (71)
- css滚动条 (60)
- java学生成绩管理系统 (59)
- 结构体数组 (69)
- databasemetadata (64)
- javastatic (68)
- jsp实用教程 (53)
- fontawesome (57)
- widget开发 (57)
- vb net教程 (62)
- hibernate 教程 (63)
- case语句 (57)
- svn连接 (74)
- directoryindex (69)
- session timeout (58)
- textbox换行 (67)
- extension_dir (64)
- linearlayout (58)
- vba高级教程 (75)
- iframe用法 (58)
- sqlparameter (59)
- trim函数 (59)
- flex布局 (63)
- contextloaderlistener (56)