iOS Swift – 将.wav文件合并并转换为.mp3

我想合并两个或更多的.wav文件到一个,然后将其转换为.mp3,我想在Swift中完成(或者至less有select将其包括到快速项目)。

在swift中合并两个.wav文件不是问题。 这里是我的例子现在我不知道如何添加蹩脚的图书馆swift项目,以及如何使用它(如何改变客观的蹩脚的代码使用语法,以迅速使用它)。

我坚持在迅速,所以我尝试与目标C的Lame库。我发现示例代码转换.caf到.mp3所以我试过了。 这是我试过的:

- (void) toMp3 { NSString *cafFilePath = [[NSBundle mainBundle] pathForResource:@"sound" ofType:@"caf"]; NSString *mp3FileName = @"Mp3File"; mp3FileName = [mp3FileName stringByAppendingString:@".mp3"]; NSString *mp3FilePath = [[NSHomeDirectory() stringByAppendingFormat:@"/Documents/"] stringByAppendingPathComponent:mp3FileName]; NSLog(@"%@", mp3FilePath); @try { int read, write; FILE *pcm = fopen([cafFilePath cStringUsingEncoding:1], "rb"); //source fseek(pcm, 4*1024, SEEK_CUR); //skip file header FILE *mp3 = fopen([mp3FilePath cStringUsingEncoding:1], "wb"); //output const int PCM_SIZE = 8192; const int MP3_SIZE = 8192; short int pcm_buffer[PCM_SIZE*2]; unsigned char mp3_buffer[MP3_SIZE]; lame_t lame = lame_init(); lame_set_in_samplerate(lame, 44100); lame_set_VBR(lame, vbr_default); lame_init_params(lame); do { read = fread(pcm_buffer, 2*sizeof(short int), PCM_SIZE, pcm); if (read == 0) write = lame_encode_flush(lame, mp3_buffer, MP3_SIZE); else write = lame_encode_buffer_interleaved(lame, pcm_buffer, read, mp3_buffer, MP3_SIZE); fwrite(mp3_buffer, write, 1, mp3); } while (read != 0); lame_close(lame); fclose(mp3); fclose(pcm); } @catch (NSException *exception) { NSLog(@"%@",[exception description]); } @finally { [self performSelectorOnMainThread:@selector(convertMp3Finish) withObject:nil waitUntilDone:YES]; } } - (void) convertMp3Finish { } 

但是这个结果只是带有噪声的.mp3。

  • 这里是例子.caf文件 。
  • 这里是结果.mp3文件 。

所以我需要解决我的三个问题:

  • 在Objective C中从caf创build正确的mp3
  • 更改代码以将其用于wav文件
  • 并将其更改为可以在Swift中使用它

我知道有关于在iOS中编码和转换MP3的许多问题,但我无法find一个与斯威夫特的例子,我找不到工作目标C代码(只是代码上面)的例子。 感谢帮助

我想发布我的工作解决scheme,因为我得到了这么多的赞许,从naresh答案并没有多大帮助。

  1. 我从这个项目生成了lame.framework库https://github.com/wuqiong/mp3lame-for-iOS
  2. 我已经添加了库到我的Swift项目(构build阶段 – >与库链接二进制文件)
  3. 我已经创build了在Objective C中使用c函数的封装,并通过桥接头在Swift中使用它。
  4. 对于连接的wav文件,我使用Swift的AVAssetExportSession

现在是源代码。 所以第一个包装。 它是转换.wav文件到.mp3的类。 可能会有很多变化(可能是输出文件和其他选项的参数),但我认为每个人都可以改变它。 我想这可以重写为斯威夫特,但我不知道该怎么做。 所以这是Objective C类:

 #import "AudioWrapper.h" #import "lame/lame.h" @implementation AudioWrapper + (void)convertFromWavToMp3:(NSString *)filePath { NSString *mp3FileName = @"Mp3File"; mp3FileName = [mp3FileName stringByAppendingString:@".mp3"]; NSString *mp3FilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:mp3FileName]; NSLog(@"%@", mp3FilePath); @try { int read, write; FILE *pcm = fopen([filePath cStringUsingEncoding:1], "rb"); //source fseek(pcm, 4*1024, SEEK_CUR); //skip file header FILE *mp3 = fopen([mp3FilePath cStringUsingEncoding:1], "wb"); //output const int PCM_SIZE = 8192; const int MP3_SIZE = 8192; short int pcm_buffer[PCM_SIZE*2]; unsigned char mp3_buffer[MP3_SIZE]; lame_t lame = lame_init(); lame_set_in_samplerate(lame, 44100); lame_set_VBR(lame, vbr_default); lame_init_params(lame); do { read = fread(pcm_buffer, 2*sizeof(short int), PCM_SIZE, pcm); if (read == 0) write = lame_encode_flush(lame, mp3_buffer, MP3_SIZE); else write = lame_encode_buffer_interleaved(lame, pcm_buffer, read, mp3_buffer, MP3_SIZE); fwrite(mp3_buffer, write, 1, mp3); } while (read != 0); lame_close(lame); fclose(mp3); fclose(pcm); } @catch (NSException *exception) { NSLog(@"%@",[exception description]); } @finally { [self performSelectorOnMainThread:@selector(convertMp3Finish) withObject:nil waitUntilDone:YES]; } } 

用于连接audio文件的Swift AudioHelper类和用于将.wav文件转换为.mp3的调用方法:

 import UIKit import AVFoundation protocol AudioHelperDelegate { func assetExportSessionDidFinishExport(session: AVAssetExportSession, outputUrl: NSURL) } class AudioHelper: NSObject { var delegate: AudioHelperDelegate? func concatenate(audioUrls: [NSURL]) { //Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack. var composition = AVMutableComposition() var compositionAudioTrack:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID()) //create new file to receive data var documentDirectoryURL = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask).first! as! NSURL var fileDestinationUrl = NSURL(fileURLWithPath: NSTemporaryDirectory().stringByAppendingPathComponent("resultmerge.wav")) println(fileDestinationUrl) StorageManager.sharedInstance.deleteFileAtPath(NSTemporaryDirectory().stringByAppendingPathComponent("resultmerge.wav")) var avAssets: [AVURLAsset] = [] var assetTracks: [AVAssetTrack] = [] var durations: [CMTime] = [] var timeRanges: [CMTimeRange] = [] var insertTime = kCMTimeZero for audioUrl in audioUrls { let avAsset = AVURLAsset(URL: audioUrl, options: nil) avAssets.append(avAsset) let assetTrack = avAsset.tracksWithMediaType(AVMediaTypeAudio)[0] as! AVAssetTrack assetTracks.append(assetTrack) let duration = assetTrack.timeRange.duration durations.append(duration) let timeRange = CMTimeRangeMake(kCMTimeZero, duration) timeRanges.append(timeRange) compositionAudioTrack.insertTimeRange(timeRange, ofTrack: assetTrack, atTime: insertTime, error: nil) insertTime = CMTimeAdd(insertTime, duration) } //AVAssetExportPresetPassthrough => concatenation var assetExport = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetPassthrough) assetExport.outputFileType = AVFileTypeWAVE assetExport.outputURL = fileDestinationUrl assetExport.exportAsynchronouslyWithCompletionHandler({ self.delegate?.assetExportSessionDidFinishExport(assetExport, outputUrl: fileDestinationUrl!) }) } func exportTempWavAsMp3() { let wavFilePath = NSTemporaryDirectory().stringByAppendingPathComponent("resultmerge.wav") AudioWrapper.convertFromWavToMp3(wavFilePath) } } 

桥接头包含:

 #import "lame/lame.h" #import "AudioWrapper.h" 

我们有专门的类来读/写媒体/ AVAssetReaderAVAssetWriter的文件,并在AVAssetReader的帮助下,你可以导出为MP3文件。 否则你可以使用https://github.com/michaeltyson/TPAACAudioConverter