How to use FFmpeg in IOS

FFmpeg, a versatile multimedia processing library, is a valuable tool for developers working on iOS applications that involve audio and video manipulation. While iOS provides its frameworks for media handling, incorporating FFmpeg into your iOS project allows for advanced functionalities and broader codec support. In this comprehensive guide, we will explore the steps to integrate FFmpeg into an iOS project and demonstrate some basic usage examples

  1. Setting Up the Development Environment:
    Before diving into FFmpeg integration, ensure you have Xcode installed on your macOS machine. Additionally, you’ll need the CocoaPods dependency manager for easier integration.
  • Install CocoaPods using the terminal:
    bash sudo gem install cocoapods
  1. Creating a New iOS Project:
    Open Terminal and navigate to the directory where you want to create your project. Run the following commands:
   pod init

This will create a Podfile in your project directory.

  1. Edit the Podfile:
    Open the Podfile in a text editor and add the following lines to integrate FFmpeg:
   platform :ios, '9.0'
   target 'YourApp' do
     use_frameworks!
     pod 'FFmpeg'
   end

Replace ‘YourApp’ with your actual target name.

  1. Install FFmpeg with CocoaPods:
    Save the changes to the Podfile and run the following command in the terminal:
   pod install

This will download and install FFmpeg as a dependency for your project.

  1. Import FFmpeg into Your Code:
    With CocoaPods handling dependencies, import FFmpeg into your Swift or Objective-C code:
   import MobileFFmpeg

In Objective-C:

   #import <mobileffmpeg/MobileFFmpeg.h>
  1. Basic FFmpeg Usage:
    Now that FFmpeg is integrated, you can use it for various tasks. For instance, let’s consider a simple command to convert a video file:
   MobileFFmpeg.execute("-i input.mp4 -c:v h264 output.mp4")

You can adapt the command to suit your specific needs, such as trimming, resizing, or applying filters.

  1. Handling FFmpeg Output:
    FFmpeg commands may produce a substantial amount of output. To capture this output, you can use the executeAsync method with a callback:
   MobileFFmpeg.executeAsync("-i input.mp4 -c:v h264 output.mp4", withCallback: { (returnCode, output) in
       print("Return code: \(returnCode)")
       print("Output: \(output ?? "")")
   })

This allows you to analyze the execution results and respond accordingly.

  1. Handling FFmpeg in the Background:
    Long-running FFmpeg tasks should be executed in the background to prevent freezing the UI. Use tools like DispatchQueue to run FFmpeg commands asynchronously:
   DispatchQueue.global(qos: .background).async {
       MobileFFmpeg.execute("-i input.mp4 -c:v h264 output.mp4")
   }

Here I am going to show the complete integration of FFmpeg in IOS

Follow these steps to use FFmpeg in IOS

iOS / tvOS

Add MobileFFmpeg dependency to your Podfile in mobile-ffmpeg- pattern

  • iOS
pod 'mobile-ffmpeg-full', '~> 4.4'
  • tvOS
pod 'mobile-ffmpeg-tvos-full', '~> 4.4'

Execute synchronous FFmpeg commands

#import <mobileffmpeg/MobileFFmpegConfig.h>
#import <mobileffmpeg/MobileFFmpeg.h>

int rc = [MobileFFmpeg execute: @"-i file1.mp4 -c:v mpeg4 file2.mp4"];

if (rc == RETURN_CODE_SUCCESS) {
    NSLog(@"Command execution completed successfully.\n");
} else if (rc == RETURN_CODE_CANCEL) {
    NSLog(@"Command execution cancelled by user.\n");
} else {
    NSLog(@"Command execution failed with rc=%d and output=%@.\n", rc, [MobileFFmpegConfig getLastCommandOutput]);
}

Execute asynchronous FFmpeg commands

#import <mobileffmpeg/MobileFFmpegConfig.h>
#import <mobileffmpeg/MobileFFmpeg.h>

long executionId = [MobileFFmpeg executeAsync:@"-i file1.mp4 -c:v mpeg4 file2.mp4" withCallback:self];

- (void)executeCallback:(long)executionId :(int)returnCode {
    if (rc == RETURN_CODE_SUCCESS) {
        NSLog(@"Async command execution completed successfully.\n");
    } else if (rc == RETURN_CODE_CANCEL) {
        NSLog(@"Async command execution cancelled by user.\n");
    } else {
        NSLog(@"Async command execution failed with rc=%d.\n", rc);
    }
}

Execute FFprobe commands

#import <mobileffmpeg/MobileFFmpegConfig.h>
#import <mobileffmpeg/MobileFFprobe.h>

int rc = [MobileFFprobe execute: @"-i file1.mp4"];

if (rc == RETURN_CODE_SUCCESS) {
    NSLog(@"Command execution completed successfully.\n");
} else if (rc == RETURN_CODE_CANCEL) {
    NSLog(@"Command execution cancelled by user.\n");
} else {
    NSLog(@"Command execution failed with rc=%d and output=%@.\n", rc, [MobileFFmpegConfig getLastCommandOutput]);
}

Check execution output later

int rc = [MobileFFmpegConfig getLastReturnCode];
NSString *output = [MobileFFmpegConfig getLastCommandOutput];

if (rc == RETURN_CODE_SUCCESS) {
    NSLog(@"Command execution completed successfully.\n");
} else if (rc == RETURN_CODE_CANCEL) {
    NSLog(@"Command execution cancelled by user.\n");
} else {
    NSLog(@"Command execution failed with rc=%d and output=%@.\n", rc, output);
}

Stop ongoing FFmpeg operations

  • Stop all executions
[MobileFFmpeg cancel];
  • Stop a specific execution
[MobileFFmpeg cancel:executionId];

Get media information for a file

MediaInformation *mediaInformation = [MobileFFprobe getMediaInformation:@"<file path or uri>"];

Record video and audio using iOS camera

[MobileFFmpeg execute: @"-f avfoundation -r 30 -video_size 1280x720 -pixel_format bgr0 -i 0:0 -vcodec h264_videotoolbox -vsync 2 -f h264 -t 00:00:05 %@", recordFilePath];

Enable log callback

[MobileFFmpegConfig setLogDelegate:self];

- (void)logCallback:(long)executionId :(int)level :(NSString*)message {
    dispatch_async(dispatch_get_main_queue(), ^{
        NSLog(@"%@", message);
    });
}

Enable statistics callback

[MobileFFmpegConfig setStatisticsDelegate:self];

- (void)statisticsCallback:(Statistics *)newStatistics {
    dispatch_async(dispatch_get_main_queue(), ^{
        NSLog(@"frame: %d, time: %d\n", newStatistics.getVideoFrameNumber, newStatistics.getTime);
    });
}

Ignore the handling of a signal

[MobileFFmpegConfig ignoreSignal:SIGXCPU];

List ongoing executions

NSArray* ffmpegExecutions = [MobileFFmpeg listExecutions];
for (int i = 0; i < [ffmpegExecutions count]; i++) {
    FFmpegExecution* execution = [ffmpegExecutions objectAtIndex:i];
    NSLog(@"Execution %d = id: %ld, startTime: %@, command: %@.\n", i, [execution getExecutionId], [execution getStartTime], [execution getCommand]);
}

Set default log level

[MobileFFmpegConfig setLogLevel:AV_LOG_FATAL];

Register custom fonts directory

[MobileFFmpegConfig setFontDirectory:@"<folder with fonts>" with:nil];

Conclusion:
Integrating FFmpeg into your iOS project opens up a myriad of possibilities for multimedia processing. With this guide, you have learned the steps to set up FFmpeg using CocoaPods and execute basic FFmpeg commands in an iOS application. Remember to consult the FFmpeg documentation for more advanced features and customization options based on your project requirements.

Leave a Comment