Monday, June 4, 2012

Image and Video Processing on Android with NDK

Although Android 4.0 Ice Cream Sandwich provides several impressive media processing features and various media codec support, we may still need to port open source codes or current algorithms developed in native C/C++ for media processing.

ffmpeg is an open-source platform for recording, converting, playing and streaming video and audio. It includes libavcodec, a popular video/audio codec. Several popular Android applications are built based on FFmpeg, including RockPlayer, MoboPlayer, acrMedia, vitalPlayer, V-Cut Express etc.

OpenCV is a good library for state-of-the-art image and video processing. It includes Human-Computer Interaction (HCI); Object Identification, Segmentation and Recognition; Face Recognition; Gesture Recognition; Motion Tracking, Ego Motion, Motion Understanding; Structure From Motion (SFM); Stereo and Multi-Camera Calibration and Depth Computation; Mobile Robotics.

If we want to develop multimedia applications that needs a video/audio codec and computer vision, ffmpeg and opencv are good choices and examples.

Android NDK (Native Development Kit) allows working with native C code using a shared C library. It includes the entire toolchain needed to build for your target platform (ARM). Native C code accessible via JNI still runs inside the Dalvik VM, and as such is subject to the same life-cycle rules that any Android application lives by. The advantage of writing parts of your app code in native language is presumably speed in certain cases.


Fortunately many developers already ported ffmpeg to Android using NDK, such as
In mcclanahoochie's blog, it is described that OpenCV was ported to Android:

No comments:

Followers

Blog Archive

About Me

My photo
HD Multimedia Technology player