ofxiPhoneVidGrabber and ofxOpenCviPhone

UPDATEsee this post.

The 4.0 release of the iPhone OS has introduced access to the pixel data of the live camera stream through the AVCaptureDevice class in the AVFoundation framework.  Want to use it?  OK!

If you want the SUPER SIMPLE way and just want to start playing around immediately, get this version of oF that I fixed up for the 4.0 OS.

otherwise (and I recommend it), read on.

NOTE: This is written for OSX/XCode.  I have no idea about other platforms.

  1. Of course, make sure you have the iPhone OS 4.0 and the iPhone version of oF
  2. Copy the emptyExample and rename it to something else
  3. You’ll probable have to fix some settings in your new project
    1. Go to Project > Edit Project Settings
    2. Change “Base SDK” to iPhone Device 4.0 (this stuff won’t work on anything else – even the 4.0 Simulator)
    3. I usually have to change the Target settings also.  Project > Edit Active Target “Blah”
  4. Toggle between “release” and “debug” in the build type dropdown to get the frameworks (libs > core > core frameworks) to switch over to 4.0.  If they are still red after doing this, something went wrong.
  5. Add the following frameworks by right-clicking on the “core frameworks” folder and selecting “Add > Existing Frameworks”
    1. CoreVideo.framework
    2. CoreMedia.framework
    3. AVFoundation.framework
  6. Fix FreeImage
    1. Easy Way: Replace your libs/FreeImage folder with this one: FreeImage.  You’ll have to also remove the existing one from your XCode project and drag this one in.
    2. Hard Way: Read this in the oF forum
  7. get ofxiPhoneVidGrabber and put it in your addons folder
  8. Get the example: iPhoneVidGrabberExample and put it in your apps/[something] folder
  9. Run and enjoy!

Want to use OpenCV too?

  1. Add this to your addons folder: ofxOpenCviPhone
  2. Add this to your apps/iPhoneSpecificExamples folder: openCviPhoneExample
  3. Run and enjoy!

What happened there?

To use OpenCV, you have to recompile it for arm6.  I also added a function to ofxCvColorImage so that you can feed it a 4 channel IplImage* and it will know how to convert it to 3 channels.  If this doesn’t make sense to you, you probably shouldn’t read any further.  But if you are really curious, here is how I compiled for arm6.  I couldn’t figure out the configure flag to leave out JPEG support, so the 5th step is a little ghetto.

  1. Make a copy of the ofxOpenCV library from the normal oF distribution.
  2. Get OpenCV 1.0
  3. expand the file you just downloaded
  4. crack open a terminal and ‘cd’ to the resulting directory
  5. mkdir build; cd build; ../configure --prefix=/path/to/somewhere/stage --without-imageio --without-python --without-swig --disable-apps --disable-dependency-tracking --without-carbon --without-quicktime --enable-shared --without-gtk --host=arm-apple-darwin10 CXX=/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/g++-4.2 CXXFLAGS="-arch armv6 -isysroot /Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS4.0.sdk"
  6. Open build/cvconfig.h and change:
    #define HAVE_JPEG

    //#define HAVE_JPEG
  7. make
  8. make install
  9. Now the include and lib folders that you need are in /path/to/somewhere/stage. Replace them with the stuff in the ofxOpenCV that you got from the normal oF distro.
  10. Here is what I replaced in ofxCvColorImage
    void ofxCvColorImage::operator = ( const IplImage* _mom ) {
    	if( _mom->width != width || _mom->height != height ||
    	   _mom->depth != cvImage->depth) {
    		ofLog(OF_LOG_ERROR, "in =, images need to match in size and depth");
    	switch(_mom->nChannels) {
    		case 3: cvCopy( _mom, cvImage );
    		case 4: cvCvtColor( _mom, cvImageTemp, CV_RGBA2RGB );
    		case 1:  cvCvtColor( _mom, cvImageTemp, CV_GRAY2RGB );
    		default: ofLog(OF_LOG_ERROR, "in =, nChannels not allowed.");

Please let me know if there is anything I need to correct here.

People: Jeff Crouse
Tags: Howtus