Monday, June 29, 2015

Integrated Image Picker for iOS

In this post I am going to discuss about a Custom integrated image picker for iOS with code examples. So why another image picker ? There are hundreds available on the github and the default UIImagePicker in SDK does the job too. These are the reasons why I wrote a custom image picker.


  • In most image picker you switch between the camera and image gallery. But in this one you get one view controller and you get both. Inspired by Facebook iOS app.
  • This image picker has a live preview using AVCaptureSession. Inspired by Twitter iOS app.


So now let us jump right into the code. The code is hosted with example project on github at https://github.com/cspnanda/CustomImagePicker



Step 1: Setup Collection View

Make a UICollectionView and have two kind of cells. One for gallery images (ALAssets) and the other for camera (This is Cell no 0). The XIB file is actually pretty simple. You have a UICollectionView and a next button. Connect the collectionView datasource and delegate to self and set the referencing outlet. All the magic happens in the code.




Step 2 : Populate the collection view Data Source


Now we need to iterate through the Assets and populate the collection view. Open the viewDidLoad in your View Controller and add the following code. Here the first element I add is the camera cell. Then the ALAssetsLibrary block iterate through all the photos in library. Note: You may want to do it in a background thread with GCD and call the dispatch_get_main_queue and update the UI (reload collection view)


  [self.collectionView registerClass:[PhotoPickerCell class] forCellWithReuseIdentifier:@"PhotoPickerCell"];
  [self.collectionView registerClass:[CameraCell class] forCellWithReuseIdentifier:@"CameraCell"];
  
  UICollectionViewFlowLayout *flowLayout = [[UICollectionViewFlowLayout alloc] init];
  if(IS_IPHONE_6 || IS_IPHONE_6P)
    [flowLayout setItemSize:CGSizeMake(120, 120)];
  else
    [flowLayout setItemSize:CGSizeMake(100, 100)];
  [flowLayout setScrollDirection:UICollectionViewScrollDirectionVertical];
  [self.collectionView setCollectionViewLayout:flowLayout];
  
  _assets = [@[] mutableCopy];
  __block NSMutableArray *tmpAssets = [@[] mutableCopy];
  if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) {
    ALAsset *asset = [[ALAsset alloc] init];
    if(showOnlyPhotosWithGPS==NO)
      [tmpAssets insertObject:asset atIndex:0];
  }
  ALAssetsLibrary *assetsLibrary = [CustomeImagePicker defaultAssetsLibrary];
  [assetsLibrary enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
    [group setAssetsFilter:[ALAssetsFilter allPhotos]];
    [group enumerateAssetsWithOptions:NSEnumerationReverse usingBlock:^(ALAsset *result, NSUInteger index, BOOL *stop) {
      if(result)
      {
        if(showOnlyPhotosWithGPS == YES)
        {
          if([result valueForProperty:ALAssetPropertyLocation])
            [tmpAssets addObject:result];
        }
        else
          [tmpAssets addObject:result];
      }
    }
     ];
    
  } failureBlock:^(NSError *error) {
    NSLog(@"Error loading images %@", error);
    if([ALAssetsLibrary authorizationStatus] != ALAuthorizationStatusAuthorized)
    {
      [self displayErrorOnMainQueue:@"Photo Access Disabled" message:@"Please allow Photo Access in System Settings"];
    }
  }];
  self.assets = tmpAssets;
  dispatch_time_t popTime1 = dispatch_time(DISPATCH_TIME_NOW, 0.5 * NSEC_PER_SEC);
  dispatch_after(popTime1, dispatch_get_main_queue(), ^(void){
    if(showOnlyPhotosWithGPS==YES) {
    OLGhostAlertView *ghastly = [[OLGhostAlertView alloc] initWithTitle:@"Photos with Location" message: @"Only Photos with Location Information are shown here." timeout:2.0 dismissible:YES];
    [ghastly show];
    }

    [self.collectionView reloadData];
  });

Now we need to implement few UICollectionView methods. Look carefully at the method cellForItemAtIndexPath. It maintains an array which holds the photos you selected. Once the number of photos selected reaches the max (defined by you in MAX_PHOTOS) we gray out the other cells. When you run the code you will get a collection view like shown in right.










Step 3 : Implement delegate methods (click to expand)


- (UICollectionViewCell *) collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath
{
  static NSString *cellIdentifier = @"PhotoPickerCell";
  static NSString *cameraCellIdentifier = @"CameraCell";
  
  if([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera])
  {
    if(indexPath.row == 0 && indexPath.section == 0 && showOnlyPhotosWithGPS == NO)
    {
      CameraCell *cell = (CameraCell *)[collectionView dequeueReusableCellWithReuseIdentifier:cameraCellIdentifier forIndexPath:indexPath];
      ALAsset *asset = self.assets[indexPath.row];
      ALAssetRepresentation *rep = [asset defaultRepresentation];
      NSString *assetURL = [[rep url] absoluteString];
      cell.asset = asset;
      cell.backgroundColor = [UIColor whiteColor];
      if([highLightThese containsObject:assetURL])
      {
        cell.layer.borderColor = [[UIColor orangeColor] CGColor];
        cell.layer.borderWidth = 4.0;
        [cell setAlpha:1.0];
        [cell setUserInteractionEnabled:YES];
      }
      else
      {
        if([highLightThese count] == maxPhotos)
        {
          cell.layer.borderColor = nil;
          cell.layer.borderWidth = 0.0;
          [cell setAlpha:0.5];
          [cell setUserInteractionEnabled:NO];
        }
        else
        {
          cell.layer.borderColor = nil;
          cell.layer.borderWidth = 0.0;
          [cell setAlpha:1.0];
          [cell setUserInteractionEnabled:YES];
        }
      }
      return cell;
    }
    else
    {
      PhotoPickerCell *cell = (PhotoPickerCell *)[collectionView dequeueReusableCellWithReuseIdentifier:cellIdentifier forIndexPath:indexPath];
      
      ALAsset *asset = self.assets[indexPath.row];
      ALAssetRepresentation *rep = [asset defaultRepresentation];
      NSString *assetURL = [[rep url] absoluteString];
      cell.asset = asset;
      cell.backgroundColor = [UIColor whiteColor];
      if([highLightThese containsObject:assetURL])
      {
        cell.layer.borderColor = [[UIColor orangeColor] CGColor];
        cell.layer.borderWidth = 4.0;
        [cell setAlpha:1.0];
        [cell setUserInteractionEnabled:YES];
      }
      else
      {
        if([highLightThese count] == maxPhotos)
        {
          cell.layer.borderColor = nil;
          cell.layer.borderWidth = 0.0;
          [cell setAlpha:0.5];
          [cell setUserInteractionEnabled:NO];
        }
        else
        {
          cell.layer.borderColor = nil;
          cell.layer.borderWidth = 0.0;
          [cell setAlpha:1.0];
          [cell setUserInteractionEnabled:YES];
        }
      }
      return cell;
    }
  } // Device with Camera
  else
  {
    PhotoPickerCell *cell = (PhotoPickerCell *)[collectionView dequeueReusableCellWithReuseIdentifier:cellIdentifier forIndexPath:indexPath];
    
    ALAsset *asset = self.assets[indexPath.row];
    ALAssetRepresentation *rep = [asset defaultRepresentation];
    NSString *assetURL = [[rep url] absoluteString];
    cell.asset = asset;
    cell.backgroundColor = [UIColor whiteColor];
    if([highLightThese containsObject:assetURL])
    {
      cell.layer.borderColor = [[UIColor orangeColor] CGColor];
      cell.layer.borderWidth = 4.0;
      [cell setAlpha:1.0];
      [cell setUserInteractionEnabled:YES];
    }
    else
    {
      if([highLightThese count] == maxPhotos)
      {
        cell.layer.borderColor = nil;
        cell.layer.borderWidth = 0.0;
        [cell setAlpha:0.5];
        [cell setUserInteractionEnabled:NO];
      }
      else
      {
        cell.layer.borderColor = nil;
        cell.layer.borderWidth = 0.0;
        [cell setAlpha:1.0];
        [cell setUserInteractionEnabled:YES];
      }
    }
    return cell;
  } //  Device Without Camera
  return Nil;
}


- (NSInteger) collectionView:(UICollectionView *)collectionView numberOfItemsInSection:(NSInteger)section
{
  return self.assets.count;
}
- (CGFloat) collectionView:(UICollectionView *)collectionView layout:(UICollectionViewLayout *)collectionViewLayout minimumLineSpacingForSectionAtIndex:(NSInteger)section
{
  return 4;
}

- (CGFloat) collectionView:(UICollectionView *)collectionView layout:(UICollectionViewLayout *)collectionViewLayout minimumInteritemSpacingForSectionAtIndex:(NSInteger)section
{
  return 1;
}


Step 4 : Add the live camera feed to first cell


For this you need to use the AVCaptureSession. Add the header files and define the instance variables.

#import <AssetsLibrary/AssetsLibrary.h>

/* Define following instance variable */

  AVCaptureSession *session;
  AVCaptureVideoPreviewLayer *captureVideoPreviewLayer;
  AVCaptureStillImageOutput *stillImageOutput;

Now initialize the AVCaptureSession and start running the instance. This code has been borrowed heavily from here and reference


Now you can call this method initializeCamera like

    [self performSelector:@selector(initializeCamera) withObject:nil afterDelay:0.1];


- (void) initializeCamera {
  if (session)
    [session release], session=nil;
  CameraCell *firstCell = (CameraCell*)[self.collectionView cellForItemAtIndexPath:[NSIndexPath indexPathForRow:0 inSection:0]];
  UIImageView *view = [firstCell getImageView];
  session = [[AVCaptureSession alloc] init];
  session.sessionPreset = AVCaptureSessionPresetLow;
  
  if (captureVideoPreviewLayer)
    [captureVideoPreviewLayer release], captureVideoPreviewLayer=nil;
  
  captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
  [captureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
  
  captureVideoPreviewLayer.frame = view.bounds;
  [view.layer addSublayer:captureVideoPreviewLayer];
  
  
  CALayer *viewLayer = [view layer];
  [viewLayer setMasksToBounds:YES];
  
  CGRect bounds = [view bounds];
  [captureVideoPreviewLayer setFrame:bounds];
  
  NSArray *devices = [AVCaptureDevice devices];
  AVCaptureDevice *backCamera=nil;
  
  // check if device available
  if (devices.count==0) {
    NSLog(@"No Camera Available");
    return;
  }
  
  for (AVCaptureDevice *device in devices) {
    
    NSLog(@"Device name: %@", [device localizedName]);
    
    if ([device hasMediaType:AVMediaTypeVideo]) {
      
      if ([device position] == AVCaptureDevicePositionBack) {
        NSLog(@"Device position : back");
        backCamera = device;
      }
    }
  }
  NSError *error = nil;
  AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error];
  if (!input) {
    NSLog(@"ERROR: trying to open camera: %@", error);
  }
  else
  {
    [session addInput:input];
    if (stillImageOutput)
      [stillImageOutput release], stillImageOutput=nil;
    
    stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *outputSettings = [[[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil] autorelease];
    [stillImageOutput setOutputSettings:outputSettings];
    
    [session addOutput:stillImageOutput];
    
    [session startRunning];
  }
  
}



At this time when you run the code on real device (simulator has no camera support) you will see a real preview from back camera. Is not it cool ? You can either select photos or click on the camera to take a capture. The camera capture code using AVCaptureCode have been heavily borrowed from here

Once you choose the photos, you need to implement the delegate

-(void) imageSelected:(NSArray *)arrayOfImages
to get the list of assets picked. One last thing. When you click an image using AVCaptureSession it did not contain Location Exif information. So I had to manually add it like
CFMutableDictionaryRef mutable = CFDictionaryCreateMutableCopy(NULL, 0, attachments);
          NSTimeZone      *timeZone   = [NSTimeZone timeZoneWithName:@"UTC"];
          NSDateFormatter *formatter  = [[NSDateFormatter alloc] init];
          [formatter setTimeZone:timeZone];
          [formatter setDateFormat:@"HH:mm:ss.SS"];
          NSDictionary *gpsDict   = [NSDictionary dictionaryWithObjectsAndKeys:
                                     [NSNumber numberWithFloat:fabs(currentUserLocation.coordinate.latitude)], kCGImagePropertyGPSLatitude
                                     , ((currentUserLocation.coordinate.latitude >= 0) ? @"N" : @"S"), kCGImagePropertyGPSLatitudeRef
                                     , [NSNumber numberWithFloat:fabs(currentUserLocation.coordinate.longitude)], kCGImagePropertyGPSLongitude
                                     , ((currentUserLocation.coordinate.longitude >= 0) ? @"E" : @"W"), kCGImagePropertyGPSLongitudeRef
                                     , [formatter stringFromDate:[currentUserLocation timestamp]], kCGImagePropertyGPSTimeStamp
                                     , [NSNumber numberWithFloat:fabs(currentUserLocation.altitude)], kCGImagePropertyGPSAltitude
                                     ,[NSNumber numberWithFloat:currentUserLocation.horizontalAccuracy],kCGImagePropertyGPSDOP
                                     
                                     , nil];
          
          
          CFDictionarySetValue(mutable, kCGImagePropertyGPSDictionary, (__bridge void *)gpsDict);

Thanks for reading. If you decide to use the control in your application don't forget to give a credit back to me. It means a lot to me. Happy coding.