The source code of the following can be found at https://github.com/cspnanda/DrawFlag I have been working on a travel related app where I want to highlight the countries I have visited. Now MKMapView allows you to create a polygon and stroke it with a color fill. It can be achieved with few lines of code in rendererForOverlay.
Water falling is a cool piece of animation in iOS apps. I had to use particle system to create an animation where water hose is spraying water. Think of the real water hose. 1. You have a continuous flow of water which are collection of water drops. 2. The shape in which water falls (you may have seen the settings on the hose - shower, mist, full etc). The above two concepts translate exactly to the concepts of emitter cell and an emitter layer.
There is one more thing: In the real world, water will always fall down because of gravity. But in programming world water direction need to be controlled. This is controlled by a property called emissionLongitude.
// Initialize the Emitter Cell. drop is a simple picture of water. From apple doc// A layer can set this property to a CGImageRef to display the image as its contents.
CAEmitterCell *emitterCell = [CAEmitterCell emitterCell];
emitterCell.contents = (id) [[UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:@"drop" ofType:@"png"]] CGImage];
// Define the velocity range - The amount by which the velocity of the cell can vary.
emitterCell.velocityRange =1;
// The angle in which water will fall
emitterCell.emissionRange = M_PI_2;
// Set the x and y component of an acceleration vector
emitterCell.xAcceleration=-10;
emitterCell.yAcceleration=300;
// Set the emissionLongitude
emitterCell.emissionLongitude = longitude;
// Constructing emitterLayer// Initialize the Layer
emitterLayer = (CAEmitterLayer *) self.layer;
// Set the emitter cell as part of emitterLayerContents
emitterLayer.emitterCells = [NSArray arrayWithObject: emitterCell];
// Add the emitter Layer to main view
[self.view addSubview:waterFall];
Voila. Now when you compile the code and run, you should see water hose spraying water. Let us look at two properties of emitter cell. xAcceleration and yAcceleration. As the name xAcceleration is how the water travels in X axis and yAcceleration is how water travels in Y axis. In the segment control when we set the yAcceleration to 0 we see no particles fall in Y axis.
In this post I am going to discuss about a Custom integrated image picker for iOS with code examples. So why another image picker ? There are hundreds available on the github and the default UIImagePicker in SDK does the job too. These are the reasons why I wrote a custom image picker.
In most image picker you switch between the camera and image gallery. But in this one you get one view controller and you get both. Inspired by Facebook iOS app.
This image picker has a live preview using AVCaptureSession. Inspired by Twitter iOS app.
Step 1: Setup Collection View Make a UICollectionView and have two kind of cells. One for gallery images (ALAssets) and the other for camera (This is Cell no 0). The XIB file is actually pretty simple. You have a UICollectionView and a next button. Connect the collectionView datasource and delegate to self and set the referencing outlet. All the magic happens in the code.
Step 2 : Populate the collection view Data Source Now we need to iterate through the Assets and populate the collection view. Open the viewDidLoad in your View Controller and add the following code. Here the first element I add is the camera cell. Then the ALAssetsLibrary block iterate through all the photos in library. Note: You may want to do it in a background thread with GCD and call the dispatch_get_main_queue and update the UI (reload collection view)
Now we need to implement few UICollectionView methods. Look carefully at the method cellForItemAtIndexPath. It maintains an array which holds the photos you selected. Once the number of photos selected reaches the max (defined by you in MAX_PHOTOS) we gray out the other cells. When you run the code you will get a collection view like shown in right. Step 3 : Implement delegate methods (click to expand)
Now initialize the AVCaptureSession and start running the instance. This code has been borrowed heavily from here and reference Now you can call this method initializeCamera like
At this time when you run the code on real device (simulator has no camera support) you will see a real preview from back camera. Is not it cool ? You can either select photos or click on the camera to take a capture. The camera capture code using AVCaptureCode have been heavily borrowed from here Once you choose the photos, you need to implement the delegate
-(void) imageSelected:(NSArray *)arrayOfImages
to get the list of assets picked. One last thing. When you click an image using AVCaptureSession it did not contain Location Exif information. So I had to manually add it like