I have several camera apps, as well as apps that use photos from the photo library on the iphone. Why do these require that location services have to be on in order to access the photo library? I don't particularly care to have location on where I take the pictures, it uses up battery, etc. But the programs require that it is on in order for them to work, even though they don't need the information. Is is something in how photos are accessed through the programming library? Just curious, and thought that a programmer might know...
Zincous summed up the difficulties of ALAssetsLibrary quite well, but there's one more. When you use an ImagePickerController, you can obtain a valif file URL for the image or video the user has chosen. Using this URL, you can retrieve the data in this fashion: NSData* theData = [NSData dataWithContentsOfURL:url options:NSDataReadingMappedIfSafe error:&dataError]; The NSDataReadingMappedIfSafe option means that the data is never loaded into RAM until it is retrieved from the NSData object. In our application, this means we can easily upload a 100MB video to our server, passing the NSData directly to the networking code. In effect, the networking code reads the data directly from the file. We tried hard to find a way to do the same trick with ALAssets, but there doesn't seem to be one. The URL you can obtain from an ALAssetRepresentation doesn't reference a file, it just identifies the representation. The only way to get at the data is by loading it into a buffer that you allocate in RAM -- and allocating a 100MB buffer in a mobile device is likely to lead to trouble. So we're winding up doing everything with UIImagePickerController. The picker UI is not nearly as nice as what we could do with ALAssets, but we don't have to worry about location services and we don't have to worry about RAM footprint.