How to implement accessibility for ALAsset photos on iOS
I wrote a custom image picker based on ALAssetsLibrary, everything works
fine but VoiceOver, every photo are only representing as "Button", I think
that is not good.
So I checked the Photo app that built in iOS, VoiceOver spoke following
information for each photo:
It's photo or video or screenshot etc.
It's portrait or landscape.
The creation date of it.
It's sharp or blurry.
It's bright or dark.
I think I can get the first three from ALAsset's properties, which is
ALAssetPropertyType
ALAssetPropertyOrientation
ALAssetPropertyDate
But how about sharpness and brightness? Can I get them from image Metadata
or derive them out?
No comments:
Post a Comment