“Siri, make yourself useful and sort out my holidays snaps, will you?” This could soon be a legitimate command, if Apple makes use of a recent patent application. The patent is for voice tagging, AppleInsider reports, meaning that you could just describe your photos by speaking, and your iPhone would do the rest.
The patent application was filed with the US Patent and Trademark Office. It was titled “Voice-Based Image Tagging and Searching”, and should let you use “natural language” to describe your shots. This is what it describes as a “text string”, which the iPhone would then process to identify the terms associated with people, an activity, or a location. And hey presto, it tags it, so you can sort through your snaps quickly and easily.
At the moment, the iPhone can sort your snaps by time taken and location. But this would simplify the entire process. The use of natural language sounds very similar to Siri, Apple’s robot butler you can chat to in a very conversational way.
One example given would have you say: “This is me at the beach”, and the picture would be tagged accordingly. Then if you search for photos of you, the beach, or for snaps taken that day, it’ll show up in the results.
You might not even have to tag each picture, as it could detect who features in which, and when and where they were taken, and do the hard work for you. Which would be a huge time saver.
Apple revamped its Photos app in iOS 7. The new app displays your photos as tiny thumbnails, sorted by year. Then you just zoom in to find them based on when and where they were taken.
Voice tagging would also be a decent use for Siri, who has been at a bit of a loose end of late. No one I know with an iPhone uses Siri to search for things, but then very few of my friends on Android opt for Google’s voice search as their first port of call, either.
What do you think of the patent? Is speaking to your gadgets the future, or will it never take off? Let me know in the comments, or speak up on our Facebook page.