As mobile devices become more complex the trend is for applications to provide more data automatically. Google is moving in this direction already but will Apple follow suit?
iOS6, the latest version of Apple's mobile software and currently at beta 3, has a number of parallels with the recently launched Android Jelly Bean in that it initially seems like a minor revision but there are a number of tweaks and small features that you would miss if reverting to the previous version of the OS.
With both iOS and Android, however, the latest updates are creating a deeper symbiotic relationship between our devices and the cloud. Jelly Bean's killer feature is Google Now (the system designed to provide data cards automatically based on what we are doing) and iOS6 continues the development of Siri, especially now that local support is available outside the US, and its interaction with the new Maps application.
On the face of it, iOS6 doesn't look like it has enough new features to be a meaningful upgrade over iOS5 but we have reached the point where mobile operating systems are sufficiently advanced so as to only need minor adjustments - fanboys would probably disagree but the reality is that the OS alone cannot progress much further without the cloud.
Robert Scoble and Shel Israel are collaborating on a book about "the coming automatic, freaky, contextual world" where being always on, always connected and monitored by a multitude of sensors means we can now have more data than ever pushed to us based on our location or activities: the context. Context is something I've been talking about quite a bit recently in a purely social sense but the marriage of context and modern hardware is a powerful mix that will take time for many to feel comfortable with for fears of privacy invasion.
Google is ahead of Apple in this regard in that Google Now pushes information to you whereas Siri pulls it based on your requests but I think it is only a matter of time before Apple goes a similar route and starts to use context to supply data automatically.
Ditching Google Maps is about so much more than just relying on a competitor for data, it is about Apple taking back control of part of its ecosystem so that it can be used for myriad purposes. Maps is just the beginning, the partnerships with other companies (to provide subject specific information) enable Apple to provide far greater utility - this will only continue as more data is hooked in to the central engine. The lines between Siri and Maps and data from the like of Yelp etc. will blur with multiple sources being amalgamated to present a more coherent, useful picture.
Those now in partnership with Apple may need to heed the Google Maps warning: as Apple builds services around the new mapping service there is a risk that they too may reach a point where they have served their purpose and be discarded as Apple moves to provide its own in-house solution. Perhaps we will see Apple acquire some of these partners to lock the data in to the iOS environment so as to reduce Google's advantage.
Apple will no doubt be accused of copying Google Now but I feel that the data engine behind Maps and Siri will need to become more proactive in supplying us information rather than us just requesting it - this is a logical progression considering the way the market is moving. As I have also said previously elsewhere, the data engine should be integrated with Spotlight search so that we are able to perform text searches on the same information when we are unable, or unwilling, to use Siri's voice capabilities.
The only variable is time but, now that Google are well on the road, is it time that Apple can afford?
Why not discuss the original post over at Google+
Image by Extra Medium