iOS 8 wants: A smarter, contextually aware Spotlight search

  • Thread starter Thread starter Rene Ritchie
  • Start date Start date
R

Rene Ritchie

9dd7f053650fe405e6889bac84b3282a.jpg

Imagine if, in iOS 8 or some future version of Apple's mobile operating system, Spotlight became a secondary, text-based point of access to Siri, able to parse the same type of natural language queries and commands, and retrieve the same kinds of responses, and perform the same kinds of actions? Back in June of 2012 I hoped for a Spotlight that could access Siri, and I'm still hoping for it today. Why?

Siri voice free


While Apple has made great progress expanding Siri since its debut in iOS 5, including adding new data sources for movies, sports, and more, they've also expanded Siri's accessibility. Siri gained Eyes Free in iOS 6 and CarPlay in iOS 7. However, there's no equivalent for "voice free".

No matter how useful and enabling Siri is, there will be times when it's simply not possible or socially acceptable to talk out loud to our phones. In those situations, being able to type "Cupertino weather?" or even "Text Georgia I'll be late" would be incredibly useful.

Sure, you can pull down Spotlight from any Home screen, search for the Weather or Messages app, enter the right city or add the right contact, type in your message, etc. but Siri has shown natural language can be faster using voice. Apps like Fantastical have shown they can be faster using text input as well.

What makes this feature so tantalizing is that Siri can already handle text input. Once you've made a voice query or command, an "edit" button appears. Type in anything you like, tap "Done" and Siri will process it. However, you have to start it with voice, which limits the utility.

4b89dc3fd742d5601b28a422482c7f87.jpg

Imagine instead if you could simply pull down Spotlight from the Home screen, enter your text, and access Siri directly. As I said last time, multiple points of entry into the Siri system don't increase complexity, they increase accessibility.

Quick access to quick actions


Siri can not only answer questions but execute commands. If Spotlight could parse text and hook into the system like Siri, it could also be used to execute simple commands.

There's lots of precedent for this. Quicksilver, LaunchBar, and Alfred are all implementations of text-based quick action launches that have existed on the Mac for years. Just Type from webOS and Type and Go from BlackBerry have tried the same on mobile. Launch Center Pro has even made an icon-driven version work as far as current iOS limitations will allow.

With Spotlight hooked up to Siri's action engine, "Text Georgia I'm running late" is just one example of the type of text-based quick-action that could be possible. "Tweet Guy Wow, arrow was bananas!" could instantly send your status. "Meeting with Ally at 6pm tomorrow" could add an event to your calendar.

b370e24d30ab7c000124e468b972526c.jpg

If Apple gets around to adding a DocumentPicker to iOS, Spotlight could even see into any and all the files you're storing locally and on iCloud. Spotlight on the Mac can already be used to search for OS X files. "Team roster" or "WWDC keynote" would be just as useful on iOS.

I firmly believe iOS should stay as simple as possible for as many users as possible. When power can be added below the surface, however, when functionality can enabled only for those who want and need it, then it's too everyone's advantage. Just like Notification Center, Control Center, and fast app switching stay completely out of the way and all but invisible unless expressly called out, so too could Spotlight quick actions.

A smarter Spotlight


There's a lot I'd like to see from Spotlight and Apple's text based search in general, including and especially how it works on the stores. Nearest-neighbor and automatic search widening so that spelling mistakes become irrelevant and queries too narrow to return exact results can return close-enough results anyway.

Server side delays, the kind that have sometimes been problematic for the current Siri implementation, could also be mitigated. Since text is already been typed, there's no need to run speech-to-text. Any local actions could be processed locally. Any queries that require a trip to the internet to get results would suffer not much more than the same possible limitations Spotlight web and Wikipedia searches already have today.

Put all that together and Spotlight, like Siri, becomes a parallel, powerful, useful way to interact with the iPhone and iPad.

It makes Spotlight smart, and who doesn't want that?

6f62c5927a0de72568ddc8a924eda333.gif


2a7feeb17916a082e2e961a37a447470._.png
f36d9ea3c97fe4174fb21946d07162c8._.png
be3681d030dd18b72ae9213c9c67d10d._.gif

46dacc4ba48ff0ebd94f93a8744561d2._.gifdeaf5c56e36fd76fb72e61d64172eff0._.gif7dbcad35255f95279236f445802410a8._.gif

Continue reading...
 
Back
Top