With iOS 9, You can search for content from the web, your contacts, apps, nearby places etc. Using spotlight search you can search for content of your app also. In this post, we are going to see how to add spotlight search in your app.

We have already integrated Spotlight Search in our client’s applications and also in our in-house applications. To share an example, let me show you our in-house application “Driving Licence Test”, in which we have integrated Spotlight Search.

Simulator Screen Shot 30-Aug-2016, 7.07.26 PM Simulator Screen Shot 30-Aug-2016, 7.07.17 PM Simulator Screen Shot 30-Aug-2016, 7.07.04 PM Simulator Screen Shot 30-Aug-2016, 7.07.10 PM

It looks interesting and helpful feature for the app, isn’t it? Below are the following steps to implement Spotlight feature in app.

 

Add spotlight search to your app

    1. Open Appdelegate.swift file import CoreSpotlight.
    2.  Add below function.
func setUpSpotlight()
    {
        var imgNameArray = NSMutableArray()
        var imgArray = NSMutableArray()
        var imgInfoArray = NSMutableArray()
        let img: String = NSBundle.mainBundle().pathForResource("images", ofType: "plist")!
        imgArray = NSMutableArray(contentsOfFile: img)!
        let name: String = NSBundle.mainBundle().pathForResource("images_name", ofType: "plist")!
        imgNameArray = NSMutableArray(contentsOfFile: name)!
        let info: String = NSBundle.mainBundle().pathForResource("Images_info", ofType: "plist")!
        imgInfoArray = NSMutableArray(contentsOfFile: info)!
        for var i = 0; i < imgNameArray.count; i=i+1
        {
            let attributeSet = CSSearchableItemAttributeSet(itemContentType: kUTTypeText as String)
            attributeSet.title = imgNameArray[i] as? String
            attributeSet.contentDescription = imgInfoArray[i] as? String
            if (((UIImage(named: imgArray[i] as! String))) != nil)
            {
                attributeSet.thumbnailData = UIImageJPEGRepresentation(UIImage(named: (imgArray[i] as? String)!)!, 0.4)
            }
            else
            {
                attributeSet.thumbnailData = UIImageJPEGRepresentation(UIImage(named: "ic_traffic_rule_e")!, 0.4)
            }
            attributeSet.keywords = imgNameArray as NSArray as? [String]
            let item = CSSearchableItem(uniqueIdentifier: String(i), domainIdentifier: "com.TrafficRules", attributeSet: attributeSet)
            CSSearchableIndex.defaultSearchableIndex().indexSearchableItems([item]) { (error: NSError?) -> Void in
                if let error = error {
                    print("Indexing error: \(error.localizedDescription)")
                } else {
                    print("Search item successfully indexed!")
                }
            }
            
        }
    }

In imgArray , imgNameArray, imgInfoArray store your data to show. You can add more features like phone calling using attributeSet like

attributeSet.supportsPhoneCall = true                    attributeSet.phoneNumbers = [phone_no]                
attributeSet.emailAddresses = [email_id]

          3. Call this function in didFinishLaunchingWithOptions method of  Appdelegate.swift .

   func application(application: UIApplication, didFinishLaunchingWithOptions launchOptions: [NSObject: AnyObject]?) -> Bool {
setUpSpotlight()
return true
   }

4. Add below method in Appdelegate.swift.

func application(application: UIApplication, continueUserActivity userActivity: NSUserActivity, restorationHandler: ([AnyObject]?) -> Void) -> Bool {
     var type = ""
     var wasHandled = false
     
     type = CSSearchableItemActionType
     
     if userActivity.activityType == type
     {
         let activityIdentifier = userActivity.userInfo![CSSearchableItemActivityIdentifier]
         NSNotificationCenter.defaultCenter().postNotificationName("second", object: nil, userInfo: ["index" : activityIdentifier!])
         wasHandled = true
     }
     return wasHandled
 }

Above method will be called when user searches and clicks on item found. You can handle this in your way like above we sent notification. And we have added observer in didFinishLaunchingWithOptions. and navigate user accordingly.

NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(AppDelegate.receiveTestNotification(_:)), name: "second", object: nil)
func receiveTestNotification(notification: NSNotification) {
        if (notification.name == "second") {
            let userInfo:Dictionary<String,String!> = notification.userInfo as! Dictionary<String,String!>
            let indexString = userInfo["index"]
            let storyBoard : UIStoryboard = UIStoryboard(name: "Main", bundle: nil)
            let controller : CustomNavigation = storyBoard.instantiateViewControllerWithIdentifier("CustomNavigation") as! CustomNavigation
            let mainviewcontroller : HomeVC = storyBoard.instantiateViewControllerWithIdentifier("HomeVC") as! HomeVC
            let viewcontroller : LearnVC = storyBoard.instantiateViewControllerWithIdentifier("LearnVC") as! LearnVC
            viewcontroller.imageIndex = Int(indexString!)!
            controller.viewControllers=[mainviewcontroller,viewcontroller]
            self.window?.rootViewController = controller
            
        }
    }

5. Now run the application. When app is there, close it and swipe right to search. Enjoy Searching. 🙂

 

Want to work with us? We're hiring!