In this segment continuing our discussion on IOS external accessories and Multitasking we are going to look at ways an application can be launched without any action on the part of the user.
Case Study:
In-car navigation and management systems , so called infotainment systems. These are Radio head units that may offer depending on the OEM Handsfree calling, messaging , navigation and control over the vehicle's environmental systems. Examples would be SYNC from Microsoft and Ford, Mylink from GM and Entune from Toyota.
All of the above named systems and many others also provide some level of integration between the Radio Headunit and Applications on the Driver's Mobile device Using simple steering wheel or voice controls. Pandora Radio is a widely included application, But the list is growing.
These infotainment systems have special challenges associated with them in that many State Legislature and NTSB are developing strict regulations for Driver Distraction, Mandating and in some States enforcing use of handsfree devices for making calls and limiting certain control of the equipment to when the vehicle is at rest or a safe speed. No one whats to see an accident report that states the driver hit the pedestrial while their attention was focused on Searching for "lost control" by Unwritten Law.
These systems work with a few smart phones e.g. Blackberry, Android and Iphone. Android is by far the easiest to interface with Android devices can connect to these systems via bluetooth socket connections. Since Android allows for true multitasking, multiple socket connections can be opened it is fairly simple for the infotainment system to provide a mechanism to swap from application to application.
Apple presents a more difficult challenge. Apple didn't add Bluetooth Support to the MFI certification until after many of the current crop of devices were on the market which means that for these devices the only connection available is using the I(Pod,Phone,Pad) accessory protocol or IAP over the dock Connector which is a serial connection.
IAP is a very simple control protocol which uses standard serial RS232 . With the Approved apple api the only way you can do serial communication between an iphone and another device is using an MFI approved appliance. There are a few backdoor approaches to serial communication involving copying portions of the IOKit api from mac OS or jailbreaking the device and installing a serial port handler but for an application that the developer hopes to get through the Apple Review process the only approved way is to use the Apple Dock Connector and belong to the MFI program.
All MFI accessories include a decoder chip, the dock connector also has a decoder chip, when the IDevice is connected to the accessory the devices go through something resembling an ssl handshake, After successful authentication The TTY port is available for standard communication. The IAP framework handles all the low level conversions , the user's application uses the standard IOS stream protocol for all traffic.
The whole process is fairly fluid but there is a major flaw, only one accessory can be connected at a time and because of the lack of multitasking only one app can be connected to the accessory.
With the latest ios apple has introduced Airplay and Bluetooth 4.0 , Bluetooth 4.0 is a recent standard for new low powered bluetooth devices now coming on the market. It is clear that apple plans Airplay to be the new direction for all future accessories.
Apple announces new MFI components
This is an exciting new direction but software is a lot easier to change than hardware, and new chips mean new engineering and with regards to Automotive product cycles can mean months or years before the public sees it. Meantime what about current hardware.
With IOS 5 Apple has tried to address this issue by providing a new notification to the Accessory. A current generation accessory can now send an IAP command to the device to alert the user to Launch a particular app. "Health Monitor needs your attention" Allow or Dismiss. For the heart monitor example the user simply taps "allow" which starts the application bringing it to the foreground. This is a great improvement it means the User doesn't have to search for the app icon on the phone or remember which app was used with the accessory, But still presents problems for a driver who must take their eyes off the road to tap the allow button. Better but not useful just yet.
I have observed a regulatory test of an infotainment system . The test consisted of the smart phone being paired with the system and then the device placed in the cars glove compartment. From that point on the only way to interact with the apps on the phone was via the handsfree microphone or steering column buttons.
Apple also included new commands to send meta data to existing MFI accessories. eg. Artist, title etc. Nice touches but still for existing accessories it means that the user has to select a favorite app ie Pandora Radio and stick with it until they can safely (or unsafely) switch to another application.
But.. A method has existed since IOS 2 for launching applications without using the application springboard. That method is the custom URL Scheme. Apple uses custom url schemes for many standard applications. If you open Safari and type mail://[email protected], Safari will close (or nowadays go into background) and the mail application will open. You can designate custom url schemes for any native applications including the phone application and facetime or define schemes for your own applications.
To communicate with an app using a custom URL, create an NSURL
object with some properly formatted content and pass that object to the openURL:
method of the shared UIApplication
object. The openURL:
method launches the app that registered to receive URLs of that type and passes it the URL. At that point, control passes to the new app.
NSURL *myURL = [NSURL URLWithString:@"Pandora://play?Sweet+Dreams&artist=sucker+punch"]
[[UIApplication sharedApplication] openURL:myURL];
|
The above code snippet assumes the Pandora app developer has coded a url handler which would process the search query specified on the url.
To create an application with a custom URL scheme
|
|
|
|
1.
|
Right-click the Information Property List key, and select Add Row. Select “URL types” from the list .
Adding a URL type.
|
2.
|
Expand Item 1, right-click URL identifier, and again select Add Row. Select URL Schemes from the list
|
3.
|
Select Item 1, and set the value to yourapp .
|
An app that has its own custom URL scheme must be able to handle URLs passed to it. All URLs are passed to your app delegate, either at launch time or while your app is running or in the background. To handle incoming URLs, your delegate should implement the following methods:
If your app is not running when a URL request arrives, it is launched and moved to the foreground so that it can open the URL.
if your app is running but is in the background or suspended when a URL request arrives, it is moved to the foreground to open the URL.
The system calls the delegate’s application:openURL:sourceApplication:annotation:
to check the URL and open it. If your delegate does not implement this method (or the current system version is iOS 4.1 or earlier), the system calls your delegate’s application:handleOpenURL:
Using the information above it is a fairly trivial task to construct a simple launcher with a button representing each application providing a function and the IBOutlet of the button having the code to openURL to the application.
-(IBAction)openSms {
[[UIApplication sharedApplication] openURL:[NSURL URLWithString:@"sms://466453"]];
}
Its also not very difficult to see how you can code a simple voice command application using Siri or Nuance.
- (void)recognizer:(SKRecognizer *)recognizer didFinishWithResults:(SKRecognition *)results
{
if ([results.results count] > 0) {
NSString *wellFormedRequest =[self getUrlRequest:[results firstResult]];
if (wellFormedRequest != nil) {
NSURL *url = [NSURL URLWithString:wellFormedRequest];
[[UIApplication sharedApplication] openURL:url];
}else{
[self speakText:MESSAGE_I_DID_NOT_UNDERSTAND_Y0U];
}else {
[self speakText:MESSAGE_NO_RESULTS_RETURNED];
}
}
In the code snippet above for a typical speech recognizer the method didFinishwithResults passes the results of the speech transcription to getUrlRequest which returns either a well formed url request in the form of appName?parameters or nil if the request couldn't be mapped to an application. The Url request calls openUrl which opens the application and puts the launcher into the background.
If getUrlRequest returned nil or [results count] = 0 then an appropriate message is passed to the text to speech engine.
So now we have a fairly decent little voice controlled launcher that can be used by any handsfree device not just MFI external accessories. Its easy to imagine one of those small bluetooth headsets e.g. Jambra used as a voice interface to your applications. This is quite possible using the new WB 16KHZ SCO support in IOS 5.
So now with a voice command we can start an SMS message, listen to Pandora Radio or get the latest Podcast from Stitcher. Our launcher has all the compatible phone apps hardcoded into a list of choices.
But.. What is the application isn't available on the device. It would be nice to find all the Applications that are present on the phone and compatible with the accessory.
At First thought you might consider as each application is started and backgrounded it writes its Id to ICloud after all isn't that what the IOS 5 "key value storage" feature is for , storing small bits of data usually for restoring the state of the app. Such as the page the reader is on in an ebook reader application.
But Icloud requires an internet connection, and it would be ideal if the launcher had as few dependencies as needed.
The next consideration might be, when an application starts it could register with the accessory , and the accessory could keep a list of active applications but this may require special coding on the accessory . Accessory code is usually embedded into firmware that must be flashed to add new features, For some Accessories this may only require an Over the Air update but for others as previously mentioned this could be a complex and time expensive task. And we wanted the launcher to function with non-MFI devices like the Jambra Headsets mentioned .
A somewhat simpler approach is to use a little bit of standard c code to provide an enumeration of all the running processes on the Device. Much like that pictured below in The AirTracer Processes View. A shameless plug for Airtracer which can List all the processes on an IDevice and there related PID and other information. AirTracer Pro can also terminate processes as well.
Once we have the list of Processes we can build a list of voice commands by comparing the process list against the hardcoded list of compatible processes.
A fragment from a typical siri voice tree using the above technique might go as follows.
User: Sir please play "Sweet Dreams" by Eurythmics
Siri: Ok , I found 2 applications on your device that can play the original mix of Sweet Dreams", Spotify or Tunes , which app would you like to you.
If the user replies Tunes Siri will launch etunes and play the the requested single, if the user fails to respond either Spotify or Tunes may be launched depending on which application the user has selected as a default.
Recent Comments