We are no longer adding new conetent to this blog
Please visit our two new blogs.
and the Blog for our new crowdsourcing effort :
We are no longer adding new conetent to this blog
Please visit our two new blogs.
and the Blog for our new crowdsourcing effort :
Posted at 11:06 AM | Permalink | Comments (0) | TrackBack (0)
Clearly having a central Controller or Navigator application is an efficient approach, if it uses a GIP protocol it can be a central single connection to the hardware accessory , even if it doesn't it can provide the central menu structure for starting applications.
Also as previous mentioned this Control Center which presumably would be provided by the Accessory provider could also having the ability to intermix its own alternative content e.g. web content, into the streams coming from the the accessories to the third party applications.
Thus channel guides or service information could be provided and not have to be re-duplicated among partner apps.
But also as mentioned someone , presumably the OEM that created the accessory must take ownership and distribution of the Control Center application.
One more scenario is worth looking at. Side loading of the applications.
As an example of what we are refering to as "side loading", if you have two apps A and B , both able to connect to the accessory, A knows about B and B knows about A , and either can start the other.
We can presume that the GIP protocol is already embedded in a staticlib distrubuted by the MFI OEM, perhaps at least a part of the menu system we discussed could also be embedded. A means to determine other apps that may be loaded either by interating though the url schemes installed or the processs list can be provided, and would create a menu that can be added alonglides the 3rd party app menu on the hardware device.
Thus, when the User Exits App "A" , it doesn't actually close any connections, it simply shows the "application menu" from which the user can select Application "B" to start, once application "B" starts it takes over foreground and opens an accessory session causing App "A" to lose its connection.
Posted at 04:20 PM in InterApp-GIP | Permalink | Comments (0) | TrackBack (0)
In this continuation of the last post "playing with app states" we are going to look at two hypothetical case studies.
Case Study I Hardware jukebox
Whether the jukebox is meant for a motor vehicle or the dressing table is irrelevant. The design is a simple display of some kind and a voice interface. All command and control is via the accessory, once the IPhone or IPod touch is placed in the dock it can be ignored.
We will assume we have two applications, RadioPlay which we mentioned before, a simple radio streamer and InfoPlay which provides, news, weather and alerts . How the apps are voice enabled again is irrelevant, will just assume they are using one of the several api libraries available .
The user puts the IPhone into the dock connector, and the default application launches. This might be automatically or require clicking an "open Application" popup depending on the implementation level of the accessory. In either case the Jukebox application is now foreground and RadioPlay and infoPlay are available on the menu. Perhaps these applications are part of a suite of apps that can be purchased and downloaded. Newly downloaded applications would get added to the menu on next app start.
The JukeBox is the only application that has a connection to the accessory. The Jukebox was developed by the company that developed the certified MFI device. The developers of RadioPlay and InfoPlay are irrelevant.
The user issues a voice command to start RadioPlay .
Case Study II return to the Pulse monitor
This is the pulse monitor wrist band we discussed earlier. It is an ANT device controlled via a bluetooth headset.
In this implementation only a single application HealthMonitor directly receives updates from the accessory. Since it is attached to the accessory it can run backgrounded.
HealthShare is a third party application, it provides a snapshot of pulse activity over some time slice as a histogram which can be posted to the user's Facebook wall. HealthShare does not need to be aware of the physical connection to the accessory or even the fact one exists. The implementation is totally hidden. It just needs one piece of data, the pulse update that it gets via a GIP connection to HealthMonitor.
Analyzing both use cases above we can draw some conclusions.
Posted at 10:37 AM in InterApp-GIP | Permalink | Comments (0) | TrackBack (0)
Thinking about the different app states in relation to background applications and interapplication messaging.
Before IOS 4 this was all irrelevant there was no way to keep the application running in the background, but now if you have an application that use background audio or external appliances an interesting idea comes to mind.
Say you have a hardware jukebox or a radio headunit, and it allowed you to stream audio via ad2p through its speakers and command and control via IAP.
You want to be able to select applications from the jukebox and switch between them. A jukebox that required you to physically launch an application on the phone to connect wouldn't be much fun.
So how might this work.
Lets say you have 3 applications all compatiable with the jukebox.
When you plug the 30 pin cable into the hardware it launches the control center application, or maybe you have to manually launch the app. Now this control center application has command and control and its application selection menu is on the hardware display.
You select one of the 3 apps, lets call it RadioPlay.
The control center launches RadioPlay with an openURL
The control center is now background
RadioPlay opens the session to the accessory, the control center Closes its session, RadioPlay's menu is now on the hardware display.
RadioPlay issues an openURL request bringing control center foreground, but still maintaining the Accessory connection
ControlCenter is foreground, RadioPlay is background and still has command and control
the user issues an exit command to RadioPlay which menu is on the hardware screen , RadioPlay sends a GIP (see our proposed General interapp protocol) message to the control Center which requests The control Center to open an accessory Session.
Because only one accessory session can exist , the session associated with RadioPlay is closed, Control Center now has command and control, and RadioPlay remains in the background until suspended.
The user now can select any other application from the menu.
Posted at 05:36 PM in InterApp-GIP | Permalink | Comments (0) | TrackBack (0)
We can now reveal the featureset for AirTracer and AirTracerLite
AirTracerLite is on track to be submitted to Apple for review last week in January. We anticipate a Typical 2 – 3 week review so you should see it in the App Store in February. Both AirTrackerLite and AirTracerPro are the perfect debugging tool when you can't use xcode or instruments. Such as remote QA testing or working with Apple external accessories.
Gabrielle Larose our Art Director and UI advisor is finishing up the promo art and we will post it here as soon as it’s available.
AirTracerLite will be free and include:
AirTracerLite is roughly equivalent in features to Android CatLog which was its inspiration.
AirTracer Pro will have a price point of 1.99 and include all the above features plus
AirTracer Pro will follow shortly after AirTracerLite, we hope to incorporate user feedback and suggestions from the Lite Product. AirTracer Pro will have some unique features that may cause a lengthy Apple Review, However we feel they are important enough to take the risk.
Android version coming later this year. Are you an Android developer, want to help us build this exciting tool, contact shelly@mooncatventures.com
Posted at 02:01 PM | Permalink | Comments (0) | TrackBack (0)
When Apple first announced the External Accessory Framework with IOS 3 there was a great deal of excitement, it was thought that the new framework would open the iphone up to a host of new third party devices which could be attached via the dock connector, camera adaptor , USB and even bluetooth.
But along with the new Framework came new rules and limitations that tempered those early expectations.
For one the new framework required enrollment in the Made for Ipod program or MFI (which can be interpreted nowadays as Made for Iphone or whatever "I" device you care to postfix the "I" with. MFI membership is pretty much limited to OEMs that have a clear vision of a real product and can prove that to Apple.
USB and Bluetooth framworks were restriced to only MFI members.
Also the new framework emphasized one of the biggest complaints with IOS , the lack of mult-tasking.
prior to IOS version 4 the lifecycle of an application was not very complicated.
.The app is started by calling application:didFinishLaunchingWithOptions which starts the event loop, When the user quits the application, the whole app is closed and the method applicationWillTerminate is called. The next time the user opens the application, the method application:didFinishLaunchingWithOptions: is called again.
This Presented a Problem for Apple which they have been incrementally addressing since IOS 4.0.
Apple was not about to allow the wholesale multitasking of the kind available on Android. Android multitasking can be a headache for some applications. i.e. streaming IPTV which may have need to maintain large buffers for audio and video, On iphone the applications are free to grab all the memory they can but on Android, the OS provides only for limited amount of memory and the app "lives" in fear of the OS killing it at any time for higher priority tasks.
But No Multi-tasking is equally a problem.
Lets take as a scenario a External Heart Monitoring accessory such as provided by Wahoo Fitness and a Iphone with 2 applications .
HealthMonitor which is attached to the accessory via ANT protocol.
Pandora Radio, for those tunes you just must listen to while you exercise.
Prior to OS 4 whenever the user switched between Pandora Radio and HealthMonitor the former application had to completely quit and the application in the foreground had to completely reinitialize itself. Developers were required to implement there own protocols for fast restarts by caching data locally. The connection to the monitor would be servered and have to be reopended when the monitor was forground again. Not a happy lifecycle for a application that needed realtime sensor data.
Once more Bluetooth allows for multiple socket connections and multiple devices but the lack of multitaking meant that there could be a "single connection" to a "single accessory" only.
With IOS 4 Apple attempted to address these limitiations in a number of ways.
First "fast task switching" gave the os the ability to save enough of the state of the application to quickly restore it upon resumption of the application.
iOS4 introduces a more complicated lifecyle, when the user quits the app, it enters the background and method application:didEnterBackground is called. The application remains in the background for some determinied time period depending on what the application was doing before it entered background usually not exceeding 10 minutes, this allows time for the application to complete whatever task it was doing before entering a suspended state . If the application is still in the middle of some long task it may "request" additional time from the os in small increments to complete whatever task it was doing. Then the application enters the suspended state.
An IOS application can be in one of 5 states
Apple Also defined a set of elite tasks type which are allowed to run in the background.
- Background audio - allows apllications to play audio continuously.
- Voice over IP - Users can now receive VoIP calls and have conversations while using another app. - Background location - Navigation apps continue to guide users who are using other apps - Push notifications - Receive alerts from your remote servers even when your app isn't running. - Local notifications - Applications can alert users of scheduled events and alarms in the background. - Task finishing - If your app is in mid-task, the app can now keep running to finish the task. - Fast app switching - Quick restore between swapping applications |
Recall we said Apple was making incremental tweaks to multitasking. IOS 5 added 2 additons to this list.
A New notification was added to the acessory protocol that allowed an accessory to launch an application when it had updated data.
Considering now are heart Monitor example.
The user launches HealthMonitor which opens an NSStream to the heart monitoring accessory and begins a brisk Juant. While running she switches to the Pandora Application which causes the HealthMonitor app to enter into a background state. Because HealthMonitor has indicated in its plist that it is one of the Apple's blessed background types with the key "external-accessory" the application continues to collect the data stream from the accessory. The user is free to switch back and forth between the apps , forcing each in turn to swap states.
To Make it interesting lets add a third application, HealthShare, which logs the users heart rate and makes it available to facebook. HealthShare also accesses the accessory. The heart monitor accessory can only handle a single serial connection at a time. So when the HealthShare accessory is started it forces HealthMonitor into the background, HealthMonitor is still connected to the accessory but get an NSStreamEcounteredEnd event because HealthShare now has the connection to the stream. This handing off of streams can be repeated as needed.
The new lifecycle is an improvement over the previous but still has flaws, for instance the user must physically bring each application back to the foreground state. Apple did provide Accessories with the abililty to send a launch notification, but the programming for most accessories is embedded into firmware, and in most cases needs to physically flashed with new programming. For some OEMs this can be a simple process of providing the user with an over the air update but for others it can involve a tedius trip by the customer to a dealer or actually having to wait for a new model, which can be several months or even years down the road.
Having to manually switch applications can be especially troublesome for In-Car infotainment or management systems because of the implications of having to physically handle the device causing a driver distraction. This would be a problem with many a state legislature.
In the next segment we will discuss a way of switching between applications without the user having to intervene.
Posted at 06:29 PM in Accessories | Permalink | Comments (0) | TrackBack (0)
In this segment continuing our discussion on IOS external accessories and Multitasking we are going to look at ways an application can be launched without any action on the part of the user.
Case Study:
In-car navigation and management systems , so called infotainment systems. These are Radio head units that may offer depending on the OEM Handsfree calling, messaging , navigation and control over the vehicle's environmental systems. Examples would be SYNC from Microsoft and Ford, Mylink from GM and Entune from Toyota.
All of the above named systems and many others also provide some level of integration between the Radio Headunit and Applications on the Driver's Mobile device Using simple steering wheel or voice controls. Pandora Radio is a widely included application, But the list is growing.
These infotainment systems have special challenges associated with them in that many State Legislature and NTSB are developing strict regulations for Driver Distraction, Mandating and in some States enforcing use of handsfree devices for making calls and limiting certain control of the equipment to when the vehicle is at rest or a safe speed. No one whats to see an accident report that states the driver hit the pedestrial while their attention was focused on Searching for "lost control" by Unwritten Law.
These systems work with a few smart phones e.g. Blackberry, Android and Iphone. Android is by far the easiest to interface with Android devices can connect to these systems via bluetooth socket connections. Since Android allows for true multitasking, multiple socket connections can be opened it is fairly simple for the infotainment system to provide a mechanism to swap from application to application.
Apple presents a more difficult challenge. Apple didn't add Bluetooth Support to the MFI certification until after many of the current crop of devices were on the market which means that for these devices the only connection available is using the I(Pod,Phone,Pad) accessory protocol or IAP over the dock Connector which is a serial connection.
IAP is a very simple control protocol which uses standard serial RS232 . With the Approved apple api the only way you can do serial communication between an iphone and another device is using an MFI approved appliance. There are a few backdoor approaches to serial communication involving copying portions of the IOKit api from mac OS or jailbreaking the device and installing a serial port handler but for an application that the developer hopes to get through the Apple Review process the only approved way is to use the Apple Dock Connector and belong to the MFI program.
All MFI accessories include a decoder chip, the dock connector also has a decoder chip, when the IDevice is connected to the accessory the devices go through something resembling an ssl handshake, After successful authentication The TTY port is available for standard communication. The IAP framework handles all the low level conversions , the user's application uses the standard IOS stream protocol for all traffic.
The whole process is fairly fluid but there is a major flaw, only one accessory can be connected at a time and because of the lack of multitasking only one app can be connected to the accessory.
With the latest ios apple has introduced Airplay and Bluetooth 4.0 , Bluetooth 4.0 is a recent standard for new low powered bluetooth devices now coming on the market. It is clear that apple plans Airplay to be the new direction for all future accessories.
Apple announces new MFI components
This is an exciting new direction but software is a lot easier to change than hardware, and new chips mean new engineering and with regards to Automotive product cycles can mean months or years before the public sees it. Meantime what about current hardware.
With IOS 5 Apple has tried to address this issue by providing a new notification to the Accessory. A current generation accessory can now send an IAP command to the device to alert the user to Launch a particular app. "Health Monitor needs your attention" Allow or Dismiss. For the heart monitor example the user simply taps "allow" which starts the application bringing it to the foreground. This is a great improvement it means the User doesn't have to search for the app icon on the phone or remember which app was used with the accessory, But still presents problems for a driver who must take their eyes off the road to tap the allow button. Better but not useful just yet.
I have observed a regulatory test of an infotainment system . The test consisted of the smart phone being paired with the system and then the device placed in the cars glove compartment. From that point on the only way to interact with the apps on the phone was via the handsfree microphone or steering column buttons.
Apple also included new commands to send meta data to existing MFI accessories. eg. Artist, title etc. Nice touches but still for existing accessories it means that the user has to select a favorite app ie Pandora Radio and stick with it until they can safely (or unsafely) switch to another application.
But.. A method has existed since IOS 2 for launching applications without using the application springboard. That method is the custom URL Scheme. Apple uses custom url schemes for many standard applications. If you open Safari and type mail://johnsmith@apple.com, Safari will close (or nowadays go into background) and the mail application will open. You can designate custom url schemes for any native applications including the phone application and facetime or define schemes for your own applications.
To communicate with an app using a custom URL, create an NSURL
object with some properly formatted content and pass that object to the openURL:
method of the shared UIApplication
object. The openURL:
method launches the app that registered to receive URLs of that type and passes it the URL. At that point, control passes to the new app.
NSURL *myURL = [NSURL URLWithString:@"Pandora://play?Sweet+Dreams&artist=sucker+punch"]
|
The above code snippet assumes the Pandora app developer has coded a url handler which would process the search query specified on the url.
1.
|
Right-click the Information Property List key, and select Add Row. Select “URL types” from the list .
Adding a URL type.
|
2.
|
Expand Item 1, right-click URL identifier, and again select Add Row. Select URL Schemes from the list
|
3.
|
An app that has its own custom URL scheme must be able to handle URLs passed to it. All URLs are passed to your app delegate, either at launch time or while your app is running or in the background. To handle incoming URLs, your delegate should implement the following methods:
Use the application:didFinishLaunchingWithOptions:
method to retrieve information about the URL and decide whether you want to open it. This method is called only when your app is launched.
In iOS 4.2 and later, use the application:openURL:sourceApplication:annotation:
method to open the file.
In iOS 4.1 and earlier, use the application:handleOpenURL:
method to open the file.
If your app is not running when a URL request arrives, it is launched and moved to the foreground so that it can open the URL.
if your app is running but is in the background or suspended when a URL request arrives, it is moved to the foreground to open the URL.
The system calls the delegate’s application:openURL:sourceApplication:annotation:
to check the URL and open it. If your delegate does not implement this method (or the current system version is iOS 4.1 or earlier), the system calls your delegate’s application:handleOpenURL:
Using the information above it is a fairly trivial task to construct a simple launcher with a button representing each application providing a function and the IBOutlet of the button having the code to openURL to the application.
-(IBAction)openSms {
[[UIApplication sharedApplication] openURL:[NSURL URLWithString:@"sms://466453"]];
}
Its also not very difficult to see how you can code a simple voice command application using Siri or Nuance.
- (void)recognizer:(SKRecognizer *)recognizer didFinishWithResults:(SKRecognition *)results
{
if ([results.results count] > 0) {
NSString *wellFormedRequest =[self getUrlRequest:[results firstResult]];
if (wellFormedRequest != nil) {
NSURL *url = [NSURL URLWithString:wellFormedRequest];
[[UIApplication sharedApplication] openURL:url];
}else{
[self speakText:MESSAGE_I_DID_NOT_UNDERSTAND_Y0U];
}else {
[self speakText:MESSAGE_NO_RESULTS_RETURNED];
}
}
In the code snippet above for a typical speech recognizer the method didFinishwithResults passes the results of the speech transcription to getUrlRequest which returns either a well formed url request in the form of appName?parameters or nil if the request couldn't be mapped to an application. The Url request calls openUrl which opens the application and puts the launcher into the background.
If getUrlRequest returned nil or [results count] = 0 then an appropriate message is passed to the text to speech engine.
So now we have a fairly decent little voice controlled launcher that can be used by any handsfree device not just MFI external accessories. Its easy to imagine one of those small bluetooth headsets e.g. Jambra used as a voice interface to your applications. This is quite possible using the new WB 16KHZ SCO support in IOS 5.
So now with a voice command we can start an SMS message, listen to Pandora Radio or get the latest Podcast from Stitcher. Our launcher has all the compatible phone apps hardcoded into a list of choices.
But.. What is the application isn't available on the device. It would be nice to find all the Applications that are present on the phone and compatible with the accessory.
At First thought you might consider as each application is started and backgrounded it writes its Id to ICloud after all isn't that what the IOS 5 "key value storage" feature is for , storing small bits of data usually for restoring the state of the app. Such as the page the reader is on in an ebook reader application.
But Icloud requires an internet connection, and it would be ideal if the launcher had as few dependencies as needed.
The next consideration might be, when an application starts it could register with the accessory , and the accessory could keep a list of active applications but this may require special coding on the accessory . Accessory code is usually embedded into firmware that must be flashed to add new features, For some Accessories this may only require an Over the Air update but for others as previously mentioned this could be a complex and time expensive task. And we wanted the launcher to function with non-MFI devices like the Jambra Headsets mentioned .
A somewhat simpler approach is to use a little bit of standard c code to provide an enumeration of all the running processes on the Device. Much like that pictured below in The AirTracer Processes View. A shameless plug for Airtracer which can List all the processes on an IDevice and there related PID and other information. AirTracer Pro can also terminate processes as well.
Once we have the list of Processes we can build a list of voice commands by comparing the process list against the hardcoded list of compatible processes.
A fragment from a typical siri voice tree using the above technique might go as follows.
User: Sir please play "Sweet Dreams" by Eurythmics
Siri: Ok , I found 2 applications on your device that can play the original mix of Sweet Dreams", Spotify or Tunes , which app would you like to you.
If the user replies Tunes Siri will launch etunes and play the the requested single, if the user fails to respond either Spotify or Tunes may be launched depending on which application the user has selected as a default.
Posted at 01:02 PM in Accessories | Permalink | Comments (0) | TrackBack (0)
So now using All the information presented so far we can turn to our hypothetical Marathon Runner trainee. She has the Pulse monitor band wrapped around her wrist and the Jambra headset mic over her right earlobe. Using the headset she can switch between Music applications on her phone as well as call up spoken statistics from the HeartMonitor application. She can also Post her progress to her Facebook Wall using the HealthShare application.
The Heart Monitor accessory allows for a single serial radio connection.
When she requests the Heart Monitor application using a voice command, HealthMonitor starts if it's not running already and connects to the Accessory. The launcher enters background and then is suspended because it's not one of the apps "blessed" to run in the background. On receiving AccessoryDidConnect HealthMonitor issues an OpenUrl request for the Launcher. HealthMonitor moves to the background but because it is a "background allowed" application it stays connected to the accessory and can still alert the runner of changes in Pulse Rate via notification Alerts.
The User now issues the "Pandora" voice command to start Pandora Radio, once again the Launcher moves to the background and is suspended. HealthMonitor continues to receive updates from the Accessory.
Pandora now is in complete control, until the runner exits it. after which Pandora enters applicationDidEnterBackground and issues an openUrl back to the launcher. Now the Launcher is the foreground application, HealthMonitor continues to run in the background.
She can continue this round Robin cycle between opening applications, bringing to foreground or sending to backgrounds indefinitely. HealthMonitor will continue to receive updates until it becomes disconnected from the heart monitor or another forground app opens a EASession.
HealthMonitor talks to the heart monitor accessory using the ubiquitous NSStream that apple uses for virtually all stream connection. The same protocol can be used for tcp, bluetooth and IAP.
The runner now issues a voice command to start HealthShare, as before the launcher enters the background and shortly after a suspended state. HealthShare does open a session to the accessory. When HealthShare opens a stream to the heart Monitor HealthMonitor receives a NSStreamEndEncountered message , Since HealthMonitor no longer has an active session to the accessory it will eventually be suspended until such time it is brought again to the foreground. HealthShare issues an openURL request to the Launcher and enters the background. It is still receiving updates from the accessory.
Reality check : There doesn't seem to be any means to start an application in the background it must be "pushed" there by a foreground application, so apps must be specifically coded to return to the launcher with the openUrl method. Unfortunately there is no voice controled version of Pandora, the app mentioned here is simply a hypothetical version.
Posted at 11:03 PM in Accessories, Code Trails | Permalink | Comments (0) | TrackBack (0)
So now we have a fairly useful multi-application usage scheme. The runner can use her voice to start and swap between applications. She can also connect or disconnect applications to the heart monitor that perform different functions.
But there are definite refinements that can be made surrounding HealthMonitor and HealthShare. Since the monitor accessory only allows a single connection only one of the applications can be getting updates from the monitor. HealthShare in particular would benefit from a persistant connection because as implemented it can only deliver a log of resperatory activity over a particular time slice.
Also any applications which like HealthShare collect statistics from the monitor need to be certified by the MFI program, leaving potential but small independent developers out in the code. It would be far more efficient and benificial if the HealthShare application could just get the pulse updates from HealthMonitor then directly from the accessory, but in a way that did not require changes to the firmware to add newer Apple protocols. It turns out such a mechanism exits and we will examine that now.
This mechanism is the Pasteboard which was introduced by Apple to provide cut and paste between and within applications. The Pasteboard is often thought to be like simple clipboard but this is not an entirely true anology.
There are two types of pasteboards that an application can interact with. The first is the general, systemwide pasteboard. This pasteboard can be accessed by all applications on the system and is the default pasteboard that is used by UITextView, UITextField, and so on.
The second type is a named pasteboard. These pasteboards are created by your application and given a name. Other applications that know the name of your pasteboard can also access that pasteboard and retrieve your data. In this way, you can share data amongst a select set of applications if you need to. Local notifications can be set up between applications. System pasteboards are persistent by default named pasteboards can be made persistent by setting the persistence property.
The UIPasteboard class provides the mechanism by which applications interact at a low level with both the general system pasteboard and specific named pasteboards that they themselves create.
To access the general system pasteboard, you call the method generalPasteboard on the UIPasteboard class. This returns the general pasteboard singleton.
To access a named pasteboard, you call the method pasteboardWithName:create: on the UIPasteboard class. If you would like the UIPasteboard class to create a unique name for you, then you can pass nil as the first parameter. The second parameter indicates whether or not you want the UIPasteboard class to create the pasteboard for you if it does not exist.
You can get a reference to the pasteboard as follows:
UIPasteboard *pasteboard = [UIPasteboard generalPasteboard];
You can then place a string on the pasteboard by using this:
pasteboard.string = @"here is some data";
After launching your app via a call to openURL:, you can then retrieve the string from the pasteboard by writing this:
UIPasteboard *pasteboard = [UIPasteboard generalPasteboard];
NSString *myString = pasteboard.string;
The UIPasteboard class contains convenience properties like this for NSString, UIImage, NSURL, andUIColor.
Pasteboards are implemented as an array of items. Each item may have multiple representations, and is represented by a dictionary. For instance, a URL can have a number of representations:
To retrieve a pasteboard’s items, invoke items. To add an item, create the appropriate dictionary and add it to the pasteboard with addItems:.
NSDictionary* item
= [NSDictionary dictionaryWithObjectsAndKeys: string, kUTTypeUTF8PlainText,
url, kUTTypeURL,
png, kUTTypePNG, nil];
[pasteboard addItems:[NSArray
arrayWithObject:item]];
Standard UTI types are used to access each item but you can also define custom keys if needed.
Due to a limitation of the iPhoneOS, the size of each item on the pasteboard cannot exceed 5~8 MB. If an item is greater than 5120 kb it should be broken into smaller items of less than 5120 kb.
Using Pasteboards it is not hard to imagine that you could come up with a protocol for sharing information between applications. A company called INTUA markets an application named BeatMaker that uses such a custom sharing protocol to share audio data between BeatMaker compatible applications. They have opened source the protocol under MIT license.
BeatMaker uses this UTI to achieve copy and paste operations of uncompressed audio data with other applications.
Items with type kUTTypeAudio contains audio data in a specific sample format.
BeatMaker assembles all contiguous items of type kUTTypeAudio, into one audio file.
Each item is added to the pasteboard and made available to other applications.
INTUA audioCopy only works for pcm audio but its not much of a stretch to devise a general protocol for interapplication sharing. Mooncatventures is in the process of developing such a General Interapp Protocol (GIP) and we will provide more details soon.
Posted at 11:22 AM in Accessories | Permalink | Comments (0) | TrackBack (0)
So now we come to the end of this series, the final cut if you will ,we hoped you found it both educational and enjoyable.
So looking again at our Marathon runner traniee. She has the pulse rate monitor band wrapped around her left wrist, and the jambra headset behind her right ear. The heart monitor "talks" to the applications on her phone using the same ANT+ protocol used by the resident Nike application. The headset allows her to control a springboard application on her iphone using Siri (or perhaps Nuance) over the new WB band 16 khz SCO channel.
As Before she has several audio applications including her favorite Radio application. The device also has the HealthMonitor application and a hypothetical version of HealthShare using a GIP protocol.
As Before she starts HealthMonitor with a voice command, HealthMonitor connects to the accessory then does an openURL transfer back to the launcher.
She now starts Pandora which becomes the forground application. When she is done with Pandora she exits and returns to the launcher.
She now issues the voice command to start HealthShare which sends the launcher to the background.
HealthMonitor continues to send updates from the background, whenever it has an update for HealthShare it retrieves HealthShares "SharedMessageQueue" (a uipasteboard) and encodes the update as a GIP message. Messages greater then 2048 bytes are broken into smaller messages, each message contains a protocol frame header. Multiple messages are in the form of a circular linked-list and have a message id to indicate the message the frame belongs too. Again each frame is no more than 2048 bytes.
After adding the message to the sharedMessageQueue HealthMonitor sends a notification to HealthShare that it has an update. Remember this protocol is in early development, perhaps it may be better from a message traffic prospective if the HealthShare just had a background task that watched the Uipasteboard named sharedMessageQueue every 100 ms or so. In any case, HealthShare gets the update and sends it to facebook. It may also update the runners location to facebook checkin.
The one drawback here is that when the runner swaps HealthShare for another application, HealthShare will become suspended, A workaround may be to have healthShare receive periodic location changes which allows it to join the list of Blessed background applications.
So ends our saga, this seem to be a fairly nice mechanism for interapp communication. Undoubtedly there are holes that may be need to be filled, but all in all theoretically it should work.
The technique is also quite cross-platform compatible and may be duplicated on android using Intents.
Posted at 10:39 AM in Accessories | Permalink | Comments (0) | TrackBack (0)
Learn more about AirTracer in our IOS Debugging Tips blog which we will be adding to weekly until the release of the application.
Along with AirTracer we will be providing a Twitter support site were Developers can share debugging problems and collaborate on log snapshots uploaded by AirTracer users.
Posted at 07:59 PM | Permalink | Comments (0) | TrackBack (0)
We promised you expanded blogs or themes. Our first theme is Microsite. AirTracer uses microsites to provide a common User Experience across all platforms. Andriod, Iphone and browsers that support webkit, while still providing a basic web page for unsupported browsers such as internet explorer. This is used for the remote console logs and trace reports.
In this theme we are going to build a little application called Smart Photos. Some of you may recognize Smart Photos as similar to a project we started back in 2010 but never completed. Photostream. (I believe we coined that term long before Apple. We started referring to Photostream as far back as 2007).
We recently decided to resurrect some of photostream as a way of sharing ICloud content between Facebook, Apple Devices and Android .
A very Quick Overview of this theme, the images are from StreamShare, the new incarnation of PhotoStream but the flow with the exception of the tabbar is similar.
Images are imported to a sqllite data base from the photoAlbum and put into the documents folder, you could (and StreamShare does) use the Asset Library, but for simplicity we will just store photos in the documents folder and a thumbnail in the database. These are displayed in a simple tableview as shown in the first image above. The second screen is the sharing viewcontroller which starts a small http server and publishes its socket address using bonjour. JMDMS would be used for the equivelant Android version.
This series of images is the Microsite broswer. The same browser would be included in all mobile platform versions of the application.
The Actual code creates a webpage with the documents directory as its root. On a desktop web browser a Simple image list is shown. But on the microsite broswer the div with the id json is used.
Json code for the images is embedded in the webpage but only seen by the special Microsite browser.
<html><head><title>Files from </title><style>html {background-color:#eeeeee} body { background-color:#FFFFFF; font-family:Tahoma,Arial,Helvetica,sans-serif; font-size:18x; margin-left:15%; margin-right:15%; border:3px groove #006600; padding:15px; } </style></head><body><a href="..">..</a><br />
<a href="IMAGE_0001.jpg">IMAGE_0001.jpg</a> ( 59.9 Kb, 2012-01-04 00:33:13 +0000)<br />
<a href="IMAGE_0002.jpg">IMAGE_0002.jpg</a> ( 45.6 Kb, 2012-01-04 00:33:32 +0000)<br />
<a href="IMAGE_0003.jpg">IMAGE_0003.jpg</a> ( 49.0 Kb, 2012-01-04 00:33:49 +0000)<br />
<a href="IMAGE_0004.jpg">IMAGE_0004.jpg</a> ( 78.5 Kb, 2012-01-04 00:33:57 +0000)<br />
<div id='json' style='display:none'>{{"results":[{"name": "IMAGE_0001.jpg"} ,{"name": "IMAGE_0002.jpg"} ,{"name": "IMAGE_0003.jpg"} ,{"name": "IMAGE_0004.jpg"} , {"name":"test1"} ]}}</div></body></html>
Below is the javascript code embedded in the browser that does the json parsing and displays the table view in the webpage.
/*
This file was generated by Dashcode.
You may edit this file to customize your widget or web page
according to the license.txt file included in the project.
*/
var url="";
var listController = {
// This object acts as a controller for the list UI.
// It implements the dataSource methods for the list.
numberOfRows: function() {
// The List calls this dataSource method to find out how many rows should be in the list.
return items.length-1;
},
prepareRow: function(rowElement, rowIndex, templateElements) {
// The List calls this dataSource method for every row. templateElements contains references to all elements inside the template that have an id. We use it to fill in the text of the rowTitle element.
if (templateElements.rowTitle) {
var displayName = items[rowIndex].name.split(".")[0];
var ext = items[rowIndex].name.split(".")[1];
templateElements.rowTitle.innerText = displayName;
if (ext!="jpg") {
templateElements.rowArrow.innerHTML
= "<img src='video.png' width='50' height='50' ></img>";
}else {
templateElements.rowArrow.innerHTML
= "<img src='"+url+displayName+"-Small.png' width='50' height='50' ></img>";
}
}
// We also assign an onclick handler that will cause the browser to go to the detail page.
var self = this;
var handler = function() {
var item = items[rowIndex];
detailController.setitem(item);
var browser = document.getElementById('browser').object;
// The Browser's goForward method is used to make the browser push down to a new level. Going back to previous levels is handled automatically.
browser.goForward(document.getElementById('detailLevel'), item.name);
};
rowElement.onclick = handler;
}
};
var detailController = {
// This object acts as a controller for the detail UI.
setitem: function(item) {
this._item = item;
this._representedObject = item.name;
// When the item is set, this controller also updates the DOM for the detail page appropriately. As you customize the design for the detail page, you will want to extend this code to make sure that the correct information is populated into the detail UI.
var detailTitle = document.getElementById('detailTitle');
detailTitle.innerHTML = "";
var detailLocation = "";
detailLocation.innerHTML = "";
var detailDescription = document.getElementById('detailDescription');
var ext = this._item.name.split(".")[1];
if (ext!="jpg") {
detailDescription.innerHTML = "<div><img src='video.png' width='80px' height='80px' /><embed src ='"+url+this._item.name+"'></></div>";
// embed src=”http://www.mypage.com/test.wav”>
}else {
detailDescription.innerHTML = "<img src="+url+this._item.name+" width='320px' height='480px' />";
}
}
};
//
// Function: load()
// Called by HTML body element's onload event when the web application is ready to start
//
function load(urlvar)
{
url=urlvar;
dashcode.setupParts();
var onloadHandler = function() { xmlLoaded(xmlRequest); };
// XMLHttpRequest setup code
var xmlRequest = new XMLHttpRequest();
xmlRequest.onload = onloadHandler;
xmlRequest.open("GET", url);
xmlRequest.setRequestHeader("Cache-Control", "no-cache");
xmlRequest.send(null);
function xmlLoaded(xmlRequest) {
if (xmlRequest.readyState == 4 && xmlRequest.status == 200) {
// call the function to handle the response data
var result = xmlRequest.responseText.split(":[")[1];
result = result.split("]")[0];
result = "[" + result + "]";
var itemobj = eval(result);
items=itemobj;
document.getElementById("list").object.reloadData();
}
};
}
// Sample data. Some applications may have static data like this, but most will want to use information fetched remotely via XMLHttpRequest.
var items = [
];
<insert link to theme >
Posted at 05:46 PM | Permalink | Comments (0) | TrackBack (0)
Here are some screenshots from location app utilizing voice searches to find your (points of interest)
You can give the app a POI such as find starbucks and it will show starbucks in your immeadiate area using reverse geocoding, or you can give it an address such as 1600 pensyvania Ave Wash DC and it will use yahoo api to show the exact poi on a google map.
- (void)recognizer:(SKRecognizer *)recognizer didFinishWithResults:(SKRecognition *)results
{
NSLog(@"Got results.");
transactionState = TS_IDLE;
[recordButton setTitle:@"Listen" forState:UIControlStateNormal];
if ([results.results count] > 0)
searchBox.text = [results firstResult];
alternativesDisplay.text = [results.results componentsJoinedByString:@"\n"];
if ([searchBox.text rangeOfString:@"0"].location == 0 || [searchBox.text rangeOfString:@"1"].location == 0 ||
[searchBox.text rangeOfString:@"2"].location == 0 || [searchBox.text rangeOfString:@"3"].location == 0 ||
[searchBox.text rangeOfString:@"5"].location == 0 || [searchBox.text rangeOfString:@"6"].location == 0 ||
[searchBox.text rangeOfString:@"7"].location == 0 || [searchBox.text rangeOfString:@"8"].location == 0 ||
[searchBox.text rangeOfString:@"9"].location == 0)
{
Forward_GeocodingViewController *googleViewController = [[Forward_GeocodingViewController alloc] init];
[googleViewController setQuery:searchBox.text];
[self.navigationController pushViewController:googleViewController animated:YES];
[googleViewController release];
}else {
SearchViewController *searchViewController = [[SearchViewController alloc] init];
[searchViewController searchForString:searchBox.text];
[self.navigationController pushViewController:searchViewController animated:YES];
[searchViewController release];
}
And here is the yahoo api delegate
- (id) initWithDelegate:(id <YahooSearchConnectionDelegate>) delegate
{
if(self = [super init])
{
connDelegate = delegate;
}
return self;
}
/*
Search the Yahoo Local service
*/
- (void) searchByString:(NSString *)searchString
{
NSString *urlString = [NSString stringWithFormat:@"http://local.yahooapis.com/LocalSearchService/V3/localSearch?appid=%@&query=%@&latitude=%f&longitude=%f&results=10"
,yahooAPIKey, searchString, [UserData data].currentLocation.coordinate.latitude, [UserData data].currentLocation.coordinate.longitude];
urlString = [urlString stringByAddingPercentEscapesUsingEncoding: NSASCIIStringEncoding];
NSLog(urlString);
NSURL *url = [NSURL URLWithString:urlString];
NSMutableURLRequest *theRequest = [NSMutableURLRequest requestWithURL:url];
NSURLConnection *theConnection = [[NSURLConnection alloc] initWithRequest:theRequest delegate:self];
if( theConnection )
{
webData = [[NSMutableData data] retain];
}
else
{
NSLog(@"Connection failed.");
}
}
/*
NSURLConnectionDelegate
*/
-(void)connection:(NSURLConnection *)connection didReceiveResponse:(NSURLResponse *)response
{
[webData setLength: 0];
}
-(void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data
{
[webData appendData:data];
}
-(void)connection:(NSURLConnection *)connection didFailWithError:(NSError *)error
{
[connection release];
[webData release];
}
-(void)connectionDidFinishLoading:(NSURLConnection *)connection
{
NSLog(@"Got search results");
results = [[NSMutableArray alloc] init];
NSString *theXML = [[NSString alloc] initWithBytes: [webData mutableBytes] length:[webData length] encoding:NSUTF8StringEncoding];
//NSLog(theXML);
[theXML release];
if( xmlParser )
{
[xmlParser release];
}
xmlParser = [[NSXMLParser alloc] initWithData: webData];
[xmlParser setDelegate: self];
[xmlParser setShouldProcessNamespaces:NO];
[xmlParser setShouldReportNamespacePrefixes:NO];
[xmlParser setShouldResolveExternalEntities:NO];
[xmlParser parse];
[connection release];
}
/*
XML Parser Delegate
*/
- (void)parser :( NSXMLParser *)parser didStartElement :( NSString *)elementName namespaceURI :( NSString *)namespaceURI qualifiedName :( NSString *)qName attributes :( NSDictionary *)attributeDict
{
currentElement = [elementName copy];
if ([elementName isEqualToString:@"Result"])
{
tempResult = [[YahooSearchResult alloc] init];
}
}
- (void)parser :( NSXMLParser *)parser didEndElement :( NSString *)elementName namespaceURI :( NSString *)namespaceURI qualifiedName :( NSString *)qName
{
if ([elementName isEqualToString:@"Result"])
{
// Trim the results
tempResult.title = (NSMutableString *)[tempResult.title stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
tempResult.address = (NSMutableString *)[tempResult.address stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
tempResult.city = (NSMutableString *)[tempResult.city stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
tempResult.state = (NSMutableString *)[tempResult.state stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
tempResult.zip = (NSMutableString *)[tempResult.zip stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
tempResult.phone = (NSMutableString *)[tempResult.phone stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
tempResult.averageRating = (NSMutableString *)[tempResult.averageRating stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
tempResult.lastReviewIntro = (NSMutableString *)[tempResult.lastReviewIntro stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
tempResult.yahooURL = (NSMutableString *)[tempResult.yahooURL stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
tempResult.businessURL = (NSMutableString *)[tempResult.businessURL stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
tempResult.latitude = (NSMutableString *)[tempResult.latitude stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
tempResult.longitude = (NSMutableString *)[tempResult.longitude stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
[results addObject:tempResult];
}
}
- (void)parser :( NSXMLParser *)parser foundCharacters :( NSString *)string
{
// Append the string to the current element of a search result
if ([currentElement isEqualToString:@"Title"])
{
[tempResult.title appendString:string];
}
else if ([currentElement isEqualToString:@"Address"])
{
[tempResult.address appendString:string];
}
else if ([currentElement isEqualToString:@"City"])
{
[tempResult.city appendString:string];
}
else if ([currentElement isEqualToString:@"State"])
{
[tempResult.state appendString:string];
}
else if ([currentElement isEqualToString:@"Zip"])
{
[tempResult.zip appendString:string];
}
else if ([currentElement isEqualToString:@"Phone"])
{
[tempResult.phone appendString:string];
}
else if ([currentElement isEqualToString:@"AverageRating"])
{
[tempResult.averageRating appendString:string];
}
else if ([currentElement isEqualToString:@"LastReviewIntro"])
{
[tempResult.lastReviewIntro appendString:string];
}
else if ([currentElement isEqualToString:@"ClickUrl"])
{
[tempResult.yahooURL appendString:string];
}
else if ([currentElement isEqualToString:@"BusinessClickUrl"])
{
[tempResult.businessURL appendString:string];
}
else if ([currentElement isEqualToString:@"Latitude"])
{
[tempResult.latitude appendString:string];
}
else if ([currentElement isEqualToString:@"Longitude"])
{
[tempResult.longitude appendString:string];
}
}
- (void)parserDidEndDocument :( NSXMLParser *)parser
{
[connDelegate yahooSearchConnection:self receivedResult:results];
}
@end
Posted at 02:26 PM | Permalink | Comments (0) | TrackBack (0)
Where do new Apps come from? The best ideas come from real life experience. Over the next few weeks you will hear about our crowd source shopping application. The spark for this came from the desperate act of trying to find gifts on black friday.What if I just took a picture what I wanted , posted it to facebook and asked my network of friends to post the stores they found the product. We'd sort by geolocation and price and show the newest location.
Applications may start as blog posts of concepts that interest us, these idea posts may evolve into themes , themes are larger concepts that we choose to look at more closely and develops apps from. One theme you will here about a lot is "template matching", template matching came out of our interest in opencv and using it to do "template matching", template matching is taking a photo and matching it against template images trying to identify a match. An app idea emerging from this is a bug tracker, e.g. take a picture of an insect and match it against a data base of insects.
So the progression is ideas become blog posts, posts grow into themes and themes may eventually end up as applications.
Posted at 08:33 PM | Permalink | Comments (0) | TrackBack (0)
We've been doing some youtube walk-thrus of our apps as they get closer to submission.
This is a walk-thru flow of the photostream app, highlighting iphone to iphone sharing.
Posted at 10:27 PM | Permalink | Comments (0) | TrackBack (0)
We've decided on a release date for streamX , Summer 2010.
It may be competed sooner but there always seems to be just one more thing...
StreamX is being demoed all over the internet.
Here is a story board we also posted a tutorial here.
http://web.me.com/cannonwc/Site_4/Blog/Entries/2010/2/3_A_Walk_Thru_of_a_Typical_Use_Case.html
Posted at 08:45 PM | Permalink | Comments (0) | TrackBack (0)
Posted at 08:29 AM | Permalink | Comments (0) | TrackBack (0)
More on this later, but here is the POC on combining a Coaca Touch frontend with sdl
http://99.139.107.194/svn/test/sdldml/sdldml/
http://web.me.com/cannonwc/Site/Photos_7.html
Please use "test" for both user and password when prompted on the svn
Posted at 10:54 PM | Permalink | Comments (2) | TrackBack (0)
Here are the results with a xvid movie.
bitrate 1500kbs, resolution dvd, size 6 , streamed from an elgato media server.
http://web.me.com/cannonwc/Site/Photos_5.html#3
You can get the updated code here. The zip has not been repacked, so if you choice to use the zip file you need to copy the ffplay.c from the svn.
http://99.139.107.194/svn/test/portalServer/ffplay/
the svn passwrd and userid are both test
ffplay isn't much good without a way of selecting movies for play, right now the movies are hard coded in the player.
I am trying to come up with a fronted for the sdl player.
I am exploring two methods.
1) hacking libsdl to expose the applicationdelegate and or forceing the sdl-surface into a eaglview.
This is partially sucessful, though very difficult.
2) some kind of back end launcher. the issue with this is I don't know how that would fly in the app store, one app dependent on another. But I've seen some other suites. Also its not clear how sdl programs will be accepted.
the player would be launched via a custom url scheme.
The url would be passed via shared keychain, since there doesn't appear to be anyway to pass the url to the sdl app.
sdl is designed as a graphics library and its great at that fps is very high with ffplay. But not allowing more native controller support is a bad shortcomming.
FFplay is going to remain totally open source.
The enchancements will be jointly hosted and handled by
ffmpeg4iphone google site run by yonas, who's done a great job with ffmpeg.
And
me at mooncatventures.com
Posted at 09:57 AM | Permalink | Comments (0) | TrackBack (0)
As mentioned previously we have two choices for a transcoding server write one ourselves or find one that meets our requirements.
Our research into using Tversity did not go very well and basic questions about how the server transcodes put to the support forum went unanswered. We decided it wasn't worth the effort.
Next we looked at ps3mediaServer, this one is more promising.
It seems to be very dlna compliant and our bridge had no problems "taking" to it, passing basic upnp commands.
A few more promising facts.
Problem areas
Planned approach
Branch project and create iphone specific version.
Planned additions
Posted at 11:49 PM | Permalink | Comments (0) | TrackBack (0)
It was fairly easy to modify the three20 photoViewController for the uiActionList and saveToPhoto toolbar buttons.
In TTPhotoViewController modify these lines
_clickActionItem = [[UIBarButtonItem alloc] initWithBarButtonSystemItem:
UIBarButtonSystemItemAction
//TTIMAGE(@"UIBarButtonReply.png")
target:self action:@selector(clickActionItem)];
UIBarButtonItem* playButton = [[[UIBarButtonItem alloc] initWithBarButtonSystemItem:
UIBarButtonSystemItemPlay target:self action:@selector(playAction)] autorelease];
playButton.tag = 1;
UIBarItem* space = [[[UIBarButtonItem alloc] initWithBarButtonSystemItem:
UIBarButtonSystemItemFlexibleSpace target:nil action:nil] autorelease];
_toolbar = [[UIToolbar alloc] initWithFrame:
CGRectMake(0, screenFrame.size.height - TT_ROW_HEIGHT,
screenFrame.size.width, TT_ROW_HEIGHT)];
_toolbar.barStyle = self.navigationBarStyle;
_toolbar.autoresizingMask = UIViewAutoresizingFlexibleWidth
| UIViewAutoresizingFlexibleTopMargin;
NSString *searchCaption = @"Photo from networked";
NSRange range = [[_centerPhoto caption] rangeOfString:searchCaption];
if (range.location == NSNotFound) {
_toolbar.items = [NSArray arrayWithObjects:
space, _previousButton, space, _nextButton, space,_clickActionItem, nil];
[_innerView addSubview:_toolbar];
}else{
_toolbar.items = [NSArray arrayWithObjects:
space, _previousButton, space, _nextButton, space, nil, nil];
[_innerView addSubview:_toolbar];
}
- (void)actionSheet:(UIActionSheet *)actionSheet clickedButtonAtIndex:(NSInteger)buttonIndex { } - (void)actionSheet:(UIActionSheet *)actionSheet didDismissWithButtonIndex:(NSInteger)buttonIndex { if(6 == actionSheet.tag) { if(0 == buttonIndex)//save page { NSLog(@"photo: %@", [_centerPhoto URLForVersion:TTPhotoVersionLarge]); NSURL *aUrl = [NSURL URLWithString:[_centerPhoto URLForVersion:TTPhotoVersionLarge]]; NSData *data = [NSData dataWithContentsOfURL:aUrl]; UIImage *img = [[UIImage alloc] initWithData:data]; NSLog(@"photo:class %@", [img class]); UIImageWriteToSavedPhotosAlbum(img, nil, nil, nil);
And add this towards the bottom
.
Posted at 01:15 AM | Permalink | Comments (1) | TrackBack (0)
Recent Comments