tappr.tv is all about community, and ultimately about building beautiful experiences together. In that spirit, tappr.tv itself is built using a wide assortment of open source projects, kits, and frameworks, without which I couldn't do what I do.
Version 5 was built in part using:
QuickTIme | No, there's no QuickTime inside tappr.tv. But ever since I learned about this fantastic temporal framework, I've been fascinated with the idea of tools and communities for music visuals. tappr.tv is the culmination of my thoughts spanning almost 15 years. Inspired by the filmstrip concepts of QT, a "dance" in tappr.tv ultimately boils down to a long list of events. "Inputs" normalize data from various hardware services, "recorders" capture and play back those event streams, and "outputs" render. All the parts are loosely coupled, making it easier to leverage into new feature directions. | |
UIKit | I'm aiming the UI as be as invisible as possible. One thing Photoshop did right in their UX is try to disappear. The real genius of UIKit is all the input capabilities: touch, accelerometers, and audio are all enabled in v5, and more are already being built. |
|
OpenGL Cocos2D |
Particles-based wands are just the first feature when it comes to visuals. By leveraging OpenGL, tappr.tv is prepared to add more wand types, more depth, and interactive spatial experiences. I'm still on OpenGL and Cocos2D v1.x though, unsure if I can still render UIKit at the same time as the CCDirector in v2. |
|
CoreData | The structure of dances is rather complex, by design. Each dance is comprised of 1 or more tracks of normalized input data, which is recorded by a user using a wand. By separating all these elements out, I'm laying the foundation for a future release of tappr.tv that includes a variety of Remix features. The granularity also enables sophisticated MVC architectures for various areas of the app. For instance, the Studio has a list of tracks on the left, and a "stage" view on the right, where the various tracks can be touch recorded and played back. Both views are actually using NSFetchedResultsControllers to dynamically detect changes. So, changing the input or mirror attributes on the left get saved and propagated to the view on the right, which actually handles the implementation of those two features. Creating a new track on the left similarly propagates to the right, where an appropriate CCLayer is instantiated and managed. |
|
Parse.framework | I've written several Heroku/RoR servers to work with iOS apps, and its laborious. Creating synchronization engines requires replicated schemas on both sides, and there are few general frameworks for creating truly asynchronous CoreData-Social model layers. When I finally got to the social release of tappr.tv (v4, Nov 2011), I probably spent a month staring at the task at hand, while also wondering if the brand new Parse team had the chops to deliver a solid platform. I took a chance on them, and have been thrilled ever since. Stupid simple schema development, support for synchronous APIs (YEAH!), server-side logic, REST/JSON for websites, third-party social authentication… good good stuff. I'm particularly thrilled with their support for APNS and their "channels" feature. In tappr.tv, each artist has a channel, and as tappr.tv scans your music, subscribes you to the top 100 artists in your music collection. Then when someone else publishes a dance to a song, a push notification is broadcast on that artist's channel. Similarly, each user in tappr.tv has her own push channel, so when others watch, rate, and share dances, tappr.tv sends a push notification back to that user. |
|
NSOperations | Because of the complexity of a dance structure, I needed to write rather complex synchronization logic, which was best accomplished using NSOperations as wrappers for long strings of synchronous calls to Parse. For instance, publishing a dance involves up to 3 network saves for song and artist info, up to 6 network saves per track, and 2 additional network saves for the dance record itself. I just can't imagine how complex this 11-step pseudo-transactional logic would be if I had to string together a set of asynchronous calls with delegate+selector or block-based callbacks. |
|
AQGridView | This class enabled so many of us to jump into iPad with experiences like Apple's updated apps demonstrated. Of course, now that Apple has released UICollectionView, AQ and I must bid adieu soon. Big kudos to Alan Quatermain for releasing this grid implementation. |
|
Spotify CocoaLibSpotify AirPlay |
In tappr.tv, Spotify premium members can watch just about any dance in the Theater. tappr.tv uses song information in each published dance to find a matching song in Spotify, and choreographs the two. This, along with continuous playback and AirPlay support, brings a dance party to any room in the house. (One aside, sometimes songs disappear from Spotify for reasons I can probably understand, and then the match results may only contain covers or parodies, which can make dances either somewhat or entirely off. C'est la vie.) |
|
David Cairns Demetri Miller |
These two gents provided source online that enabled the tappr.tv audio input feature. In the Studio, I convert songs from iTunes into a local audio file, then set up an AU chain in order to play and detect at the same time. |
|
MGSplitViewController | I was drawn to this class because it would work inside a UINavigationViewController, thus enabling me to have different UXs in certain areas of the app. I am, however, interested in developing a column-view navigation view controller, where the columns can adjust as they move from primary to indexed layers. |
|
ASIHTTP | This class provided a solid S3 API. But, as you probably know, its been superseded by the likes of AFNetworking. I was so saddened to see the abuse the originator of this repository received. I hope AllSeeing-i lives long and prospers. |
|
iVersion | stupid simple implementation of release notices with notes. The +load class method does all the work, so you literally just add the code to your project and you're done. |
|
JSONKit | Man, this JSON lib screamed in its day. Like so many things though, Apple needed and did provide a native API, which will eventually take over. |
|
Mugunth | Some nice blocks-based categories for various Apple foundation and UI classes. |
|
InAppSettingsKit QuickDialog |
Used in the Settings area of the app. |
|
ShareKit | Kudos to all who contributed to this open source project, clearly a need and Apple seemed to agree with the overall approach to the API in iOS 6. |
|
CMPopTipView | This is used in all of the bubbles that present themselves during the first launch UX. | |
Airship | This is what manages the store for Wand Packs. Unfortunately, I am considering an entirely different approach to the freemium store UX. | |
Touchpose | For when I start giving demos on stage ;-) | |
Trevor Harmon | Thanks for sharing with us the best way to resize images. |
Coming in v6: