I am totally astonished as to how well AirplaySDK works. The guys at Ideaworks Labs have done it right.
They have made all the right choices when developing an SDK for cross mobile OS game development.
The issue with mobile game development is dealing with all the different Mobile OS plus dealing with different hardware using same OS. Game developers are somewhat lucky if there is OpenGLES implementation available as then they don’t have to worry too much about the rendering part. The sound and file IO and network features still mostly remain as an issue. OpenAL is not supported on most phones for sound. EdgeLib appeared a few years back to solve the problem of IO, Sound and Network features plus to allow software renderer in case OpenGLES is not available. More on EdgeLib in another post.
I tested the kartz sample from AirplaySDK on N82, N80, HTC Diamond, Samsung Omnia.
It is much easier to just list the feature pros and cons of AirplaySDK that I figured out so far by reading and trying out.
PROS
- Native binary without any VM.
- Good packaging tools for all OS.
- Fixed point computation as well as floating point available.
- Speedy Software rendering engine
- Auto fallback to software rendering when no hardware accelerator present.
- Sticking to opensource libraries.
- One Dynamic Engine (ODE) for rigid body physics.
- Support for multiple Mobile OS including Symbian, iPhone, Windows Mobile, Android,BREW and Linux). Edgelib doesn’t support Anrdoid and BREW.
- Highly optimised compression of assets for games. Not just simple zip compression but usage of Derbh which uses shared buffer across files. Claims on site say that Metal Gear Solid came down to 1.5MB.
- Arm debugger/emulator available that runs on Windows. Debugging is much easier this way.
- Windows emulator allows testing of software in various different scenerios (tilt/compass/keypad/touch/resolution)
- VS Studio programmers would love it as they can use VS with this SDK. Very tight integration.
- UI development possible for softwares other than games. This is interesting feature but as the Core System API is not extensive, it won’t be possible for creating full applications. Only simple apps can be built. There is possibilty of extensions that can do OS dependent API calls but this system is not exposed to developers to create their own extensions. There seems to be discussion of this opening up on AirplaySDK Forum.
- Extensions available (only for iPhone) for some of iPhone specific API
- Phone Orientation awareness and auto rotation of GUI.
- Human Interface limitations/possiblities awareness. Can detect touch screen and keypad and control based on that.
- Handling of phone events (incoming call/notifications) without extra code.
- UI code for input is intelligent enough to display touch keypad on touch UI systems without any extra work. MoSync can’t do that. EdgeLib has no support for UI controls/widgets.
CONS
- No support for OSX/XCode for development yet. It is planned though.
- Integrates best with visual studio which can be costly. Although express edition is free and should be enough
- Not very extensive API for core OS like System API.