Hi there

I've got an idea for a native app, but since I'm a bit of a noob at coding for the iphone I might need someone to hold my hand and let me know what's possible and what isn't.

First of all, I know that it's possible to support multitouch in applications (cf. NES), but what I'm wanting to know if it's then possible to somehow route this information through the dock connector, and therefore be able to create a ctlin signal through MaxMSP. My thinking is that it'll produce a different method of interaction than through wireless networks, and allow greater portability to perhaps older macs that do not support WiFi, with a view to controlling Max patches with faders, theremin-like use of the ambient light sensor, full x/y/z monitoring of the accelerometer, all sent to the computer via the USB dock connector.

Certainly, a lot of this is modelled after aka.iPhone, but I really can't wait until february to start mucking about with a multitouch interface! Ideally, I'd like to have a system of multiple landscape scenes, navigateable by dragging, in a similar way to the Summerboard desktops, with a view to firstly producing a portable multitouch mixing environment.

Do you guys think this might be possible with the current knowledge of development for the iphone?

Might I suggest the project be called I/OPhone?

Thanks in advance

M