December 22nd, 2012, 03:28 PM
Android device to Android device (Remote desktop type control)?
I've done some windows desktop and windows PDA/phone development, but have no experience (yet) with android other than as a user.
I have a specific interest in something akin to remote desktop, where (when they are both connected, probably via wifi) one android device [slave] would show an image of the other [host] screen, and (multi)touch input would be directed to the host from the slave device, essentially allowing the slave to control the host remotely, real-time (while seeing the impact of that control)
The screen presentation could probably be handled pretty easily, same for the wifi connection- what I'm really interested in is the screen control options.
Does Android expose (at any level) the events for user input (both to capture slave input, and push that to the host)? Whether that be the raw input of multitouch, or some intermediate level of control? The basic question here is whether the input on one device can be used as direct input on a different device, without having to create control wrappers for each program, or (for one existing program I read about) having to change the default keyboard entry to a host program specifically to enable this type of functionality.
I'd welcome technical information (what can and can't be done in Android), conceptual solutions ("here is one way it might work"), and suggestions for better search terms so I can find existing posts on the subject.
December 25th, 2012, 11:22 AM
Or, if there are better forums on which to post this type of question? Being new to Android, I'm not sure which forums are active, and which are more technical in nature.