I'm reposting from someone who asked this on the Macrumors forums, and thought it would be interesting to get the opinions from some of the more seasoned touchscreen developers. So, now that we have a device big enough - What's to stop a developer making something akin to this, in an iPad format? Using capacative controls, the app recognises these being overlaid onto the screen, and then can use their movement to be a control. At first I couldn't see a way possible for the touchscreen to recognize the objects or an object turning but the OP added a very good suggestion: Say the person is put on the screen - it has 2 capacative points opposite each other on the underside with a central capacative point. That distance could be used in part to identify a player or piece. It'd be more complex if the dial itself is moving it's centre whilst you turn it, but i'm sure it's not too hard to work out turning - You do the math for the angle the 2 points are in relation to the centre of the dial.http://forums.macrumors.com/showthread.php?t=863687
Wow that is a neat idea. If the ipad supports a very large number of simultaneous touches then this could work really well. Each "gamepiece" could have a unique dot pattern (3 dots minimum).
Yeah, now that I think about it this could work and be implemented really well. Does the iPad still have the 5-touch hardware restriction?
This looks like an interesting idea. The only thing that could ruin it is the whole 5 finger multi-touch limit.
well maybe instead of them running on the touch, maybe it could be like...wifi sensing? they sound out a signal that the iPad then reads and sees where it is and you can tap there now?
I don't think Wi-Fi and Bluetooth can sense distance; maybe to some extent you could check for signal strength but it would be very unreliable.
well maybe an add on to the ipad (seeing as these objects will probably be bought at a store) could be a like sonar somewhat attachment?
Distributing an app to all iPad users is going to be quite simple; however, distributing physical pieces of the right caliber is another story. OP is right, the idea seems totally possible, and pretty cool at that. Only problem is, the pieces moving across the screen.
hate to rain on the parade, set a object on one of the icons on your ipod or iphone and watch what happens...nothing.
Read it again. If there are bumps on the bottom made of capacitive materials, then the configuration of those bumps will make a particular pattern that can be detected and measured by the app. The only problems I can see: - Unless it has been changed (and there's no reason to think it has yet), it will be limited to 5 simultaneous multitouch points maximum. - The bumps will have to have enough flat surface area on the bottom to register a touch with the screen. It would still be workable though. I just don't see how it would be practical.
The five touch limit is on the iPhone, but I don't think anyone's discovered the iPad's limit yet, have they?
Is there a reason it needs to be 3 dots that I'm not seeingÂ…? Couldn't you determine a piece using just 2 dots and a specific distance between them?
You need a third dot to know which way it is facing. If that doesn't matter you could do two and still use it for relative rotation, or just one for position only.
No, but I can't see Apple being particularly compelled to up the limit. It seems like the sort of thing that would be low on Apple's list of priorities if it's even there, given that it's unlikely that there are enough practical application possibilities to make it worth Apple's time.
I have to disagree there, Mindfield. I think Apple are just waiting for the first person who creates a nice two-player game they can show in an ad, with two people using one iPad. I think it's also indicative of more input points when you consider the iPad's virtual keyboard now supports 10 fingers at once.
I would expect the code to let you grab as many points as the hardware allows. Why wouldn't it? Basically you are getting a list of touches. The limitations are more on what is considered a touch. The SDK is not giving you the exact readout of what the hardware is giving, it's interpreting touches for you. So, tiny and close touches will register as one touch, most likely. I'd really like to see some support in the SDK for raw info from the touch screen or maybe something specific to what we are talking about here. Tags on the bottom of small objects that can be recognized. Not going to happen in the near future, but as the surfaces get larger it's something that will be necessary. It will be nice when we can play with this stuff without shelling for a Microsoft surface.