What I intend for multi-touch? Windows Phone is already a multi-touch device but for some reason (I guess it is legacy of Silverlight) the standard Xaml controls don’t have a real multi-touch behavior. Real multi-touch means that I should be able to interact with two different objects at the same time with multiple fingers. What happens now for the XAML standard controls is that once a control start receiving touch events (ManipulationStarted, ManipulationEnded, ManipulationDelta) all the other controls will not receive touch events. The easiest test to do is add a standard button on the page, put one finger on the screen outside the button and try to press the button with another finger. You will see that the button does not respond to your commands. For Xaml applications this might not be a problem, but for games it becomes one. Xaml (I mean Xaml+C# or XAML+VB.NET) is fast to develop easy games.
The solution would be to build your own control and use Touch.FrameReported to “drive” it. In this sample I will build a multi-touch button. I will call it ButtonEx (some of you remember OpenNETCF J?) and I will just add three events to it: TouchDown, TouchUpInside, TouchUpOutSide (iOs MonoTouch event names). With this three events I should have better control (Click in reality is a TouchUpInside event) .
So I've created a new Windows Phone Class Library called ControlsEx and I added the control ButtonEx derived from ContentControl. I copied the standard style of the Button control (you can easily generate it from a standard button using Blend and Edit Copy command on the Button Template). I've then added the style to the /Themes/generic.xaml file inside our project. When I create the control I will subscribe Loaded and Unloaded events as I want to start receiving Touch events when the control loads and unsubscribe the Touch events when the control gets unloaded.
public ButtonEx()
{
DefaultStyleKey = typeof(ButtonEx);
this.Loaded += ButtonEx_Loaded;
this.Unloaded += ButtonEx_Unloaded;
IsEnabledChanged += ButtonEx_IsEnabledChanged;
IsPressed = false;
}
void ButtonEx_Loaded(object sender, RoutedEventArgs e)
{
Touch.FrameReported += Touch_FrameReported;
}
void ButtonEx_Unloaded(object sender, RoutedEventArgs e)
{
Touch.FrameReported -= Touch_FrameReported;
}
Now everything we need “happens” inside the Touch_FrameReported method. For my button I am interested to trace only one finger(using its id) from TouchAction.Down until TouchAction.Up. Once the first finger is down on the surface of my control I memorize the id and track it’s actions till it leaves the screen. Depending of the control that you are building you might have to take in consideration multiple fingers. One thing that is pretty important when starting to track a finger is to see if your control is in front or not (imagine an MessageBox over your controls and when you press the Ok button you will also press the button which is in the back). To resolve this issue I’ve used TouchDevice.DirectlyOver property of the TouchPoint and the VisualTreeHelper to see if the UIElement returned by DirectlyOver is a member of my control or not.
bool IsControlChild(DependencyObject element)
{
DependencyObject parent = element;
while ((parent != this) && (parent != null))
parent=VisualTreeHelper.GetParent(parent);
if (parent == this)
return true;
else
return false;
}
Here is the method Touch_FrameReported method:
void Touch_FrameReported(object sender, TouchFrameEventArgs e)
{
if (Visibility == Visibility.Collapsed)
return;
TouchPointCollection pointCollection = e.GetTouchPoints(this);
for (int i = 0; i < pointCollection.Count; i++)
{
if (idPointer == -1)
{
if (IsEnabled&&(Visibility==Visibility.Visible) && (pointCollection[i].Action == TouchAction.Down) && IsControlChild(pointCollection[i].TouchDevice.DirectlyOver))
{
//start tracing this finger
idPointer = pointCollection[i].TouchDevice.Id;
IsPressed = true;
VisualStateManager.GoToState(this,"Pressed", true);
if (TouchDown != null)
TouchDown(this, pointCollection[i].Position);
}
}
else if ((pointCollection[i].TouchDevice.Id == idPointer) && (pointCollection[i].Action == TouchAction.Up))
{
idPointer =-1;
IsPressed = false;
UpdateIsEnabledVisualState();
if ((pointCollection[i].Position.X > 0 && pointCollection[i].Position.X < ActualWidth) && (pointCollection[i].Position.Y > 0 && pointCollection[i].Position.Y < ActualHeight))
{
if (TouchUpInside != null)
TouchUpInside(this, pointCollection[i].Position);
}
else
{
if (TouchUpOutside != null)
TouchUpOutside(this, pointCollection[i].Position);
}
}
}
}
For the button control we don’t have to trace the movements of the finger until Up action but we might need to if we are writing a Slider control for example. The sample application that you will find in the source code uses 2 ButtonEx controls and a standard Button control. The ButtonEx should always respond to your commands (fingers).
I’ve also used this approach to develop an multi-touch XAML game for flying an Bluetooth BeeWi helicopter. I will also have a session on the 27
th February at
Community Days here in Italy where I will present a session on developing a game for Windows Phone and I will use this game as a starting point. This application has multi-touch buttons, slider and joystick control.
Also have to thank the TTT (Train The Trainer) program which awarded me a beautiful Sphero for my multi-touch controls.
As always don’t hesitate to contact me if you have further questions.
NAMASTE