I had for a while been looking for a display for my Raspberry, but most of the displays I saw, all had some kind of crazy wiring setup along the lines of this:
Now since my interest wasn’t so much in messing with hardware, but building a home automation gateway, I wanted a neat package, so I could just focus on building the software.
Now compare this to the picture here below:
This is exactly what I was looking for. A 5” 800x480 LCD touch display that simply plugs in on top of the Raspberry.
If you’re interested in the same display, you can get it here on Amazon:
Here’s mine running the Windows IoT’s default app:
Nice and clean :-)
Now this display supports touch, but there isn’t any Windows 10 IoT supported touch drivers for any touch display out there. This is very frustrating because you can run these neat XAML-based apps on it, but the only way to interact with things on the screen is to use a mouse.
Luckily, someone has already done the hard work, and you can find a project on hackster.io that got code for reading this particular touch processor. However it doesn’t really deal with XAML elements (it uses some custom SPI rendering and manual touch trigger on rectangular regions), and I personally didn’t like the code too much for my need, so decided to do a major refactor for easier reuse. So in best stand-on-the-shoulders-of-giants style, I’ll build on top of this and created a more reusable touch processor, and a xaml-based calibration view to calibrate your screen. None of this is an actual touch driver that just drives touch into any app running on the PI, but app-specific code. A driver would be the best, but the following is a good start. If you just want the code, just jump to the bottom.
The library that I provide here consists of 3 things:
- An implementation of the TSC2046 touch screen controller
- A touch processor that runs a thread for changes to touch and raises events on pointer down, moved and up
- A UI Control for performing a screen calibration.
1 is very hardware specific. If you have a different controller, you can implement the ITouchDevice interface – 2 and 3 and work with any device implementing this interface.
The screen calibration utility allows the user to click a number of control points so the reported touch points will be returned in screen units rather than raw touch device units. You should always run a screen calibration (and save it for future use) the first time. The calibration even returns an “s0” estimate of accuracy. If this is more than a few pixels, you should probably reject the calibration and have the user try again. Calibration can be run with 4, 5 or 7 points (well anything from 4 and up, but I just made UI for has these). The more you have, the better calibration and the better it is at detecting outliers, but for the most part 4 or 5 points are plenty to get a good calibration.
So to use the library, when your page loads, create the device, load / create calibration, and start listening for events. Here’s an example of that:
TouchPanels.Devices.Tsc2046 tsc2046 = await TouchPanels.Devices.Tsc2046.GetDefaultAsync();
try
{
await tsc2046.LoadCalibrationAsync("TSC2046");
}
catch (System.IO.FileNotFoundException)
{
//Initiate calibration if we don't have a calibration on file
var calibration = await TouchPanels.UI.LcdCalibrationView.CalibrateScreenAsync(tsc2046);
tsc2046.SetCalibration(calibration.A, calibration.B, calibration.C, calibration.D, calibration.E, calibration.F);
await tsc2046.SaveCalibrationAsync("TSC2046");
}
//Load up the touch processor and listen for touch events
processor = new TouchPanels.TouchProcessor(tsc2046);
processor.PointerDown += Processor_PointerDown;
processor.PointerMoved += Processor_PointerMoved;
processor.PointerUp += Processor_PointerUp;
If you haven’t already calibrated, a UI is automatically popped up asking you to calibrate:
So this part just reports touch events, but what if we want to click a button? We can use the VisualTreeHelper to look for any visual elements under the touch point, and use the AutomationPeer APIs to filter by any that implement the IInvokeProvider interface and invoke the provider – this would usually be buttons. So here’s that code:
private void Processor_PointerDown(object sender, TouchPanels.PointerEventArgs e)
{
FindElementsToInvoke(e.Position);
}
private void FindElementsToInvoke(Point screenPosition)
{
var elements = VisualTreeHelper.FindElementsInHostCoordinates(new Windows.Foundation.Point(screenPosition.X, screenPosition.Y), this, false);
foreach (var e in elements.OfType())
{
var element = e;
AutomationPeer peer = null;
object pattern = null;
while (true)
{
peer = FrameworkElementAutomationPeer.FromElement(element);
if (peer != null)
{
pattern = peer.GetPattern(PatternInterface.Invoke);
if (pattern != null)
{
break;
}
}
var parent = VisualTreeHelper.GetParent(element);
if (parent is FrameworkElement)
element = parent as FrameworkElement; //Walk up the tree
else
break;
}
if (pattern != null)
{
var p = pattern as Windows.UI.Xaml.Automation.Provider.IInvokeProvider;
p?.Invoke();
}
}
}
You can also expand on this and use the scrollable providers to scroll ScrollViewers on PointerMoved. The sample code shows this as well.
So with this code, you can now just focus on creating ordinary XAML code with buttons, and any clicks to any button automatically executes the click-handler.
Now you can move this code to App.xaml.cs so all pages in your app “just works” and you don’t have to create a touch processor on each page – in the sample you can download here, I put it all in MainPage, just to keep things in the same place.
Of course a proper touch screen driver would be very helpful and the next step – Once the driver is installed, touch would just work for all apps. I just have no idea how to get started on building a touch driver :-)
Comments