Pointer events provide a unified way of handling all input events.
Back in the good old days we only had a mouse and listening for events was simple. The web content assumed the user’s pointing device will be a mouse.
Nowadays we have many devices which don’t correlate to having a mouse, like phones with touch surface or pens.
How can we listen for events if the user doesn’t have a mouse?
The touch events interfaces are relatively low-level APIs that can be used to support application specific multi-touch interactions such as a two-finger gesture. A multi-touch interaction starts when a finger (or stylus) first touches the contact surface. Other fingers may subsequently touch the surface and optionally move across the touch surface. The interaction ends when the fingers are removed from the surface. During this interaction, an application receives touch events during the start, move and end phases.
Adding touch event listeners is rather simple but you have to support both the
Try the example on your phone too. Try using it on the desktop version to split the screen and inspecting the
Supporting both the
mouse events can become very bloated and hard to maintain since you basically have to code events for different devices.
Remember our example above where we had 2 event listeners — one for the mouse and one for the touch?
Well instead of writing two event listeners, we can simply change one of them to
pointermove and remove the rest.
It’s amazing what browsers can do today! Fingers crossed we have fully native experience inside browsers someday.
React recently launched version 16.4 with native pointer events support.
Let’s create a quick react project and try out these new nifty events. Update to the latest
create-react-version and start the project.
Since pointer events support touch events, we want to test it on our phone too. Open the app on your phone with via
on your network: http://xxx.xxx.x.xx:3000/ url.
We start off by creating the same circle and declaring a state called