AI learns to type on a phone like humans

A new computational model precisely replicates the eye and finger movements of touchscreen users – could lead to better auto-correct and keyboard usability for unique needs.

Touchscreens are notoriously difficult to type on. Since we can’t feel the keys, we rely on the sense of sight to move our fingers to the right places and check for errors, a combination of efforts we can’t pull off at the same time. To really understand how people type on touchscreens, researchers at Aalto University and the Finnish Center for Artificial Intelligence FCAI have created the first artificial intelligence model that predicts how people move their eyes and fingers while typing.

The AI model can simulate how a human user would type any sentence on any keyboard design. It makes errors, detects them — though not always immediately — and corrects them, very much like humans would. The simulation also predicts how people adapt to alternating circumstances, like how their writing style changes when they start using a new auto-correction system or keyboard design.

Read more here.