ELLIS Institute Finland is a world-class research hub in AI and machine learning and part of the European Laboratory for Learning and Intelligent Systems (ELLIS).
Tired of swiping? Now an AI simulation helps us understand why
Prolonged scrolling is bad for your well-being, but is it also physically tiring? Until now, we haven’t really been able to say. This is why researchers from Aalto and Leipzig universities created a new AI model that makes it possible to simulate muscle activations and used energy to work out how physically effortful smartphone interactions are for users.
'It’s the first time anyone has developed a tool that can help designers and developers quickly assess how physically tiring a real mobile user interface could be,’ says Antti Oulasvirta, Professor at Aalto University and ELLIS Institute Finland. ‘So far, smartphone logs have only told us where a finger has touched the screen – not whether or not it’s felt comfortable.'
To bridge this gap, Oulasvirta and his colleagues at Leipzig University developed Log2Motion, an AI model that translates smartphone logs into simulated human motion. Movement of this musculoskeletal simulation is based on data from previous motion capture studies.
In the simulation, a human model consisting of digital bones and muscles moves its index finger to interact with a smartphone laid out on a desk. Through a software emulator, the model can use real mobile apps in real time. It can re-enact logs collected on users to illuminate what happened during interaction. The Log2Motion model then estimates the motion, speed, accuracy and effort of these biomechanical movements.
The model provides entirely new horizons for smartphone use research – as well as design.
'We found that some gestures are harder to perform – in this case, up-down and down-up swipes,' explains Oulasvirta. 'Small icons and locations toward the corners of the display also require additional effort.'
Using such simulation early in the process could help designers create user-friendly interfaces. It can also provide insight into accessibility needs for users with tremors, reduced strength or prosthetics.
'It is possible to scale the Log2Motion model to simulate other scenarios, such as the more classic: laying on the couch, holding the phone in one hand and scrolling with the thumb,' Oulasvirta says.
The researchers hope that human simulations would be adopted to help design interactions that are more ergonomic and pleasant for users. In the future, these simulations could be combined with other AI methods to optimise user interfaces to a user’s needs.
The paper, 'Log2Motion: Biomechanical Motion Synthesis from Touch Logs', will be presented on 17 April at CHI 2026, the leading conference on human–computer interaction. It is also available online through (DOI: 10.1145/3772318.3790773).
Researchers investigate how AI could better understand humans
Antti Oulasvirta has received a EUR 2.5 million Advanced Grant by the European Research Council (ERC) for the study of user models.
Read more news
Record-breaking photonics approach traps light on a chip for millions of cycles
With 'nanoscale surgery' the researchers were able to sculpt delicate van der Waals materials without destroying them, achieving record-breaking performance in the process.
New professors appointed to the School of Arts, Design and Architecture
A summary of all new professors appointed since September 2024.
The Educational Partnership project is moving forward in Espoo – cooperation between guardians and schools is being developed through participatory methods
The two-year project explores and develops cooperation between guardians and schools using service design methods.