Interactive Machine Learning for More Expressive Game Interactions with Carlos Gonzalez Diaz
In this session I’ll share some of the findings of my research in how machine learning can be applied in games and how this can heighten positive aspects in the player-computer interaction. There is an increasing trend of incorporating a diverse variety of sensors into videogame systems, ranging from game controllers to current VR/AR kits, yet there are no standard practices how to design sensor-based control schemes. Interactive machine learning (IML) is a novel interaction paradigm that involves users to iteratively build machine learning models. IML has been successfully used to allow designers and end users to fine-tune or even design wholly personal control schemes for interactive music and other applications. Could these techniques be used in games development?
The research and technology presented in this session explores how using interactive machine learning in the design of sensor-based control schemes of digital games can improve the player experience and how simple it can be to use. This technology can be used to quickly prototype controls by directly showing human actions and computer responses without writing code! Players or designers may use interactive machine learning to design enjoyable control schemes for themselves and create unexpected playful interactions.
I will run through specific tools and examples with Unity 3D that we researched and developed with the help of a Google grant and presented at the GDC 2019 AI track and Develop:Brighton 2019. No matter the audience background, all code and examples will be open sourced for them to explore after the session!