.

Interactive Machine Learning Visual Scripting for Unity

InteractML brings machine learning visual scripting to Unity, empowering game creators to develop novel gameplay mechanics and control schemes without writing code. You can create ML models by joining nodes together and visualize in real-time the data from an Unity scene in the graph.

Interactive machine learning (IML) is a subfield of artificial intelligence (AI) research where users, generally non-experts, can quickly create and test ML models. These models can learn input/outputs from real-time data through human/computer examples (i.e. when the user moves his arms up-down the character swims upwards).

Create New Interfaces

Joysticks for videogames? Really? We can do more for our players. Using the power of Interactive Machine Learning, InteractML lets you control what interfaces you use as inputs and what the gestures look and feel like in your game. Let players tell you how they want to play. Set your game's interface free.

We provide a number of already made examples using different input devices, ranging from mouse/keyboard, arduinos to modern VR systems with motion tracking. You can as well pipe your own custom devices if you wish to. Anything you can send to Unity can be used in InteractML!

Follow The Project

  • Get the Code

    This entire project is a work in progress. We are currently in alpha release with slim documentation.

    Download on Github
  • Join the Discord Community

    If you have questions, feedback, or want to join our community and studies, you can find us on Discord. Click on the following button and follow the guidelines to join us!

    Say Hi
  • Keep Up To Date

    If you want to keep up to date with the events and updates we release, don't forget to sign up to the mailing list!

Learn about Interactive Machine Learning

Interested in Machine Learning?

Interactml builds heavily of Rebecca Fiebrink's Wekinator project. To jumpstart your journey, you can watch videos about ML & Wekinator here. All of the concepts you will learn for the Wekinator can be applied to InteractML and vice-versa.

Get Smarter

Events

We organise workshops, talks or game jams where we teach you about InteractML and help you create something with it. Here are some of the latest events we are running:

About Our Team

Rebecca Fiebrink

I am a Reader at the Creative Computing Institute at University of the Arts London. My students, research assistants, and I work on a variety of projects developing new technologies to enable new forms of human expression, creativity, and embodied interaction. Much of my current research combines techniques from human-computer interaction, machine learning, and signal processing to allow people to apply machine learning more effectively to new problems, such as the design of new digital musical instruments and gestural interfaces for gaming and accessibility. I am also involved in projects developing rich interactive technologies for digital humanities scholarship, and machine learning education.




Phoenix Perry

Phoenix Perry creates physical games and user experiences. As an advocate for women in game development, she founded Code Liberation Foundation. In her role at Goldsmiths, University of London, she lecturers on Physical Computing and games and leads an MA in Independent Games and Experience Design. Fotering professional growth and mentoring new leaders in the field, she strives to infuse the industry with new voices. She is currently doing a PhD at Goldsmiths University of London and has an MS from NYU Tandon School of Engineering.




Marco Gillies

Marco Gillies is Principal Investigator on the 4i Project. Marco’s research centres on how we can create technologies that work with embodied, tacit human knowledge. He has many years’ experience of research into how to generate non-verbal communication for animated virtual characters, particularly for social interaction in virtual reality. His approach focuses on the role actors and performers can play in creating autonomous characters.He has also worked on other forms of immersive experience and embodied interaction, particularly applied to immersive theatre and performance. His recent research has been on human-centred machine learning in which humans guide machine learning algorithms interactively as a way of making use of tacit human knowledge in artificial intelligence systems.




Ruth Gibson

Ruth Gibson is a Reader at the Centre for Dance Research and a certified teacher in Skinner Releasing Technique. She works across disciplines to produce objects, software and installations in partnership with artist Bruno Martelli as Gibson/Martelli . She exhibits in galleries and museums internationally creating award-winning projects using computer games, virtual and augmented reality, print and moving image. Ruth worked as a motion capture performer, supervisor and advisor for Vicon, Motek, Animazoo, Televirtual, and the BBC. A recipient of a BAFTA nomination, a Creative Fellowship from the AHRC, awards from NESTA, the Arts Council and The Henry Moore Foundation, she won the Lumen Gold Prize & the Perception Neuron contest. Widely exhibited, her work has been shown at the Venice Biennale, SIGGRAPH, ISEA, Transmediale and is currently touring with the Barbican’s ‘Digital Revolution’. She is PI on Reality Remix, an AHRC/EPSRC Immersive Experiences Award.




Carlos Gonzalez Diaz

I am a PhD Candidate at the Centre for Doctoral Training in Intelligent Games and Games Intelligence (IGGI) at the University of York, Goldsmiths and Queen Mary universities. I am passionate about science and coding, and I developed games and tech in the UK, Sweden and Spain. I am interested in understanding how the use of interactive machine learning in the design of movement interactions of virtual reality games can affect the player experience. I was previously an intern at Sony Interactive Entertainment R&D researching on the PSVR game system. After that, I jumped to this project and architected the first version of InteractML. Currently I am working with the rest of team on further developing InteractML and exploring its possibilities.




Michael Zbyszynski

Michael Zbyszyński is a lecturer in the Department of Computing, where he teaches perception & multimedia computing, live electroacoustic music, and real-time interaction. His research involves applications of interactive machine learning to musical instrument design and performance. As a musician, his work spans from brass bands to symphony orchestras, including composition and improvisation with woodwinds and electronics. He has been a software developer at Avid, SoundHound, Cycling ’74, and Keith McMillen Instruments, and was Assistant Director of Pedagogy at UC Berkeley’s Center for New Music and Audio Technologies (CNMAT). He holds a PhD from UC Berkeley and studied at the Academy of Music in Kraków on a Fulbright Grant. His work has been included in Make Magazine, the Rhizome Artbase, and on the ARTSHIP recording label.




Nicola Plant

Nicola Plant is a new media artist, researcher and developer currently working as a researcher on a project developing machine learning tools for movement interaction design in immersive media at Goldsmiths, University of London. Nicola holds a PhD in Computer Science that focuses on embodiment, non-verbal communication and expression in human interaction from Queen Mary University of London. She has an artistic practice that specialises in movement-based interactivity and motion capture, creating interactive artworks exploring expressive movement within VR.




Clarice Hilton

Clarice Hilton is a creative technologist and researcher specialising in Unity and immersive artwork. She is a researcher at the University of London developing a movement based tool to intuitively design interaction in unity using machine learning. In her interdisciplinary practise she collaborates with filmmakers, dance practitioners, theatre makers and other artists to explore participatory and embodied experiences. She developed an interactive puppetry and AR touring show If Not Here… Where? with The Little Angel, Great Ormond Street Hospital. She was the creative technologist on SOMA a virtual reality experience exploring the somatic experience between the physical and the virtual in VR by Lisa May Thomas. She worked as a developer on The Collider developed by Anagram which has toured internationally at Tribeca, Venice Film Festival, IDFA Doc Lab and Sandbox Immersive Festival (best immersive artwork) and was named one of 2019 top immersive experiences by Forbes. She previously taught Interactive Storytelling and Unity at UCL on the Immersive Factual Storytelling course.




Bruno Martelli

Bruno Martelli’s practice examines figure and landscape, transposing sites to create ambiguous topographies exploring the relationship natural and artificial. He works with live simulation, performance capture, installation and video to create immersive virtual realities. He holds a doctorate in Immersive Environments from RMIT. Commissioned by Wallpaper, Selfridges, Henry Moore Foundation, The Barbican & NESTA, his AHRC projects include: ‘Error Network’, ‘Capturing Stillness – visualisations of dance through motion capture technologies’ and ‘Reality Remix’. He led serious gaming projects to create permanent installations in James Cooke University Hospital, Middlesborough for the ‘Healing Arts Project’, and Ashfield School in Leicester - part of the ‘Building Schools for the Future’ programme. Directing motion capture for an award-winning UNICEF animation, his artworks have been commissioned by Great Ormond Street Hospital Trust. Based in London Bruno collaborates with artist Ruth Gibson as Gibson/Martelli. Their first work together was BAFTA nominated, recently their ground-breaking ‘MAN A’ project won the Lumen Gold Prize.




Get in Touch

If you want to email us, you can do so at hello at interactml.com. You can also find us on twitter at @getinteractml

Leave us your email

If you want to keep up to date with events and latest news, you can do so by leaving here your email address

Subscribe to mailing list