Brain Relevance Feedback for Interactive Image Generation

UIST'20, 2020

Carlos de la Torre-Ortiz, Michiel Spapé, Lauri Kangassalo and Tuukka Ruotsalo

Image generation with multiple visual features inferred directly from EEG via brain-computer interfacing


  • We present the first of its kind interactive brain-computer interface for image generation combining generative adversarial neural networks and brain feedback.

  • We demonstrate the technique with realistic tasks (such as generate a blond face), and complex combinations of the tasks (such as generate an old female face that is not smiling), and show that the technique can generate images matching user intentions.

  • We validate the approach in an offline study. Our results show performance significantly better than random and approaching manually provided explicit feedback. Our results and generated example images show that our approach yields pragmatic performance suggesting applicability in real-world usage.

Abstract

Brain-computer interfaces (BCIs) are increasingly used to perform simple operations such as moving a cursor but have remained of limited use for more complex tasks. In our new approach to BCI, we use brain relevance feedback to control a generative adversarial network (GAN). We obtained EEG data from 31 participants who viewed face images while concentrating on particular facial features. Following, an EEG relevance classifier was trained and propagated as feedback on the latent image representation provided by the GAN. Estimates for individual vectors matching the relevant criteria were iteratively updated to optimize an image generation process towards mental targets. A double-blind evaluation showed high performance (86.26% accuracy) against random feedback (18.71%), and was not significantly lower than explicit feedback (93.30%). Furthermore, we show the feasibility of the method with simultaneous task targets demonstrating BCI operation beyond individual task constraints. Thus, brain relevance feedback can validly control a generative model, overcoming a critical limitation of current BCI approaches.

Materials

Publication

Carlos de la Torre-Ortiz, Michiel Spapé, Lauri Kangassalo and Tuukka Ruotsalo

Brain Relevance Feedback for Interactive Image Generation

UIST'20

@inbook{delatorreortiz2020brain,
author = {de la Torre-Ortiz, Carlos and Spap\'{e}, Michiel M. and Kangassalo, Lauri and Ruotsalo, Tuukka},
title = {Brain Relevance Feedback for Interactive Image Generation},
year = {2020},
isbn = {9781450375146},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3379337.3415821},
booktitle = {Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology},
pages = {1060–1070},
numpages = {11}
}