Sharing is caring!

Better late than never – we’ve made a promise and we’re gonna keep it. Back in May this year, just after the NeuroGaming 2013 conference and expo -where people from around the world came together in San Francisco to discuss the present state, and future, of neurogaming– we promised that we would post 2 exclusive interviews taken at this first of its kind event. The first interview we published was with Ariel Garten, CEO and co-founder of InteraXon, you can watch and read it here. Now here comes the second interview, this time with Zack Lynch, the brain behind the whole NeuroGaming event.

The interview (scroll down for the video) was recorded by NeuroGadget contributor Melanie, on 1st May, 2013 at the NeuroGaming 2013 event in San Francisco. Just like was the case with our Ariel Garten interview, the audio quality of this new video is not perfect either (to say the least). Hopefully the full transcript below will help to make the video more enjoyable to everyone.

We want to say thank you to Zack for bringing this event to life and also for taking the time to talk with us.

Full transcript of our interview with Zack Lynch

Melanie (NeuroGadget): I was hoping you could tell me a little bit about how you think the conference is going, and what you think the highlights are in terms of the future of Brain-Computer Interface (BCI) technology being integrated into the gaming world.

Zack Lynch (NeuroGaming): Well, I mean, it’s the first ever NeuroGaming Conference and Expo, so my hope was, first of all, just to bring the community together to begin to have the talk around different types of technologies, and how the interplay between them could begin to spark new experiences for players – and I think it’s a great success. I mean, the conversations that I’m having with people and hearing them have in the hallways, at the Q&A sessions during each [panel] session – people are really trying to push the edge of where the technology is, where it’s going, how inexpensive sensors will allow us to do fundamentally new things in game design – not just for gaming, but for health, for education, a little bit for national defence, so I’m really excited!

I mean, highlights? Right now, every panel is a highlight because each one brings sort of a new area of neurogaming into focus. Instead of just having the names of a couple of companies, what we’re seeing is the actual people running these companies – the inventors, the cutting edge designers – actual talk about these cutting edge technologies.

We started the conference out with the ‘Sensory Gaming Platforms’ panel discussion, I mean the haptics technology discussion from Disney – WOW – that was absolutely fantastic. Then we moved on to the ‘Emotion Sensing’ panel and got to hear from Valve, what they’re doing from the game design research and how they are integrating biometric feedback into the game design process. Then the ‘Cognitive Gaming Platforms’ panel, where people were talking about BCI and how the BCI technologies are being used. It just keeps building, right.

The ‘Investing in Neurogaming’ panel was very, sort of, honest and refreshing. We didn’t have people saying ‘This is going to be huge!’ we had people saying ‘Well, we don’t know yet.’ It seems still sort of fragmented. But then again, what we’re trying to do here is create a community that actually comes together and creates that next big thing. When I get to listen to the game developers talk about where they can take these different technologies, it’s just heartwarming. It’s amazing to see these people down here in the expo arena with their different technologies and the conversations are just going and going. There are lines to try the different technologies are just moving forward.

M: So I was a little bit surprised at how much of the focus of the cognitive and sensory gaming platforms was geared towards the healthcare applications as opposed to just like, the commercial integrating into ‘gaming’ applications. Did you expect that, or did that surprise you as well?

ZL: Well, no, the genesis of neurogaming has really been in the health space. Really what we’re trying to do by the neurogaming meme into the entertainment industry is to make neurogaming an interesting, attractive, and profitable space for the overall entertainment gaming industry. It’s sort of new to them, and so finding enough people who have cutting edge experience in the real entertainment neurogaming space rather than the therapeutic neurogaming space is a challenge, and because there’s a little bit more history there, and a little bit more expertise, the conversation sort of shifts that way.

What I’m really trying to do is bring the game designers in. That’s why we’re hosting it right in Yetizen, which is a game accelerator right in the middle of San Francisco. We’ve got to bring the game designers, the entertainment game designers into the conversation, and I think in future years what we’ll see is a lot more of these entertainment focused discussions as that becomes a profit center, and as that becomes a viable market for all the neurotech enabled game design products.

M: One more question: If you had to guess what modality of neurogaming will catch first – will it be EEG headsets being integrated more into the gaming world, or haptic technology – which do you think is the best avenue for integration.

ZL: It’s impossible to know, and it’s all happening at once. It’s a convergence, and that’s why it’s so exciting. That’s why a haptics technology conference, that’s why I didn’t focus on sound design and scent design that’s why I wanted to bring together all of these different verticals into one area and one meeting, and get that conversation going. It really is about the cheap sensors across all these different areas making possible all these new types of game mechanics and experiences that will push the possibility of play and then making that exciting.

So, I think the answer is convergence. It’s Oculus Rift with a headset to improve your concentration with an EEG to understand your emotional reactions, to facial tracking technology to capture pupil dilation and figure out what you’re actually looking at. All four of those combined with sound, some haptics oriented headsets, in a haptics oriented chair.

While that sounds like a lot, a couple of years down the road, those converged might be a $400-$500 device system – that’s an IPad! If you could get all that in a system, think of the experiences. So it’s really about the convergence, not about one being the single thing. I mean, I think obviously Oculus Rift is going to take off very quickly, there’s a lot of interest there, some of the EEG headsets are catching in new ways – I love what Puzzlebox is doing, it’s taking it into the real world of gaming – there are a lot of exciting things happening.

M: Awesome, thank you so much!

ZL: You’re welcome.