Virtual Reality

Scan my head in 3D to further my research on virtual reality | Imperial News

A 3D rendering of my head and shoulders, made as part of the SONICOM project.

I had my head 3D scanned as part of the Imperial-run SONICOM project – here’s my experience and how it contributes to VR research.

At the top of an ordinary metal staircase at the back of Imperial’s Dyson School of Design Engineering is a room like no other I’ve been in. Sitting in what looks like a soundproof chapel is a contraption that wouldn’t look out of place in an episode of Star Trek.

This is the Turret Lab, a room specially fitted out for the SONICOM project as part of its research to develop immersive audio technologies for virtual and augmented reality (VR/AR). Coordinated by Imperial’s Dr Lorenzo Picinali, the Audio Experience Design team seeks to use artificial intelligence (AI) to develop tools that can customize the way sound is delivered in virtual and augmented reality so that it is as similar to real-world sound as possible. .

The SONICOM project is managed by Imperial’s research project management team, and as the new science communications manager, I decided to volunteer with the study. What better way to get to know a project than to have a 3D render of your head created as part of it?

A photo of the Turret Lab, consisting of soundproof walls and a machine to measure an individual's HRTF in the middle.

Measure how my ears receive sound

I was greeted by SONICOM PhD student Rapolas Daugintis and post-doc Dr Aidan Hogg who walked me through what was to come. They invited me to sit in the chair in the center of the device which apparently was not an interdimensional gateway, but a device that would measure something called my “head related transfer function” (HRTF).

Basically, an HRTF is a mathematical way of describing the unique way a person’s ears receive sound. Because everyone’s head, ears, and shoulders are different sizes and shapes, the way sound waves actually enter your ears is different from person to person.

Recording someone’s HRTF and using it when streaming sound through headphones is the key to creating “binaural” sound. It’s the kind of audio we experience in real life where we can pinpoint where a sound is coming from, made possible by the fact that we have an ear on either side of our head that allows us to notice slight differences in what they receive.

Using an HRTF means the headset can reproduce “3D sounds” which may appear as if they are coming from a specific direction, such as far above you or as if someone is whispering in your ear right behind you.

Apparently the process of measuring this complex mathematical function involves me putting a stocking over my head, putting microphones in my ears, and standing still while sitting in a chair that slowly rotates as sounds are made through loudspeakers all around me.

It might not sound like everyone’s idea of ​​a good time, but the novelty of this bizarre series of events meant it was actually quite enjoyable. And in the end, the team had measured my own personal HRTF.

A 3D render of the author's ear

Matching math to physics

Measuring individual HRTFs is fine, and it’s the best way to deliver immersive binaural sound, but realistically, it’s not possible for everyone to have their own HRTF measured to achieve this. This is where a 3D scan of my head comes in.

SONICOM wants to relate the mathematical nature of HRTFs to the physical shapes of the ears. This means, for example, that you can record the shape of your ear, then AI could use that to choose an HRTF that it thinks would be closest to yours to provide realistic sound.

It also doesn’t mean that everyone would need their personal 3D scanner (although I really want one now). The team took a scan of my head with a high quality handheld 3D scanner, but also took several photos of my head from different angles with just a phone camera.

By linking all of this different data together – HRTFs, 3D scans and static photos – one day someone could sign up for a virtual concert, take a few pictures of their ears, and the AI ​​could then make sure their experience looks similar to an in-person location as possible.

The work SONICOM members do is incredibly exciting, and if the past few years have proven anything, it’s that virtual experiences are here to stay – so let’s make them as realistic as possible.

Want to record your own HRTF and 3D scan your head? The SONICOM project is always looking for new volunteers to add to its database! If you are interested, contact Dr. Picinali.

Leave a Reply