Sound field visualization helps us understand the complicated sound propagation. However, it is difficult to visualize the sound field in detail because it requires many measurement points. In this study, based on physical models and deep learning, we propose the visualization method of a scattered sound field around a rigid object, including non-spherical geometry, with a small number of microphones. From the simulation experiments in the two-dimensional sound fields, the proposed method improved the estimation accuracy by introducing the boundary conditions in addition to the wave equation.
Sound field visualization helps us understand and analyze sound propagation. To visualize the sound field in an actual room, the sound pressures at all observed points should be measured. However, the number of measurement points significantly increases as the resolution and area of visualization are increased. Moreover, the large amount of data makes it difficult to display all the information in the 3D sound field. In this study, we propose an interactive Mixed Reality (MR) visualization system of the 3D sound field in a room. We estimate the sound field in a room using a small number of microphones based on a physical model of sound propagation and then visualize the sound field by interactively changing the visualization region by hand gesture.
Sound field visualization aids in elucidating complicated sound propagation. Numerical simulation is one of visualization methods and incurs high computational costs. In this study, we proposed 2D sound field visualization using deep learning from the room shape. Through an experiment, the proposed method yielded high estimation accuracy for visualization in the asymmetric trapezoidal shapes of rooms.
Mixed reality (MR) can be used to visualize three-dimensional (3D) sound fields in real space. In our previous study, we proposed a sound intensity visualization system using MR. The system visualizes the flow of sound energy in a stationary sound field by measuring sound intensity. However, room impulse responses (RIRs) are essential data when investigating the sound field of a room. Therefore, to demystify the time variation of the sound field, it is crucial to visualize the spatial distribution of RIRs. However, the measurement of multipoint RIRs requires considerable time and effort and a large microphone array. In this paper, we propose an MR visualization system for RIR mapping on two planes based on dynamic RIR measurement using a moving microphone. The proposed system simplifies the measurement of RIRs at multiple points owing to the dynamic measurement capabilities of a hand-held microphone. In the simulation experiment, the RIRs on the grid points were estimated from the microphone signal using the moving path of the microphone. The estimated results were visualized by the animation of RIR maps in real space using MR. From the experimental results, the MR animation of RIR maps on the two orthogonal planes can help demystify 3D sound propagation.
To a achieve highly immersive auditory experience, physically accurate sound field reproduction is required. Many local sound field synthesis (SFS) methods have been proposed to accurately reproduce the sound field around a listener. However, the narrow listening area in local SFS prevents the listener from moving freely. In this study, we developed a dynamic 2.5-dimensional local SFS system, which moves the listening area to follow the movements of the listener using a body-tracking camera. For the evaluation the proposed system, the azimuth localization accuracy was investigated by measuring interaural level differences in an anechoic chamber. The results show that when the distance between the listener and the virtual point source is 1.0 m, the proposed system has a small azimuthal localization error compared with conventional SFS.
The visualization of the sound field which is an invisible-physical phenomena is useful for detection of noise sources, understanding the sound propagation, and architectural acoustics. Room Impulse Response (RIR) is widely used in various acoustic applications because RIR characterizes the sound propagation between a source and a measurement point in a room. Thus, measuring the RIRs at multiple points allows us to visualize the sound field in more detail. For example, we can observe the animation of sound wave fronts by using the RIRs at multiple points. However, many microphones and repetitive measurements are required to measure the RIRs with high resolution. In this paper, we propose a method for visualization of the sound field with high spatial resolution based on the estimation of the RIRs around the microphone using a small number of microphones. The RIRs are modeled using sparse equivalent sources and the image source methods. We conducted the evaluation experiment in an anechoic chamber. By animating the RIRs estimated by three microphones, the sound wave fronts including direct sound and primary-reflection can be clearly observed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.