Monday , October 18 2021

Deconstructing Crowd Noise at College Basketball Games



[ad_1]

WASHINGTON, D.C., NOVEMBER 5, 2018 – With thousands of fans cheering, singing, yelling and mocking, college basketball games can be almost deafening. Some arenas have decibel meters that, accurately or not, provide some indication of the volume of noise generated by viewers and sound systems. However, crowd noise is rarely the focus of scientific research.

"Whenever it comes up in the literature, it's mostly something that researchers are trying to understand," said Brooks Butler, a Brigham Young University undergraduate physicist and a BYU staff member who will present the survey at the 176th Acoustical Society of America. Meeting, held in conjunction with Acoustics Week of the Canadian Acoustic Association in Canada, November 5-9, at the Victoria Conference Center in Victoria, Canada.

"Crowd noise is usually treated as background interference – something to filter." But the BYU researchers felt that the noise of the crowd was worthy of their own investigation. In particular, they wanted to see if the machine learning algorithms could select patterns within the raw acoustic data that indicated what the crowd was doing at a given moment, providing clues about what was happening in the game itself. One possible application of this could be the early detection of unruly or violent behavior from the crowd – although this idea has not been tested.

The BYU team performed high-fidelity acoustic measurements during the men's and women's basketball games at the university, later doing the same for football and volleyball matches. They divided the games into half-second intervals, measuring the frequency content (as shown in the spectrograms), sound levels, the ratio of maximum and minimum sound levels within a defined block of time, and other variables. They then applied signal processing tools that identified 512 distinct acoustic characteristics, composed of different frequency bands, amplitudes, and so on.

The group used these variables to construct a space of 512 dimensions, using machine learning techniques to perform a computerized and grouped analysis of this complicated and multidimensional domain.

BYU physics professor Kent Gee was the lead investigator on the project, along with professors Mark Transtrum and Sean Warnick. Together, they led a multi-student team focusing on different aspects of the problem, including data collection, analysis, and machine learning.

Gee explained the process with a simple analogy. "Suppose you have a dot chart on a two-dimensional x-y chart and measure the distance between those points," he said. "You can see that the points are grouped into three clusters or clusters. We did something similar with our 512-dimensional space, although you obviously need a computer to keep up with all of that."

The so-called "clustering K-media" analysis they ran revealed six separate groups that matched what was happening in the arena, depending on whether people were cheering, singing, booing, silencing or leaving the speakers dominating the sound landscape.

In this way, Gee and his colleagues were able to assess the emotional state of the audience, simply from an analysis of the sound data made by machines. "An important potential application of our research," he said, "may be the early detection of unruly or violent mob behavior."

###

Brooks A. Butler, Mylan R. Cook, Kent L. Gee, Mark K. Transtrum, Sean Warnick, Eric Todd and Harald Larsen will be on Monday, November 5, 4:25 pm in the Shaughnessy Room (Fe) of the Victoria Conference Center in Victoria, British Columbia, Canada.

———————– MORE MEETING INFORMATION ———————–

USEFUL LINKS

Primary meeting site: https: //society.org /meetings-wing /

Technical program meeting: https: //ep70.eventpilotadmin.with /network/planner.php? id =ASAFALL18

Hotel information: https: //society.org /meetings-wing /#hr

WORLD PRESS ROOM

In the coming weeks, the ASA World Press Room will be updated with additional tips on dozens of interesting stories and articles in lay language, which are summaries of 300-800 words from presentations written by scientists for a general audience and accompanied by photos, audio and video. You can visit the website, beginning in late October, at http: // acoustics.org /world press room /.

PRESS RELEASE

We will grant free registration to accredited journalists and professional freelance journalists. If you are a reporter and would like to participate, please contact Rhys Leahy or the AIP Media Line ([email protected], 301-209-3090). We can also help set up interviews and get pictures, sound clips or basic information.

WEBCAST LIVE MEDIA

A press conference with a selection of interesting research will be streamed live from the conference on Tuesday, November 6, 2018. Times and topics will be announced. Members of the media must register in advance at http: // aipwebcasting.with.

ABOUT ASA

The Acoustical Society of America (ASA) is the leading international scientific society in acoustics dedicated to sound science and technology. Its 7,000 members worldwide represent a broad spectrum of the study of acoustics. ASA publications include The Journal of the Acoustical Society of America (the world leader in acoustics), Acoustics Today magazine, books and standards on acoustics. The society also holds two major scientific meetings each year. For more information about ASA, visit https: //society.org.

ABOUT CAA

The Canadian Acoustical Association (CAA) is an interdisciplinary professional organization that promotes communication between people working in all areas of acoustics in Canada; promotes the growth and practical application of knowledge in acoustics; encourages education, research, protection of the environment and employment in acoustics; and is an umbrella organization through which general issues in education, employment and research can be addressed at the national and multidisciplinary level. For more information on the CAA, visit http: // caa-aca.here.

Legal Notice: AAAS and EurekAlert! are not responsible for the accuracy of new releases published on EurekAlert! by contributing institutions or by using any information through the EurekAlert system.

[ad_2]
Source link