Dancing with Myself: Interdisciplinary Machine Learning Methods for Choreography (feat. Mariel Pettee)

A recording of the event can be found here: Kaltura, Youtube

DATE & TIME: Monday, April 18, 5:00 - 6:00 PM
LOCATION: Paino Lecture Hall

In these years marked by physical distance, Dr. Pettee’s primary dance partner has been a machine learning (ML) model. Inspired by her applications of ML in the domain of high-energy particle physics during her PhD, over the past few years she has led several independent teams of researchers across academia, industry, and the arts to create state-of-the-art ML-generated choreography using techniques including Variational Autoencoders and Graph Neural Networks. In this talk, she will discuss the research trajectories leading to her models, which were designed to understand dynamic many-body systems, along with their generative implications for her creative practice.

Biography

Mariel Pettee (Physics PhD, Yale University) is a Chamberlain Postdoctoral Research Fellow at Lawrence Berkeley National Laboratory. Her research encompasses the development of custom machine learning techniques for high-energy particle physics as well as astrophysics. She is particularly interested in creating generic techniques that have broad applicability across other areas of fundamental science and art. Since 2017, she has also led independent teams of researchers using machine learning to generate choreography based on 3D motion capture of her own movements. As a choreographer, director, and performer, she also uses theater and dance work to research audience activation, duration, power, self-documentation, authenticity, fear, and playfulness. Most recently, she has worked with companies including Kinetech Arts, LeeSaar, Urban Bush Women, Paul Taylor 2, and the Bill T. Jones/Arnie Zane Company. She was a choreographer-in-residence at Harvard University’s Dance Center in 2014. Prior to her PhD, she earned her Bachelors in Physics & Mathematics from Harvard University and her Masters in Physics at the University of Cambridge (Trinity College) as a Harvard-Cambridge Scholar.

5 Likes

Looks like a fun talk. Looking forward to it!!!

1 Like

5 Likes

Great presentation. Very thought provoking. FYI: that school on the other side of town has a lab already setup to capture similar datasets.

Their motion capture system appears to have much smaller body markers than what Dr. Pettee used during her first study.

They also have a set of Delsys (local company) EMG motion recording sensors. These measure body movement via gyroscopes and accelerometers, and muscle exertion as bioelectrical activity strain to capacity.

Last time I checked (just prior to COVID), this equipment had seen surprisingly little use. Maybe use ML/AI to formulate insight beyond comparison and measurement could help change that.

4 Likes

Is there any knowledge about what it takes to interact with the equipment? If someone was interested in things like that, what would they have to do inorder to learn more and potentially use the equipment?

1 Like

The staff at UMA IALS would be operating (or facilitating operation) of the equipment. This comes with a cost, but does avoid a rather steep learning curve. This only returns input data, similar to what Dr Pettee had used to inform her models.
The lab also has this software. It looks interesting, but that’s all I know about it.

As for how to cover the research expense on a student budget… well… if you formed a small business (s-corp, perhaps d/b/a) focused on creating a product (or maybe just becoming a motion capture data service provider might work) Massachusetts offers a very generous voucher program that effectively reduces lab charges by 75% for start-ups or companies with fewer than 10 workers.

There was also a seed grant program for smaller (or perhaps more experimental) projects. Links to it are still present, but the webpage has no content. Unsure what happened with that.

If you were looking to do something related to human movement, the next step would probably be reaching out to the lab to discuss your project idea. A copy (or recording) of Dr Pettee’s presentation could be very helpful as this type of stuff can be very complicated to otherwise explain. (iirc) Prof Spector also maintains faculty status at CICS with the University, so he might also be a helpful resource for making introductions etc.

And if you’re considering a product/service provider business approach to utilize (or apply for) the voucher program, you might want to consider the iCorps program. I’m unsure if the College has a chapter, but UMass does.

iCorps is a business development program to aid student inventors with commercialization of their products. Fun fact: the program’s national office is located in Hadley.

Contact at the University is Karen Utgoff. Although she’s not necessarily linked to the CH2P lab, her office (iirc) is across the hall from it at IALS. (and yeah, the iCorps info was slightly off-topic, but it tied into the voucher program, project funding strategies, the five college ecosystem, etc, etc)

Hope that helps. Thanks.

2 Likes

Thank you.

This is really great to hear about all the tools we have available! Maybe we can find students working on projects that can make use of these tools!

A recording of the event can be found here! Kaltura, Youtube

Perhaps the most intriguing piece(s) of data collection equipment at the CH2P lab are the 1,000 health data wristbands and smartphones.

What’s unique is that (iirc) the devices are Microsoft Bands. Details aren’t discussed because the equipment was discontinued long ago. What’s unique about the Band is that it could measure galvanic skin response.

No other piece of consumer grade equipment (beyond one fitbit model) can do this, and the Band is/was the only one that had an open-source(ish) tool for collecting/aggregating that data. (fitbit path gets expensive quickly)

Cant speak for how the Band measured those electrical currents, but there is some level of acceptance with correlation between skin conductivity and emotional response. I’ll let you research that claim further.

So what could one do with this stuff… well… perhaps measure the emotive response of an audience during a dance performance? (or any artistic work constructed to elicit an emotional human response)

Hollywood does this, Madison ave does this, and I’m sure those in Menlo Park must do this. (combined with pupil dilation, EEG, etc) Nobody talks about it… or uses it to advance art rather than advertising. (soapbox out)

Most challenging aspect of this field of research (right now and imho) is discovering novel ways to inform and construct models. It’s what (again imho) made Dr. Pettee’s presentation so memorable and intriguing.

2 Likes