Skip to content

fourMs/MGT-python

Repository files navigation

MGT-python

PyPi version GitHub license CI Documentation

The Musical Gestures Toolbox for Python is a collection of tools for visualizing and analysing audio and video files.

MGT python

📖 Documentation & Examples

Quick Start

Installation

pip install musicalgestures

musicalgestures installs its core Python dependencies automatically. You still need a working ffmpeg installation on your system for video processing.

Basic Usage

import musicalgestures as mg

# Load a video
v = mg.MgVideo('dance.avi')

# Create visualizations
v.grid()
v.videograms()
v.average()
v.history()

# Perform motion analysis
v.motion()

# Audio analysis
v.audio.waveform()
v.audio.spectrogram()
v.audio.tempogram()

# Pose estimation
v.pose(model='body_25', device='cpu')

Runtime Notes

  • ffmpeg is required for video I/O and preprocessing.
  • pose() downloads OpenPose weights on first use if they are missing.
  • In notebooks and other non-interactive runs, missing pose weights are downloaded automatically when possible.
  • If device='gpu' is requested but OpenCV CUDA support is unavailable, pose() falls back to CPU execution.
  • flow.dense(), flow.sparse(), and blur_faces() use CPU by default (use_gpu=False). Set use_gpu=True to opt into CUDA acceleration with automatic CPU fallback.
  • get_cuda_device_count() is available to quickly check whether OpenCV sees CUDA devices.
  • blur_faces() returns the generated result object consistently, including when save_data=True.

Try Online

Open In Colab

Quick Links

Features

  • Video Analysis: Motion detection, optical flow, pose estimation
  • Audio Processing: Spectrograms, audio descriptors, tempo analysis
  • Visualizations: Motiongrams, videograms, motion history
  • Integration: Works with NumPy, SciPy, and Matplotlib ecosystems
  • Cross-platform: Linux, macOS, Windows support

Presentation

See this short video presentation made for the Nordic Sound and Music Computing Conference 2021:

nordicsmc2021-thumbnail_640

Requirements

Research Background

This toolbox builds on the Musical Gestures Toolbox for Matlab, which again builds on the Musical Gestures Toolbox for Max. Many researchers and research assistants have helped its development over the years, including Balint Laczko, Joachim Poutaraud, Frida Furmyr, Marcus Widmer, Alexander Refsum Jensenius

The software is currently maintained by the fourMs lab at RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion at the University of Oslo.

Reference

If you use this toolbox in your research, please cite this article:

@inproceedings{laczkoReflectionsDevelopmentMusical2021,
    title = {Reflections on the Development of the Musical Gestures Toolbox for Python},
    author = {Laczkó, Bálint and Jensenius, Alexander Refsum},
    booktitle = {Proceedings of the Nordic Sound and Music Computing Conference},
    year = {2021},
    address = {Copenhagen},
    url = {http://urn.nb.no/URN:NBN:no-91935}
}

License

This toolbox is released under the GNU General Public License 3.0 license.