Spaces:
Sleeping
A newer version of the Gradio SDK is available:
6.2.0
title: AnalysisGNN Music Analysis
emoji: π΅
colorFrom: red
colorTo: pink
sdk: gradio
sdk_version: 5.49.1
app_file: app.py
pinned: false
license: mit
short_description: Inference for the AnalysisGNN score analysis model
AnalysisGNN Gradio Interface
A Gradio web interface for AnalysisGNN, a unified music analysis model using Graph Neural Networks.
Features
- πΌ MusicXML Upload: Upload and analyze musical scores in MusicXML format
- π¨ Score Visualization: Automatic rendering of uploaded scores to images (now with built-in MuseScore AppImage fallback)
- π Multi-task Analysis: Perform various music analysis tasks:
- Cadence Detection
- Key Analysis (Local & Tonalized)
- Harmonic Analysis (Chord Quality, Root, Bass, Inversion)
- Roman Numeral Analysis
- Phrase & Section Segmentation
- Harmonic Rhythm
- Pitch-Class Set Groupings
- Non-Chord Tone (TPC-in-label / NCT) Detection
- Note Degree Labeling
- π Results Table: View analysis results in an interactive table
- πΎ Export Results: Download analysis results as CSV
- π§Ύ Parsed Score Download: Grab the normalized MusicXML that is produced after parsing with Partitura
Quick Start
Local Installation
# Clone the repository
git clone https://github.com/manoskary/analysisgnn-gradio.git
cd analysisgnn-gradio
# Create a virtual environment (recommended)
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
# Run the app
python app.py
The app will be available at http://localhost:7860
Hugging Face Spaces
This app is designed to run on Hugging Face Spaces. Simply deploy it as a Gradio Space.
Usage
- Upload a MusicXML file using the file upload button
- Select analysis tasks you want to perform (cadence, key, harmony, etc.)
- Click "Analyze Score" to run the inference
- View results:
- Score visualization (rendered image)
- Analysis results table (note-level predictions)
- Download results as CSV if needed
Score Rendering Backend
The interface first tries to render MusicXML scores with Partitura. If that backend is missing MuseScore/LilyPond, the app now mirrors the manoskary/weavemuse approach: it automatically downloads and extracts the MuseScore AppImage (stored under ./artifacts/musescore/) and calls it headlessly (QT_QPA_PLATFORM=offscreen). You can override the binary by setting MUSESCORE_BIN=/path/to/mscore before launching the app.
Model
The app uses a pre-trained AnalysisGNN model automatically downloaded from Weights & Biases. The model is cached in the ./artifacts/ folder to avoid re-downloading.
Dependencies
analysisgnn: Core music analysis librarygradio: Web interface frameworkpartitura: Music processing librarytorch: Deep learning frameworkpandas: Data manipulation- See
requirements.txtfor complete list
Citation
If you use this interface or AnalysisGNN in your research, please cite:
@inproceedings{karystinaios2024analysisgnn,
title={AnalysisGNN: A Unified Music Analysis Model with Graph Neural Networks},
author={Karystinaios, Emmanouil and Hentschel, Johannes and Neuwirth, Markus and Widmer, Gerhard},
booktitle={International Symposium on Computer Music Multidisciplinary Research (CMMR)},
year={2025}
}
License
MIT License - See the AnalysisGNN repository for more details.
Acknowledgments
- Built with Gradio
- Powered by AnalysisGNN
- Music processing with Partitura