analysisgnn / README.md
manoskary's picture
Enhance README and app.py: Improve score visualization and add MuseScore AppImage handling
8d6029e

A newer version of the Gradio SDK is available: 6.2.0

Upgrade
metadata
title: AnalysisGNN Music Analysis
emoji: 🎡
colorFrom: red
colorTo: pink
sdk: gradio
sdk_version: 5.49.1
app_file: app.py
pinned: false
license: mit
short_description: Inference for the AnalysisGNN score analysis model

AnalysisGNN Gradio Interface

A Gradio web interface for AnalysisGNN, a unified music analysis model using Graph Neural Networks.

Features

  • 🎼 MusicXML Upload: Upload and analyze musical scores in MusicXML format
  • 🎨 Score Visualization: Automatic rendering of uploaded scores to images (now with built-in MuseScore AppImage fallback)
  • πŸ“Š Multi-task Analysis: Perform various music analysis tasks:
    • Cadence Detection
    • Key Analysis (Local & Tonalized)
    • Harmonic Analysis (Chord Quality, Root, Bass, Inversion)
    • Roman Numeral Analysis
    • Phrase & Section Segmentation
    • Harmonic Rhythm
    • Pitch-Class Set Groupings
    • Non-Chord Tone (TPC-in-label / NCT) Detection
    • Note Degree Labeling
  • πŸ“ˆ Results Table: View analysis results in an interactive table
  • πŸ’Ύ Export Results: Download analysis results as CSV
  • 🧾 Parsed Score Download: Grab the normalized MusicXML that is produced after parsing with Partitura

Quick Start

Local Installation

# Clone the repository
git clone https://github.com/manoskary/analysisgnn-gradio.git
cd analysisgnn-gradio

# Create a virtual environment (recommended)
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install dependencies
pip install -r requirements.txt

# Run the app
python app.py

The app will be available at http://localhost:7860

Hugging Face Spaces

This app is designed to run on Hugging Face Spaces. Simply deploy it as a Gradio Space.

Usage

  1. Upload a MusicXML file using the file upload button
  2. Select analysis tasks you want to perform (cadence, key, harmony, etc.)
  3. Click "Analyze Score" to run the inference
  4. View results:
    • Score visualization (rendered image)
    • Analysis results table (note-level predictions)
  5. Download results as CSV if needed

Score Rendering Backend

The interface first tries to render MusicXML scores with Partitura. If that backend is missing MuseScore/LilyPond, the app now mirrors the manoskary/weavemuse approach: it automatically downloads and extracts the MuseScore AppImage (stored under ./artifacts/musescore/) and calls it headlessly (QT_QPA_PLATFORM=offscreen). You can override the binary by setting MUSESCORE_BIN=/path/to/mscore before launching the app.

Model

The app uses a pre-trained AnalysisGNN model automatically downloaded from Weights & Biases. The model is cached in the ./artifacts/ folder to avoid re-downloading.

Dependencies

  • analysisgnn: Core music analysis library
  • gradio: Web interface framework
  • partitura: Music processing library
  • torch: Deep learning framework
  • pandas: Data manipulation
  • See requirements.txt for complete list

Citation

If you use this interface or AnalysisGNN in your research, please cite:

@inproceedings{karystinaios2024analysisgnn,
    title={AnalysisGNN: A Unified Music Analysis Model with Graph Neural Networks},
    author={Karystinaios, Emmanouil and Hentschel, Johannes and Neuwirth, Markus and Widmer, Gerhard},
    booktitle={International Symposium on Computer Music Multidisciplinary Research (CMMR)},
    year={2025}
}

License

MIT License - See the AnalysisGNN repository for more details.

Acknowledgments