BlossomTune is an open-source orchestrator designed to simplify and democratize federated learning.

It provides an intuitive web interface for managing participants, launching experiments, and securely distributing credentials, making complex federated learning setups accessible to everyone.

Why Support BlossomTune?

Our project features:

By supporting BlossomTune you are investing in the open-source infrastructure needed to make federated learning practical for a wider audience of developers, researchers and organizations.

Your contribution will directly fund the development of new features, improve documentation and help us grow a vibrant community around accessible and secure AI/ML.

The BlossomTune Family

BlossomTune is an ecosystem of open-source tools designed to advance and simplify federated learning.

The project is divided into the core orchestrator and specialized Flower Apps for cutting-edge research.

Orchestrator

Flower Apps

These applications are built upon the FlowerTune LLM templates for the paper "FlowerTune: A Cross-Domain Benchmark for Federated Fine-Tuning of Large Language Models" (https://arxiv.org/abs/2506.02961) presented at NeurIPS 2025 conference, showcasing federated fine-tuning of Language Models on specialised tasks.

Citations

@misc{gao-2025,
    author = {Gao, Yan and Scamarcia, Massimo Roberto and Fernandez-Marques, Javier and Naseri, Mohammad and Ng, Chong Shen and Stripelis, Dimitris and Li, Zexi and Shen, Tao and Bai, Jiamu and Chen, Daoyuan and Zhang, Zikai and Hu, Rui and Song, InSeo and KangYoon, Lee and Jia, Hong and Dang, Ting and Wang, Junyan and Liu, Zheyuan and Beutel, Daniel Janes and Lyu, Lingjuan and Lane, Nicholas D.},
    month = {6},
    title = {{FlowerTune: a Cross-Domain benchmark for federated Fine-Tuning of large language models}},
    year = {2025},
    url = {https://arxiv.org/abs/2506.02961},
}