Communication Compression

Improved Convergence in Parameter-Agnostic Error Feedback through Momentum

We study normalized error feedback algorithms with momentum and parameter-agnostic stepsizes, eliminating the need for problem-dependent tuning while achieving competitive convergence rates.

Nov 1, 2025

Bernoulli-LoRA: A Theoretical Framework for Randomized Low-Rank Adaptation

We introduce Bernoulli-LoRA, a theoretical framework for randomized Low-Rank Adaptation that unifies existing approaches and provides convergence guarantees for various optimization methods.

Aug 1, 2025

MARINA-P: Superior Performance in Non-smooth Federated Optimization with Adaptive Stepsizes

We extend MARINA-P algorithm to non-smooth federated optimization, providing the first theoretical analysis with server-to-worker compression and adaptive stepsizes while achieving optimal convergence rates.

Dec 22, 2024

3PC: Three Point Compressors for Communication-Efficient Distributed Training and a Better Theory for Lazy Aggregation

We introduce three point compressors (3PC), a novel framework unifying and improving communication-efficient distributed optimization methods. Presented as a Poster at ICML 2022.

Feb 2, 2022

EF21 with Bells & Whistles: Practical Algorithmic Extensions of Modern Error Feedback

Six practical algorithmic extensions of the EF21 error feedback method for communication-efficient distributed learning, with strong convergence guarantees.

Oct 7, 2021

EF21: A New, Simpler, Theoretically Better, and Practically Faster Error Feedback

We propose EF21, a novel approach to error feedback offering a better theoretical rate and strong empirical results. Presented as an Oral + Poster at NeurIPS 2021.

Jun 9, 2021