PyTorchFSDP
Educational PyTorch repo for distributed training from scratch: DP, FSDP, TP, FSDP+TP, and PP
A new open-source repo implements all major PyTorch parallelism strategies explicitly, without high-level framework abstractions.
Apr 12·3 min read