Chipstrat
GPU Networking Basics
We’re going to very gently discuss networking and GPUs. It’s an important topic, but it can feel boring or esoteric. Hang with me! Motivation Training an LLM requires a lot of floating point operations (FLOPs): Source How long to train these models? If a single GPU can produce about 2 PetaFLOP/s (2