Composable transformations of Python+NumPy programs: automatic differentiation, vectorization, and JIT compilation to GPU/TPU via XLA. 35K+ GitHub stars.

JAX became the framework of choice for DeepMind and Google research, powering the training of Gemini, AlphaFold, GraphCast, and most DeepMind scientific models. Its functional style enables clean composition of parallelism, batching, and differentiation. Apache 2.0.

Library

GitHub Repository

open-sourceinfrastructurefoundational

Related