nanoGPT is designed to be the simplest and fastest repository for training and fine-tuning medium-sized GPTs, up to millions of parameters. It is well-suited for research projects and small-scale applications. It supports both CPU and GPU training and uses PyTorch as its main backend. nanoGPT features tools for quickly installing and running experiments, a baseline training script and several different finetuning methods. It also contains efficiency notes, troubleshooting information and a detailed list of acknowledgements. nanoGPT is open source and highly versatile, allowing anyone to use it and contribute to the project. It has been developed by a team of dedicated individuals who are passionate about making research more accessible.