Describing DeepSeek-V3 as “a frontier-grade (large language model) trained on a joke of a budget”, Andrej Karpathy, one of OpenAI’s founders, said on social media platform X that it was believed a model of this scale should require at least 16,000 GPUs, while the most advanced models typically consist of around 100,000 GPUs.