The wisdom goes that the more compute you have or the more training data you have, the smarter your AI tool will be. Sutskever said in the interview that, for around the past half-decade, this "recipe" has produced impactful results. It's also efficient for companies because the method provides a simple and "very low-risk way" of investing resources compared to pouring money into research that could lead nowhere.
AI labs are racing to build data centers as large as Manhattan, each costing billions of dollars and consuming as much energy as a small city. The effort is driven by a deep belief in "scaling" - the idea that adding more computing power to existing AI training methods will eventually yield superintelligent systems capable of performing all kinds of tasks.