Training GPT-3 consumed as much electricity as 120 US homes would consume in 1 year

Image Credit: Livemint
  • Training GPT-3, which is a single general-purpose AI program that can generate language and has many different uses, took 1.287 gigawatt hours, about as much electricity as 120 US homes would consume in a year.
  • That training generated 502 tons of carbon emissions, according to the same paper, or about as much as 110 US cars emit in a year.

[Via]