It’s crazy to think that in 2023, with billions of internet users around the world, that one of the industry’s greatest initiatives is to raise awareness about data centers, what they are, what they do, and the career opportunities they offer. 

Data centers have been around for decades and there’s one behind nearly everything we do. But, mention one outside of an industry event, and, more often than not, you’ll hear something along the lines of “What’s that?” in response. 

Yet, ChatGPT, which hasn’t even been around for a full year, is magically part of everyone’s vernacular. 

Are you kidding me?

I have a whole slew of issues with this, but, for now, I’m going to hone in on one: the environmental impact. 

According to a Scientific American article, OpenAI’s GPT-3 model — with 175 billion parameters — consumed an equivalent amount of energy as 123 gasoline-powered passenger vehicles driven for one year, or around 1,287 MW hours of electricity. It also generated 552 tons of carbon dioxide. And, those numbers only represent the model at the time of launching — before any users. 

For an industry focused on sustainability and reaching carbon neutrality in a few short years, that’s pretty bad. But, that’s not the whole story. 

Generative AI is defined as a particular type of AI that can generate content — text, images, graphics, and other forms of multimedia — in response to a user prompt. There are so many ways this can be used for good. For instance, it can slash the amount of time it takes to research a particular topic. This could help doctors save lives in critical care scenarios. It could potentially keep innocent people out of jail by aiding legal teams in finding precedent cases or other relevant information. 

But, one downside of generative AI is that it’s not necessarily generating new content, per se. Sure, you could ask ChatGPT to write a term paper for you, but here’s what’s going on behind the scenes: The AI bot is simply searching the web for relevant information that already exists, so it can compile it into one cohesive document. And, who knows — someone else could have asked for a term paper on the very same topic, in which case, it may just spit the same exact piece out at you. 

That’s not to mention the fact that since it’s just getting its information from the internet, a certain amount of fact checking needs to be performed … by a human. 

Do I think generative AI can play a role in making the world a better place? Absolutely. I think there are industries that can use this technology to help humanity. Not only in the instances I mentioned above, but in so many other ways too. Imagine if you were a doctor and you had all of the information about every patient case that every existed filed away in a rolodex in your mind. But, on top of that, you also had all of the same information from every doctor and every hospital. It would take the guesswork out of so many situations. That’s basically what generative AI is. 

So, my problem with it is this: people are asking it to do things they could easily do themselves out of pure laziness. I was reading a forum recently where people were sharing the ways they’ve been using generative AI. Not only are people just abandoning the art of being creative and asking their bots to “come up with a clever subject line” for their emails or “write an interesting bio” for their profiles, they’re no longer allocating time for basic human tasks, like deciding what to eat and making a grocery list. There are some students who don’t even bother doing homework anymore — they just ask an AI model to do it for them, trusting it with things, like writing code and research papers. 

Security issues aside, there’s a problem with this picture.

Generative AI is an immensely energy-intensive technology, but I don’t see the industry raising awareness of the environmental impact of our blasé attitude toward data consumption. It’s a huge problem — one we can solve but that we seem to be perpetuating instead …