How Data Centers Actually Work

how-data-centers-actually-work

Tech giants have been investing hundreds of billions of dollars into AI data centers just this year alone. But as the deals pile up, so have the concerns around their viability and sustainability. Michael Calore and senior correspondent Lauren Goode sit down with senior writer Molly Taft to discuss how these energy hungry facilities actually work, the different industry interests at stake, and whether it’ll all come crumbling down.

Mentioned in this episode:
The AI Industry’s Scaling Obsession Is Headed for a Cliff by Will Knight
OpenAI’s Blockbuster AMD Deal Is a Bet on Near-Limitless Demand for AI by Will Knight
A Political Battle Is Brewing Over Data Centers by Molly Taft
How Much Energy Does AI Use? The People Who Know Aren’t Saying by Molly Taft

You can follow Michael Calore on Bluesky at @snackfight, Lauren Goode on Bluesky at @laurengoode, and Molly Taft on Bluesky at @mollytaft. Write to us at uncannyvalley@wired.com.

How to Listen

You can always listen to this week’s podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here’s how:

If you’re on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts and search for “uncanny valley.” We’re on Spotify too.

Transcript

Note: This is an automated transcript, which may contain errors.

Michael Calore: Hey, Lauren. How are you doing?

Lauren Goode: Hey, Mike. I’m great. It’s so nice to be back in studio with you again, because our schedules were not aligning for the past few weeks.

Michael Calore: Nope. But the stars and the moon have aligned now, and here we are once again.

Lauren Goode: Here we are. And I’m sure all of our listeners have just been sitting here wondering, “When are Lauren and Mike getting back together? When is the band getting back together?” And I just went to yet another AI dinner last night. Everyone’s talking about AI. Everyone is talking about whether or not we’re in an AI bubble, and I think a lot of this is fueled by the news that we’ve been seeing trickling out about all of these AI infrastructure projects.

Michael Calore: Yes, data centers.

Lauren Goode: Data centers.

Michael Calore: Big warehouses, stuffed with computers.

Lauren Goode: Stuffed with server racks.

Michael Calore: And for sure we’re going to be touching on all of that on today’s show. But before we get into it, we have to welcome our guest, WIRED’s senior writer and climate energy expert, Molly Taft is joining us today. Hello, Molly.

Molly Taft: Hello.

Lauren Goode: Hi, Molly. Is this your inaugural Uncanny Valley podcast?

Molly Taft: Yeah, and I’m a fan, so this is a big deal for me, frankly. I’m very excited to be here talking about my favorite topic and the thing people ask me the most these days about at parties.

Lauren Goode: And what is that?

Molly Taft: Data centers. It’s data centers.

Lauren Goode: I thought you were going to say their electricity bills.

Molly Taft: Yeah, I mean, they’re related, aren’t they?

Michael Calore: Yeah.

Molly Taft: People are fired up about it.

Michael Calore: This is WIRED’s Uncanny Valley, a show about the people, power, and influence of Silicon Valley. Today we’re talking about the AI infrastructure boom. In recent years, tech giants like OpenAI, Amazon, Meta, and Microsoft have invested hundreds of billions of dollars in data centers. These are large warehouses full of servers delivering the huge amounts of computing power needed to run AI models. Recent announcements like OpenAI’s, new Stargate data center in Texas, or the AI startup’s deal with the chip maker AMD, have only added to the hype and the capital surrounding the expansion of these data centers. But just as the deals keep piling up, the issues around this rapid expansion are becoming more worrisome. We’ll dive into how these data centers work, what effect they’re already having in some communities, and what they reveal about the current state of the AI industry and our economy. I’m Michael Calore, Director of Consumer Tech and Culture.

Lauren Goode: I’m Lauren Goode. I’m a senior correspondent.

Molly Taft: And I’m Molly Taft, a senior writer covering energy and the environment.

Michael Calore: I think it’s safe to say that all of us, including our listeners, have heard of data centers somewhere in the news and they have a pretty good idea of what a data center is. But let’s talk about why AI companies rely on them. To put it another way, how does a ChatGPT query on my computer end up in a data center?

Lauren Goode: This is an excellent question and I think we’re going to nerd out a little bit.

Michael Calore: Awesome.

Lauren Goode: Okay. Say you type into ChatGPT that you want a few options for dinner recipes tonight, or “What am I going to get Mike for his birthday next year?”

Michael Calore: Bless you.

Lauren Goode: You send your request through and then it goes to OpenAI servers. It goes through a few checkpoints first. Like there’s authentication, are you a valid user? Moderation, is the prompt okay or aligns with their guidelines? And then some load balancing, which is, which data center should ultimately handle this? The words or the strings of texts that you write are broken down into these little chunks of text called tokens. They’re kind of like these puzzle pieces that artificial intelligence models can process. And at this point, the request lands on specialized hardware, in most cases GPUs.

Molly Taft: Yeah. The next part is key. GPUs, which stands for graphic processing units, are essential to AI systems. They’re electronic cards that are really good at parallel processing, which basically means they can perform many calculations at once. Data centers are full of metallic rows of servers that contain these GPUs, so once you send your query in, it gets processed by those GPUs. You’ve probably heard some of the most famous ones in tech headlines, like Nvidia’s H100s.

Lauren Goode: Ah, the H100. The kids can’t get enough of them.

Michael Calore: My old friend.

Lauren Goode: And then I guess to bring it back full circle to our query for what to eat for dinner tonight or Mike’s birthday present next year, once that query arrives at the data center and the servers, the AI model starts working. This is called the inference time. The model is going to predict what words, in the form of tokens, should come next, one after another until it builds out this full answer. And then finally, the response is sent back through the same network path to your browser or app. This is a very summarized version of what happens, but on a high level, that’s why your ChatGPT question or query needs to go through to these data centers.

Michael Calore: And this all happens in a matter of seconds.

Lauren Goode: It’s pretty miraculous.

Michael Calore: Yeah.

Lauren Goode: Yeah.

Michael Calore: Well, I love the breakdown. It’s also, just for the record, I have to say that I probably would never ask for ChatGPT for recipe recommendations. And Lauren, I really hope that you get a good answer as to what to get me from a birthday because I’m already very excited.

Lauren Goode: You might recall that I did this last year for your birthday, and I think I still haven’t gotten you anything.

Michael Calore: Well, I think if there’s a next question that everybody has about data centers, it’s about how energy-intensive they are. Because even a simple query like that, we’re talking about accessing multiple different servers. You’re breaking up your query into different tokens, spraying them into this data center. How much impact do those computing requirements have on the environment? And Molly, I want to ask you this question because I know you’ve reported on this a lot, what does the energy consumption of a data center look like? And are we talking about just energy? Are we talking about water? What does it look like?

Molly Taft: Large data centers, there’s a lot going on inside of them. They need cooling systems, they need to keep the lights on, they need to run their network equipment. That all creates a lot of energy. They don’t run at the same energy level all the time. They often run in cycles based on when there’s more queries. They’ll take a break at night when the queries are lower. And the environmental footprint of a data center also depends on what type of energy they’re hooked up to. If you’re hooked up to a dirtier grid, a grid powered by fossil fuels, you’re going to create more emissions. A data center that runs on solar power or wind power will have a lower environmental footprint. It’s funny trying to calculate the footprint of a specific data center because a lot of the information about the environmental impacts is proprietary information, so a lot of what we’re working with is stuff that companies volunteer themselves. You’ll see headlines. Meta is building a data center they’re calling Hyperion in Louisiana. It’s part of their bigger build-out, and that’s going to be huge. It says it’s going to be about five gigawatts, which is massive. That’s about half the peak power load of New York City. That’s the biggest one under construction right now, but there’s a lot more out there. There’s also regional level approximations, and there are some places in the world that are starting to get worried about how much energy data centers are consuming or are set to consume. In Ireland, right now data centers use more than 20% of the country’s electricity, which is crazy.

Lauren Goode: Wow.

Molly Taft: Virginia is also facing a huge cliff, and they’re projected to use a lot more over the next couple of years. So as a group, there’s definitely an energy trend that’s going up.

Lauren Goode: You mentioned that each tech company, data center company, is reporting their energy usage. I’m curious how far down the stack they’re going. For example, if they are using a certain number of GPUs and those components from GPUs have to be shipped from different parts of the world or manufactured in different parts of the world, that is also an emissions cost. So are any of them incorporating that?

Molly Taft: Yeah, the problem with climate stuff and the problem with emission stuff is you can keep going down this rabbit hole forever. What you’re talking about gets into emissions reporting, which is what is the total environmental footprint of what you’re doing? And this is something that climate folks and people who like to do math on emissions are constantly trying to figure out when it comes to specific companies. And many companies don’t necessarily want to go down that chain. But I think when we talk about power use, we are talking about on-site power use. We’re talking about just what it’s going to take to turn that thing on and to run that thing. The total footprint of these things is probably a lot bigger than we think.

Lauren Goode: And when you mentioned people who like to do math and figure this out, I think of someone like a Sasha Luccioni who, she’s pretty vocal out there. She’s the climate lead at Hugging Face. And she tends to, to put it eloquently, call bullshit on a lot of the numbers that these tech leaders put out there.

Molly Taft: Yeah, and Sasha’s been really good at calling out, especially when you have models where it’s less transparent, as to how much energy is actually going into a query. There’s numbers that kind of get tossed around, and Sam Altman put out a blog post this summer where he said the average query for ChatGPT uses about what an oven would use in a little over one second or a high-efficiency light bulb would use in a couple of minutes. It’s about 0.34 watt hours of energy. But Sasha really correctly points out that these kinds of figures don’t give us a lot to work on. When he says an average query, what does that mean? How many of those queries are happening? What are these data centers hooked up to? Are they hooked up to grids that are mostly renewables? Are they hooked up to grids that use coal? There are so many factors that go into the energy use of these products. And Sasha put it really, really well. She said that Altman “pulled that figure out of his ass.” That’s a direct quote. And she told me, “It blows my mind that you can buy a car and know how many miles per gallon it consumes, yet we use all these AI tools every day and we have absolutely no efficiency metrics, emissions factors, nothing.” So just one number doesn’t help us get to the bigger picture, and a lot of the times it may just help companies procrastinate on putting out more data.

Michael Calore: We don’t have any clear indicators about how much energy each query is taking and how much energy these data centers are consuming, other than it’s a lot and it’s a problem. And even knowing that it’s a problem, a lot of the big tech companies that are investing in AI infrastructure are continuing to build, build, build. We’ve talked about some of the big players and we should make sure that we mention as many of them as we can. Who are the other big names in this space?

Lauren Goode: Honestly, I think the biggest ones are the folks we’ve already mentioned, which is OpenAI, AMD, Nvidia, and the Stargate Project that’s kind of an umbrella term for a bunch of different partnerships. The Stargate Project is a $500 billion, 10-gigawatt commitment between OpenAI, SoftBank, Oracle, and MGX, all of these interconnected relationships between the different hyperscalers and the chip makers. What’s interesting is that there has been a little bit of a shift in the terminology or the way that these investments are being described, and that they’re being described as gigawatt investments. So they’re kind of staggered investments in many cases. They’re based on the assumption that the demand for AI will continue to scale upwards, and that eventually more and more and more energy, or compute power, I should say, is going to be needed. It’s not some kind of clear-cut, “Well, in exchange for this amount of investment, this company is going to get X million number of GPUs.” It’s not basic unit reporting.

Michael Calore: Okay, so there’s one thing that you said there that I want to hear more about, and that’s hyperscaling. Because it sounds like what we’re already talking about is hyperscaling, but you’re saying that there are companies that are interested in growing even more than what we’ve already talked about?

Lauren Goode: Well, they’re all interested in growing more. Who among us, Mike? But the hyperscalers refers to this class of major tech companies or cloud service providers. So Meta, Amazon, Microsoft, Google, they’re all in that category.

Molly Taft: Yeah, and I think it’s important to remember that these companies have so much money and they have an ability to raise capital like nobody’s business. So they’re able to do some really crazy stuff to build quick and to build-out really, really big. And they’re getting pretty creative, because their goals right now are to build these things quickly and get them up and running so they can basically use this physical infrastructure to compete with each other.

Lauren Goode: I think that’s right, Molly. I think there’s a lot of frenemy building happening right now, and I would just love to be a part of their group chats when all of these announcements are being made.

Michael Calore: Yeah, and speaking of frenemies, the other sphere of influence that these companies are operating in is the political sphere. Obviously, in order to build a giant data center somewhere, you need to have the political will to do it, which means you need buy-in from the local residents, the local government, the state, the country. So what’s happening in the political sphere with folks who want to build more data centers and people who oppose it, regulation? How is that playing out?

Molly Taft: That’s a great question, and I think if you look at the national conversation, it’s quite different than what’s happening on the local level. You have Washington, you obviously have an administration that is very friendly to the idea of an American AI empire. Importantly for the energy conversation, the way that the Trump administration has approached this support has been through support of fossil fuels. They would really like for all data centers to be powered with oil and gas, a little bit of nuclear and coal. And this works out great for those industries as well. If you’re going to have this massive expansion of power demand, it’s really cool to be in the middle of that and be the one that everyone wants to turn to for energy resourcing. And then on the other side, there has been this influx of local opposition to these data centers for a variety of reasons, be it the water use, be it fears about rising electricity rates, be it noise, and some of the really big struggles have catapulted this issue to national conversation. I’m thinking about xAI in Memphis. When Elon Musk wanted to get xAI up and running, he installed a bunch of unpermitted gas turbines in order to get xAI working that he installed in a majority Black community in Memphis that already had severe issues with air pollution and asthma. And those folks made themselves known. Earlier this year, there was an attempt in DC to impose a moratorium on any state regulation around AI at all. It was an incredibly broad inclusion in the Big Beautiful Bill that ultimately didn’t succeed. But one of the people who opposed it publicly was Marjorie Taylor Greene, who actually mentioned data centers in her opposition, and she compared AI to Skynet, the fictional AI from the Terminator movie franchise. So, this is getting some strange bedfellows in league with each other, I think this kind of contrast between what the administration is trying to push forward and some very powerful energy companies that stand to gain from it, versus some truly grassroots local movements and people concerned about the impacts of what these things are going to do in their communities.

Michael Calore: We are going to take a quick break right now, but when we come back we’re going to dive into why AI companies’ aggressive bet on scaling might come back to bite them. So stay with us.

                                  Welcome back to Uncanny Valley. Today we’re talking about the AI infrastructure boom. Now, Lauren and Molly, we just talked about why data center investments have been ramping up and what the known impacts are. So now comes another key question, is all of this aggressive expansion even a good idea?

Lauren Goode: It depends on who you ask, Mike. AI founders will typically tell you that there’s nothing to worry about and that this is necessary to meet the current and future demand for AI. But the big elephant in the room is that for all of this investment, consumer spending on AI is not there yet. And some of the frontier model companies have seen a significant amount of their revenue coming from enterprise developers who are actually building for their companies and using AI for that reason. I have to imagine, and this is just my own theory, that unless we become a world where consumers essentially are developers, where consumers are using AI to the point where we’re all building things and writing lines of code based on this, that I don’t see how the revenues get there. I also found this really interesting, The Economist reported that AI hyperscalers, which are the cloud service providers I talked about who are the largest spenders on AI, are using some interesting accounting tricks to try to depress the reported infrastructure spending. Which, then of course has the effect of inflating the profits from that. So I think that’s partly why there’s a lot of people concerned that there could be an AI bubble. It’s about supply and demand, and right now we are spending so much money as a nation on supply and just hoping that the demand continues to grow.

Michael Calore: Fingers crossed.

Molly Taft: I think when we talk about the demand for this, you have to understand that behind it, there’s a lot more going on than just accurately reading the tea leaves. There was an incident in the late ’90s and early 2000s, where there was all of a sudden this narrative about how much energy the internet was going to use. And there were a bunch of high-profile articles that made some claims about how by the 2010s, the internet was going to use half of the US’s electricity. The internet was posed as this huge electricity suck, and we were going to need to make more coal-fired power plants to deal with it. There’s a really smart researcher named John Koomey who dug into this, and he found that it was really being pushed by the industries that stood to gain from such a build-out. And of course, that didn’t end up happening. Efficiency gains were great. The internet doesn’t actually use that much energy. I think we’re in this weird moment where US energy use has stayed pretty stagnant. We’re getting better at using what we have and we’re getting better at being more efficient. And all of a sudden there’s the possibility for a big customer of various industries, not just energy. And so, it’s hard to tell if we’re in a bubble when there are so many people who stand to benefit from people believing that this is really going to happen.

Lauren Goode: And to that point, when you make this kind of investment in infrastructure, it’s a very fixed investment and you’re basically doing it under the assumption that the way the compute happens will remain the same for a certain period of time. And actually, what we’re seeing is that some of the more computationally-intensive AI models could soon potentially offer diminishing returns compared to smaller models. For example, like a frontier model company like OpenAI, is currently much better, smarter than a smaller model from an academic lab, but that may not always be the case. There’s a lot of interesting research going on. There are new alternatives to deep learning, there are novel chip designs, there’s new approaches like quantum computing. And I think what we saw with DeepSeek out of China and its remarkably, or reportedly, low-cost model this past January, that’s already served as a reality check for the AI industry.

Michael Calore: What do you think is something that we as citizens, as people who have the decision of whether or not to use AI tools, and we have decisions to get involved in whether or not data centers are going to come to our state or come to our county, what can we do?

Molly Taft: I think about this all the time. I’m going to assume, I’m going to go out on a limb that most people listening to this are nerds and enjoy reading and learning, and one thing I… I’m so much fun at parties. One thing I love to tell people at parties is not even get involved, but to learn more about how their local electric utility works. Because if a data center comes to your area and impacts your electric bill, it’s because of your utility. And utilities in the US are set up in this really, really strange way, and a lot of them are investor-owned, so still have to make a profit for themselves. But they also control how much money you’re seeing on your electric bill every month. And so I think this kind of problem can feel so big and so outside your control, but the more you dig into the wonkier side of things, the more you will find people who are organizing around even just basic asks to utilities to use more renewable energy, to monitor rate hikes. And they will be the ones that will be up on if a data center is coming to town, what it might be doing to your electric bill. So I would encourage people to read more about the fascinating topic of electric utilities in the United States.

Lauren Goode: That’s really great.

Molly Taft: Yeah.

Lauren Goode: Molly, you’ll be happy to hear that a few weeks ago I was down in Chandler, Arizona for a tour of Intel’s new chip fabrication plant. There were a handful of us who were what I would consider maybe mainstream tech journalists there, but then there was one woman there from the paper in Arizona who was paying very close attention to power and utilities and water usage in the area.

Molly Taft: Yeah, she’s the real one, honestly. Read her stuff.

Lauren Goode: Yeah. My piece of advice is not related so much to energy usage or your own usage of AI, but I would just say double-down on the humanities. I don’t think AI is going away, even if we’re in a bubble, even if that bubble bursts. The way that it looks may change in the next few years, but I do not think it’s going away. And I think in the future, I want to believe that what will make us stand out is our ability to think and our human relationships, our human capital, if you will, and our ability to appreciate real human-generated art. And so that’s what I’ve been doing. I just spend a lot of time when I’m not on the internet, reading books written by humans and watching films, reconnecting with family and friends. That’s my, I don’t know, I guess my small act of resistance.

Michael Calore: That’s great.

Molly Taft: Same, honestly. Yeah.

Michael Calore: Yeah. And I would say that you do need to understand these technologies in order to have an opinion about them, and in order to understand them, you have to engage with them. So I would encourage you to dabble a little bit, but don’t use AI like a monster. Don’t say thank you to the machine, because we know that people typing in thank you at the end of a query or after they get an answer, is burning even more resources. So just understand the technology enough so that you understand why it’s important to have an opinion about it, and it could help you form your opinion it. And also push back on AI features being crammed into things where they don’t need to be. So if you don’t see the utility in the AI features on your phone or in your car, figure out how to turn it off, because then it’s like turning off the lights when you leave the house. All right, well, let’s take another break, and then we’ll come back and we’ll do our WIRED-Tired, our new little experiment. Welcome back. Lauren and Molly, we are now going to be diving into a new segment that we’re calling WIRED-Tired, and there’s a slash in between those two words. It’s WIRED-Tired. If you’re a long time WIRED reader, then you will know that the WIRED-Tired format is a big part of our brand. So whatever is new and cool is WIRED, and whatever passe thing it’s replacing is tired. And the WIRED and tired don’t have to be related to each other, but it just rhymes better and really rings the bell if they are related. So we’re going to try it here on the show and we’re going to talk about what is WIRED and tired in our lives. Are you ready?

Molly Taft: Yes.

Lauren Goode: We’re all tired.

Molly Taft: I’m exhausted, personally, but thrilled about this.

Michael Calore: Well, let’s start with Lauren. Lauren, tell us what is WIRED and tired to you.

Lauren Goode: Okay. Tired to me are extravagant coffee drinks, and WIRED is just go for the simple drip coffee.

Michael Calore: And what do you put in your drip coffee?

Lauren Goode: It’s pumpkin spice season, everyone. And I have to say I’m sick of seeing the signs for not only pumpkin spice, but weird twists on it. And now it’s like Dubai chocolate pumpkin spice lattes. And where are we going with this?

Michael Calore: No. We’re done.

Lauren Goode: There’s no end, so just stop it now. Shut it down, and just go for either pour-over or drip, or maybe if you’re going to get a little fancy, an Americano. I love a good Americano. I put a little half-and-half in mine and sometimes a sweetener. Whatever. Yeah, that’s it.

Michael Calore: So tired, PSL.

Lauren Goode: Yeah.

Michael Calore: WIRED-

Lauren Goode: Drip.

Lauren Goode: Be a drip.

Michael Calore: Be a drip.

Lauren Goode: Yeah, that’s it.

Michael Calore: Love it.

Molly Taft: I love that.

Lauren Goode: Thank you.

Michael Calore: Molly, your turn.

Molly Taft: Tired is looking at my phone. I don’t want to do it anymore.

Lauren Goode: Yes.

Molly Taft: I don’t want to do it anymore, frankly, for anything. I don’t care. This is a public apology to all of my friends. I haven’t answered texts in months because I don’t want to look at my phone. So it’s very tired. I’m very tired about it. WIRED, new thing called books. I love to read them. And my recommendation, the book that I read earlier this year and I have not stopped thinking about, is a novel called Sky Daddy by Kate Folk. It is set in the Bay Area, so it ties in, and it’s about a woman in love with planes and I’m not spoiling anything. It’s a little not safe for work, so don’t read it at work, but it’s honestly one of the funniest books I’ve read in a very long time.

Michael Calore: Nice. WIRED, books. Tired, phones. Solid.

Lauren Goode: Great. Mike, what’s yours?

Michael Calore: My WIRED is hydration tablets, and my tired is plain old water.

Lauren Goode: Oh, I love how we both went with beverages.

Michael Calore: Yeah. Well, okay, so you should still drink water, and I drink water at meals, but for those times in between meals when I’m doing the thing and hydrating at my desk, I’ve gotten really into hydration tablets and hydration powders. These are the things that basically turn your bottle of water into a Gatorade. It gives you electrolytes, it gives you some flavor, a little bit of sugar. They have caffeine in them now, which is amazing.

Lauren Goode: Your eyes just lit up.

Michael Calore: Oh, yeah.

Molly Taft: Game changer.

Michael Calore: Totally. You get them at the sporting goods store, you can get them on the internet or whatever. They’re sold as sports nutrition. But really they’re great because they get me excited about drinking the water that’s in front of me. Too many times I’m filling up my water bottle and then just forgetting about it, and then I come back in four days into the office on a Monday and unscrew the cap and it tastes like pennies. Nobody wants that. So put a hydration tablet in your water bottle and suck it down and love it. I like Nuun. Those are good.

Lauren Goode: And they come in handy-

Molly Taft: Those are really good.

Lauren Goode: … packets for travel too.

Molly Taft: Yeah.

Michael Calore: They do, yeah.

Lauren Goode: Yeah.

Michael Calore: Yeah, lots of single-use plastic. It’s awesome. Also-

Molly Taft: WIRED.

Michael Calore: … Liquid I.V. is good.

Molly Taft: Liquid I.V. has a Firecracker Popsicle flavor. That is my household’s favorite.

Michael Calore: We need PSL-flavored electrolytes.

Lauren Goode: Yes.

Molly Taft: No.

Michael Calore: Thanks for listening to Uncanny Valley. If you liked what you heard today, make sure to follow our show and rate it on your podcast app of choice. If you’d like to get in touch with us with any questions, comments, or show suggestions, write to us at uncannyvalley@WIRED.com. Today’s show is produced by Adriana Tapia. Amar Lal at Macro Sound mixed this episode. Mark Lyda is our San Francisco studio engineer. Pran Bandi is our New York studio engineer. Kate Osborn is our executive producer. Katie Drummond is WIRED’s global editorial director, and Chris Bannon is Condé Nast’s head of global audio.

Related Posts

Leave a Reply