Texas’ grid operator is developing a new process to evaluate multiple large-load interconnection requests at the same time. The question for cryptocurrency miners and data center developers that are already in line is: who gets to go first?
That should hopefully be sorted out by month’s end. That’s when the Electric Reliability Council of Texas (ERCOT) hopes to have criteria announced for which energy-intensive projects could be considered for “Batch Zero,” the first group to go through ERCOT’s revised planning process.
At a Public Utility Commission of Texas (PUC) meeting on Thursday, Jeff Billo, ERCOT’s vice president of interconnection and grid analysis, said that for “Batch Zero,” the grid operator will consider proposed large load interconnection requests from projects that have been in the queue for some time and don’t need to be restudied. Projects that are less far along will be studied in a later batch, Billo said.
These requests primarily come from data centers, crypto mines, industrial sites and hydrogen projects. Billo also said that just because some existing projects in ERCOT’s queue might warrant another transmission study doesn’t mean they couldn’t still be considered for “Batch Zero.”
“There are a lot of details to fill out there,” Billo said. “We are still really early in the process of designing how that batch study would work.”
The timing of a transmission study matters greatly for companies with billions of dollars at stake in developments requiring grid connection. PUC Chairman Thomas Gleeson said one of the largest points of contention among data center developers has been uncertainty about where their projects stand, despite some having been in the queue for years. Gleeson said it’s important not to leave these companies in limbo while ERCOT sorts out its new procedures for reviewing multiple requests from large energy users together and allocating existing transmission capacity among them. ERCOT previously studied large load users one at a time.
“Transparency around this is going to be critically important to ensuring success,” the chairman said.
The planning around “Batch Zero” comes as ERCOT is well into reforming its transmission planning procedures. With the increasing number of large loads seeking to connect to the grid, ERCOT and utilities cannot keep up with the required transmission planning and end issuing restudies.
The current system, built for a large load queue totalling 40 to 50 projects, is now bogged down by the 225 new interconnection requests ERCOT received last year, according to a December report.
Under its previous planning process, by the time one data center finished planning studies, the results would often have to be reconsidered almost immediately, as more projects joined the interconnection queue and changed the local transmission needs and reliability.
The consensus from early conversations with corporate stakeholders, including Google, Meta, CenterPoint, Amazon and OpenAI—all looking for grid capacity in Texas—was that the uncertainty in the current process creates undue risk for developers with existing interconnection requests.
The proposed batch method aims to ameliorate that.
The outcome of the new process would determine the number of megawatts that Texas’ independent grid could reliably deliver and the additional transmission projects needed to enable full interconnection.
If a developer requested a 500-megawatt project be interconnected in 2028, but the batch study showed that ERCOT could only reliably provide 100 megawatts and would need to undergo a transmission upgrade project in 2030, the developer would be offered an “on-ramp” of 100 megawatts until the transmission upgrade is completed in 2030, Billo said as an example. Then, the developer would receive the full 500 megawatts of grid power.
At the end of the batch study process, developers would have a set amount of time to make a financial commitment, Billo said, to demonstrate they will proceed with the project. After that, ERCOT could begin a transmission project to cover “firm” commitments, which could then be utilized in other batch studies.
The initial sentiment from “hyperscale” data center users like Google, Amazon Web Services, Meta and Microsoft, developers and independent power generators is that a batch-based approach is necessary for large load interconnection, Billo said. “Everyone that we have talked to so far has been supportive of us moving to a batch study process,” Billo said.
About This Story
Perhaps you noticed: This story, like all the news we publish, is free to read. That’s because Inside Climate News is a 501c3 nonprofit organization. We do not charge a subscription fee, lock our news behind a paywall, or clutter our website with ads. We make our news on climate and the environment freely available to you and anyone who wants it.
That’s not all. We also share our news for free with scores of other media organizations around the country. Many of them can’t afford to do environmental journalism of their own. We’ve built bureaus from coast to coast to report local stories, collaborate with local newsrooms and co-publish articles so that this vital work is shared as widely as possible.
Two of us launched ICN in 2007. Six years later we earned a Pulitzer Prize for National Reporting, and now we run the oldest and largest dedicated climate newsroom in the nation. We tell the story in all its complexity. We hold polluters accountable. We expose environmental injustice. We debunk misinformation. We scrutinize solutions and inspire action.
Donations from readers like you fund every aspect of what we do. If you don’t already, will you support our ongoing work, our reporting on the biggest crisis facing our planet, and help us reach even more readers in more places?
Please take a moment to make a tax-deductible donation. Every one of them makes a difference.
Thank you,
Great Job By Arcelia Martin & the Team @ Inside Climate News Source link for sharing this story.

