The recent Biden White House Executive Order on AI addresses important questions. If it’s not implemented in a dynamic and flexible way, however, it runs the risk of impeding the kinds of dramatic improvements in both government and community participation that generative AI stands to offer.
Current bureaucratic procedures, developed 150 years ago, need reform, and generative AI presents a unique opportunity to do just that. As two lifelong public servants, we believe that the risk of delaying reform is just as great as the risk of negative impacts.
Anxiety around generative AI, which has been spilling across sectors from screenwriting to university education, is understandable. Too often, though, the debate is framed only around how the tools will disrupt us, not how these they might reform systems that have been calcified for too long in regressive and inefficient patterns.
OpenAI’s ChatGPT and its competitors are not yet part of the government reform movement, but they should be. Most recent attempts to reinvent government have centered around elevating good people within bad systems, with the hope that this will chip away at the fossilized bad practices.
The level of transformative change now will depend on visionary political leaders willing to work through the tangle of outdated procedures, inequitable services, hierarchical practices, and siloed agency verticals that hold back advances in responsive government.
New AI tools offer the most hope ever for creating a broadly reformed, citizen-oriented governance. The reforms we propose do not demand reorganization of municipal departments; rather, they require examining the fundamental government operating systems and using generative AI to empower employees to look across agencies for solutions, analyze problems, calculate risk, and respond in record time.
What makes generative AI’s potential so great is its ability to fundamentally change the operations of government.
Bureaucracies rely on paper and routines. The red tape of bureaucracy has been strangling employees and constituents alike. Employees, denied the ability to quickly examine underlying problems or risks, resort to slow-moving approval processes despite knowing, through frontline experience, how systems could be optimized. And the big machine of bureaucracy, unable or unwilling to identify the cause of a prospective problem, resorts to reaction rather than preemption.