Behind the rapid expansion of these tools is an example of how America’s biggest city is aiming to be transparent about which departments use these tools and what kind of data is being processed and by whom.
In a decade, the number of reported algorithmic tools used in New York City has surged from eight to 46, highlighting the city’s growing reliance on digital solutions.
For a clear sense of how algorithms affect the average citizen, here’s a few examples of the dozens of tools used by the city in 2023:
- Child welfare: Machine learning models analyze historical child welfare cases to predict future risks, helping the Administration for Children’s Services prioritize interventions and allocate resources more effectively.
- Education: The Department of Education uses algorithms to assess educator performance, informing personnel decisions and potentially shaping the quality of education for students.
- Public health: Algorithms monitor social media and 311 complaints for signs of foodborne illness outbreaks, while other tools analyze wastewater and genetic data to predict disease risks and optimize public health programs.
- Emergency response: The Fire Department employs algorithms to match patients with the nearest appropriate hospitals, optimize ambulance dispatching, predict peak times for emergency calls and prioritize fire inspections based on risk assessment.
- Environmental protection: An AI-powered tool analyzes citizen-submitted media to identify idling vehicles, aiding the Department of Environmental Protection in enforcing regulations and improving air quality.
The widespread use of these tools — with roughly 35 percent processing identifying information like names, locations or health data — underscores the importance of transparency and accountability.
This year, the NYC Office of Technology and Innovation (OTI) released a set of definitions, including the meaning of an “algorithmic tool” to help departments adhere to the reporting requirements.
The definition broadly encompasses technology that analyzes data to support decision-making, and isn't limited to AI. According to OTI, the surge in reported algorithmic tools reflects not only their growing prevalence but also a heightened awareness among city agencies regarding their reporting obligations.
“You must have clear definitions and guidance to effectively support agencies,” wrote Alex Foard, the executive director of research and collaboration at OTI, in an email to Government Technology. “And to do this, you have to speak the same language and communicate the same way — which is why we published definitions of more than 20 key terms to enable better support for agency implementation of AI projects and improved accountability. It also cannot be overstated how important it is to get started having conversations with your stakeholders — including members of other agencies, innovators in industry and academia and engaging with the public directly.”
In addition to annual reports on the OTI website, there’s also a list of the tools and descriptions of their use on the city’s open data portal.
OTI doesn’t expect the adoption of these tools to slow down. The city considers itself a “global leader” in the space that is pioneering processes to responsibly use AI, releasing a first-of-its-kind action plan for AI in October of 2023. Leaders express that they’re looking forward to exploring ways to improve serving their constituents.
“Across city government, agencies are relying upon algorithms more than ever to deliver more efficient services, maintain public safety and make our city more livable for New Yorkers,” said Foard. “We expect that trend will continue — and, as it does, the action plan will be there to provide the necessary guardrails to make sure that agencies responsibly and safely adopt these technologies.”