Minnesota Department of Transportation

511 Travel Info

Research & Innovation

Need statements

NS739: Use Of Artificial Intelligence In the Analysis Of Qualitative Public Input

Problem

Artificial Intelligence (AI) has the ability to summarize large volumes of information and data, including survey research and public engagement comments. AI tools can assist users in effectively organizing and summarizing such qualitatively-derived data since many programs offer data summary, storage, retrieval, and organization. MnDOT and Local Agencies rely on public input to make important planning, program, and project-related decisions. Could AI enable us to be more responsive to the public by more easily, quickly and cost-effectively putting this public input data to use? Would access to AI programs help broaden outreach to multiple communities and encourage more rigorous efforts to invite public input in the future? Further research is still needed before we can understand how it can be used as a research tool, and more importantly, how it can advance our knowledge and understanding.

Objective

MnDOT planning, public engagement, and market research staff wish to understand more about the use of artificial intelligence (AI) programs to analyze qualitative public input – in particular, can AI produce high-quality, trustworthy summary data to decision makers without the current high cost and onerous practices associated with manual coding and analysis?

Previous research

Our team plans to consider previous research as outlined in the literature search. In addition, we would like to find a consultant with previous experience in qualitative research methods and artificial intelligence to provide guidance.

See literature search for previous examples. However, there have been no studies at MnDOT to build upon.

Expected outcomes

  • New or improved technical standard, plan, or specification
  • New or improved manual, handbook, guidelines, or training
  • New or improved policy, rules, or regulations
  • New or improved business practices, procedure, or process
  • New or improved tool or equipment
  • New or improved decision support tool, simulation, or model/algorithm (software)

Expected benefits

The numbers 1 and 2 indicate whether the source of the benefit measurement is from: 

  1. A specific research task in your project that supports measuring this particular benefit, or
  2. Implementation of the research findings (anticipating positive results)
  • Decrease Engineering/Administrative Cost (1)
    • Online public engagement/participation platforms have gained significant popularity in reaching thousands of participants. Government agencies are using them to collect public input to inform decision making. Handling these inputs has become increasingly challenging as it requires manual analysis, which is time-consuming, inconsistent, and expensive. The existing analyzing approaches for public input does not align with the tremendous increase of qualitative data from various platforms (SurveyMonkey, Let’s Talk Transportation surveys, etc.) that Market Research is involved with daily.
  • User Benefits: Advancing Equity (2)
    • Applicability of AI to summarize/translate non-English language public participation contributions. In addition, this adds to transparency in reporting methods and outcomes, as summaries of market research studies and public engagement comments could be shared with the public and stakeholders in a timely manner. This creates an opportunity to increase public trust. As reported in biennial Omnibus studies, public trust in MnDOT has trended downwards. There in an opportunity to evaluate data quickly, and report through social media, GovDelivery, and other communication methods and could be tracked through survey questions in the Omnibus study and post-construction and post-engagement surveys after a construction project or engagement activity takes place.
  • Risk Management (2)
    • This is qualitative vs. quantitative public input, so how do we make room for the quiet voices/comments, the one brilliant idea that cracks open the issue or points to the solution, the unusual or different “take” on the topic?
      • Issues around the quality and rigor of the questions and the impact on results
      • Whether the results are for a stand-alone project (one or two rounds) or part of a multi-year / multi-phase / multi-project effort…and thus the context of the results and subsequent analysis
      • The impact of assumptions / understandings about what decision makers expect, will accept
      • Impact of the extent to which decision makers are committed to using the results to shape decisions…and how that shapes the analysis
      • The impact of unexplored / unnamed “standard” practices, processes, approaches, methods of doing analyses, along with assumptions, directions, expectations – as well as experience and expertise. How do the notions of “objectivity” or “bias” or “accuracy” fit in? What does the analyst believe they are supposed to do and deliver
      • Concerns about public data collected through a survey or other method: data will need to be scanned for personal identifiable data prior to being included in Artificial Intelligence tools. There is a risk of including this sort of data in public AI tools, which will need to be mitigated prior.
      • Data security. How do the AI tools understand the data that is input? Where is the data stored? Are there risks associated with using AI tools?
      • Are others at MnDOT using AI in their work? We will only be focusing on the data that is collected by Market Research. Are there risks associated with untrained users using AI (data breaches, security with public data, etc.)?

Technical advisory panel

  • Hally Turner
  • Nathan Lassila
  • Rep from PELG
  • Rep (s) from DPE
  • Rep from PMG
  • Rep from FHWA (Josh Pearson?)
  • Rep From MNIT