Return to need statements home page.
NS739: Use of artificial intelligence (AI) in the analysis of qualitative public input
Problem
The community is a major user of the road transport system. It is important to incorporate them into the assessment and implementation of road transport policy. Artificial Intelligence can summarize large volumes of information and data, including survey research and public engagement comments. AI tools can assist users in effectively organizing and summarizing such qualitatively-derived data since many programs offer data summary, storage, retrieval, and organization. MnDOT and Local Agencies rely on public input to make important planning, program, and project-related decisions. Could AI enable us to be more responsive to the public by more easily, quickly and cost-effectively putting this public input data to use? Would access to AI programs help broaden outreach to multiple communities and encourage more rigorous efforts to invite public input in the future? Further research is still needed before we can understand how it can be used as a research tool, and more importantly, how it can advance our knowledge and understanding.
Objective
MnDOT planning, public engagement, and market research staff wish to understand more about the use of artificial intelligence (AI) programs to analyze qualitative public input – in particular, can AI produce high-quality, trustworthy summary data to decision makers without the current high cost and onerous practices associated with manual coding and analysis? Decisions in transportation investments and policies often carry significant public and environmental stakes, surveys and interviews face unique challenges in integrating AI, underscoring the need for a rigorous, resource-efficient approach that enhances participant engagement and ensures privacy. This includes ensuring there is human responsibility in reviewing and approving AI recommendations that impact internal and external stakeholders. In the current environment, it is very important to carry out highly effective management of public opinion to ensure accuracy to deliver on transportation goals.
Previous research
Our team plans to consider previous research as outlined in the literature search. In addition, we would like to find a consultant with previous experience, preferably in the field of transportation, in qualitative research methods and artificial intelligence to provide guidance.
See the literature search for previous examples. The more relevant studies are located at the top of the literature search. Findings state that contrasting quantitative and qualitative results highlight the need for mixed methods when assessing public perceptions. There have been no studies at MnDOT done in this space.
Expected outcomes
- New or improved technical standard, plan, or specification
- New or improved manual, handbook, guidelines, or training
- New or improved policy, rules, or regulations
- New or improved business practices, procedure, or process
- New or improved tool or equipment
- New or improved decision support tool, simulation, or model/algorithm (software)
Expected benefits
For each anticipated benefit, describe in one or two sentences how that benefit might be tracked or measured (quantitatively or qualitatively). Also, as best you can, indicate (with the number in the drop-down box) whether the source of the benefit measurement is from:
- A specific research task in your project that supports measuring this particular benefit, or
- Implementation of the research findings (anticipating positive results)
- Decrease Engineering/Administrative Cost: (1)
Online public engagement/participation and survey platforms have gained significant popularity in reaching thousands of participants. Government agencies are using them to collect public input to inform decision making. Handling these inputs has become increasingly challenging as it requires manual analysis, which is time-consuming, inconsistent, and expensive. The existing analyzing approaches for public input does not align with the tremendous increase of qualitative data from various platforms (SurveyMonkey, Let’s Talk Transportation surveys, etc.) that Market Research is involved with daily. - User Benefits, Advancing Equity: (2)
Applicability of AI to summarize/translate non-English language public participation contributions. In addition, this adds to transparency in reporting methods and outcomes, as summaries of market research studies and public engagement comments could be shared with the public and stakeholders in a timely manner. This creates an opportunity to increase public trust. As reported in biennial Omnibus studies, public trust in MnDOT has trended downwards. There in an opportunity to evaluate data quickly, and report through social media, GovDelivery, and other communication methods and could be tracked through survey questions in the Omnibus study and post-construction and post-engagement surveys after a construction project or engagement activity takes place. - Risk Management: (2)
This is qualitative vs. quantitative public input, so how do we make room for the quiet voices/comments, the one brilliant idea that cracks open the issue or points to the solution, the unusual or different “take” on the topic? - Issues around the quality and rigor of the questions and the impact on results
- Whether the results are for a stand-alone project (one or two rounds) or part of a multi-year / multi-phase / multi-project effort…and thus the context of the results and subsequent analysis
- The impact of assumptions / understandings about what decision makers expect, will accept
- Impact of the extent to which decision makers are committed to using the results to shape decisions…and how that shapes the analysis
- The impact of unexplored / unnamed “standard” practices, processes, approaches, methods of doing analyses, along with assumptions, directions, expectations – as well as experience and expertise. How do the notions of “objectivity” or “bias” or “accuracy” fit in? What does the analyst believe they are supposed to do and deliver
- Concerns about public data collected through a survey or other method: data will need to be scanned for personal identifiable data prior to being included in Artificial Intelligence tools. There is a risk of including this sort of data in public AI tools, which will need to be mitigated prior.
- Data security. How do the AI tools understand the data that is input? Where is the data stored? Are there risks associated with using AI tools?
- Are others at MnDOT using AI in their work? We will only be focusing on the data that is collected by Market Research. Are there risks associated with untrained users using AI (data breaches, security with public data, etc.)?
Technical advisory panel
Names of individuals to consider for the Technical Advisory Panel:
- Hally Turner
- Nathan Lassila
- Melissa Barnes
- Rep (s) from DPE
- Rep from FHWA
- Rep From MNIT or MNIT @ MnDOT