
In 2025, the Levin Center conducted a study of all 50 states to examine how legislatures ensure government programs work effectively for citizens. Factors such as legislative oversight committees, routine engagement with executive agencies, collection of casework data, digital transparency tools, and user-centered policy evaluation were of particular importance. The full report can be found here.
State Summary #
In Minnesota, the bipartisan, bicameral Legislative Audit Commission plays an important role in the state’s oversight landscape. The Commission appoints the Legislative Auditor, selects topics for review, and periodically reviews the work of the Office of the Legislative Auditor. While the Commission does not typically take public testimony, the Office of the Legislative Auditor is notable for its in-depth work with end-users of programs they evaluate. More information on this work and how it fits into a people-centered oversight framework is included in this report as a case study.
In Minnesota, the Senate uses a centralized system for managing constituent contact and casework.
Case study: The Minnesota Office of the Legislative Auditor and people-centered data in formal program evaluation #
Legislative oversight is only as strong as the information that informs it. While hearings and constituent casework can provide valuable snapshots of how government is working, they are not a substitute for more formal, systematic tools for gathering information. For that, legislative institutions often turn to analytic agencies – entities like auditors and inspectors general, legislative research agencies, and so on – that help legislators assess conditions in society, examine financial performance, and evaluate the impact of government programs. The field of program evaluation is a broad one, but its general focus on operational performance, service delivery, and program outcomes along with aspects like statutory compliance makes it a natural venue for people-centered oversight. In this case study, we will look to Minnesota, where the Office of the Legislative Auditor’s efforts to incorporate citizens’ lived experience with government make it a model of people-centered oversight in action.
The Minnesota Office of the Legislative Auditor’s (OLA) Performance Evaluation Division exists to “determine the degree to which the activities and programs entered into or funded by the state are accomplishing their goals and objectives” and using resources efficiently.[1] As a part of the legislative branch, it answers to a bipartisan, bicameral commission of legislators that appoints the Legislative Auditor and selects topics for review.
In recent years, OLA staff have used tools like surveys and interviews in their program evaluation work to collect feedback from groups as wide-ranging as cosmetologists, court officials, parents of new drivers, and individuals incarcerated in state prisons. OLA’s consistency in gathering this user- and implementer-centered information, their care in working with vulnerable populations, and their commitment to using the resulting information to produce substantive data and insights to drive oversight make them a valuable national model for the role that legislative analytic agencies can play in people-centered oversight.
While not every program evaluation topic lends itself to extensive interviews and surveys, most OLA program evaluations in recent years have included some component in which OLA staff solicit direct feedback from end-users or implementers of programs. A report on driver testing and licensing in 2021 included not just information garnered from surveys of supervisors at testing sites, but in-person surveys on the experience of scheduling a test conducted with 45 parents while they waited for their child to take a road test.[2] A 2023 evaluation of a COVID-era rental assistance program included feedback gathered from 207 survey responses from landlords on their experiences with program aspects like application processing times and call center support.[3] Even a recent report on the implementation of task force recommendations on aggregate resources (a term encompassing materials like sand and gravel) – not a subject on which surveys might typically be used as an evaluation tool – included survey information gathered from county zoning administrators across the state.[4]
Including perspectives from people who have lived experience with government programs adds a great deal of value to OLA’s program evaluations. This is especially true in cases when those perspectives come from vulnerable populations or other groups whose perspectives might not traditionally be considered in oversight work. Working to include voices from those populations – and doing so in a thoughtful, ethical way – is challenging but, in recent reports examining state correctional facilities and parts of the state’s child protection system, OLA did just that.
As part of a program evaluation examining safety in state correctional facilities, OLA collected survey responses from 246 prisoners and 1,469 prison staff.[5] These surveys yielded valuable insights from both groups on the safety of correctional facilities, but the process of collecting data from them – particularly from prisoners – required careful planning and execution. An appendix to the report discusses some of the challenges involved.[6]
Department staff needed to develop methods for prisoners to access an online survey tool without access to the wider internet. Prison staff had to arrange for prisoners selected by random sample to take the survey and supervise the process without observing survey answers. Because of the unique ethics of conducting such a survey, OLA voluntarily consulted the Department of Corrections’ Institutional Research Board when developing survey protocols. These protocols included emphasizing the voluntary nature of the survey and making audio instructions available for respondents with limited reading ability. All these measures and others described in the survey methodology appendix (which stands on its own as a worthwhile read for anyone interested in the work of people-centered oversight) took considerable effort. They resulted, however, is a richer and more valuable program evaluation for the Legislature; one which carried with it the voices and points of view necessary to understand the full picture of the state’s corrections system – voices and perspectives that might otherwise never have made it beyond the walls of a prison.
Similarly, a 2022 report on Child Protection Removals and Reunifications includes, along with survey data gathered from law enforcement agencies and county child protection agency administrators, information gathered from a series of interviews with young people who were directly impacted by child protection removals. From the report: “We worked through a DHS-coordinated youth advisory council to interview several teenagers and young adults who had been removed from their homes and placed in foster care. We appreciated their willingness to share their stories with us.”[7] As it had done in its correctional facilities project, OLA staff consulted with an agency Institutional Review Board to assess the ethical challenges of conducting research with a vulnerable population.
While these interviews did not produce the same type of statistically rigorous data as some of OLA’s other work, they did provide valuable perspective from people who had experienced the effects of child protection removals firsthand. “Most of the young people we spoke with acknowledged that they had been in abusive or neglectful situations prior to their removal from the home,” reads the report. “However, a common concern in these interviews was that the young person was not aware of what was happening at the time nor did they know the reason for their removal from the home. The young people we spoke with expressed a desire for greater communication at the time of removal.”[8] Once again, this information – key to understanding policy’s impact on the people to whom it matters most – is only available in service of legislative oversight because of its inclusion in OLA’s reports.
OLA’s work in Minnesota is an excellent example of people-centered oversight because it so consistently includes frontline perspectives on program implementation – from both staff and “end-users” of programs. By meeting people where they are (sometimes very literally, as in the case of parents during their teen’s road test) OLA can include more candid and meaningful information on how policy impacts the people government serves. This work is challenging and takes real effort – far more than any web form – but the results speak for themselves.
Around the country, legislative analytic agencies with the ability to conduct program evaluations were born out of the idea that legislatures, to serve as a coequal branch of government, must develop better capacity for independent information gathering and oversight.[9] The information they collect and report is one of the best tools the legislative branch has to answer the central questions of people-centered oversight – questions of whether and how well government delivers on its promises. By collecting information from the front lines of program implementation thoughtfully and consistently, the Minnesota OLA does more than just enhance its reports – it strengthens the legislative branch and its oversight efforts with valuable, people-centered data. The Minnesota approach shows the power of people-centered oversight that extends beyond token efforts and includes deliberate, systematic integration of feedback from the people whose perspectives matter most. It serves as a compelling blueprint for people-centered oversight that is well within the reach of most analytic agencies.
[1] Minnesota Statutes 2024, 3.971, subd. 7.
[2] “Driver Examination Stations” (Minnesota Office of the Legislative Auditor, Program Evaluation Division, 2021), 20-21.
[3] “RentHelpMN” (Minnesota Office of the Legislative Auditor, Program Evaluation Division, 2023), 37.
[4] “Aggregate Resources” (Minnesota Office of the Legislative Auditor, Program Evaluation Division, 2025), 18-24.
[5] “Safety in State Correctional Facilities” (Minnesota Office of the Legislative Auditor, Program Evaluation Division, 2020), 87-90.
[6] Ibid.
[7] “Child Protection Removals and Reunifications” (Minnesota Office of the Legislative Auditor, Program Evaluation Division, 2022), 2.
[8] Ibid, 27.
[9] “NLPES and Its History,” National Council of State Legislators, February 25, 2025, https://www.ncsl.org/legislative-staff/nlpes/nlpes-and-its-history.