Using data analytics in regulation: Federal and local implementations
Ascend-blog-0518-Banner-cropped
What do data analytics in action look like? In our latest Ascend article, we take a look at an example of analytics used at the federal level as well as a set of considerations local governments can take when tackling analytics projects of their own.

Thentia is a highly configurable, end-to-end regulatory and licensing solution designed exclusively for regulators, by regulators.

RELATED TOPICS

Thentia is a highly configurable, end-to-end regulatory and licensing solution designed exclusively for regulators, by regulators.

RECOMMENDED FOR YOU

SHARE

Share on linkedin
Share on twitter
Share on email
Share on facebook

The power of data analytics is almost dizzyingly massive. Over the past 20 years, regulators have found themselves equipped like never before to extract meaningful insights from readily available data in their effort to serve the public interest. As such, we find ourselves in a worldwide “trial and error” phase of public data analytics. Every day, we see regulators attempting to take full advantage of the data at their disposal. Once government leaders can access and analyze data as efficiently as possible, however they choose to do it, what can be done with it?

Here we will look at a real-life example of data analytics in action at the federal level as well as basic considerations regulators are encouraged to take when planning analytics projects.

Complaint analytics: the federal and state levels

Analytics is an approach to governance in which regulators use data to generate new insights into the needs they serve and the services they provide. Ideally, a successful analytics project uses these insights to improve services, creating more effective results using fewer resources. On the federal and state levels, regulators are using data analytics to craft policy, identify trends in complaints, find opportunities for regulatory reform, and much more. The Financial Conduct Authority (FCA), a U.K. regulator, for example, uses data to analyze complaints, gauging how firms in the country treat their customers.

The FCA’s 2017 Complaints Data Analysis report provides a key example of the use of data analytics at the federal level. Government researchers analyzed approximately 3.32 million complaints received by 3,160 firms — a Herculean task that would require a serious budget increase were it performed manually by data scientists. Using analytics, officials found that 97% of these complaints were received by just 226 firms. In six in 10 of these cases (59%), the firm agreed with the complainant. In just the first half of 2017, just under £2 billion had been paid in redress to complainants.

Though the report does not outline a federal response to this volume of complaints, it provides valuable insight for regulators to determine which firms and practices need their attention most. For example, 80% of redress payments in 2017 were for payment protection insurance (PPI), which is an insurance product guaranteeing repayment of credit in the event of death or serious hardship on the part of the borrower. Regulators could now use the information gleaned from this analysis to target PPI products more closely moving forward.

Planning an analytics project: the local level

A 2018 paper from the Civic Analytics Network initiative at Harvard’s Ash Center discusses different ways city governments have used data analytics to maximize efficiency and improve quality of life for their residents. The paper also breaks down some best practices local regulators can take when planning their analytics projects. This includes making sure government leaders target the correct problems and use the appropriate data when building out these projects. The paper outlines five basic steps toward a successful analytics project:

Identifying the problem

While it may seem a simple enough task, identifying which problems could best be solved with data analytics presents a challenge for government leaders. Regulators can empower themselves to better identify problems in their jurisdictions by hiring data scientists as internal consultants. For example, Chicago was one of the first cities to create a chief data officer (CDO) position within their state’s Department of Innovation and Technology (DoIT).

While DoIT does not prescribe data solutions to problems the city faces, the department uses data-driven technology to help leaders identify weaknesses and improve performance and delivery of services wherever those weaknesses may be. Sometimes the department does directly intervene, however, like when it created an analytical model to improve the efficiency of restaurant inspections.

Assessing data readiness

Data readiness is not always as simple as government leaders may assume. Though regulators sit atop a wealth of data that could help them solve problems, utilizing unused data as an end goal is a misguided approach, according to the Civic Analytics Network paper. Author Jessica A. Gover argues that “successful analytics projects begin with the identification not of unused data, but of critical issues in need of data-driven solutions.”

The University of Chicago’s Center for Data Science and Public Policy (DSaPP) has created a “Data Maturity Framework,” designed to help public officials determine how ready their organizations are to undertake an analytical project. It comes in the form of a scorecard identifying factors like how data is stored, what data is collected, what privacy practices are in place, what data use policies are in place, and more.

Scoping the project

The scope of a city’s analytics project can vary depending on the municipality, the task at hand, and many other variables. Still, researchers have attempted to create scoping standards to help regulators figure out just how much data they should analyze, and to what extent. For example, DSaPP has also created a Data Science Project Scoping Guide, which helps speed up project development with its simple framework. DSaPP proposes four steps for this process:

  • Goals – Defining the regulator’s (or researcher’s) goals.
  • Actions – Determining which actionable steps the analytics project will inform.
  • Data – Identifying what data is available and necessary to the project.
  • Analysis – Pinpointing what type of analysis is required (and whether it involves description, detection, prediction, or behavior change).

Piloting the project

This is the classic trial-and-error phase that every analytics project must undergo. Though some real risks may be present, regulators must do their best to mitigate these risks and proceed with new analytics projects as this is the only way to uncover the important insights necessary to run the project on a larger scale. The U.K. government argues the following:

“Once embarked upon, a pilot must be allowed to run its course. Notwithstanding the familiar pressures of government timetables, the full benefits of a policy pilot will not be realized if the policy is rolled out before the results of the pilot have been absorbed and acted upon.”

Implementing and scaling the model

Even more variable than the pilot phase of an analytics project is its implementation and scaling. Regulators must work on a case-by-case basis, evaluating what worked and what didn’t during the pilot phase, determining whether to move forward with implementation, and deciding the scope of the implementation phase itself. But if the first four steps are taken with diligence and careful consideration, this fifth phase of an analytics project will be that much more well-informed, and the public that much better off in their reception of a regulator’s data-driven actions.

SHARE

Share on linkedin
Share on twitter
Share on email
Share on facebook
Jordan Milian
Written byJordan Milian
Jordan Milian is a writer covering government regulation and occupational licensing for Ascend, with a professional background in journalism and marketing.

Featured

Ascend Magazine lives at the nexus of regulation, licensing, public policy, and digital government. We share news, insight, and exclusive commentary from leaders in regulation and technology. 

OCCUPATIONAL LICENSING REFORM

VOICES

CYBERSECURITY

LICENSE PORTABILITY