Data virtualization and regulation: Creating a logical digital fabric for government leaders
Ascend-data-virtualization-cropped
Government leaders often find themselves analyzing data from multiple discrete sources in their everyday work. In the past, physically integrating this data has proven to be a costly and time-consuming process. Thanks to data virtualization, however, regulators can now access data from many different sources without relocating any of it. Here we look at the basics of data integration as well as different ways governments have adopted the technology.

Thentia is a highly configurable, end-to-end regulatory and licensing solution designed exclusively for regulators, by regulators.

RELATED TOPICS

Thentia is a highly configurable, end-to-end regulatory and licensing solution designed exclusively for regulators, by regulators.

RECOMMENDED FOR YOU

SHARE

Share on linkedin
Share on twitter
Share on email
Share on facebook

The transition toward a data-driven world has not historically been a uniform process. Because different types of information have been compiled and made machine-friendly on different timelines and in different ways over the years, governments today often find themselves juggling multiple datasets with their own unique structures. But what if regulators could access and analyze data regarding complaints, credentials, policy changes, and everything in between, all without physically integrating their data into one container? In recent years, this has become a reality, thanks to data virtualization.

In this article, we will look at the basic concepts and mechanics of data virtualization, as well as steps governments have taken to implement this technology in their work.

What is data virtualization?

Data virtualization is an approach to data management that allows regulators to access multiple datasets directly, regardless of incongruities in the way each dataset is built and organized. It represents an alternative to methods like the “extract, transform, load” (ETL) approach, under which these same datasets would be scanned, processed, and reformatted until all data is neatly organized in one output container.

The most prominent difference between data virtualization and ETL is the amount of time and processing power each method requires. Under the ETL approach, human operators must often replicate and manipulate separate datasets until they all fit into one overarching data structure. Of course, technological advancements have led to the creation of automated ETL software that expedites this process, but it does not change the amount of processing power required to compile such a wealth of information.

With data virtualization, regulators can avoid the time and costs required to create a brand-new integrated data platform. It relies on “middleware,” which is specialized software designed to connect to each data source, access the information therein, and present this information to an end user, while also providing tools to analyze the data as the user sees fit. The software acts a layer of “digital fabric” that covers each discrete data source and produces results much more quickly than physical data integration.

How have regulators used data virtualization?

Aside from reductions in processing time and monetary cost, data virtualization offers a few additional advantages that can help government agencies more efficiently and effectively serve the public. For example, using a data virtualization layer, different public employees with different objectives can access the same information in the ways that are most relevant to their goals. Data scientists, business analysts, and public policymakers, for example, can access an agency’s complete datasets and organize the data in their preferred manner.

As the digital world has become increasingly reliant on data over the past 15 years, analysts have been quick to point out the potential of data virtualization to streamline regulatory work. For example, a 2012 study of federal, state, and local government IT decision-makers posited that using the technology could save government $30 billion by the end of 2015. An early use-case for this approach can be found in the local government of North Myrtle Beach, which by 2015 had been working to virtualize the city’s servers for seven years.

Rick Wall, director of information services for the city, said this effort emerged from the need to provide mobile city workers with specialized applications and access to large databases. Public Works employees and city inspectors, for example, can now access datasets securely and contribute their own updates to help the local government function more efficiently overall. Beach Services auditors, too, can use portable computers to perform field audits with concession workers.

Government leaders at the federal level have also adopted data virtualization in their approaches. A Defense Department agency, for example, seeking to combine two data centers and provide widespread access to the information contained therein, recently used virtualization to accomplish this goal. In doing so, the agency reduced its data integration expenses by 80%. The agency can also now respond up to 97% faster to inquiries and it can quickly and securely deploy new cloud-based platforms and services.

The future of data virtualization

As regulators at every level continue to migrate their work to the cloud and adopt data-driven technologies in their everyday work, the way they organize their data will be crucial to their success as digital governors. The promise of data virtualization is huge – by streamlining the way data is accessed and organized by multiple decision makers, governments can save millions of dollars in processing and labor costs.

Many public officials have already capitalized on this promise, using data virtualization to give their agencies a new level of efficiency, accessibility, and flexibility in the way they organize data and respond to external events. When used in conjunction with other data-driven technologies like predictive analytics and artificial intelligence, data virtualization offers an encouraging path forward for regulators to better serve the public in our new world of digital governance.

SHARE

Share on linkedin
Share on twitter
Share on email
Share on facebook
Jordan Milian
Written byJordan Milian
Jordan Milian is a writer covering government regulation and occupational licensing for Ascend, with a professional background in journalism and marketing.

IN BRIEF

Week-in-brief-Aug-15-2022-banner-cropped
Alabama
Review commission identifies barriers to entry for Virginia teachers: Weekly regulatory news

The Week in Brief is your weekly snapshot of regulatory news and what's happening in the world of professional licensing, government technology, and public policy.
This week in regulatory news, a review commission identifies barriers to licensure amidst Virginia’s statewide teacher shortage, a U.K. architecture board recommends reforming educational requirements, and more.