Guest Post: How Agencies Can Gain Faster Mission Value from Legacy Data
Following is a guest blog post from Bob Jeffcott, Principal Systems Engineer at Software AG Government Solutions and Brendan Schultz, Sales Engineer at Snowflake, about how agencies can transform legacy data for faster mission value.
While IT modernization is a key trending topic in government, the reality is that many agencies still rely on legacy mainframe data for maintaining the mission.
Getting faster value from legacy data comes with perceived challenges, such as its perceived to be too hard to access, not scalable, has limited access and can be expensive and complex.
Data engineers also face complexities associated with security, governance, various application programming interfaces, or APIs, developed over decades and the presence of data in various message queues. Troubleshooting and resolving issues in accessing this data across firewalls adds even more complexity.
While modernizing legacy data can seem like a herculean task, it’s actually easy for agencies to modernize their legacy data through data integration capabilities. There are three key steps to achieve this: liberating the data, connecting it through data repositories and then building the data pipeline.
The first step is creating a comprehensive data dictionary, which involves understanding the structure and organization of the mainframe data, identifying the different data stores, and mapping the data elements to their corresponding applications.
This helps to develop a clear understanding of the data landscape, where agencies can gain insights into data relationships and ensure accurate data integration. From there, agencies can build and configure data pipelines for moving data from legacy applications to target systems.
Pre-built connectors and adapters can also establish seamless data flows and automate the movement of data between systems. This eliminates the need for cumbersome and error-prone custom code while ensuring scalability and reliability. It is also important to consider how these data modernization efforts require robust governance and security measures to protect sensitive information.
Use cases of faster value coming to life
There are several use cases where bringing faster value from legacy data are possible, as highlighted by Gartner. These span a wide range of mission-focused efforts.
Public health monitoring and response: The integration and analysis of health-related data from multiple sources, such as electronic health records, hospital systems, and disease surveillance systems enables real-time monitoring of public health trends, early detection of disease outbreaks and effective response planning.
Smart city initiatives: The collection, integration and analysis of data from various smart city components, including sensors, internet-pf-things devices, transportation systems and utility infrastructure helps to optimize urban planning, traffic management, energy consumption, waste management and public services.
Emergency management and response: Officials can better coordinate response efforts by providing a centralized data platform that enables the integration of data from emergency services, weather forecasting, social media and other sources to facilitate situational awareness, resource allocation and timely decision-making.
Homelessness management: Integration of data from social services agencies, housing providers, health care providers and law enforcement to develop a comprehensive understanding of the factors contributing to homelessness, identify trends and develop targeted interventions.
Industry partnerships drive faster value
Thanks to industry partnerships, agencies can take a 360-degree approach to modernizing mainframes and get faster value from the data. Ultimately, this allows government to accelerate innovation and be on the forefront of the mission.
In addition, state and local governments can move data from the mainframe to dynamic industry platforms so that they can analyze it in new ways, and embrace a truly digital government.
An integrated solution from combined industry partners begins with creating a comprehensive data dictionary. This involves understanding the structure and organization of the mainframe data, identifying the different data stores and mapping the data elements to their corresponding applications.
Many top industry players are also helping agencies to leverage APIs to analyze operational data. This comes down to accessing analytical data and creating a truly unified solution for building a connected agency enterprise.
Another critical element is that all partners work in concert on solutions that are truly “built for government.” The most ideal integrated solutions have been fully validated by government, whether through authorities to operate or StateRAMP designations.
Fortunately, the days of relying on legacy data for government are coming to a close. Thanks to the right industry partnerships, it’s possible to access, share and leverage data in ways that make digital government a reality. In the end, agencies can better support mission efforts and meet the needs of all customers and constituents.
About the authors:
Bob Jeffcott is a Principal Systems Engineer at Software AG Government Solutions. He has been with Software AG for 25+ years. His focus is state and local government, and he has covered the state of Texas for over 10 years. He has worked with dozens of organizations to modernize and integrate their legacy systems at the federal, state, and local levels.
Brendan Schultz is a Sales Engineer at Snowflake. His focus is across both the federal and state and local government. He has helped organizations modernize and build powerful, flexible data platforms to advance the missions.