Editor's Note: This piece was written by Gary Newgaard, Vice President, Public Sector at Pure Storage. The opinions represented in this piece are independent of Smart Cities Dive's views.
Ask average citizens about their biggest frustrations in dealing with government organizations and you’re likely to conjure up at least a few stories of never-ending lines at the Department of Motor Vehicles (DMV). Bureaucracy and manual processes have, fairly or not, become synonymous with the business of government. They upset constituents, and chances are they don’t help government workers get their jobs done, either.
But the advent of new technologies promises to simplify work for state and local employees. Powered by artificial intelligence (AI) and machine learning, these emerging capabilities have tremendous potential for a smarter and more agile government and public sector as a whole.
Examples of this already exist. Chatbots on agency websites now answer increasingly complex questions and execute customer assistance tasks that previously drained employee resources and hours. It’s meant fewer calls to government agencies and more valuable staff hours available for other tasks.
And those other tasks are being streamlined, too. Paperwork presents the biggest opportunity for cost-cutting across agencies, with state and local employees spending 10% of their time documenting information, according to Deloitte.
The firm’s 2017 report, AI-augmented Government, illustrates the dollars-and-cents potential at both the federal and state level. It projects that "simply automating tasks that computers already routinely do could free up to 96.7 million federal government working hours annually, potentially saving $3.3 billion. At the high end ... AI technology could free up as many as 1.2 billion working hours every year, saving $41.1 billion."
Now, consider that state and local governments throughout the United States employ nearly four times as many people as the federal government, around 8 million, not including the education sector. Deloitte estimates that state governments perform an even higher percentage of tasks suitable for automation, which means that the annual savings across the nation could possibly eclipse $100 billion.
The potential for AI extends well beyond reducing administrative backlogs. Modernization efforts in various localities continue to attract press coverage, and smart city rollouts are among the biggest trending technology topics.
In Philadelphia, smart water meters monitor usage in real time and provide insight into potential leaks or maintenance needs. In Los Angeles, smart parking meters inform customers of available spots, and street lights dim or turn on based on weather conditions or the presence of pedestrians.
State governments are embracing innovation, as well. Iowa will launch an AI-based system this year that joins flood-sensor data with AI chatbots for real-time flood forecasts. Sustainable, cost-saving practices that often improve citizen safety and quality of life are arriving every day.
Soon our traffic control systems will interact with self-driving cars. Machine learning will improve public safety with predictions on traffic, crime and even disease prevention, and all of this will run on vast stores of data. Future-proofed and adaptable communities must take steps now to ensure their technology infrastructure will support further innovations.
The journey to AI is not without its hurdles, including navigating ethical questions about black-box algorithms that prevent visibility into learning paths. These challenges and questions, however, can be resolved with the creation of best practices and ethical guidelines. Stagnancy could hold organizations back, and state and local agencies can take meaningful steps now to prime the AI engine today. Big data is an important fuel. It’s the bedrock for real-time analytics, which will be necessary to promote more efficient use of jurisdictional resources.
While the volume of unstructured data has exploded, legacy storage built to house that data has not fundamentally changed in decades. Deep learning (DL), Graphics Processing Units (GPUs) and the ability to store and process very large datasets at high speed are fundamental for AI. DL and GPUs are massively parallel, but legacy storage technologies were not designed for these workloads – they were designed in an era with an entirely different set of expectations around speed, capacity, and density requirements.
While determining how best to bring AI into their existing operations, organizations should already be focusing on data collection and cleaning in preparation for AI implementation. As large, high-quality datasets are a key determiner of a deep learning model’s accuracy, organizations can accelerate their path to production by collecting, indexing and exploring their data ahead of time.
It is critical to invest in hardware today that’s flexible for the future. Advances in GPU hardware as well as data preparation and training techniques are rapidly evolving. Newer, more powerful, GPU chips enable training more complex models, requiring larger data sets.
Legacy storage has become a bottleneck for municipalities that want to turn big data into a big advantage through real-time intelligence. Within the last two years, the amount of compute required to run bleeding-edge deep learning algorithms has jumped 15-fold. Compute delivered by GPUs has jumped 10-fold. By and large, legacy storage capabilities have remained stagnant.
If data is the new currency for the 21st century, and we are committed to improving communities through AI innovation, we cannot design the systems we need on the technologies of the last century.
We need to begin building the new foundation today with data platforms that are re-imagined from the ground up for the modern era of intelligent analytics. Slow storage means slow machine learning performance. Imagine a marathon runner trying to re-hydrate after the race through a wafer-thin straw: essentially, this is what happens to organizational data run on yesterday’s storage. Ultimately, much of the available insights remain locked in the data, limiting the intelligence which could be extracted from the stored data.
Several key characteristics define data platform and storage requirements for the cloud era:
- Silicon-optimized versus disk-optimized storage, to support gigabytes/second of bandwidth per application. The performance of SSD technology exceeds that of hard disk drive-based storage many times over.
- A highly-parallel application architecture that can support 1,000s to 10,000s of composite applications sharing petabytes of data versus 10s to 100s of monolithic applications consuming terabytes of data siloed to each application.
- Elastic scale to petabytes that allow organizations to pay as they grow with perpetual forward compatibility.
- Full automation to minimize management resources required to maintain the platform.
- The ability to support and span multiple cloud environments from core data centers to edge data centers, as well as across multi-cloud infrastructure-as-a-service (IaaS) and software-as-a-service (SaaS) providers.
- An open development platform versus a closed ecosystem built on complex one-off storage software solutions.
- A subscription-consumption model that supports constant innovation and eliminates the churn and endless race to expand storage to meet growing needs and refresh every three to five years.
State and local governments have an immense opportunity to leverage technology for significant cost savings and a better world for citizens. AI and other next-gen technologies offer an important path forward, and it’s vital to begin the journey today with a data foundation designed to accelerate adoption and impact.