Unlocking the potential to transform economies by improving time-to-insight

Ray Falcione Jr., Vice President of Federal Business, OmniSci

The past decade has seen rapid evolutions, brought about by the accelerated digitisation of the world economy. Across Asia Pacific, people, businesses and even the public sector are collecting more data more rapidly than ever before.

In Singapore, digital transformation and smart nation initiatives in the public sector have been a key focus area, with the Digital Government Blueprint (DGB) outlining plans to re-engineer the citizen experience through leveraging data, connectivity and computing.

This has only increased citizen expectations, which necessitate government agencies to think of long-term plans, predict trends and take into account future scenarios that could positively or negatively impact a nation’s economy.

However, this would be a challenge if entities struggle to join together their most valuable databases. Segregation is expensive and will inhibit good decision-making with integrated insights. Government agencies have to constantly look for new ways to integrate siloed information to make better decisions, reduce costs and improve time-to-value.

When critical questions need answering, the typical approach is to analyse these silos of data individually. This can be acceptable if results aren’t needed for hours or days. But today, and for government agencies in particular, that’s rarely the case. In fact, data can become stale even before a query is complete.

Slow, disjointed data analysis presents two types of difficulties. One is the time requirement; in many situations the quality of a decision degrades quickly when insight is delayed.  There is a big difference between an answer that comes in minutes versus a half-second. Those small time deltas can make an immense difference when critical services — and even lives — are on the line.

The other challenge is the curiosity factor. In conventional analytics, the freedom to be curious is time consuming and costly. Simply asking a question may require a large effort to prepare the data and reserve the hardware. Yet if decision-makers can ask one question, get the answer, and then ask three more questions based on that first answer without having to pay a time penalty, they have the opportunity to be significantly more curious.

Agencies typically under-appreciate or overlook this benefit until they’ve experienced it. Without speed and agility, agencies can end up in a rut, asking the same stale questions over and over again because the time/value equation never improves.

A new era in analytics

Better technology, however, is beginning to break down silos and significantly bend the time curve. New platforms based on parallel processing architectures deliver insights at the speed of curiosity. When CPUs and GPUs are able to ingest and interrogate billions of lines of data per second, using familiar interfaces including JDBC, SQL and Kafka, the possibilities are transformative not only in terms of speed, but inquiry as well.

With parallel processing it’s possible to reset the analytics paradigm by merging and integrating conventional data analytics with spatio-temporal analysis. The result is a visualised, map-based analytics environment that provides democratised insight to everyone, regardless of their technical background.

Parallel processing removes traditional data manipulation workarounds like pre-indexing, down-sampling and pre-aggregation. It combines this power with visualisation capabilities that allow users to not only view data geographically, but also drill down, pivot and cross-filter in real time, regardless of scale or data size.

What’s more, datasets — even joined internal and external datasets — can be cross-indexed for greater understanding. Comparing data from multiple sources allows users to assess what is happening not just within their organisation, but also with other entities and even the world at large.

Benefits across government

Parallel processor-based analytics platforms support more than just faster querying. They are also able to render hundreds of millions of points, lines and polygons in mere milliseconds, then make them filterable and interactive. This kind of geospatial interrogation has never before been achievable.

In the past, analysts would either have to down-sample data to accommodate the limited power of their platform or purposely look at only a small portion of a map. If health officials studying the COVID-19 pandemic, for instance, wanted to track patterns-of-life for those who were geolocated with an infected person, they were stuck with a terrible choice: look at one small area of a rendering, or wait for hours.

Even further, if analysts wanted to measure levels of social distancing, contact tracing and travel/activity restrictions over specific geographies, then correlate these findings with the rate of spread of COVID-19 in those regions to predict the rate of spread based on movement behaviour, those results simply aren’t readily available.

Today, however, they are, thanks to several analytics and data companies working in task forces with non-profit organisations, private industry, academics institutions and the federal government. Their efforts are supporting health officials looking at data in new ways to help address the immediate needs of the global community.

Advanced analytics also makes it possible to democratise the pursuit of curiosity and insight in every corner of an organisation. No longer are people outside the data team locked into a limited selection of static pie charts or stale histograms; now those same decision-makers can explore geospatially mapped, cross-filtered data on their own. Such agility not only opens data to end-users in a self-serve environment, but also frees data scientists and analysts to pursue the deeper, predictive, machine learning work they were probably hired to do.

When an agency’s collections of data are finally connected, deeper insights are possible; and when the time it takes to get those insights is no longer an issue, better decision-making becomes the norm. Today, everyone can access the right data from the right sources, explore them in the right way and obtain answers never before possible — at the time those answers are needed. Silos may still be necessary — but with the new analytics technologies available, their contents are sprouting fields of knowledge like never before.