Data, Insights, action!

Generative AI and Large Language Models (LLMs) are unquestionably the most talked-about subjects at the moment - they’re sucking all the oxygen from the room. But do you know what is the biggest challenge with this AI revolutionization? — Building an effective data management value chain that can lead to powerful and game-changing benefits. In order for AI to thrive, organizations will need to tackle data obstacles and fix flawed data, implementing principles to effectively manage, cleanse, and enhance it, thereby enabling the realization of broader AI aspirations.

Forward-looking data-driven companies are bringing in a product-mindset, managing the data like a product across its entire life cycle. And nearly, one-third of executives identified data-related challenges among the top three concerns hampering their company’s AI initiatives. Lots of true disruption is coming. Mostly around end user experience and how people interact with technology. Based on what we’re seeing, it’s not just hype. The dramatic change everyone’s talking about is real. Generative AI and related technologies will affect productivity, job roles and responsibilities. It will aid creative processes and create entirely different experiences.

 

The easy diagram of data->insight->action is actually very hard. It may look like three steps, but the process can really be 10 to get from data to insight, and 20 to get from insight to action. Generative AI will help reduce the steps for the end user, for greater efficiency and velocity. Since the introduction of the iPhone, the extent to which we engage with data and applications has significantly increased, fundamentally altering our daily routines. The emergence of generative AI is poised to bring about a comparable transformation, albeit at a much-accelerated pace.

“The idea of AI disruption is very real”

There is a need of “democratization of data”. The natural language interfaces allow business decision-makers to dive deep into data that previously required the help of necessary gatekeepers, such as data scientists, business analysts and other highly technical experts.

As data sources proliferate, integrating and harmonizing disparate datasets from various sources become increasingly complex. Ensuring interoperability and consistency across different data formats, structures, and systems is essential for effective data analysis and decision-making. Having a data strategy in place is an initial indicator of being product-led yet lacking a singular source of data results in increased difficulty in maintaining data quality. Issues such as incomplete, inaccurate, or inconsistent data can significantly impact the company’s strategic decision-making capabilities. From a resource perspective, too much employee time and associated expenses are going toward managing and preparing data for one-off analysis. 

I remember when I was working as a Senior Technical Product Manager for Infrastructure and Internal tools at a mobile banking company, my main role was to do the gap-analysis and identify the need of the multiple different internal tools that the company already had it in place. There were a couple of marketing/acquisition growth tools like AppsFlyer and Clevertap along with various BI tools like Amplitude, Looker and PowerBi. We were also collecting data from Salesforce and managing some reports in the spreadsheets. The biggest challenge was data integration and data interoperability.

After working with big data for several years, I’ve realized that one doesn’t really need a lot of fancy tools. Having one single tool is enough. Centralizing data into a singular source of truth has the potential to cut overhead costs by 50%, stemming from the upkeep and management of numerous data repositories and vendor partnerships. It also helps the decision-makers by having access to reliable, up-to-date information, enabling them to make informed decisions quickly and confidently. This also helps in facilitating smooth scalability to handle rising data volume and complexity.

As the experts may say, “The generative AI era does not call for a fundamental shift in data strategy. It calls for an acceleration of the trend toward breaking down silos and opening access to data sources wherever they might be in the organization.” And if you don’t accelerate, you risk being left behind, virtually overnight.

For years, companies have been encouraged or even warned to formulate an extensive and future-oriented data strategy. Just as a growing number of businesses were fulfilling that requirement, advancements in AI pose a threat to making last year's strategy obsolete. Fortunately, the experts are unanimous that if you’ve already put in the work to create a solid data strategy, you’re on the right track.

I think the key is to continue to focus on your core needs, the things that keep you up at night, and invest in those. Which means that how you manage and govern your data somehow becomes even more important. Today’s analysts generally create and present a canned report, following up with new queries when an executive has a specific supplemental question.

With the advance use of generative AI, execs can expect to interact directly with data summarized in that overview report. This self-service will free analysts to work on deeper questions, bringing their own expertise to what the org really should be analyzing. For AI-fueled organizations, data becomes a resource, sparking innovation and competitive advantage.

Previous
Previous

Introvert’s journey to Harvard, Skydiving & Helicopter flying

Next
Next

It’s all abou​​t the journey…not the destination!