– By Executive Director & CTO, Data & AI Services, Kyndryl India
The importance of harnessing value from data and artificial intelligence at scale and their role in any digital transformation cannot be understated. However, the questions enterprises and CxOs are grappling with are : can we be depending on data ninjas for data wrangling and deriving insights? Can we make data consumption and insight generation agile, secure and simple? Are the insights generated of marginal value only?
In most large enterprises, we have multiple data siloes, petabytes of data distributed across disparate data platforms, data streaming in from multiple edge locations, thousands of data pipelines, and hundreds of different ML models deployed. There is also the question of whether data is an asset or increasingly becoming a liability – where huge amounts of time is lost on getting to the most recent version of truth, establishing lineage, determining data quality, and making the data ready for consumption. This challenge is compounded by the fact that the best-in-class data engineering skills, combined with domain expertise, are very scarce.
From our observations, working with various clients, and managing complex environments, there are a few super-catalysts to accelerate time to insight – for organizations to gain business agility and unlock value from data.
DataOp: brings in reliability and intelligence into the data lifecycle. Implementing this can be complex, and at the same time, most rewarding. From the time when data is being ingested either in batch or real time, to data transformation and the point of consumption, automation and orchestration can be applied. The time to insight can be significantly reduced, and more importantly enterprises can trust their decision support systems to be fail-safe and riding on top of disparate data with the right compliance posture. Enterprises struggle to get to a consolidated view of platform and pipeline health, provide necessary data governance and at the same time provide data consumers with curated feature stores and data products. This can be made possible by gleaning all of the business, technical and operational metadata from the data being managed. Once you have an aggregation of all of the metadata, it becomes possible to not only manage data operations seamlessly, but also to bring in observability, predictability and intelligent automation to the data lifecycle operations.
MLOps: provides a solution framework for monitoring and managing the enterprise AI models that are in production. For most ML models in production, there is no unified view of whether the AI models are continuing to deliver consistent results or outcomes for the business. Apart from monitoring for model decay, A/B testing and need for model re-training, MLOps can help with establishing lineage, explainability and detecting bias – all of which are extremely critical towards ensuring trust in decision automation and decision support systems. In many industry verticals, this is becoming an essential regulatory requirement to be complied with. MLOps also extends into integration of AI/ML-based insights with the regular business workflows. Last mile consumption of insights at scale and in production implies that the data is being securely accessed, privacy controls (in terms of data masking, anonymization, or encryption) are in place and that the data or ML insights’ pipelines are having the CI/CD integration with downstream applications or microservices that business services are composed of.
Data & AI FinOps: provides a solution framework to monitor spends, resource consumption (both on the green field data environments, forays into cloud data platforms and their legacy data estate), redundancy in data pipelines, resource intensive queries and ability to map their spends to business outcomes. It is very common to encounter runaway costs after migrating to the cloud platforms – given that the cloud PaaS consumption can sometimes overshoot the original estimates besides running into previously unplanned costs for egress or ingress of data. With a fit-for-purpose FinOps framework, you can monitor end-to-end lifecycle processing costs, set thresholds and an alert mechanism to trigger preventive actions when resource consumption thresholds are breached or anomalies detected. Simply put, FinOps begins to address: what are we paying for ? Business insight isn’t about restating something that’s obvious. It’s a leap in understanding. A good insight can reframe your business model and change your convictions about what matters to the business.
Backed by these powerful solutions – you can unleash the value from data that your enterprise holds – sustainably and at scale. Business data workers can then focus on exploiting data and insights, discovering dark data – rather than having to chip off at the proverbial tip-of-the-iceberg (i.e., data that is seen as being available). The agility and simplification that these solutions introduce move your enterprise from being overwhelmed with “integrating and processing” data to achieving a future-proof vision of getting to a self-service and driverless data consumption roadmap.
Discover the stories of your interest
Disclaimer: Content Produced by ET Edge