Innovation at Scale with AI is a team sport

Alessio-Bagnaresi Innovation at Scale with AI is a team sport
Alessio Bagnaresi, AI & Advanced Analytics Director, Microsoft EMEA

Artificial intelligence initiatives are maturing across organizations with a growing number of use cases ready to add unprecedented value to the business, however Innovation stakeholders across many enterprises continue to face significant challenges in moving these use cases to production.  In this article, my aim is to provide some tips on how to bring to fruition investments made on AI to deliver business value by accelerating model operationalization.

As I meet with many organizations, I realize that there are challenges to get IT, Developers and Data Science teams working synergistically together to operationalize end to end Data, Apps and ML Model pipelines. While building accurate models is critical, even more critical is to ensure accuracy over time by putting in place the proper governance for data preparation, operationalization and integration with core applications leveraging existing core infrastructure.  Indeed, Security and Privacy are common concerns when it comes to integrate AI with both systems of record and systems of engagement and it typically becomes a blocker in the effort of moving use cases from pilot to production.

While traditional DevOps procedures cater for code changes, the Data Science world is more dynamic by nature with both code, model and data changing and this poses an operational burden to be addressed by interdisciplinary teams and not just the Data Scientists.  Typically, the challenge lays not only with building solutions to automate end-to-end ML pipelines including data preparation, model building, deployment and lifecycle management but also with stakeholder responsibilities and accountabilities.

Furthermore, organizations should think about modernizing their Data Estate if the aim is to rapidly innovate with ML/AI and this typically means building a unified, modular, elastic, scalable, secure and well-governed Data Platform upholding a wealth of use cases spanning across the entire analytical spectrum (descriptive, diagnostic, predictive and prescriptive).  Breaking data silos and storing big data and relevant metadata in a Modern Data Platform enables aggregation, enrichment, discovery, exploration and modeling. The resulting high quality “data products” become the fuel to ignite ML engines.

Moreover, the establishment of a DataOps practice represents a critical enabler to ensure that the right data will be in the hands of the right stakeholder at the right time and with the appropriate level of access control, metadata and governance. Indeed, a DataOps practice becomes the precursor of MLOps because it allows to build multi-stage data pipelines encompassing data collection, data wrangling, feature engineering and data segregation.  All these stages are then needed by MLOps for its subsequent stages.

As soon as the Data is ready to be consumed by ML engines thanks to DataOps, implementing a model management capability alongside with a DevOps toolkit is the catalyst to automate and streamline model development, training, validation, testing, deployment and lifecycle management.   This capability typically needs to consider hybrid cloud deployment models as quite often organizations are looking at use cases encompassing online inferencing at the edge (like quality control, workplace health and safety and predictive maintenance just to name a few).  Therefore I encourage organizations to establish an MLOps practice to perform model tracking, lineage, versioning,  continuously evaluate model accuracy (often due to data drift), model usage, model interpretability and fairness in training Datasets (limiting BIAS as much as possible).

Many robust ML/AI Platform commercially available in the market nowadays can provide all the aforementioned capabilities, but they also need to be able to get smoothly integrated with other platforms in order to stitch together DataOps, ML Ops and App DevOps to enable a truly end-to-end “AI Factory” where ML models get infused in downstream apps or business processes.

Building cohesive interdisciplinary teams made up of Data Scientists, Data Engineers, ML Engineers, App Developers, Enterprise Architects and Subject Matter Experts and defining well-thought end-user success driven KPIs and operational SLAs among all these actors are the real game changer here and can help organizations to innovate faster and gain competitive edge.  In fact, the ML/AI commercial platforms available in the market are now ready to help in this endeavor due to their built-in collaboration capabilities.  Now it’s about exploiting these platform capabilities by embracing the cultural shift needed to promote cross-teams collaboration and establish enabling processes and procedures.

To conclude, innovating at scale is critical for organizations to cut costs, create new revenue streams and ensure sustainability.  ML/AI can help organizations of every size to stay relevant, grow and build a strong reputation if and only if they strive to timely take the necessary measures to bring this technology to mainstream adoption both internally and for the benefit of their external stakeholders.  A Platform integrating DataOps, MLOps and App DevOps along with cross-teams collaboration can ultimately turn out to be the secret sauce.