
Would you like to receive AI Weekly in your inbox every Thursday for free? Login here.
My AI-related news feeds this week were filled with oil-painting-style images of fuzzy pandas wearing leather jackets and riding skateboards, thanks to Google Research’s new Imagen diffusion model. I’ll admit it: I long to pet the uber-realistic corgi in sunglasses riding his bike in New York’s Times Square.
I’ve also been down the Twitter rabbit hole, from everything on how the controversial Clearview AI facial recognition software is reportedly being rolled out in private businesses and schools – to more memes, threads and emoji-filled posts all about the Will-AGI-come- soon-or-won ‘t-AGI-ever-come’ debate that I highlighted last week.
However, I’ve chosen to focus this issue of AI Weekly on something very timely, applicable to companies of all shapes and sizes working to implement AI in their organizations: that is, the AI deployment problem at the “final Mile”.
Oh, and the Transform 2022 agenda is now live.
Let’s dive in.
— Sharon Goldman, Editor-in-Chief and Author
Twitter: @sharongoldman
Can the “last mile” AI problem be solved?
Since joining VentureBeat six weeks ago to cover the AI Beat, few stats have been repeated to me more than a version of “the vast majority of AI and machine learning projects fail.”
Whether that number is reported as 80%, 85%, or 90%, it seems clear that the biggest problem is getting AI and machine learning projects from pilot to production. This is known as the “last mile” problem. The term comes from the supply chain industry’s famous “last mile”, which is described as the highly complex final leg in the journey of people and packages from hubs to their final destinations.
This week, I asked some AI vendors, executives, and practitioners for their thoughts on how they view the “last mile” issue and how organizations can address this challenge.
Enterprises need to develop model deployment fluently
“The challenge is that many companies simply lack the data engineering, data science and MLops expertise required to properly build a model, deploy it in a system or environment (cloud or on-prem). place, deploy and execute. Organizations need to learn how to deploy models in different environments and how to run those models and how to manage models for drift and related issues. That doesn’t mean they have to hire legions of technical talent. Rather, they can just acquire the basic organizational skills and fluency and leverage emerging players who will shoulder much of that complexity and offer MLaaS, or machine learning as a service.”
– Edward Scott, CEO, ElectricifAi
We need the “bad” AI models that are never released
“It’s easy to say that 80% of the models never go into production. But we have to do that work so that the 20% that get released are the good guys. A lot of de-biasing, optimization, and learning comes from the “bad” ones, but you don’t want to publish them. I think this shows the interplay between leadership expectations and actual issues when running a machine learning program. Is it reasonable that 100% of all models make it into production? Of course not — some problems can’t be solved by machine learning, and the people working on them are working on something else that actually goes into production. But there are also models that don’t make it for more structural reasons. It’s data scientists who can’t get access to the data. Engineers don’t have access to the infrastructure or the systems are too fragmented to run large data jobs.”
– Joe Doliner, Co-Founder and CEO, Pachyderm
Focusing on outcomes and data strategy is critical
“The ‘last mile problem’ takes away the workload of data practitioners and costs organizations hundreds of thousands of dollars. Imagine a data science team with access to the data they want, that has built and validated a predictive model, that model generates exciting results locally… and then it stays on the shelf. It is imperative for data science teams to move away from the traditional “data science project” approach and focus on results. At the organizational level, a data strategy that clearly includes AI mandates is the first step. The next step is to engage data practitioners—engineers and scientists—in cross-functional teams and workgroups. Technical teams need input from commercial users to ensure their models are fit for purpose and meet business needs, and commercial users need to be comfortable using the model and fully understand how an application extends their day-to-day work.”
– Stuart Davie, Vice President of Data Science at Peak
Processes, procedures and roles must be clearly defined
“Three weeks ago, a prospective customer volunteered that a model that ran insurance pricing in their organization had not been viewed for 18 months. The data scientist who created the model had left the organization, and the model was undocumented and unattended. The organization had relied on its data science team to develop and maintain its models. While this is a perfectly acceptable way of running a business, when something goes into production, processes, procedures and roles must be clearly defined. Data scientists see great importance in their AI models making it to production. In fact, after salary, lack of business value is the number one reason data scientists leave companies. Nevertheless, many do not understand the production operation or do not want to keep it going, especially since the models used are growing from two to three to hundreds.”
– Grant Case, APAC Director of Sales Engineering, Dataiku
AI engineering will be a game changer
“The true differentiator for organizations will be their ability to continually add value through rapid AI changes. The “AI engineering age” will be necessary for companies to achieve true customer-centric AI. AI engineering is a discipline that enables both business and IT leaders to work together to provide repeatable patterns for AI solution success. According to Gartner, AI Engineering, one of the top 12 strategic technology trends of 2022, will operationalize the delivery of AI to ensure its continued business value. It is an essential bridge between MLOps and customer-centric AI, where companies see enduring value because they will be able to truly know and serve their customers throughout the customer journey.”
– Akshay Sabhikhi, Founder and COO, CognitiveScale
I hope you share your thoughts on the AI ”last mile” problem with me: sharon.goldman@Wonderfulclubs.
PS I also accept AI-related Twitter memes, emojis, and images of teddy bears swimming the 400m butterfly at the Olympics.
VentureBeat’s mission is intended to be a digital marketplace for technical decision makers to acquire knowledge about transformative enterprise technology and to conduct transactions. Learn more about membership.