r/LeopardsAteMyFace: The Decommissioning of AI (2024)

Ad - Header (728*90 AD)

A.I. Machine Learning

  • By Michele Goetz, Forrester
  • January 24, 2023

r/LeopardsAteMyFace: The Decommissioning of AI (1)

It’s AI strategy season in a tough economic climate. Cutting IT costs is a top priority even as chief data and analytics officers want to scale AI. This led to a conversation that I had with a services provider today about the cost of running AI models. It seems that there are several clients seeking to remove AI models because cloud costs are too high. I thought, “What a horrible idea!” Then, ignoring my filter, I blurted it out. Under conditions of economic uncertainty, extending your AI footprint and building insights-driven capabilities ensures enterprise resilience. That was proven during the pandemic.

This is a classic “leopard ate my face” moment. If you aren’t familiar, LeopardsAteMyFace is a Reddit thread containing stories where people suffer ironic consequences resulting from a poorly considered decision.

Retiring models based on cost is an avoidable catastrophe. It indicates a lack of ModelOps and AI governance as well as a lack of AI monetization by model and business value stream. Cost-based model retirement ignores the impact on making money and saving money using AI. And it ignores the problem of what replaces the AI-driven intelligence and decision automation when the model no longer exists.

Related

  • Improved, New AI Models Are Pushing the Boundaries of Possibility
  • New Text-to-Image AI Model Goes Up Against DALL-E 2

So if you must retire models, and cost is the key driver, be smart about it and provide insights that a non-data scientist understands. Here are the tools you need to avoid hungry leopards:

  • A CxO-level business performance framework for AI.CxOs need to see AI’s overall speed to value and scale of value, as well as cost to own and serve. AI business performance frameworks help CxOs interpret AI contribution to overall goals and metrics that quantify money made and money saved. For example, chief revenue officers care about an overall contribution of AI personalization to revenue generation.
  • Audits of model performance and process stream performance over time.ModelOps tools help data scientists know when model performance degrades. Indications of data drift, bias, and overall model degradation are early warning signals. Business intelligence on AI in the form of continuous audits uncovers the decay trajectory to guide model optimization strategy. Where models are often interdependent, business intelligence on AI also extends ModelOps to see model dependencies and helps business decision-makers tune models in context of each other for a holistic assessment of model performance.
  • Data intelligence.Data intelligence (data observability tools, pipeline profiling and lineage, data catalogs and glossaries) bring fidelity to the state and value of a machine-learning model. New data and metadata capture is required, along with knowledge graph capabilities that link and describe the state and dependencies of the data, model performance, data and AI policies, domains, and business metrics. While feature stores are all the rage and simplify model deployment, management, and reuse, they need integration with data intelligence capabilities for closed-loop traceability for audits.
  • Model testing and lifecycle plans.Unlike traditional technologies, AI is not implemented and forgotten. Continuous monitoring and optimization frequently have multiple models performing the same task in production as part of testing plans. This has a multiplier effect on cost. The strategy should not be with an aim to limit in-production testing, however, but rather to maintain lifecycle best practices that update, replace, and retire degraded ML.
  • Up-front plans for cost optimization.Self-service, citizen ML model development, and increased application and data-flow complexity impact the efficiency of models. Poorly crafted transformation and queries can make the difference between milliseconds and seconds in a transaction, increasing compute and thus increasing cost. In addition, edge use cases can add to cost with hybrid (cloud/edge) storage and compute requirements. Upskill data scientists on data engineering basics and integrate their activities with engineering and DevOps to address and properly test ML models before deployment, then make cost a KPI used for testing and release within data engineering, ML engineering, and DevOps.

The cost of AI matters. Reducing the number of ML models based only on cost, however, is a recipe for business latency, missed opportunity, and poor resilience. Your organization will be better positioned to ride out economic and market conditions without being eaten by the leopard.

The original article byMichele Goetz, Forrester's vice president and principal analyst, is here.

The views and opinions expressed in this article are those of the author and do not necessarily reflect those of CDOTrends.Image credit: iStockphoto/PashaIgnatov

Recommended Stories

Data's New Sheriff: Snowflake's Quest to Bring Order to the AI Frontier

By Winston Thomas

Choosing the Right AI Model for Your Business

By Paul Mah

The Future of AI is Empathetic AI Agents

By Paul Mah

OpenAI Launches New GPT-4o Model For Free

By Paul Mah

Singapore's GenAI Paradox: Business Boom vs. Public Skepticism

By CDOTrends editors

Recommended Whitepapers

Are You Data and AI Ready?

Digital Realty

AI for IT Leaders: Deploying a Future-Proof IT Infrastructure

Digital Realty

Top 5 Considerations for Your AI/ML Platform

Red Hat

Operationalizing ML Models for DevOps and ML Engineers

Red Hat

Advance Your Business With AI/ML

Red Hat

Data Science & AITrends Weekly: Get our best stories, exclusive reporting and essential analysis of the Digital news in your inbox.

Sign Up Now

r/LeopardsAteMyFace: The Decommissioning of AI (2024)

FAQs

How long until AI takes over? ›

The consensus among many experts is that a number of professions will be totally automated in the next five to 10 years.

Why is AI not taking over? ›

Data Dependencies And Their Limitations

Unlike humans, who can learn from a few examples or even from a single experience, AI systems need thousands—or even millions—of data points to master even simple tasks. This difference highlights a fundamental gap in how humans and machines process information.

Will AI one day take over the world? ›

1. AI will take over the world. If you believe science fiction, then you don't understand the meaning of the word fiction. The short answer to this fear is: No, AI will not take over the world, at least not as it is depicted in the movies.

Will technology take over the world? ›

Notwithstanding fears of an AI takeover, where machines supplant people as the predominant knowledge in the world, such a situation appears to be far-fetched. In any case, business network PwC predicts that up to 30% of occupations could be mechanized by robots by the mid-2030s.

Will AI wipe out humanity? ›

Many other experts similarly believe that fears of AI wiping out humanity are unrealistic, and a distraction from issues such as bias in systems that are already a problem.

What did Stephen Hawking say about AI? ›

Hawking's biggest warning is about the rise of artificial intelligence: It will either be the best thing that's ever happened to us, or it will be the worst thing. If we're not careful, it very well may be the last thing.

Is Bill Gates worried about AI? ›

Gates expressed concerns about the downsides of advanced artificial intelligence, such as taking people's jobs, including his own. Read About Startup Investing: Harvard-founded AI startup is solving paywalls, growing 5x yearly and looking for new shareholders.

Why Elon Musk opposes AI? ›

In an interview with Tucker Carlson last year, Musk had said that AI was more dangerous than a 'mismanaged aircraft' and has the potential of 'civilisation destruction'. However, Yann LeCun had expressed his disagreement with Musk and said that the assumption of AI being an existential threat is 'false'.

What will AI look like in 10 years? ›

Robots, Co-bots And Automated Friends

Big advances have been made in robotics in recent years, thanks to the application of AI to problems like balancing and moving in proximity to humans. So, by 2034, it might seem reasonable to think that mechanical companions will be all around us.

Will AI become sentient? ›

We don't know whether AI could have conscious experiences and, unless we crack the problem of consciousness, we never will. But here's the tricky part: when we start to consider the ethical ramifications of artificial consciousness, agnosticism no longer seems like a viable option.

Is AI a threat to humanity? ›

Can AI cause human extinction? If AI algorithms are biased or used in a malicious manner — such as in the form of deliberate disinformation campaigns or autonomous lethal weapons — they could cause significant harm toward humans. Though as of right now, it is unknown whether AI is capable of causing human extinction.

What did Elon Musk say about AI? ›

Elon Musk says artificial intelligence will take all our jobs and that's not necessarily a bad thing. “Probably none of us will have a job,” Musk said about AI at a tech conference on Thursday.

What jobs will AI replace first? ›

What Jobs Will AI Replace First?
  • Data Entry and Administrative Tasks. One of the first job categories in AI's crosshairs is data entry and administrative tasks. ...
  • Customer Service. ...
  • Manufacturing And Assembly Line Jobs. ...
  • Retail Checkouts. ...
  • Basic Analytical Roles. ...
  • Entry-Level Graphic Design. ...
  • Translation. ...
  • Corporate Photography.
2 days ago

How likely is an AI takeover? ›

The possibility of an AI takeover, like the one depicted in the movie Terminator 2, is no longer the stuff of science fiction but a real concern that experts in the field are grappling with. Cloud Artificial Intelligence (AI) really take over the World? The Answer, No. AI will not take over the world.

Will AI takeover by 2030? ›

According to the McKinsey report, AI is expected to replace 2.4 million US jobs by 2030, with an additional 12 million occupational shifts. An expected 400 to 800 million people will lose their jobs due to AI. Jobs with repetitive and routine tasks are the most vulnerable.

How advanced will AI be in 2050? ›

By 2050, AI-powered technologies could revolutionize patient care, enabling faster and more accurate diagnoses, customized treatment plans, and the discovery of groundbreaking therapies. AI may also play a significant role in predicting and preventing diseases, leading to better population health management.

Will AI takeover by 2025? ›

According to the World Economic Forum's "The Future of Jobs Report 2020," AI is expected to replace 85 million jobs worldwide by 2025. Though that sounds scary, the report goes on to say that it will also create 97 million new jobs in that same timeframe.

Top Articles
Latest Posts
Article information

Author: Stevie Stamm

Last Updated:

Views: 6312

Rating: 5 / 5 (60 voted)

Reviews: 83% of readers found this page helpful

Author information

Name: Stevie Stamm

Birthday: 1996-06-22

Address: Apt. 419 4200 Sipes Estate, East Delmerview, WY 05617

Phone: +342332224300

Job: Future Advertising Analyst

Hobby: Leather crafting, Puzzles, Leather crafting, scrapbook, Urban exploration, Cabaret, Skateboarding

Introduction: My name is Stevie Stamm, I am a colorful, sparkling, splendid, vast, open, hilarious, tender person who loves writing and wants to share my knowledge and understanding with you.