Uncategorized

What makes large language models expensive?



The larger the [AI] model is, the more parameters it has, and that’s going to drive compute and different resources. – Jessica Ridella

Jessica Ridella, IBM’s Global Sales Leader for the watsonx.ai generative AI platform, breaks down the pivotal factors that influence the cost of large language models in businesses.

She provides a comprehensive understanding of elements such as model size, use case, pre-training costs, inferencing, tuning, hosting costs and deployment options.

Table of Contents

  1. The Role of Use Case
  2. Model Size Impact
  3. Pre-Training Costs
  4. Understanding Inferencing Costs
  5. The Significance of Tuning
  6. Hosting Costs
  7. Deployment Options

The Role of Use Case

Different use cases necessitate different models and methods which require varying amounts of compute power.

It’s advisable to engage in a pilot with a partner or vendor to identify potential challenges and determine if generative AI is an appropriate solution.

🚀

Read Big Ideas from this + 100,000 of world’s best books, videos and podcasts in BigIdeas app (for free!)

➡️ Download: Android, iOS

Be the smartest thinker in the room. Grow daily with #BigIdeas App!

Model Size Impact

The size and complexity of the generative AI model significantly affect pricing.

Larger models demand more computing resources.

Vendors often have pricing tiers based on model size, thus it’s crucial to discern whether they provide flexibility according to your specific use case.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *