As a data practitioner, you’re on the leading edge of AI; a field moving almost faster than we can keep up. This course will give you a critical understanding of large language models, their history, how they work, and how they can be leveraged.
What you’ll learn
Just when we had gotten used to tools like Alexa, Siri, and Cortana, applications such as ChatGPT suddenly reset the world’s expectations of
what is possible with AI. Powering this massive technological leap in terms of natural language processing are large language models.
In this course, Introduction to Large Language Models for Data Practitioners, you’ll be introduced as a data practitioner to what you need to know about large language models (LLMs) and how you can leverage them moving forward.
First, you’ll explore the evolution of LLMs over the past 70 years, from the first conceptual neural network through advancements in machine learning, deep learning, and the development of various language model architectures, to the transformer-based LLMs now emerging for general use such as PaLM, Claude, LLaMA, and GPT.
Next, you’ll discover what makes the transformer model so revolutionary by learning more about the internal workings of the model – defining concepts and terms critical to transformer-based LLMs such as parameters, encoding, decoding, attention, weights, training, and tuning.
Finally, you’ll see where LLMs fit within the domain of primitive objects available to data practitioners to solve real-world problems, obtain confidence in understanding the power and limitations of these models, discuss ethical considerations and harmful biases, and demonstrate improving accuracy and relevancy through fine tuning and feedback vs. feedforward approaches in machine learning.
When you’re finished with this course, you’ll have the knowledge necessary to determine where LLMs fit in your domain, the ability to conceptualize how LLMs work, and, more importantly, what capabilities and limitations this presents to you as you prepare to implement LLMs into your toolkit moving forward.