New Banner :) TEST UPDATE Learn more

Offered By: IBMSkillsNetwork

Fine-Tuning Transformers and Gen AI Models

Familiar with Python and PyTorch? Build job-ready skills in Generative AI fine-tuning transformers in 2 weeks! Get practical experience and a credential.

Continue reading
Premium

Course

Artificial Intelligence

At a Glance

Familiar with Python and PyTorch? Build job-ready skills in Generative AI fine-tuning transformers in 2 weeks! Get practical experience and a credential.

This Fine-Tuning Transformers and Gen AI Models intermediate-level course teaches you the skills you need for your AI career.  
 
You Will learn: 
 
  • Job-ready skills in 2 weeks, plus you’ll get practical experience employers look for on a resume and an industry-recognized credential 
  • How to perform parameter-efficient fine-tuning (PEFT) using LoRA and QLoRA 
  • How to use pretrained transformers for language tasks and fine-tune them for specific tasks 
  • How to load models and their inferences and train models with Hugging Face. 
 
 
Course Overview 
 
The demand for technical Generative AI skills is exploding. Businesses are hunting hard for AI engineers who can work with large language models (LLMs). This Fine-Tuning Transformers and Gen AI Models course builds job-ready skills to advance your AI career.   
 
In this course, you’ll explore transformers, model frameworks, and platforms such as Hugging Face and PyTorch. You’ll begin with a general framework for optimizing LLMs and quickly move on to fine-tuning generative AI models. Further, you’ll learn about PEFT, low-rank adaptation (LoRA), quantized low-rank adaptation (QLoRA), and prompting.   
 
Additionally, you’ll get valuable hands-on experience in online labs that you can talk about in interviews, including loading, pretraining, and fine-tuning models with Hugging Face and PyTorch.   
 
If you’re keen to take your AI career to the next level and boost your resume with in-demand gen AI competencies that catch the eye of an employer, ENROLL today and have job-ready skills you can use straight away within two weeks! 

Course Syllabus

Module 0: Welcome

·         Video: Course Introduction
·         Specialization Overview
·         Reading: Helpful Tips for Course Completion
·         Reading: General Information
·         Reading: Learning Objectives and Syllabus
·         Reading: Grading Scheme

Module 1:Transformers and Fine-Tuning

·         Reading: Module Introduction and Learning Objectives
·         Video: Hugging Face vs. PyTorch
·         Lab: Loading Models and Inference with Hugging
·         Video: Using Pre-Trained Transformers and Fine-Tuning
·         [Optional] Pre-training LLMs with Hugging Face
·         Video: Fine-Tuning with PyTorch
·         Video: Fine-Tuning with Hugging Face
·         Lab: Pre-Training and Fine-Tuning with PyTorch 
·         Lab: Fine-Tuning Transformers with PyTorch and Hugging Face
·         Reading: Summary and Highlights: Transformers and Fine-Tuning
·         Practice Quiz: Transformers and Fine-Tuning
·         Graded Quiz: Transformers and Fine-Tuning

Module 2: Parameter Efficient Fine-Tuning (PEFT)

·         Reading: Module Introduction and Learning Objectives
·         Video: Video: Introduction to PEFT
·         Lab: Adapters with PyTorch
·         Video: Low-Rank Adaptation (LoRA)
·         Video: LoRA with Hugging Face and PyTorch 
·         Lab: LoRA with PyTorch
·         Video: From Quantization to QLoRA
·         [Optional] Lab: QLoRA with Hugging Face
·         Reading: Soft Prompts 
·         Reading: Summary and Highlights: Parameter Efficient Fine-Tuning (PEFT)
·         Practice Quiz: Parameter Efficient Fine-Tuning (PEFT)
·         Graded Quiz:  Parameter Efficient Fine-Tuning (PEFT)
·         Reading: Cheat Sheet: Generative AI Engineering and Fine-tuning Transformers
·         Reading: Course Glossary: Generative AI Engineering and Fine-Tuning Transformers

Course Wrap-Up

·         Reading: Course Conclusion
·         Reading: Congratulations and Next Steps
·         Reading: Teams and Acknolwledgements
·         Reading: Copyrights and Trademarks
·         Course Rating and Feedback
·         Feedback

Recommended Skills Prior to Taking this Course

Basic knowledge of Python, PyTorch, and transformer architecture. You should also be familiar with machine learning and neural network concepts.

Estimated Effort

8 Hours

Level

Intermediate

Skills You Will Learn

Fine-tuning LLMs, Hugging Face, LoRA And QLoRA, Pretraining Transformers, PyTorch

Language

English

Course Code

AI0211EN

Tell Your Friends!

Saved this page to your clipboard!

Stay Ahead in AI – Subscribe to Our Newsletter

Get latest insights, courses, and trends in AI and cognitive computing by joining our newsletter. Be the first to know about new learning opportunities, expert articles, and exclusive content.

Have questions or need support? Chat with me 😊