Skip to main content

Attention, Transformers, and LLMs: a hands-on introduction in Pytorch

Submission Number: 286
Submission ID: 4310
Submission UUID: e3e92f64-5378-409e-b803-bf18eaaeb53a
Submission URI: /form/resource

Created: Thu, 01/18/2024 - 13:50
Completed: Thu, 01/18/2024 - 13:51
Changed: Fri, 03/14/2025 - 11:43

Remote IP address: 2620:103:a000:101:ecf5:ca33:2a11:ff23
Submitted by: Dane Smith
Language: English

Is draft: No
Yes
Attention, Transformers, and LLMs: a hands-on introduction in Pytorch
Learning
Intermediate
This workshop focuses on developing an understanding of the fundamentals of attention and the transformer architecture so that you can understand how LLMs work and use them in your own projects.