Sign in to confirm you’re not a bot
This helps protect our community. Learn more
Cross Attention | Method Explanation | Math Explained
1.5KLikes
32,643Views
2023Mar 20
Cross Attention is one of the most crucial methods in the current field of deep learning. It enables many many models to work the way they are and output amazing results as seen with #stablediffusion #imagen #muse etc. In this video I'm giving a visual and (hopefully) intuitive explanation of how Cross Attention works. I'm trying to give some simple examples to have an easy way to understand the math. 00:00 Introduction 01:10 Self Attention explained 07:40 Cross Attention explained 11:28 Summary 12:25 Outro #crossattention #attention

Follow along using the transcript.

Outlier

17.5K subscribers