New posts only

Subscribe to skull angel


<100 subscribers
<100 subscribers
If you REALLY want to understand the math of ML, look no further.
This is the third and final blog of my series on the math of AI and LLMs.
A while ago, I wrote a blog about PCA, a crucial dimensionality reduction method that’s still used to make ML systems more efficient.
I explained how it worked, and how and eigenvectors are so crucial to getting the principle components —but wait, what exactly are eigenvectors?
Today, we’ll learn about exactly that: eigenvalues, eigenvectors, and how they’re crucial mathematical foundations to AI concepts like dimensionality reduction and more.
If you like learning AI concepts through easy-to-understand diagrams, I’ve created a free resource that organises all my work in one place — feel free to check it out!
If you REALLY want to understand the math of ML, look no further.
This is the third and final blog of my series on the math of AI and LLMs.
A while ago, I wrote a blog about PCA, a crucial dimensionality reduction method that’s still used to make ML systems more efficient.
I explained how it worked, and how and eigenvectors are so crucial to getting the principle components —but wait, what exactly are eigenvectors?
Today, we’ll learn about exactly that: eigenvalues, eigenvectors, and how they’re crucial mathematical foundations to AI concepts like dimensionality reduction and more.
If you like learning AI concepts through easy-to-understand diagrams, I’ve created a free resource that organises all my work in one place — feel free to check it out!
Share Dialog
Share Dialog
No activity yet