10 years ago I wrote “A Beginner's Guide to Coding Graphics Shaders”. For ~7 years it remained the top google search result for the generic term “graphics shader tutorial”. It got me my first computer graphics job1 , and also my second one2, as well as a ton of inbound for freelance projects over the years.
The article had hundreds of comments saying:
“I finally get it! I can’t believe no one has ever explained it this way!!”
It also got translated to 3 other languages.

The strange thing is that I barely understood shaders myself when I wrote it. I knew just enough to explain the one thing in there. Despite the popularity of this article on the internet, and despite how often I got this reaction:
“Oh, you’re the guy from that article that unlocked shaders for me! You changed my life!!!”
No one in the computer graphics industry proper praised it or recommended it. The experts did often point out its flaws3.
Why was I the first to fill this gap?
I think the way to understand what happened there is something like this:
You start a new job, and follow the instructions to get set up
At some point you are stuck, you can’t figure it out. What the instructions say doesn’t seem to work
After lots of blood, sweat, and tears, you finally figure it out!
Now, instead of going back & updating the instructions, you think to yourself:
“ah, well, I’m the new guy, I don’t know much. It makes sense that I struggled. I bet everyone else got it right away…”
And you move on. The next guy joins, and the cycle repeats.
When I was first learning, it took like 4 months to figure out what shaders are. Everyone kept saying things like:
“shaders are really complicated! Only really smart people can do that work”
“you have to read this book on GPU hardware fundamentals before you can even begin to understand shaders”
“you have to read this textbook on linear algebra first”
Since everyone was saying that, I believed them. So I did all the work. And AFTER learning all that, I realized it was completely unnecessary for what I needed to do.
The gatekeeping really bothered me. Some of it was ego. Like:
“if I had to suffer to earn my knowledge, then you should too”
Some of it I think was genuine: if the people “above you” said this knowledge was important, and you don’t think it’s important…then you must not be “there yet”. And once “you’re there” you’ll agree with the elders.
But sometimes the elders are wrong. Correcting them accelerates the creation of new elders, and pushes the whole field forward.
The thing that beginners can do that experts cannot is compress knowledge.
Before my article, let’s say the path everyone had to go through to learn to write shaders was ~4 months of prerequisites. My contribution was figuring out how to do something useful WITHOUT all those prereq’s. I charted a new path.
It’s almost impossible for an expert to do this because they cannot remember what it’s like to NOT have this knowledge. It’s a very difficult thing to simulate in your mind, a lack of knowledge, perhaps impossible.
It’s kind of like if a piece of software runs on your machine, but doesn’t run on your co-worker’s machine. You know that you have some dependency that they don’t, but which one? You can just give them a full list of your dependencies, and they can install all of them, and then it works. And now they have become “the expert”. But now NEITHER of you have any idea what the necessary subset actually is!
The reason this matters so much is because knowledge is not always additive. Getting beginners to contribute is useful NOT just a matter of accelerating the path from layperson to expert (even though that is extremely high value), it also potentially unblocks major breakthroughs.
What if one of the pieces of knowledge along the recommended path is actually wrong? What if it introduces a blindspot? What if, without this piece of knowledge, the breakthrough is obvious?
recently wrote about exactly this kind of thing, the damage of intellectual monocultures:Hinton’s long-time hostility against any role at all for symbols has, in my judgement, cost the field dearly. Ideas that were were only discovered in the last couple years (e.g., some discussed later in this essay) may have been discovered much later than they might otherwise have been.
Many other important ideas have likely also yet to be discovered, precisely because the Hinton path has distracted immense resources from other ideas, fostering an intellectual monoculture that, in the words of Emily Bender, has been “sucking the oxygen from the room.”
From: How o3 and Grok 4 Accidentally Vindicated Neurosymbolic AI
I think a lot of the world and our pursuit of knowledge is currently stuck in this way. And I think it’s fairly easy to get out of this loop. Like, people NOT knowing things is not a problem, it’s an opportunity. If you struggle to learn something, and then shorten that path for others, you will be, at minimum accelerating the field, and perhaps creating a new school of thought that will surface a major breakthrough.
I want to say it one more time for emphasis: you can accelerate the entire field, on day one, as a beginner. This isn’t like “give the new guy a toy problem”, this is:
“we NEED you to contribute because we literally cannot do this work up here in the ivory tower. For the love of god we need you”
Robert Taylor found a link to it on a game dev forum, and cold-emailed me asking if I wanted an internship at the game studio he was working at the time.
This was at Cesium where they were looking for someone who understood graphics, but could also write. A piece of finished work is infinitely more persuasive than any resume.
The major one being that the article claims to explain shaders, but it only talks about fragment shaders, which are only one part of a larger GPU pipeline.
really great read
The curse of the experts