When learning any hardware for the first time, one invariably runs into peculiarities that are so foreign and frustrating that they discourage the learner (especially the self-taught learner) from progressing further.
Many a development board has been left collecting dust for months -- or set aside forever -- in favor of a more approachable piece of hardware that is less powerful but easier to use. For this reason, most people best master difficult-to-learn hardware in a university or professional setting. In these environs, the user is forced to use a particular piece of hardware, but at the same time has access to resources and mentors that are simply not available to the average do-it-yourself electronics enthusiast.
In the case of FPGAs, the home-learner can read all the books and cruise all the online forums out there, but -- inevitably -- he will run across an issue specific to FPGAs that is so counter-intuitive that he stops having fun and starts being frustrated.
This FPGA will rue the day it crossed me!
My job is to give that DIY user access to adequate resources that he or she can use to learn complicated hardware -- in this case, FPGAs. But how can I possibly deal with the frequent and varied peculiarities that seem to arise when learning programmable logic? The short answer is "I can't!" The long answer is that, by assuming the user is familiar with MCUs, I can use that assumed knowledge as a basis for comparison in teaching the fundamental concepts of FPGA programming.
In my Digital Logic class in college, for example, our first assignment with VHDL was to write a simple circuit that used a couple of switches as inputs and modified the state of some corresponding LED outputs accordingly. Because of my experience with MCUs and C programming, I could not get my head around the idea that I was "writing a circuit." I understood the digital logic I was being taught, but when it was put into a code format I was still thinking sequentially. Where was the "while(1)" loop at the end? This code was going to run off into the weeds!
I sat in my first lab completely frustrated until the teaching assistant pointed to my code and said, "When it's uploaded to the board, everything in this code is happening all at once." Thinking sequentially was my first "chute," my error in understanding, and my TA's statement was my first "ladder," a way of thinking that allowed me to gain a real understanding as to what was going on inside my code.
So now I'm trying to put ladders in place along the learning path before the user even hits the chutes. Portions of VHDL code that are reliant on a clock signal, for example, can be very hard for a new user to understand. Having MCU experience might even come as a detriment, because the user already has a bias as to how things are "supposed" to go. How do I even explain a constraint file to an MCU user if she can't initially visualize the connections between the constraints and the hardware description language (HDL) representation? I need to dispel biased ideas before they can become frustrations so the student can understand FPGAs as painlessly as possible.
Everyone has a different experience when learning FPGAs. When you were learning, at what point did you become hopelessly stuck? What "peculiarities" of FPGAs really made you bang your head on your keyboard? And, when you did finally get it? What was the trick? What did you read or hear that finally gave you a ladder and made the idea "click"?