Is this an example of "everybody programs really well in their first computer language, and just makes all the others they learn fit that paradigm"? Equally I know modern flavours of C have any number of bells and whistles, and nice standard libraries, but it was often, though jokingly at points, referred to as "portable assembler" and it still applies. Hopefully said 4 years were also not spent staring at a CPU description document and hoping it leaps into your head to one day suddenly mean you have a full understanding.https://stuff.pypt.lt/ggt80x86a/asm1.htm
does pretty well at building up a practical end result from basic instructions.
If you get on well with that then https://www.youtube.com/watch?v=hE7l6Adoiiw&list=PL6B940F08B9773B9F&index=1
is also something that might work for you. Don't have to do the whole series (and he has a nice followup one as well) but if you already have pointers on lock from C in general then by the time they are covering those you probably have most of what you want from that.
To round out the list then http://www.plantation-productions.com/Webster/
is also for X86.
Anyway I can't say I have ever really had that problem. However I mostly came from an electronics background (or at least had one) so the "everything is adding" (adding it adding, subtraction is a type of adding, multiplication is long form adding, division is subtraction if you use logarithms which is a type of adding, comparisons are adding) and building up from basic logic steps is nothing unusual (build any gate from a NAND gate collection, build an adding machine, a flip flop, a shift register, a set of shift registers, if you know C then I assume you can skip all the ceiling/floor/float/shift/rotate/signed... stuff, hope you also have maths as far as log tables and trigonometry tables). Whether then you want to do a little crash course in digital electronics and digital logic I don't know but it usually puts people in reasonable stead, especially when it comes time to start playing with extra hardware on carts/addons/the system in question that might be rather basic in what it does but still digital electronics.
Most high level languages are working around the limitations of using bits to store information in what is anything but a binary universe, and abstracting that away from people to free them up to think about bigger things. CPUs feeding back into that as newer classes of information processing come into vogue to have in silicon rather than burning hundreds of instructions in what might be done in one, or indeed have hundreds of effectively identical operations done in one (SIMD = single instruction multiple data after all).
Analogies is not usually how I would view things for assembly either. It is not without merit (decompilation is sure to be really fun over the coming decades) but instead in terms of processes/systems in the classical machine (car has drive system, power system, breaking system, control system... and they interact and influence each other at these points) or biology sense of the term. Usually then maths, housekeeping aka basic IO (fiddling with registers, flags, memory, states and what have you, get a bit further* and you can also wind in a bit of security and operating systems**) and program flow.
*not that all but the most recent consoles really have it.
**it cuts the other way as well. Have to do the whole push and pop routine to do a function or something and you will understand quickly why you are told it is bad form to call a subroutine within a subroutine.
Maths then being one of the big ones and that usually just being some CPU designer then having to do every case of an add, sub, divide, multiply... operation for every length of signed, float, fixed point, boolean, simple flag or whatever that they figured was necessary and within their transistor budget. The 6502 peeps scare me with having registers be unique special things rather than just a list of equally viable options for most purposes that I view ARM and X86 as but in the end there are so few in 6502 that it makes some sense to do it that way, and underneath it all I know if I looked at transistors then I would see that.
Some also find it valuable to figure out what is missing (ARM that the GBA and DS have misses divide, both do poorly for anything floating point). Some also reckon more limitations is good, and it is not a position without merit but I would sooner train my mind to avoid doing certain actions on certain systems than have to learn to remember to use them as they are available on another.
Spend some time playing hacker for games just either hardcoding cheats or tweaking things slightly so -- cause it to miss a calculation (congratulations there is no more stupid random luck variable in what might otherwise have been a pure game), tweak some maths just slightly so, add on something a bit more random to something... and most of the rest will fall in place.
To answer some of the questions directly.
I am not sure what you mean by incremental. It has been a while since I delved into MIPS but never noted anything strange there and a quick scan said nothing I would particularly note as incremental here.
Quick and dirty guide... that depends more how well you know C and can guess what something is attempting to do and fit it accordingly, possibly tempered further with knowledge of how coding played out on the system in question and was taught back in the day. Sometimes you can augment it a bit with the output of a dynamic recompiler but eh.
Book of comparisons... no and I doubt there ever will be one. At very best some phd doing a decompiler might have done something exhaustive for a given piece of code or standard library from a given version of a given compiler (or maybe two) and that is more for those writing a decompiler than anything like you might want.