RISC-V from scratch (1/n)

January 19, 2024

RISC-V has become somewhat of a buzzword in the electronics space in the last few years. What is it? Why has it become so popular? and more importantly, how can I make cool stuff with it?

Disclaimer: I am by NO MEANS an expert on RISC-V. I’m just a guy who fiddles occasionally with his keyboard on random evenings.

I wound up on the ‘slow boat’ to RISC-V very much by accident. I interned at Raspberry Pi back in the summer of 2021, nearly 3 years ago, under the role of ‘software intern’. I was responsible for writing a bunch of embedded software to support their then-newly released RP2040 chip. I managed to finish the tasks that I was given in the first few weeks and was thus discreetly pulled into the team opposite me (the ASIC team) to write some internal tooling.

During my time there, I overheard a lot of interesting words (and in many cases acronyms…because you can never have enough acronyms). I would sit in the meetings and just absorb all the language: DMA, datapath, execute-in-place and so on. I was just 18 at the time and though I had blinked a few LEDs, all of my work was at the software level, never anything below that. I enjoyed it so much I went back again in the summer of 2022. Suffice it to say that I learned a lot in those two summers. I walked away with a brain brimming full of knowledge in embedded software and chip design, as well as an FPGA (hint hint!!!).

First appearances matter

A year or so later, I took a course at university called “Computer Systems”. The lecturer was known to be somewhat of a stern man and the frightening stare of death that I received in my first year of university when he was a lab leader certainly didn’t help. Oddly enough, the extent to which you’re willing to tolerate the BS of first year undergraduates is inversely correlated with how good of a lecturer you are, and I certainly did enjoy that computer systems course…

It was really well explained, the handouts were to the point, and the lecturer managed to cultivate a genuine interest in computing in the audience. The course textbook was a work of art as well. Luckily for me, I had already read a good portion of the textbook as part of my reading for my summer internship before the fall.

During the course we used the MIPS instruction set architecture (ISA) as a learning reference to understand how assemblers, instruction sets, and other related concepts worked. I had actually written some Verilog during my internship and in a very distant corner of my mind, remembered bits of VHDL concepts from my first year of university in 2019.

With all that I had learned in the two years leading up to this, I decided that this was a great moment to write my own CPU from scratch. The textbook explained it so clearly. Datapaths. Pipelines. ALUs. The funny clips the lecturer showed of ’the early nerds of computing’ like Wozniak and Bill Gates huddled around their little terminals made it seem so fun and appealing. I excitedly compiled a binary of Icarus Verilog (a Verilog simulator), created a Makefile and a bit of scaffolding. After a few hours, progress quickly slowed to a halt. I was scratching my head more often and my excitement wained. I never once touched the GitHub repository after that day.

The sad road to failure

So it turns out that writing your own CPU was a lot more complex and involved than I thought. There were concepts that I didn’t know about but more importantly, describing the register transfer level (RTL) was very different to coming up with all these fancy diagrams of a processor. You might recall that FPGA that I so triumphantly received at the end of my 2022 internship. I remember walking out the office with that little guy in my hand as if it were the world cup trophy.

Unfortunately, it sat at my desk for nearly two years before I even got the chance to touch it. There was a huge mental block between me writing some Verilog and getting it running. Until now, there was no real impetus to work out the complex open source toolchain required to get an LED blinking.

In parallel to all this, I had started to hear the term ‘RISC-V’ in more of the media that I consumed. YouTube videos, news articles, Twitter posts from some respected tech-y people, and a whole host of other sources. I was curious. I also needed to brush up my Verilog due to some new responsibilities at work. I thought that maybe RISC-V could be an interesting way for me to improve my skills. As anyone learning a new skill knows, it’s much easier to learn if you have a passion project that you’re willing to dedicate hours of your free time to.

To see what RISC-V was even about, I looked online for a couple of articles and blog posts. It wasn’t until I read a copy of the RISC-V ISA (namely the RV32I base instruction set, version 2.0) that I realized RISC-V looked oddly familiar. It was like MIPS, only way more. Suddenly, this big scary behemoth that I had heard about wasn’t so scary after all. Thus, it was decided. I was going to make my own CPU from scratch. Second time’s the charm, right?

To be continued…

Read more about my RISC-V journey in the next journal entry.