This is why network RFCs talk of "octets", to avoid the ambiguity. Octets are always 8 bits.
The architectural decision everything else follows from: a decimal calculator should store numbers as BCD — one decimal digit per 4-bit nibble. A standard byte-oriented CPU (Z80, 6502) fights that layout constantly. So I designed a small custom CPU in Verilog where 4 bits is the natural data width and memory is nibble addressable.
What the project covers:
- Custom CPU: Harvard architecture, 12-bit ISA, 8-state execution FSM, hardware stack guard with a FAULT state for microcode debugging
- CORDIC for trig functions, verified to 14 significant digits
- Two-pass assembler in Python (~700 lines)
- Verilator + Qt framework: same Verilog source runs in simulation, as a desktop GUI debugger, as WebAssembly, and on real hardware
- Scripting language on top of the microcode for adding functions without touching hardware
- Custom PCB (EasyEDA/JLCPCB), battery, charging circuit
Write-up: https://baltazarstudios.com
Hackaday: https://hackaday.com/2026/05/13/build-the-cpu-then-build-the...