My new uni completely rejected my previous uni's Intro CS curriculum. They forced me to retake their version: A Lisp-then-Java cycle that was a theoretical whiplash, which initially I resented. Who wants to re-take a class for topics they "already" learned?
By my second year, I saw the light. I understood approach.
While my highschool friends at other unis had courses in {{topic}} and courses in {{language}}, the uni I was at had a policy of "languages are incidental" for most courses. Professors had a list of the languages that they would accept for projects. "Anything but Perl" was common, although "systems" courses required C. For every class with a required language, it was not taught during class. There would be an optional lab that you could attend to learn at least one of the languages allowed for the course. The expectation was that you could learn enough in two or three weeks before the first project kicked off, and the TAs really were there to help you figure it out (most courses had 20-30 students for 2-3 TAs).
The "bookend" intro course that taught extreme FP and extreme OOP served as a forcing function that made me realize "Oh, languages really don't matter. Just pick up the details for the paradigm, and get on with the work."
As much as I resented having to "redo" an intro course cycle at first, I am forever grateful for the fearlessness that the overall experienced instilled in me.
It always struck me as strange that universities never had a course that would teach open source code. As in: grab a repo of a popular open source project, read part of it and do your best to create a contribution in it.
The lectures should be about different open source projects and their design choices.
I could see LLMs affecting that though. Their ability to output shitty and yet somewhat functional skeletons to work on manually further is just spot on.
There's problems like do you teach Python or C? It sounds silly but the difference is not about languages but how much you teach about systems. Teaching Python you get people going and they can produce faster, which does help students get less discouraged. But teaching C forces learning about the computer system and enables students to dive deeper to teach themselves many different subtopics that no 4 year program can.
What I think is generally missing and would be good to implement is code review and teaching how to understand a large existing codebase (all that grep, find, profilers, traces, tags, and all that jazz). This often gets taught in parallel (e.g. have students review each others code) but it's hit or miss, a lot of extra work, and not everyone does it.
Here's the shitty part: I was often told by peers and people higher up "don't look at student's code, just look at output and run tests." I always ignored, because that advice is why we're failing so many students. But I also understand it because professors are overburdened. There's too much work to do and teaching isn't even half the job. Then every new administrator or "office assistant" they hire, the more work you have (seriously, it'll take days to book a flight because you have to use some code but it takes 2 days for someone to tell you the code and 5 more to tell you that it was the wrong code and it's clearly your fault because you clicked on "book flight" and not trips > booking > flights > schedules > trips > access code > flights > search available flights. Honestly, I think all this llm agent stuff would sound silly if people actually just knew how to design things...)
This isn't a plug/whatever - just good content from an old friend.
Data, data, data :))) Some basic notions to know: Input → Computation → Output
Information is omnipresent (this is just an intuition, not a claim). It serves as both input and output.
Computation—also known as a procedure, function, set of instructions, transformation, method, algorithm, or calculation.
In my early days, I ignored the fundamental notion of data and procedures. But eventually, it clicked: Programs = Data + Instructions
Watch Feynman on computing—he even starts with the same concept of data, introducing computers as information processing systems. And processing requires algorithms (i.e., instructions or procedures).
Programming is simply writing instructions for the computer to perform computations.
A computer is just a machine for computing.
Computation is a general idea: a transformation of one form of information into another.
Richard Feynman Computer Science Lecture: https://www.youtube.com/watch?v=EKWGGDXe5MA
Old documentry on programming: https://www.youtube.com/watch?v=dFZecokdHLo
George Hotz video: what is programming? https://www.youtube.com/watch?v=N2bXEUSAiTI
https://denninginstitute.com/pjd/GP/gp_overview.html
https://htdp.org/2003-09-26/Book/curriculum-Z-H-5.html#node_...
Learning to program without knowing the language is useless and counter-productive.
Of course, this doesn't mean you have to learn 10+ languages first... but you have to learn a real programming language (not a toy one) before you can learn to program.
Edit: * a language
Which language is the language? A competent programmer can think about programming and reason about programs written in most languages without having to know that particular language intimately (with some exceptions that push outside the normal algorithmic language notation of the Fortran, C, Java, JS, Common Lisp, Rust, Go, etc. family of languages; but those are minority languages and a competent programmer shouldn't need more than a short period of time to become literate, if not expressive, in it).
> A competent programmer can think about programming and reason about programs written in most languages without having to know that particular language intimately
That's because the programmer already learned how to program.
But when they started, they definitely didn't write only pseudocode that wasn't runnable (to see the results) for months/years.
GT started students that way and it worked well for years. A full semester (number varied, but was the CS 101 course, 1301/1311/1501 or something like that), taught with only pseudocode. They got rid of it because of appearances, trying to be like every other school out there. Eventually settling on Python, I think, after a brief stint with Scheme (which ended after a major cheating scandal).
You need to learn to leetcode in psuedocode first.
I never see anyone learning to program using pseudocode (which isn't runnable to get feedback).
If they used pseudocode, were they just run the program in their heads?
In Software Tools Brian Kernighan and P.J. Plauger describe a pseudo-language called RATFOR (Rational Fortran), and then throughout the book implement RATFOR in itself.
Getting feedback while learning to program has a lot of value, but so does learning to think through code in your head. People old enough to remember when you had to wait a day to run your program and get results back (very slow turnaround) know the value of that skill, we used to call it "desk checking" -- reading through your code and running it in your head and on paper.
This is itself a skill people need to learn, that I'm not sure is possible with pseudocode and no prior experience. Too easy to gloss over details without actually running it to learn where your blind spots are.
I did this workshop a decade or so ago where I learned my co-workers don't do this, and never did learn how they understand code otherwise. One of them mentioned he didn't even realize this was a thing.
I haven’t seen this pedagogical practice in any other introductory course I’ve seen since. I believe it’s a holdover from the early days of computing, when programmers didn’t have access to personal computers or even interactive computing, which meant that programmers needed to spend more up-front time on design. Think of the punchcard era, for example.
I teach introductory programming in C++ at Ohlone College in Fremont, and I have my students write C++ on Day 1, starting with “Hello World” and going from there without flowcharts.
?
Look on my works, ye Chrome Heathens, and despair!
Now they teach language but you just ask agents to check the accuracy of code and rarely read it.
Only few devs wrote new algorithms and only few devs will now write the actual new code. These few devs don't need courses but all other devs need to pretend that they are part of these "few" so they need all the courses, just in case...
If you want to be a well-paid developer, learn leetcode.
Just "hey nobody can understand why that line is the way it is, what should we do about that" is probably one of the basic building-block skills of developing on a team, and you teach it wholly by abusing prima donna cowboys until they write something legible or quit.
You can get all these fundamentals for free and probably better from an LLM.