Make.ts
117 points
5 hours ago
| 17 comments
| matklad.github.io
| HN
syhol
1 hour ago
[-]
My gut reaction is to rush to the comments to shill my favourite task runner ( mise tasks[1], now with shell aliases[2]!) but pushing past that, the core idea of writing scripts in a file rather than a shell prompt is a great nugget of wisdom. But I disagree with this bit:

"I want to be clear here, I am not advocating writing “proper” scripts, just capturing your interactive, ad-hoc command to a persistent file."

What's the difference? Why not version control it, share it with colleagues. Imagine writing a unit test to test a new feature then deleting it when done, what a waste. Ok it's not exactly the same because you aren't using these scripts to catch regressions, but all of that useful learning and context can be reused.

I don't think the language you use for scripting is too important as long as the runtime is pinned and easily available on all engineers machines, perhaps using a toolchain manager like... mise[3].

[1] https://mise.jdx.dev/tasks/ [2] https://mise.jdx.dev/shell-aliases.html [3] https://mise.jdx.dev/dev-tools/

reply
pzmarzly
4 hours ago
[-]
This is the way. Shell makes for a terrible scripting language, that I start regretting choosing usually around the time I have to introduce the first `if` into my "simple" scripts, or have to do some more complex string manipulation.

At least nowadays LLMs can rewrite Bash to JS/Python/Ruby pretty quickly.

reply
kh_hk
4 hours ago
[-]
Well, at least I will be able to run my bash scripts in 5 years
reply
g947o
44 minutes ago
[-]
I don't know Ruby, but chances are that your Python/JavaScript scripts are going to run in 5 years as well, if you stick to standard library.
reply
ChrisGreenHeur
13 minutes ago
[-]
and then your mamba changes
reply
nilamo
6 minutes ago
[-]
What's that even mean
reply
pzmarzly
3 hours ago
[-]
Fair. My bash scripts only broke 3 times over the years:

- when ls started quoting filenames with spaces (add -N)

- when perl stopped being installed by default in CentOS and AlmaLinux (had to add dnf install -y perl)

- when egrep alias disappeared (use grep -E)

reply
greener_grass
2 hours ago
[-]
Bash is not a great cross-platform choice. Too many subtle differences.

The best way is a scripting language with locked-down dependency spec inside the script. Weirdly .NET is leading the way here.

reply
Imustaskforhelp
2 hours ago
[-]
Python with uv seems decent in here too.
reply
kh_hk
2 hours ago
[-]
python does EOL releases after 5 years. I guess versions are readily available for downloading and running with uv, but at that point you are on your own.

bash is glue and for me, glue code must survive the passage of time. The moment you use a high-level language for glue code it stops being glue code.

reply
gf000
4 hours ago
[-]
For some quality of "run", because I'm hella sure that it has quite a few serious bugs no matter what, starting from escapes or just a folder being empty/having files unlike when it was written, causing it to break in a completely unintelligible way.
reply
kh_hk
4 hours ago
[-]
I guess we have wildly different expectatives of what a language is responsible for and what not.
reply
frizlab
4 hours ago
[-]
I use swift! I even (re-)wrote swift-sh[0] to make it possible to import external modules in a script (à la uv).

[0] https://github.com/xcode-actions/swift-sh

reply
sureglymop
3 hours ago
[-]
Agreed. The shell is great for chaining together atomic operations on plaintext. That is to say, it is great for one liners doing that. The main reason probably isn't how it all operates on plain text but how easy it makes it to start processes, do process substitution, redirections, etc.

As soon as you have state accumulating somewhere, branching or loops it becomes chaotic too quickly.

reply
ctenb
2 hours ago
[-]
"Just" is exactly made for this, and it is amazing. You write a justfile that is somewhat similar to a makefile but without the painpoints and it provides a CLI interface of commands you want to run
reply
nilamo
2 minutes ago
[-]
I was a Just enjoyer for quite a while, until I tried mise. Mise does all the same things as just, but also has source/output tracking to avoid rerunning build jobs (like make), and also bundles runtimes like asdf. It's become my all-in-one task runner of choice.
reply
epaga
4 hours ago
[-]
It's almost depressing to me how much this post feels like a breath of fresh air if for nothing else than because it's clearly hand-written, not ghost-written by LLM.

No repetitive short sentences, no "Not X, just Y." patterns, and lots of opinionated statements, written confidently in the first person.

Please more of this.

reply
raincole
1 hour ago
[-]
Completely off-topic, but I recently had my "AI-depression moment" when I found out top domain writer.com is owned by an AI company now.
reply
hsbauauvhabzb
2 hours ago
[-]
It’s also relatively short and concise :)
reply
forty
4 hours ago
[-]
In the web/js/ts ecosystem, most people use npm scripts in package.json, rather than a custom make.ts. Scripts you launch from there can be in any language, so nothing prevents you from using TS shell scripts if that's your thing.

Another quite standard way of savings your command history in a file that I have seen used in all ecosystems is called "make", which even saves you a few characters when you have to type it, and at least people don't have to discover your custom system, have auto complete work out of the box, etc

reply
Cthulhu_
2 hours ago
[-]
The main downside to putting scripts into package.json (or NX's project.json) is that you have to wrap it in JSON. Which is fine for simple commands, but when you start adding stuff like quotes or multi-command commands it starts to get a bit busy.

I quite like make or just as a task runner, since the syntax / indentation / etc overhead is a lot lower. I haven't yet tried to introduce it in any JS based projects though, because it adds yet another tool.

reply
forty
27 minutes ago
[-]
I put any sufficiently complex command to scripts/<command>.sh and keep the package json as light as possible.

One very big upside I have to use package.json is that we use pnpm which has very sophisticated way of targeting packages with --fiter (like "run tests from packages that had modification compared to master and all their transitively dependents packages" which is often exactly what you want to do)

reply
soulofmischief
3 hours ago
[-]
My monorepos have become increasingly multilingual over the years, often due to dependencies, and it's not uncommon to find a make file, cargo.toml, package.json, deno.json, venv + requirements.json, etc. all living in the same root.

Coming from a web background, my usual move is to put all scripts in the package.json, if present. I'd use make for everything, but it's overkill for a lot of stuff and is non-standard in a lot of the domains I work in.

reply
embedding-shape
3 hours ago
[-]
> My monorepos have become increasingly multilingual over the years, often due to dependencies, and it's not uncommon to find a make file, cargo.toml, package.json, deno.json, venv + requirements.json, etc. all living in the same root.

Same!

Usual move used to put everything in Makefile, but after getting traumatized time and time again from ever-growing complexity, I've started to embrace Just (https://github.com/casey/just) which is basically just a simpler Make. I tend to work across teams a lot, and make/just seems easier for people to spot at a glance, than scripts inside of a package.json that mostly frontend/JavaScript/TypeScript people understand to take a look at.

But in the end I think it matters less specifically what you use, as long as you have one entrypoint that collects everything, could be a Makefile, Justfile or package.json, as long as everything gets under the same thing. Could be a .sh for all I care :)

reply
oulipo2
3 hours ago
[-]
Mise is also very nice (for dependencies and for scripts) https://mise.jdx.dev/
reply
tcoff91
1 hour ago
[-]
How am I the first person to mention fzf?

Just integrate fzf into your shell and use ctrl-r to instantly summon a fuzzy shell history search and re-execute any command from your history!

I cannot imagine going back to using a terminal without this.

I still write plenty of scripts if I need to repeat multi command processes but for one liners just use fzf to reexecute it.

Also in a shared project you can ignore script files with .git/info/exclude instead of .gitignore so you don’t have to check in your personal exclusion patterns to the main branch.

Seriously people if you use a terminal you need the following tools to dominate the shell:

ripgrep, zoxide, fzf, fd

reply
spiffytech
50 minutes ago
[-]
I can't believe how long I was sleeping on fd and zoxide. zoxide is now one of my top commands, and fd feels like when I switched to ripgrep. So fast and easy there's no reason not to run it.
reply
tcoff91
44 minutes ago
[-]
Zoxide is incredible! Going from cd to zoxide is like going from walking to driving an F1 car around the directory tree.

I made a function called y that is like the z function but is git worktree / jj workspace aware. So useful!

reply
arnorhs
4 hours ago
[-]
I mostly have my scripts in package.json "scripts" section - but sometimes the scripts invoked will actually be .ts files, sometimes just bash if that makes more sense.

Though, I generally run these scripts using bun (and the corresponding `$` in bun) - basically the same thing, but I just prefer bun over deno

reply
mcapodici
2 hours ago
[-]
If you want it to be an alternative to shell history then ~/make.ts is better, since that'll be the same wherever you are.
reply
matklad
2 hours ago
[-]
Thanks, I haven't considered this! My history is usually naturally project-scoped, but I bet I'll find ~/make.ts useful now that I have it!
reply
nextaccountic
2 hours ago
[-]
Using `` to interpolate command arguments is very clever! What's missing is a discussion on how you do quoting (for example, how to ls a directory with spaces in name)

Anyway, what kills this for me is the need to add await before every command.

reply
worldsayshi
4 hours ago
[-]
It sounds like at least some of the problems pointed at would be mitigated by using fzf. At least it has greatly improved my terminal ux.
reply
theanonymousone
3 hours ago
[-]
I already do it, but not in TS. There is a scripting language that is as available in most/all (non-Windows) systems as Bash: Python.

Edit: zero-dependency Python.

reply
verdverm
3 hours ago
[-]
Works all and well until you need a dependency, then you need to do all the same project setup as normal.

Stopped using python for scripting for this reason

reply
GeneralMaximus
3 hours ago
[-]
If you use `uv`, you can declare your dependencies at the top of a script: https://docs.astral.sh/uv/guides/scripts/#declaring-script-d...

I've started using Python for many more tasks after I discovered this feature. I'm primarily a JS/TS developer, but the ability to write a "standalone" script that can pull in third-party dependencies without affecting your current project is a massive productivity boost.

reply
theanonymousone
3 hours ago
[-]
I don't think you need any dependencies to match Bash scripting in capability.
reply
hsbauauvhabzb
2 hours ago
[-]
You can even wrap shell / system commands in python and capture the output, so it’s basically a superset!
reply
kh_hk
1 hour ago
[-]
You can also inline python inside shell scripts, does that make them equal sets? :)

    life() {
      python3 << EOF
    print(42)
    EOF
    }
reply
IshKebab
5 hours ago
[-]
This is one of Deno's killer use cases IMO. 100x better than shell scripting and like 5x better than Python scripting. Python should be good for this sort of thing, but it isn't.

Historically we had to use pip which was super janky. Uv solves most of pip's issues but you still do have to deal with venvs and one issue it doesn't solve is that you can't do imports by relative file path which is something you always end up wanting for ad-hoc scripting. You can use relative package paths but that's totally different.

reply
PurpleRamen
3 hours ago
[-]
> you can't do imports by relative file path

Just add the targeted path to sys.path, or write your own importhandler. importlib might help there. But true, out of the box, imports in python3 are a bit wacky for more flexible usage.

reply
IshKebab
1 hour ago
[-]
Both of those are horrible and break all tooling. Deno's imports work properly.
reply
PurpleRamen
1 hour ago
[-]
> Both of those are horrible and break all tooling.

No, they don't. Tooling is fine with those things.

reply
wiseowise
4 hours ago
[-]
> 5x better than Python scripting

I’m not sure about that. All those ‘await’s, parentheses really kill my mojo. Why do you find it better than Python?

reply
IshKebab
4 hours ago
[-]
> Why do you find it better than Python?

I said already - the main reason is you can import files by relative file path.

You can get close to the Deno UX with uv and a script like this:

  #!/usr/bin/env -S uv run --script
  #
  # /// script
  # requires-python = ">=3.12"
  # dependencies = ["httpx"]
  # ///
  import httpx
  print(httpx.get("https://example.com"))
But you still have to deal with the venv e.g. for IDE support, linting and so on. It's just more janky than Deno.

I wish someone would make a nice modern scripting language with arbitrary precision integers, static types, file path imports, third party dependencies in single files, etc. Deno is the closest thing I've found but in spite of how good Typescript is there are still a ton of Javascript warts you can't get away from (`var`, `==`, the number format, the prototype system, janky map/reduce design, etc.)

reply
fainpul
2 hours ago
[-]
PowerShell is pretty good for shell scripting.

  iwr https://example.com
You also have arbitrary precision integers and all the other stuff from .NET

  $b = [BigInt]::Parse('10000000000000000000000000000000000000000000000000')
reply
IshKebab
1 hour ago
[-]
Powershell has god awful syntax though. There's no way I'd want to do anything remotely significant with it.
reply
flohofwoe
4 hours ago
[-]
Heh, I went down that same rabbid hole recently, but in addition to 'shell scripting tasks' also describe a whole C/C++ build in Deno-flavoured TS instead of wrestling with cmake syntax: https://github.com/floooh/fibs - and while at it, also allow to integrate build jobs written in Typescript into the C/C++ build.

...this is the same sort of 'works for me' philosophy as in Matklads post though, it's so heavily opinionated and personalized that I don't expect other people to pick it up, but it makes my day-to-day work a lot easier (especially since I switch multiple times between macOS, Linux and Windows on a typical day).

I'm not sure if Bun can do it too, but the one great thing about Deno is that it can directly import without requiring a 'manifest file' (e.g. package.json or deno.json), e.g. you can do something like this right in the code:

    import { Bla } from 'jsr:@floooh/bla^1';
This is just perfect for this type of command line tools.
reply
jauntywundrkind
5 hours ago
[-]
Zx is great. Really easy scripting!

This article used Dax instead which also looks fine! Https://github.com/dsherret/dax

reply
pzmarzly
5 hours ago
[-]
There is also Bun shell built-in library, that I liked. https://bun.com/docs/runtime/shell
reply
Imustaskforhelp
2 hours ago
[-]
Agreed I was looking for this comment. Bun shell is amazing although I had trouble having it be written by LLM's sometimes (not always) but overall Bun shell is really cool.

One of my projects actually use bun shell to call some rust binary in a website itself and I really liked this use case.

reply
drcongo
3 hours ago
[-]
I use mise for this as it then also gives you a handy `mise tasks` command so you can see what commands are available and what they do. Mise has been a real gamechanger for my ailing memory.
reply
chanux
3 hours ago
[-]
Any good write up about this you can recommend please? I have been struggling to get on mise tasks train.
reply
throwaway290
5 hours ago
[-]
> I have definitelly crossed the line where writing a script makes sense

...and that was also the one concrete example where it makes sense to have extra dependency and abstraction layer on top of a shell script:)

say you know TS and even if you walk back to where $ is defined, can you tell immediately why $`ls {dir}` gets executed and not just logged?

reply
supernes
4 hours ago
[-]
You can make it more explicit by renaming the import to something like "shell_exec". Tagged templates are already pretty common in TS projects for things like gql or sql queries.
reply
throwaway290
4 hours ago
[-]
tagged template does not cause execution of given string. tagged template is just a function and in this case it's simply a proxy for console.log() which also doesn't cause execution of given string.

so how does it get executed?

unless it was just an example and you are supposed to switch in $ from some third party library... which is another dependency in addition to deno... and which can be shai-huluded anytime or you may be offline and cannot install it when you run the script?

reply
supernes
4 hours ago
[-]
Yes, it's another dependency (dax). The example with console.log is just that, an example. Standard dependency management practices apply, e.g. pinning a version/commit hash.
reply
throwaway290
3 hours ago
[-]
That explains it:) Maybe the original article deserves a clarification
reply
doanbactam
5 hours ago
[-]
Does it track file hashes or just timestamps? Critique 2: Better. Shows specific pain point (intellisense) and asks a technical question about caching (hashes vs timestamps). This looks like a solid middle ground between npm scripts and a full-blown CI system. I've always hated the tab syntax in GNU Make, so a typed alternative is appealing.
reply
hdjrudni
4 hours ago
[-]
I don't think you understand what he's proposing here. This isn't really a replacement for Make at all. This is just using Deno to run random script files.
reply
forty
4 hours ago
[-]
That are two things in the article: having a kind of make alternative to "save your command history" and basically avoiding repeating large commands and how they use TS to make shell scripts.
reply