"Arvados is a modern open source platform for managing and processing large biomedical data. By combining robust data and workflow management capabilities in a single platform, Arvados can organize and analyze petabytes of data and run reproducible and versioned computational workflows."
The primary thing with all the workflow systems is to see what is abstracted and find out if it matches your needs. Airflow would never be suitable for what I want need (diverse tools in many languages, some needing conflicting library versions), Arvados would work but I prefer WDL or Snakemake or even NextFlow. Snakemake for incrementally building up a prototype pipeline or a one-off analysis. WDL when I need a production pipeline for years. NextFlow or CWL/Arvados when I need to fit into somebody else's culture or compute infrastructure.
Edit: and something I have never seen in any workflow system is a looping mechanism that allows testing for convergence or dynamic parameter sweeps, etc. only the homegrown systems, built on top of cluster management like SLURM, have been that flexible. But these homegrown systems for managing compute clusters were never quite mature and generalizable enough to release as open source, even if the company had been willing to open source them.
And most workflow systems that support loops/cycles could be used for that too (e.g. Cylc, ecFlow, Prefect, Orchesta/StackStorm, Covenant, etc.).