But certain managers are now very keen on making a lot of noise about just how effectively their teams are using AI. So I took my four python scripts which together form a pipeline that solves a scheduling problem with OR-Tools and renamed my README.md to skill.md so agents would think it was for them.
The LLM does pretty much nothing except run the commands in order, CP-SAT does the real work and is being confused for AI. Yet when I demoed it people were like:
> wow, neat, look at what's now possible in this dawning age of AI
I've not bothered to tell them that it's 1960's technology and that the AI part of it could also be adequately performed by a README with less than 100 words. I guess everything that the managers haven't heard of is now "AI" and golly look how effectively we're all using it.Maybe they’re not AI in the way you’ve heard marketing teams use it recently, but they are artificial intelligence nonetheless.
If you open any AI textbook written before 2022 there is almost surely a chapter on these methods (c.f. Russel and Norvig’s Artificial Intelligence: A Modern Approach).
Batch a cheap process at night that runs CP-SAT solution. If someone calls in sick, be prepared to run it again with more horsepower so you can update it.
Metaheuristic solvers are different in that you don't need to model your problem as a mixed integer problem. Instead, all it cares about is having a function that returns something you can compare. This allows you to model your problem however you like. Some metaheurstic techniques include Simulated Annealing, Late Acceptance Local Search, and Tabu Search.
Metaheuristic solvers may not generate optimal solutions (after all, by their nature, they don't know the structure of the problem), but they generate "good enough" and "close to optimal" solutions. Metaheurstic solvers tends to beat MIP and CP-SAT for VRP, whereas MIP and CP-SAT are better for bin packing.
If you want to try using a Metaheuristic solver, I can recommend Timefold, which allows you to define your constraints using your domain objects in an incremental matter (it has SQL/Java-Streams like syntax, which in my opinion, is more readable than formulas) (disclosure: I work for Timefold).
This basically means you have two choices:
1. Translate the constraints from the new language to Java bytecode at runtime. 2. Translate the entire solver to a new language.
We did (1) for a bit for CPython, but since CPython bytecode constantly change and break (and is so poorly documented) it was a nightmare to maintain. You can find a blog post of me explaining it a bit more here: https://timefold.ai/blog/java-vs-python-speed. The CPython port is no longer maintained, and has quite a few missing features.
That being said, we have a wide range of ready made models that you can access via an API, which might fit your use case (you can see a list at https://docs.timefold.ai/).
One practical consideration often ignored is difficulty of implementation. To make a MIP model, you only have the tools of linear programming: equations and linear inequalities, like mx >= y. If you want a MIP, you need to write down ALL aspects of your problem as a list of inequalities, which can be difficult for some real world domains. There is a real art to MIP modeling.
On the other hand, metaheuristics are like an interface where you only need to implement a few functions in plain old code and you can get good answers.
It’s not quite that simple, since there is still an art to modeling a problem in a suitable input format for a meta heuristic (for example in genetic algorithm, how do I write my delivery schedule as a genome?), but the upshot is that it doesn’t have to be a mathematically precise formulation to work correctly.
Examples include:
- School Timetabling
- Employee Scheduling
- Conference Scheduling
- Flight Crew Scheduling
Metaheurstics are also very useful for puzzle games; you can quickly run a metaheuristic to generate a difficult but solvable puzzle in less than a second, while only being about 20 lines of code without libraries (but as your scale increases to hundreds of different pieces, you probably want a library so you can use their incremental calculation).
So, you could use it for any application you saw benefit from genetic algorithms, simulated annealing, or tabu search. You can even use those to optimize neural networks without backpropagation and with fewer, local optima. Many papers on this but it's computationally heavier.
I'd used it in a past life to build a Kubernetes scheduler [2] and tackle some cluster management problems.
[1] https://www.youtube.com/watch?v=lxiCHRFNgno [2] https://www.usenix.org/system/files/osdi20-suresh.pdf
It allows me to put a host in maintenance mode and distribute all virtual machines away across the remaining hosts, it's really awesome.
Constraint Programming is quite interesting and hard, as you have to reformulate the problem domain in a way that is not always intuitive.
The one thing I've been trying to model well are cover constraints where for each x : xs, there is some y : ys st. pred(x, y). I've tried both boolean matrices and index constraints, and they work but seem to be quite taxing on the solver. Maybe there's a better formulation.