🧺 ICS 2025 Solvers Cluster Takeaways
I just returned from the 2025 INFORMS Computing Society conference, where I had the privilege of organizing a cluster on optimization solvers. The cluster had two sessions, Solvers I and Solvers II, and focussed on new developments in the implementation of optimization solvers. In the coming days, I’ll provide a few posts about some of the solvers covered in those sessions, what makes them interesting, and how to test drive them....
👔 Hierarchical Optimization with HiGHS
In the last post, we used Gurobi’s hierarchical optimization features to compute the Pareto front for primary and secondary objectives in an assignment problem. This relied on Gurobi’s setObjectiveN method and its internal code for managing hierarchical problems. Some practitioners may need to do this without access to a commercial license. This post adapts the previous example to use HiGHS and its native Python interface, highspy. It’s also useful to see what the procedure is in order to understand it better....
👔 Hierarchical Optimization with Gurobi
One of the first technology choices to make when setting up an optimization stack is which modeling interface to use. Even if we restrict our choices to Python interfaces for MIP modeling, there are lots of options to consider. If you use a specific solver, you can opt for its native Python interface. Examples include libraries like gurobipy, Fusion, highspy, or PySCIPOpt. This approach provides access to important solver-specific features such as lazy constraints, heuristics, and various solver settings....
📅 Reducing Overscheduling
At a Nextmv tech talk a couple weeks ago, I showed a least absolute deviations (LAD) regression model using OR-Tools. This isn’t new – I pulled the formulation from Rob Vanderbei’s “Local Warming” paper, and I’ve shown similar models at conference talks in the past using other modeling APIs and solvers. There are a couple reasons I keep coming back to this problem. One is that it’s a great example of how to build a machine learning model using an optimization solver....
🖍 Visualizing Decision Diagrams
I attended DPSOLVE 2023 recently and found lots of good inspiration for the next version of Nextmv’s Decision Diagram (DD) solver, Hop. It’s a few years old now, and we learned a lot applying it in the field. Hop formed the basis for our first routing models. While those models moved to a different structure in our latest routing code, the first version broke ground combining DDs with Adaptive Large Neighborhood Search (ALNS), and its use continues to grow organically....
🚀 Blogging is back, baby!
I’ve been a mostly absent blogger for the past few years. I could make excuses. They might sound like, “I was busy finishing my dissertation!” or “I founded a company and have a toddler!” or “The static site generator I used was abandoned!” Whatever they might be, these excuses would certainly end in exclamation points. But, ultimately, for several years it just felt like blogging was dead. Its space was usurped by Tweets, LinkedIn hustle posts, long form Medium content aimed at attracting talent, and other content trends....
🏖️ Langrangian Relaxation with Gurobi
Note: This post was updated to work with Python 3 and the 2nd edition of “Integer Programming” by Laurence Wolsey. We’ve been studying Lagrangian Relaxation (LR) in the Advanced Topics in Combinatorial Optimization course I’m taking this term, and I had some difficulty finding a simple example covering its application. In case anyone else finds it useful, I’m posting a Python version for solving the Generalized Assignment Problem (GAP). This won’t discuss the theory of LR at all, just give example code using Gurobi....
🔲 Normal Magic Squares
Note: This post was updated to work with Python 3 and PySCIPOpt. The original version used Python 2 and python-zibopt. It has also been edited for clarity. As a followup to the last post, I created another SCIP example for finding Normal Magic Squares. This is similar to solving a Sudoku problem, except that here the number of binary variables depends on the square size. In the case of Sudoku, each cell has 9 binary variables – one for each potential value it might take....
🔲 Magic Squares and Big-Ms
Note: This post was updated to work with Python 3 and PySCIPOpt. The original version used Python 2 and python-zibopt. It has also been edited for clarity. Back in October of 2011, I started toying with a model for finding magic squares using SCIP. This is a fun modeling exercise and a challenging problem. First one constructs a square matrix of integer-valued variables. from pyscipopt import Model # [...snip...] m = Model() matrix = [] for i in range(size): row = [m....
⏳️ Know Your Time Complexities - Part 2
In response to this post, Ben Bitdiddle inquires: I understand the concept of using a companion set to remove duplicates from a list while preserving the order of its elements. But what should I do if these elements are composed of smaller pieces? For instance, say I am generating combinations of numbers in which order is unimportant. How do I make a set recognize that [1,2,3] is the same as [3,2,1] in this case?...