Industry Signals
Jacobi Robotics + Delta Technology Deploy Live Mixed-Case Palletizing Cell at Defence Manufacturer: 90% Cube Utilisation, 100% Pallet Stability, 2-Week Commissioning
Foundation: At a defence manufacturer’s outbound dock, operators were hand-building pallets from completely random, unsequenced case streams all day — the highest-injury task in the building, and one that conventional palletizing automation cannot handle because it requires pre-planned sequencing the live workflow cannot provide. Mixed-case palletizing (building stable freight pallets from random inbound cases with no advance knowledge of what arrives next) is a real-time 3D bin-packing problem: for each incoming case the system must assign a stable pallet position, enforce stacking rules (heavy-on-bottom, crush limits, carrier lane sequencing), and compute the robot arm trajectory before the next case lands on the conveyor. Jacobi Robotics’ OmniPalletizer solves this by planning every placement inside a physics-aware digital twin (a simulation that enforces stability and motion constraints in real time) running on a FANUC M-710iC/45M industrial robot deployed by system integrator Delta Technology.
Delta Technology installed the OmniPalletizer cell into the manufacturer’s existing conveyor lanes with no upstream sequencing buffers or warehouse redesign. The system handles thousands of cases per day across dozens of SKUs, boxes to 60 lbs, and two destination pallets, with operator training under one day. Commissioning took two weeks with zero gap between simulated and actual cycle time. In live production since Q4 2025: 100% pallet stability, 90% cube utilisation.
Any warehouse removing people from manual palletizing can validate OmniPalletizer against their own SKU history in simulation before committing; the Delta Technology deployment shows two-week commissioning with no surprises between sim and production floor.
Research Papers
ML-Guided Primal Heuristic Outperforms Gurobi Defaults on MBQP Benchmarks 🔗
Foundation: A mixed-binary quadratic program (MBQP) is an optimisation problem with binary 0/1 decision variables and a quadratic objective function, covering problem classes like quadratic assignment, max-cut graph partitioning, and portfolio selection. Inside a branch-and-boundA tree search algorithm that systematically partitions the solution space, pruning branches once a bound proves they cannot improve on the best solution found. See full concept page. solver (a method that systematically partitions the solution space into smaller subproblems), a primal heuristic is a fast shortcut that produces a concrete feasible solution early — giving the solver a bound to prune unpromising branches. Standard primal heuristics use fixed, hand-designed rules; this paper trains a machine learning classifier to predict which variable fixings produce the best early incumbents on MBQP instances.
The classifier is trained on solved MBQP instances and predicts which variables to fix and to what values, producing a warm-start incumbent that branch-and-bound inherits. On standard MBQP benchmark suites it finds better feasible solutions earlier in the search than Gurobi's default primal heuristics. The formulation, benchmarks, and results are in the case study below.
Unified CP Model Produces Shorter Job Shop Makespans Than Decomposed Approach 🔗
Foundation: The flexible job shop scheduling problem (FJSP) is the task of assigning manufacturing operations to machines and sequencing them to minimise total completion time (makespan), with each operation eligible to run on any machine in a given set. When physical parts must be transported between machines by mobile robots, scheduling and routing are coupled: a robot that is busy elsewhere delays the next operation even if its target machine is free. Solving scheduling and routing sequentially (first schedule ignoring robot availability, then route) produces suboptimal schedules because the scheduling stage cannot account for robot travel times. constraint programmingA solver paradigm that expresses decisions as variables with allowed-value domains and uses propagation to eliminate impossible combinations before and during search. See full concept page. is a solver paradigm that expresses each decision as a variable with a domain of allowed values and uses propagation (the systematic removal of values that cannot appear in any feasible solution) to shrink domains before and during search.
The paper encodes machine assignment, robot assignment, and travel-time constraints in a single constraint programming model, so the solver reasons simultaneously about scheduling and routing. On benchmark instances with multiple machines and multiple heterogeneous robots it produces shorter makespans than a decomposed baseline that schedules first and routes second. The formulation generalises to any manufacturing setting where mobile material transport is a binding constraint alongside machine scheduling.
Any manufacturer where parts move between machines on mobile robots can apply this constraint programming formulation to eliminate the makespan penalty that sequential scheduling-then-routing imposes.
Term of the Day
Arc Consistency
"Local consistency, applied relentlessly to every constrained pair, delivers global coherence; the solver need only discover what remains." — Adapted from Alan Mackworth, Consistency in Networks of Relations, Artificial Intelligence (1977)
Arc consistency enforces a rule across every constrained pair of variables: every value in one variable's domain (the set of values it is still allowed to take) must have at least one compatible partner in the other variable's domain. When a value has no compatible partner, it is permanently removed before the solver branches. This is the classical definition for binary constraints (rules between exactly two variables); for global constraints involving more variables at once, modern CP solvers enforce the stronger Generalized Arc Consistency (GAC), checking that a complete tuple satisfying the full constraint exists for every retained value. What both forms of propagation prevent is the solver wasting search effort on values that were never part of any feasible solution; a single constraint propagates consequences through every variable connected to it.
A concrete example
A two-machine job shop schedules operations for three jobs. Machine A processes each job first; Machine B handles a follow-on operation. A constraint says Machine A must finish a job before Machine B can start it. Machine A's start-time domain is {9am, 10am, 11am} and Machine B's start-time domain (per job) is {10am, 11am, noon}. So far every value in A's domain has a compatible partner in B's domain, and every value in B's domain has a compatible partner in A's, so arc consistency finds nothing to prune.
Now a delivery deadline forces Machine B to finish all work by 11am. Each B operation takes one hour, so noon is eliminated from B's domain, leaving {10am, 11am}. Arc consistency immediately re-checks: can A's 11am start-time (finishing at noon) still find a compatible B start? No: B cannot start at noon any more. A's 11am value is eliminated. The solver derived that A cannot start at 11am from a deadline placed on B, without branching at all.
This bidirectional propagation continues until no more values can be removed. In a real job shop with hundreds of operations and dozens of machines, each constraint eliminated via arc consistency spares the solver from generating and evaluating thousands of partial assignments.
Why practitioners misread this
Most practitioners assume arc consistency runs once as a pre-processing step, before the solver starts branching. In reality, modern constraint programming solvers run arc consistency continuously: every time the solver branches and fixes a variable to a value, arc consistency immediately propagates that assignment to all connected variables, removing any value in their domains that has now lost its compatible partner. It is not front-loaded pre-processing; it is a live, recursive pruning mechanism woven into every node of the search tree.
A related misread is treating arc consistency as if it covers all constraint types uniformly. Classical arc consistency operates on binary constraints (rules between exactly two variables); for constraints involving more than two variables at once, such as AllDifferent (the rule that no two variables in a set may share the same value) or NoOverlap (the rule that no two operations may use the same machine at the same time), modern CP solvers enforce the stronger Generalized Arc Consistency (GAC, also called Hyper-Arc Consistency): a value is GAC-supported only if a complete tuple satisfying the full multi-variable constraint exists with that value in its slot. Whether the constraint is binary or n-ary, the same ceiling applies: a fully consistent network can still have no feasible solution; the infeasibility only surfaces when three or more variables must be assigned simultaneously. What propagation delivers is aggressive elimination of locally impossible values, not a proof that a solution exists. (The property that does guarantee extensibility is called strong k-consistency, which requires checking all subsets of k variables.)
Where this shows up in practice
Manufacturing scheduling: Every CP-based job shop solver, including Google CP-SAT and IBM ILOG CP Optimizer, enforces arc consistency on machine capacity, no-overlap conditions, and sequence-dependent setup times. Vehicle routing: Time-window constraints on customer visits are arc-consistency-enforced: if a vehicle cannot arrive within a customer's window given current assignments, that assignment is pruned before the solver branches. Employee rostering: Shift-pattern rules and working-time regulations are expressed as pair-wise constraints and pruned by arc consistency before any person is assigned to a shift. Supply chain planning: Lead-time and warehouse capacity constraints expressed as variable domains are shrunk by arc consistency before order allocation decisions are fixed. The diagnostic question to ask when evaluating any published constraint programming model: are all binding operational constraints expressed explicitly as constraints between pairs of variables? If a constraint is left implicit, arc consistency cannot prune it.
- ML + MIP Heuristics A machine learning classifier trained on solved instances predicts which variable fixings produce the best feasible solutions in branch-and-bound for mixed-binary quadratic programs, outperforming Gurobi's default primal heuristics on standard benchmark suites.
- CP for Job Shop + Robots Encoding machine scheduling and robot routing as a single constraint programming model eliminates the makespan degradation that sequential decomposition imposes, and the formulation applies to any manufacturing setting where mobile material transport is a binding constraint alongside machine capacity.