Signomial Programming

Signomial programming finds a local solution to a problem of the form:

\[\begin{split}\begin{array}{lll}\text{} \text{minimize} & g_0(x) & \\ \text{subject to} & f_i(x) = 1, & i = 1,....,m \\ & g_i(x) - h_i(x) \leq 1, & i = 1,....,n \end{array}\end{split}\]

where each \(f\) is monomial while each \(g\) and \(h\) is a posynomial.

This requires multiple solutions of geometric programs, and so will take longer to solve than an equivalent geometric programming formulation.

In general, when given the choice of which variables to include in the positive-posynomial / \(g\) side of the constraint, the modeler should:

  1. maximize the number of variables in \(g\),
  2. prioritize variables that are in the objective,
  3. then prioritize variables that are present in other constraints.

The .localsolve syntax was chosen to emphasize that signomial programming returns a local optimum. For the same reason, calling .solve on an SP will raise an error.

By default, signomial programs are first solved conservatively (by assuming each \(h\) is equal only to its constant portion) and then become less conservative on each iteration.

Example Usage

"""Adapted from t_SP in tests/t_geometric_program.py"""
from gpkit import Model, Variable, SignomialsEnabled

# Decision variables
x = Variable('x')
y = Variable('y')

# must enable signomials for subtraction
with SignomialsEnabled():
    constraints = [x >= 1-y, y <= 0.1]

# create and solve the SP
m = Model(x, constraints)
print(m.localsolve(verbosity=0).summary())
assert abs(m.solution(x) - 0.9) < 1e-6

# full interim solutions are available
print("x values of each GP solve (note convergence)")
print(", ".join("%.5f" % sol["freevariables"][x] for sol in m.program.results))

# use x0 to give the solution, reducing number of GPs needed
m.localsolve(verbosity=0, x0={x: 0.9, y:0.1})
assert len(m.program.results) == 2

When using the localsolve method, the reltol argument specifies the relative tolerance of the solver: that is, by what percent does the solution have to improve between iterations? If any iteration improves less than that amount, the solver stops and returns its value.

If you wish to start the local optimization at a particular point \(x_0\), however, you may do so by putting that position (a dictionary formatted as you would a substitution) as the x0 argument.

Sequential Geometric Programs

The method of solving local GP approximations of a non-GP compatible model can be generalized, at the cost of the general smoothness and lack of a need for trust regions that SPs guarantee.

For some applications, it is useful to call external codes which may not be GP compatible. Imagine we wished to solve the following optimization problem:

\[\begin{split}\begin{array}{lll}\text{} \text{minimize} & y & \\ \text{subject to} & y \geq \sin(x) \\ & \frac{\pi}{4} \leq x \leq \frac{\pi}{2} \end{array}\end{split}\]

This problem is not GP compatible due to the \(sin(x)\) constraint. One approach might be to take the first term of the Taylor expansion of \(sin(x)\) and attempt to solve:

"Can be found in gpkit/docs/source/examples/sin_approx_example.py"
import numpy as np
from gpkit import Variable, Model


x = Variable("x")
y = Variable("y")

objective = y

constraints = [y >= x,
               x <= np.pi/2,
               x >= np.pi/4,
              ]

m = Model(objective, constraints)
print(m.solve(verbosity=0).summary())

         ┃┓
    Cost╺┫┃
 (0.785) ┃┣╸0.785
         ┃┛



       ┃┓
       ┃┃
       ┃┣╸x ≥ 0.785
 Model╺┫┛
       ┃┓
       ┃┃
       ┃┣╸y ≥ x
       ┃┛


Free Variables
--------------
x : 0.7854
y : 0.7854

Assume we have some external code which is capable of evaluating our incompatible function:

"""External function for GPkit to call.  Can be found
in gpkit/docs/source/examples/external_function.py"""
import numpy as np

def external_code(x):
    "Returns sin(x)"
    return np.sin(x)

Now, we can create a ConstraintSet that allows GPkit to treat the incompatible constraint as though it were a signomial programming constraint:

"Can be found in gpkit/docs/source/examples/external_constraint.py"
from external_function import external_code


class ExternalConstraint:
    "Class for external calling"

    def __init__(self, x, y):
        # We need a GPkit variable defined to return in our constraint.  The
        # easiest way to do this is to read in the parameters of interest in
        # the initiation of the class and store them here.
        self.x = x
        self.y = y

    def as_gpconstr(self, x0):
        "Returns locally-approximating GP constraint"
        # Creating a default constraint for the first solve
        if self.x not in x0:
            return (self.y >= self.x)
        # Otherwise calls external code at the current position...
        x_star = x0[self.x]
        res = external_code(x_star)
        # ...and returns a posynomial approximation around that position
        return (self.y >= res * self.x/x_star)

and replace the incompatible constraint in our GP:

"Can be found in gpkit/docs/source/examples/external_sp.py"
import numpy as np
from gpkit import Variable, Model
from external_constraint import ExternalConstraint

x = Variable("x")
y = Variable("y")

objective = y

constraints = [ExternalConstraint(x, y),
               x <= np.pi/2,
               x >= np.pi/4,
              ]

m = Model(objective, constraints)
print(m.localsolve(verbosity=0).summary())

         ┃┓
    Cost╺┫┃
 (0.707) ┃┣╸0.785
         ┃┛



       ┃┓
       ┃┃
       ┃┣╸<external_constraint.ExternalConstraint object>
 Model╺┫┛
       ┃┓
       ┃┃
       ┃┣╸x ≥ 0.785
       ┃┛


Free Variables
--------------
x : 0.7854
y : 0.7071

which is the expected result. This method has been generalized to larger problems, such as calling XFOIL and AVL.

If you wish to start the local optimization at a particular point \(x_0\), however, you may do so by putting that position (a dictionary formatted as you would a substitution) as the x0 argument