转换

转换提供了除原子函数之外的其他操作CVXPY对象的方法。 而原子函数仅在表达式上运行, 转换也可以接受问题、目标或约束对象作为输入。 转换不需要遵循任何特定的API。

SuppFunc

SuppFunc 转换接受以某个CVXPY变量为隐含表达的凸集,并返回表示该凸集支撑函数的函数句柄。 当评估函数句柄时,它将返回一个 SuppFuncAtom object 。 这样的对象可以像其他CVXPY表达式一样用于凸优化建模。

class cvxpy.transforms.suppfunc.SuppFunc(x, constraints)[source]

Given a list of CVXPY Constraint objects \(\texttt{constraints}\) involving a real CVXPY Variable \(\texttt{x}\), consider the convex set

\[S = \{ v : \text{it's possible to satisfy all } \texttt{constraints} \text{ when } \texttt{x.value} = v \}.\]

This object represents the support function of \(S\). This is the convex function

\[y \mapsto \max\{ \langle y, v \rangle : v \in S \}.\]

The support function is a fundamental object in convex analysis. It’s extremely useful for expressing dual problems using Fenchel duality.

Parameters:
  • x (Variable) – This variable cannot have any attributes, such as PSD=True, nonneg=True, symmetric=True, etc…

  • constraints (list[Constraint]) – Usually, these are constraints over \(\texttt{x}\), and some number of auxiliary CVXPY Variables. It is valid to supply \(\texttt{constraints = []}\).

Examples

If \(\texttt{h = cp.SuppFunc(x, constraints)}\), then you can use \(\texttt{h}\) just like any other scalar-valued atom in CVXPY. For example, if \(\texttt{x}\) was a CVXPY Variable with \(\texttt{x.ndim == 1}\), you could do the following:

z = cp.Variable(shape=(10,))
A = np.random.standard_normal((x.size, 10))
c = np.random.rand(10)
objective =  h(A @ z) - c @ z
prob = cp.Problem(cp.Minimize(objective), [])
prob.solve()

Notes

You are allowed to use CVXPY Variables other than \(\texttt{x}\) to define \(\texttt{constraints}\), but the set \(S\) only consists of objects (vectors or matrices) with the same shape as \(\texttt{x}\).

It’s possible for the support function to take the value \(+\infty\) for a fixed vector \(\texttt{y}\). This is an important point, and it’s one reason why support functions are actually formally defined with the supremum “\(\sup\)” rather than the maximum “\(\max\)”. For more information on support functions, check out this Wikipedia page.

__call__(y) SuppFuncAtom[source]

Return an atom representing

max{ cvxpy.vec(y) @ cvxpy.vec(x) : x in S }

where S is the convex set associated with this SuppFunc object.

标量化

*标量化*转换将一个目标列表转换为一个目标列表, 例如加权和。所有标量化都在每个目标上是单调的,这意味着优化标量化目标总是返回相对于原始目标列表 Pareto 最优的点。 此外,除边界点外,所有 Pareto 曲线上的点在某个目标的加权下都是可达的。

scalarize.weighted_sum(weights) Minimize | Maximize

Combines objectives as a weighted sum.

Parameters:
  • objectives – A list of Minimize/Maximize objectives.

  • weights – A vector of weights.

Returns:

A Minimize/Maximize objective.

scalarize.max(weights) Minimize

Combines objectives as max of weighted terms.

Parameters:
  • objectives – A list of Minimize/Maximize objectives.

  • weights – A vector of weights.

Returns:

A Minimize objective.

scalarize.log_sum_exp(weights, gamma: float = 1.0) Minimize

Combines objectives as log_sum_exp of weighted terms.

The objective takes the form

log(sum_{i=1}^n exp(gamma*weights[i]*objectives[i]))/gamma

As gamma goes to 0, log_sum_exp approaches weighted_sum. As gamma goes to infinity, log_sum_exp approaches max.

Parameters:
  • objectives – A list of Minimize/Maximize objectives.

  • weights – A vector of weights.

  • gamma – Parameter interpolating between weighted_sum and max.

Returns:

A Minimize objective.

scalarize.targets_and_priorities(priorities, targets, limits=None, off_target: float = 1e-05) Minimize | Maximize

Combines objectives with penalties within a range between target and limit.

For nonnegative priorities, each Minimize objective i has value

off_target*objectives[i] when objectives[i] < targets[i]

(priorities[i]-off_target)*objectives[i] when targets[i] <= objectives[i] <= limits[i]

+infinity when objectives[i] > limits[i]

and each Maximize objective i has value

off_target*objectives[i] when objectives[i] > targets[i]

(priorities[i]-off_target)*objectives[i] when targets[i] >= objectives[i] >= limits[i]

-infinity when objectives[i] < limits[i]

A negative priority flips the objective sense, i.e., we use -objectives[i], -targets[i], and -limits[i] with abs(priorities[i]).

Parameters:
  • objectives – A list of Minimize/Maximize objectives.

  • priorities – The weight within the trange.

  • targets – The start (end) of penalty for Minimize (Maximize)

  • limits – Optional hard end (start) of penalty for Minimize (Maximize)

  • off_target – Penalty outside of target.

Returns:

A Minimize/Maximize objective.

Raises:

ValueError – If the scalarized objective is neither convex nor concave.

其他

这里我们列出其他可用的转换。

class cvxpy.transforms.indicator(constraints: List[Constraint], err_tol: float = 0.001)[source]
An expression representing the convex function I(constraints) = 0

if constraints hold, +infty otherwise.

Parameters:
  • constraints (list) – A list of constraint objects.

  • err_tol – A numeric tolerance for determining whether the constraints hold.

transforms.linearize()

Returns an affine approximation to the expression computed at the variable/parameter values.

Gives an elementwise lower (upper) bound for convex (concave) expressions that is tight at the current variable/parameter values. No guarantees for non-DCP expressions.

If f and g are convex, the objective f - g can be (heuristically) minimized using the implementation below of the convex-concave method:

for iters in range(N):
    Problem(Minimize(f - linearize(g))).solve()

Returns None if cannot be linearized.

Parameters:

expr – An expression.

Returns:

An affine expression or None.

partial_optimize.partial_optimize(opt_vars: List[Variable] | None = None, dont_opt_vars: List[Variable] | None = None, solver=None, **kwargs) PartialProblem

Partially optimizes the given problem over the specified variables.

Either opt_vars or dont_opt_vars must be given. If both are given, they must contain all the variables in the problem.

Partial optimize is useful for two-stage optimization and graph implementations. For example, we can write

x = Variable(n)
t = Variable(n)
abs_x = partial_optimize(Problem(Minimize(sum(t)),
          [-t <= x, x <= t]), opt_vars=[t])

to define the entrywise absolute value of x.

Parameters:
  • prob (Problem) – The problem to partially optimize.

  • opt_vars (list, optional) – The variables to optimize over.

  • dont_opt_vars (list, optional) – The variables to not optimize over.

  • solver (str, optional) – The default solver to use for value and grad.

  • kwargs (keywords, optional) – Additional solver specific keyword arguments.

Returns:

An expression representing the partial optimization. Convex for minimization objectives and concave for maximization objectives.

Return type:

Expression