To establish whether a form h(x) is SOS amounts to solving a convex optimization problem. Indeed, any h(x) can be written as h ( x ) = x { m } ′ ( H + L ( α ) ) x { m } {\displaystyle h(x)=x^{\{m\}'}\left(H+L(\alpha )\right)x^{\{m\}}} where x { m } {\displaystyle x^{\{m\}}} is a vector containing a base for the forms of degree m in x (such as all monomials of degree m in x), the prime ′ denotes the transpose, H is any symmetric matrix satisfying h ( x ) = x { m } ′ H x { m } {\displaystyle h(x)=x^{\left\{m\right\}'}Hx^{\{m\}}} and L ( α ) {\displaystyle L(\alpha )} is a linear parameterization of the linear space L = { L = L ′ : x { m } ′ L x { m } = 0 } . {\displaystyle {\mathcal {L}}=\left\{L=L':~x^{\{m\}'}Lx^{\{m\}}=0\right\}.}
The dimension of the vector x { m } {\displaystyle x^{\{m\}}} is given by σ ( n , m ) = ( n + m − 1 m ) , {\displaystyle \sigma (n,m)={\binom {n+m-1}{m}},} whereas the dimension of the vector α {\displaystyle \alpha } is given by ω ( n , 2 m ) = 1 2 σ ( n , m ) ( 1 + σ ( n , m ) ) − σ ( n , 2 m ) . {\displaystyle \omega (n,2m)={\frac {1}{2}}\sigma (n,m)\left(1+\sigma (n,m)\right)-\sigma (n,2m).}
Then, h(x) is SOS if and only if there exists a vector α {\displaystyle \alpha } such that H + L ( α ) ≥ 0 , {\displaystyle H+L(\alpha )\geq 0,} meaning that the matrix H + L ( α ) {\displaystyle H+L(\alpha )} is positive-semidefinite. This is a linear matrix inequality (LMI) feasibility test, which is a convex optimization problem. The expression h ( x ) = x { m } ′ ( H + L ( α ) ) x { m } {\displaystyle h(x)=x^{\{m\}'}\left(H+L(\alpha )\right)x^{\{m\}}} was introduced in 7 with the name square matricial representation (SMR) in order to establish whether a form is SOS via an LMI. This representation is also known as Gram matrix.8
A matrix form F(x) (i.e., a matrix whose entries are forms) of dimension r and degree 2m in the real n-dimensional vector x is SOS if and only if there exist matrix forms G 1 ( x ) , … , G k ( x ) {\displaystyle G_{1}(x),\ldots ,G_{k}(x)} of degree m such that F ( x ) = ∑ i = 1 k G i ( x ) ′ G i ( x ) . {\displaystyle F(x)=\sum _{i=1}^{k}G_{i}(x)'G_{i}(x).}
To establish whether a matrix form F(x) is SOS amounts to solving a convex optimization problem. Indeed, similarly to the scalar case any F(x) can be written according to the SMR as F ( x ) = ( x { m } ⊗ I r ) ′ ( H + L ( α ) ) ( x { m } ⊗ I r ) {\displaystyle F(x)=\left(x^{\{m\}}\otimes I_{r}\right)'\left(H+L(\alpha )\right)\left(x^{\{m\}}\otimes I_{r}\right)} where ⊗ {\displaystyle \otimes } is the Kronecker product of matrices, H is any symmetric matrix satisfying F ( x ) = ( x { m } ⊗ I r ) ′ H ( x { m } ⊗ I r ) {\displaystyle F(x)=\left(x^{\{m\}}\otimes I_{r}\right)'H\left(x^{\{m\}}\otimes I_{r}\right)} and L ( α ) {\displaystyle L(\alpha )} is a linear parameterization of the linear space L = { L = L ′ : ( x { m } ⊗ I r ) ′ L ( x { m } ⊗ I r ) = 0 } . {\displaystyle {\mathcal {L}}=\left\{L=L':~\left(x^{\{m\}}\otimes I_{r}\right)'L\left(x^{\{m\}}\otimes I_{r}\right)=0\right\}.}
The dimension of the vector α {\displaystyle \alpha } is given by ω ( n , 2 m , r ) = 1 2 r ( σ ( n , m ) ( r σ ( n , m ) + 1 ) − ( r + 1 ) σ ( n , 2 m ) ) . {\displaystyle \omega (n,2m,r)={\frac {1}{2}}r\left(\sigma (n,m)\left(r\sigma (n,m)+1\right)-(r+1)\sigma (n,2m)\right).}
Then, F(x) is SOS if and only if there exists a vector α {\displaystyle \alpha } such that the following LMI holds: H + L ( α ) ≥ 0. {\displaystyle H+L(\alpha )\geq 0.}
The expression F ( x ) = ( x { m } ⊗ I r ) ′ ( H + L ( α ) ) ( x { m } ⊗ I r ) {\displaystyle F(x)=\left(x^{\{m\}}\otimes I_{r}\right)'\left(H+L(\alpha )\right)\left(x^{\{m\}}\otimes I_{r}\right)} was introduced in 9 in order to establish whether a matrix form is SOS via an LMI.
Consider the free algebra R⟨X⟩ generated by the n noncommuting letters X = (X1, ..., Xn) and equipped with the involution T, such that T fixes R and X1, ..., Xn and reverses words formed by X1, ..., Xn. By analogy with the commutative case, the noncommutative symmetric polynomials f are the noncommutative polynomials of the form f = fT. When any real matrix of any dimension r × r is evaluated at a symmetric noncommutative polynomial f results in a positive semi-definite matrix, f is said to be matrix-positive.
A noncommutative polynomial is SOS if there exists noncommutative polynomials h 1 , … , h k {\displaystyle h_{1},\ldots ,h_{k}} such that f ( X ) = ∑ i = 1 k h i ( X ) T h i ( X ) . {\displaystyle f(X)=\sum _{i=1}^{k}h_{i}(X)^{T}h_{i}(X).}
Surprisingly, in the noncommutative scenario a noncommutative polynomial is SOS if and only if it is matrix-positive.10 Moreover, there exist algorithms available to decompose matrix-positive polynomials in sum of squares of noncommutative polynomials.11
Hilbert, David (September 1888). "Ueber die Darstellung definiter Formen als Summe von Formenquadraten". Mathematische Annalen. 32 (3): 342–350. doi:10.1007/bf01443605. S2CID 177804714. https://zenodo.org/record/1428214 ↩
Choi, M. D.; Lam, T. Y. (1977). "An old question of Hilbert". Queen's Papers in Pure and Applied Mathematics. 46: 385–405. ↩
Goel, Charu; Kuhlmann, Salma; Reznick, Bruce (May 2016). "On the Choi–Lam analogue of Hilbert's 1888 theorem for symmetric forms". Linear Algebra and Its Applications. 496: 114–120. arXiv:1505.08145. doi:10.1016/j.laa.2016.01.024. S2CID 17579200. /wiki/Bruce_Reznick ↩
Lasserre, Jean B. (2007). "Sufficient conditions for a real polynomial to be a sum of squares". Archiv der Mathematik. 89 (5): 390–398. arXiv:math/0612358. CiteSeerX 10.1.1.240.4438. doi:10.1007/s00013-007-2251-y. S2CID 9319455. https://www.optimization-online.org/DB_HTML/2007/02/1587.html ↩
Powers, Victoria; Wörmann, Thorsten (1998). "An algorithm for sums of squares of real polynomials" (PDF). Journal of Pure and Applied Algebra. 127 (1): 99–104. doi:10.1016/S0022-4049(97)83827-3. /wiki/Victoria_Powers ↩
Lasserre, Jean B. (2007). "A Sum of Squares Approximation of Nonnegative Polynomials". SIAM Review. 49 (4): 651–669. arXiv:math/0412398. Bibcode:2007SIAMR..49..651L. doi:10.1137/070693709. /wiki/ArXiv_(identifier) ↩
Chesi, G.; Tesi, A.; Vicino, A.; Genesio, R. (1999). "On convexification of some minimum distance problems". Proceedings of the 5th European Control Conference. Karlsruhe, Germany: IEEE. pp. 1446–1451. https://ieeexplore.ieee.org/document/7099515 ↩
Choi, M.; Lam, T.; Reznick, B. (1995). "Sums of squares of real polynomials". Proceedings of Symposia in Pure Mathematics. pp. 103–125. https://www.researchgate.net/publication/240268385 ↩
Chesi, G.; Garulli, A.; Tesi, A.; Vicino, A. (2003). "Robust stability for polytopic systems via polynomially parameter-dependent Lyapunov functions". Proceedings of the 42nd IEEE Conference on Decision and Control. Maui, Hawaii: IEEE. pp. 4670–4675. doi:10.1109/CDC.2003.1272307. /wiki/Doi_(identifier) ↩
Helton, J. William (September 2002). ""Positive" Noncommutative Polynomials Are Sums of Squares". The Annals of Mathematics. 156 (2): 675–694. doi:10.2307/3597203. JSTOR 3597203. /wiki/Doi_(identifier) ↩
Burgdorf, Sabine; Cafuta, Kristijan; Klep, Igor; Povh, Janez (25 October 2012). "Algorithmic aspects of sums of Hermitian squares of noncommutative polynomials". Computational Optimization and Applications. 55 (1): 137–153. CiteSeerX 10.1.1.416.543. doi:10.1007/s10589-012-9513-8. S2CID 254416733. /wiki/CiteSeerX_(identifier) ↩