2014年3月31日 星期一

SVD decomposition with numpy

Python科學計算-使用numpy計算SVD decomposition

本文出處 : (http://glowingpython.blogspot.tw/2011/06/svd-decomposition-with-numpy.html)
以下原文摘錄在這邊的原因有二:
是原來的網址常常久了之後就不見了,另外一個原因是要中英對照,以免我這裡有翻譯不對之處,讀者可以參考原文。

SVD矩陣分解是將矩陣作因式分解,在訊號處理和統計上應用非常廣泛,在這篇文章我們將學習

  • 如何應用Python語言的numpy套件,計算矩陣A的SVD分解
  • 如何應用SVD分解所求得的各個矩陣,計算矩陣A的反矩陣
  • 如何應用SVD分解求解線性方程式Ax=b

The SVD decomposition is a factorization of a matrix, with many useful applications in signal processing and statistics. In this post we will see

  • how to compute the SVD decomposition of a matrix A using numpy,
  • how to compute the inverse of A using the matrices computed by the decomposition,
  • and how to solve a linear equation system Ax=b using using the SVD.

矩陣A的SVD分解可以寫成下列形式:
The SVD decomposition of a matrix A is of the form


因為,經過SVD分解後求得的U和V矩陣為正交(亦即U的轉置矩陣與U相乘為單位矩陣,以及U的轉置矩陣與U相乘為單位矩陣),A矩陣的反矩陣可寫成下列形式:
Since U and V are orthogonal (this means that U^T*U=I and V^T*V=I) we can write the inverse of A as (see Solving overdetermined systems with the QR decomposition for the tricks)


所以,我們可以開始計算A矩陣的因式分解和求解其反矩陣:
So, let's start computing the factorization and the inverse

from numpy import *

A = floor(random.rand(4,4)*20-10) # generating a random
b = floor(random.rand(4,1)*20-10) # system Ax=b

U,s,V = linalg.svd(A) # SVD decomposition of A

# computing the inverse using pinv
pinv = linalg.pinv(A)
# computing the inverse using the SVD decomposition
pinv_svd = dot(dot(V.T,linalg.inv(diag(s))),U.T)

print "Inverse computed by lingal.pinv()\n",pinv
print "Inverse computed using SVD\n",pinv_svd

從以下程式輸出結果可以看到,以numpy內建計算反矩陣函數所得結果pinv,和使用SVD分解後求得的U和V矩陣計算之虛擬反矩陣pinv_svd相同
As we can see, the output shows that pinv and pinv_svd are the equal

Inverse computed by lingal.pinv()
[[ 0.06578301 -0.04663721  0.0436917   0.089838  ]
 [ 0.15243004  0.044919   -0.03681885  0.00294551]
 [ 0.18213058 -0.01718213  0.06872852 -0.07216495]
 [ 0.03976436  0.09867452  0.03387334 -0.04270987]]
Inverse computed using SVD
[[ 0.06578301 -0.04663721  0.0436917   0.089838  ]
 [ 0.15243004  0.044919   -0.03681885  0.00294551]
 [ 0.18213058 -0.01718213  0.06872852 -0.07216495]
 [ 0.03976436  0.09867452  0.03387334 -0.04270987]]

現在,我們可以使用反矩陣求解線性方程式Ax=b
Now, we can solve Ax=b using the inverse:


或者將SVD分解結果代入A,求解線性方程式
or solving the system


方程式兩邊同乘以U的轉置矩陣,可以得到
Multiplying by U^T we obtain


令 U^Tb為c 且 V^Tx為w,則上式可寫成
Then, letting c be U^Tb and w be V^Tx, we see


因為sigma矩陣為對角矩陣,我們可以很容易求解上述系統。最後,我們可以求解下式得到x=V*w
Since sigma is diagonal, we can easily obtain w solving the system above. And finally, we can obtain x solving

讓我們比較一下各種方法計算結果:
Let's compare the results of those methods:

x = linalg.solve(A,b) # solve Ax=b using linalg.solve

xPinv = dot(pinv_svd,b) # solving Ax=b computing x = A^-1*b

# solving Ax=b using the equation above
c = dot(U.T,b) # c = U^t*b
w = linalg.solve(diag(s),c) # w = V^t*c
xSVD = dot(V.T,w) # x = V*w

print "Ax=b solutions compared"
print x.T
print xSVD.T
print xPinv.T

上述程式碼當中,x矩陣為使用numpy套件直接求解結果,xPinv為使用前述pinv_svd與b矩陣相乘結果求得,最後,xSVD使用上述推導結果計算x=V*w。要特別注意的是:svd分解求得的是V的轉置矩陣,所以在程式碼中,計算x=V*w時,必須以V^T與w相乘。
如同我們所預期的,三種結果相同
As expected, we have the same solutions:

Ax=b solutions compared
[[ 0.13549337 -0.37260677  1.62886598 -0.09720177]]
[[ 0.13549337 -0.37260677  1.62886598 -0.09720177]]
[[ 0.13549337 -0.37260677  1.62886598 -0.09720177]]