Very efficient (or parallel) Eigen solution for complex symmetric Matrix
Started by 3 years ago●10 replies●latest reply 3 years ago●212 viewsI have a matrix that is complex but truly symmetric (not conjugate symmetric).
I'm trying to find a reference that will help me build an extremely efficient implementation for finding the first N Eigen values or singular values if SVD makes more sense.
It would actually be even more preferable if the implementation could be spread across multiple cores.
Size of the matrix is up to about 1000x1000, but smaller is also possible with some reduction in quality if absolutely essential.
David
Thanks to all for the kind contributions.
It turns out that I found an extremely promising scheme (since the realistic number of modes is low) that is referred to as Randomized SVD.
I have experimented with it for my case, and it seems to work very well, and reduces the processing load by at least a factor of 10 for the experiments I have run.
OK, that was cool. I haven't seen the digitally reversed white board before, unless he was really writing backwards. Interesting new concept that I didn't know existed.
Very cool, and I bet he is writing backwards on a large transparent panel!
The reason why, is that he is probably right handed, and I watched him erase some things.
Also, the text is in front of him all the time, and it appears that he can see what we see, by watching at a monitor.
Ah. I think he is left handed. Here's a youtube video on how its done:
Is the matrix well conditioned? Is an approximate solution sufficient?
If yes, an iterative solver will probably have a reduced operation count compared to classic QR/LU/SVD. Did you already look into iterative solvers?
Thanks for your insight.
For our Matlab implementation, we have used svds() which allow us to specify that the processing should find only the first N singular values.
For our case N is like < 50 for a matrix with over 800x800 dimensionality.
Even with svds, the processing load is beyond what we can afford.
Just thinking out loud here. If the matrix is split into a complex conjugate part and a residual part (which would be complex and could be off diagonal), does that shift the problem into an easier to resolve arrangement?
Or any of the other ways of partitioning your matrix.. Then use lots of tricks I've long forgotten!
Thanks for your idea.
I will have to see if there is some promise there.
Offhand, I would say that you should separate the matrix into real and complex matrices, each of which would be symmetric. Factoring out "i" from the complex matrix would give you two eigenvalue problems to solve - one for the real part matrix and one for the imaginary part matrix. SVD can be used for both matrices. Depending on the nature of each, it could turn out that each matrix would have different null spaces (eigenvectors associated with zero eigenvalues). If this is part of an inverse problem, this would complicate any regularization that could be imposed on the problem. I've performed SVD on matrices larger than 1000x1000 with MatLab on a single processor without much trouble. Interesting problem.
Chuck
Thanks Chuck.
We are using Matlab for the "offline" processing right now, and it does the job we need.
In fact, we are able to run in "real-time" for a significantly reduced number of simultaneous instances.
We are using svds() because we believe that the useful number of singular values is quite small compared to the size of the matrix (maybe < 50).