.. _example_cross_decomposition_plot_compare_cross_decomposition.py:


===================================
Compare cross decomposition methods
===================================

Simple usage of various cross decomposition algorithms:
- PLSCanonical
- PLSRegression, with multivariate response, a.k.a. PLS2
- PLSRegression, with univariate response, a.k.a. PLS1
- CCA

Given 2 multivariate covarying two-dimensional datasets, X, and Y,
PLS extracts the 'directions of covariance', i.e. the components of each
datasets that explain the most shared variance between both datasets.
This is apparent on the **scatterplot matrix** display: components 1 in
dataset X and dataset Y are maximally correlated (points lie around the
first diagonal). This is also true for components 2 in both dataset,
however, the correlation across datasets for different components is
weak: the point cloud is very spherical.



.. image:: images/plot_compare_cross_decomposition_001.png
    :align: center


**Script output**::

  Corr(X)
  [[ 1.    0.51  0.07 -0.05]
   [ 0.51  1.    0.11 -0.01]
   [ 0.07  0.11  1.    0.49]
   [-0.05 -0.01  0.49  1.  ]]
  Corr(Y)
  [[ 1.    0.48  0.05  0.03]
   [ 0.48  1.    0.04  0.12]
   [ 0.05  0.04  1.    0.51]
   [ 0.03  0.12  0.51  1.  ]]
  True B (such that: Y = XB + Err)
  [[1 1 1]
   [2 2 2]
   [0 0 0]
   [0 0 0]
   [0 0 0]
   [0 0 0]
   [0 0 0]
   [0 0 0]
   [0 0 0]
   [0 0 0]]
  Estimated B
  [[ 1.   1.   1. ]
   [ 1.9  2.   2. ]
   [-0.  -0.   0. ]
   [ 0.   0.   0. ]
   [ 0.   0.   0. ]
   [ 0.   0.  -0. ]
   [-0.  -0.  -0.1]
   [-0.  -0.   0. ]
   [ 0.   0.   0.1]
   [ 0.   0.  -0. ]]
  Estimated betas
  [[ 1.]
   [ 2.]
   [ 0.]
   [ 0.]
   [ 0.]
   [-0.]
   [-0.]
   [ 0.]
   [-0.]
   [-0.]]



**Python source code:** :download:`plot_compare_cross_decomposition.py <plot_compare_cross_decomposition.py>`

.. literalinclude:: plot_compare_cross_decomposition.py
    :lines: 21-

**Total running time of the example:**  0.24 seconds
( 0 minutes  0.24 seconds)