# ICA | 用RNN-ICA探索功能核磁内在网络模型的时空动力学

• 文章转自微信大众号：机器学习炼丹术
• 笔记：陈亦新
• 论文称号：Spatio-temporal Dynamics of Intrinsic Networks in Functional Magnetic Imaging Data Using Recurrent Neural Networks

## Introduction

While model degeneracy in time is conventient for learning; as an assumption about the data the explicit lack of temporal dependence necessarily marginalizes out dynamics, which then must be extrapolated in post-boc analysis.

## Background

Here we will formalize the problem of source separation with temporal dependencies and formulated the solution in terms of maximum likelihood estimation (MLE) and a recurrent model taht parameterizes a conditionally independent distribution

The data is composed of N ordered sequences of lengt T.

where each element in the sequence xt,nx_{t,n} is a D dimensional vector, and the index n enumerates the whole sequence.

The gold is to find a set of source signals:

This problem can generally be understood as inderence of unobserved or latent configurations from time-series observations.

It is convencient to assume that the sources, SnS_n, are stochastic random variables with well-understood and interpretable noise, suhc as Gaussian or logistic variables with independence constrains.

Representable as a directed graphical model in time, the choice of a-priori model structure, such as the relationship between latent variables and observations, can have consequences on model capacity and inference complexity.

Directed graphical models often require complex approximate inference which introduces variance into learning. Rather than solving the general problem in Equation 3. We will assume that the generating function, G() is noiseless, and the source sequences, SnS_n have the same dimensionality as the data XnX_n, with each source signal being composed of a set of conditionally independent components with density parameterized by a recurrent neural network(RNN)

We will show that the learning objective closely resembles that of noiseless independent component analysis (ICA). Assuming generation is noiseless and preserves dimensionality will reudce variance which would otherswise hinder learning with high-dimensional, low-sample size data, such as fMRI.

### Independent component analysis

• (1298条音讯) 独立成分剖析ICA原理_蔡希玉的博客-CSDN博客_ica原理

• (1298条音讯) ICA与PCA的差异_psybrain的博客-CSDN博客_ica和pca

• ICA又称为盲源别离Blind source separation BBS。

• ICA是independent component analysis独立成分剖析的缩写。

• 用鸡尾酒会模型来做比方，假定咱们在一个音乐厅或者是一个舞会，麦克风放在舞台的各个方位，每个麦克风都会捕获到混合的原始信号，有多少个麦克风就会有多少个混合信号。ICA的方针便是将混合信号别离提取或重建成非混合信号。

【ICP vs PCA】 ICA是一种将数据乘以一个分化矩阵来康复源数据的办法，而PCA是对输出进行去相关，让每一个连续重量尽可能多的解说数据中的方差。ICA则试图输出具有核算含义上的独立，使得每一个重量尽可能多的反响数据中与时刻无关的信息。

ICA需求预先界说分化的独立源的数目，及需求用户对数据有一个先验常识，把握必定的数据特征，不能随意挑选。而PCA的核算过程是完全无参的。

• 一般以为，PCA假定源信号彼此非相关，PCA的源信号其实便是主成分的方向，不相关其实便是只主成分方向正交
• ICA假定源信号彼此独立。因为ICA分化的源信号需求坚持核算上的独立。
• 主成分剖析以为主元之间彼此正交，样本呈高斯散布，独立成分剖析则要求数据非高斯散布。
• PCA的意图是找到信号当中的不相关部分（正交性），对应二阶核算量（最大方差）。PCA的完成就像咱们之前讲的，两种：特征值分化和SVD分化。PCA的问题便是对向量描绘的基的改换，让改换后的数据有着最大的方差。方差的巨细是描绘一个变量的信息量。
• ICA是找出构成信号的彼此独立的部分，并不要求正交，对应高阶核算量剖析。ICA的理论以为用来观测的混合矩阵X是由独立源A经过线性加权取得的。ICA的方针便是通过X求取一个别离矩阵W，使得W作用在X上取得的成果是独立源S的最优逼近。

x=AS,A=W−1,WX=Y,Y=Ax=AS,A=W^{-1},WX=Y,Y=\hat{A}

• 与PCA不同，ICA的方针不在于下降方针的维度，而是尽可能的从混合讯号中找出更具生理或者物理含义的信号来源。

【ICA的假定】

1. 假定源信号是各自独立的；也便是一起散布是各自散布的乘积
2. 假定源信号散布对错高斯散布。

## python完成ICA

import numpy as np
import matplotlib.pyplot as plt
from sklearn.decomposition import FastICA
# 构建四个不同的信号
C=200
x=np.arange(C)
s1 = 2 * np.sin(0.02 * np.pi * x)
a = np.linspace(-2,2,25)
s2 = np.concatenate([a,a,a,a,a,a,a,a])
s3 = np.array(20*(5*[2]+5*[-2]))
s4 = np.random.random(C)
# 展示信号
ax1 = plt.subplot(411)
ax2 = plt.subplot(412)
ax3 = plt.subplot(413)
ax4 = plt.subplot(414)
ax1.plot(s1)
ax2.plot(s2)
ax3.plot(s3)
ax4.plot(s4)



s = np.array([s1,s2,s3,s4])
ran = np.random.random([4,4])
mix = np.dot(ran,s)
ax1 = plt.subplot(411)
ax2 = plt.subplot(412)
ax3 = plt.subplot(413)
ax4 = plt.subplot(414)
ax1.plot(mix[0])
ax2.plot(mix[1])
ax3.plot(mix[2])
ax4.plot(mix[3])


ica = FastICA(n_components=4)
u = ica.fit_transform(mix.T)
print(ica.n_iter_)
ax1 = plt.subplot(411)
ax2 = plt.subplot(412)
ax3 = plt.subplot(413)
ax4 = plt.subplot(414)
ax1.plot(u[:,0])
ax2.plot(u[:,1])
ax3.plot(u[:,2])
ax4.plot(u[:,3])


ica = FastICA(n_components=4)
ica.fit(mix.T)
w = ica.components_
u = np.dot(w,mix)
ax1 = plt.subplot(411)
ax2 = plt.subplot(412)
ax3 = plt.subplot(413)
ax4 = plt.subplot(414)
ax1.plot(u[0])
ax2.plot(u[1])
ax3.plot(u[2])
ax4.plot(u[3])