Update README.md
This commit is contained in:
parent
30a586eeac
commit
5c41e416b8
1 changed files with 18 additions and 13 deletions
29
README.md
29
README.md
|
@ -31,7 +31,7 @@ We used a RTX 3090 as test GPU.
|
|||
- try to load previous mask
|
||||
- start: cleaned_load_data
|
||||
- start: load_data
|
||||
- work in XXXX.npy
|
||||
- work on XXXX.npy
|
||||
- np.load
|
||||
- organize acceptor (move to GPU memory)
|
||||
- organize donor (move to GPU memory)
|
||||
|
@ -44,22 +44,23 @@ We used a RTX 3090 as test GPU.
|
|||
- move inter timeseries
|
||||
- acceptor time series and donor reference image
|
||||
- spatial pooling (i.e. 2d average pooling layer)
|
||||
- data(x,y,t) = data(x,y,t) / data(x,y,t).mean(t) + 1
|
||||
- acceptor(x,y,t) = acceptor(x,y,t) / acceptor(x,y,t).mean(t) + 1
|
||||
- donor(x,y,t) = donor(x,y,t) / donor(x,y,t).mean(t) + 1
|
||||
- remove the heart beat via SVD from donor and acceptor
|
||||
- copy donor and acceptor and work on the copy with the SVD
|
||||
- remove the mean (over time)
|
||||
- use Cholesky whitening on data with SVD
|
||||
- scale the time series accoring the spatial whitening
|
||||
- average time series over the spatial dimension (which is the global heart beat)
|
||||
- use a normalized scalar product for get spatial scaling factors
|
||||
- use a normalized scalar product for getting spatial scaling factors
|
||||
- scale the heartbeat with the spatial scaling factors into donor_residuum and acceptor_residuum
|
||||
- store the heartbeat as well as substract it from the original donor and acceptor timeseries
|
||||
- remove mean from donor and acceptor timeseries
|
||||
- remove linear trends from donor and acceptor timeseries
|
||||
- remove mean from donor and acceptor timeseries (- mean over time)
|
||||
- remove linear trends from donor and acceptor timeseries (create a linear function and use a normalized scalar product for getting spatial scaling factors)
|
||||
- use the SVD heart beat for determining the scaling factors for donor and acceptor (heartbeat_scale)
|
||||
- apply bandpass donor_residuum (filtfilt)
|
||||
- apply bandpass acceptor_residuum (filtfilt)
|
||||
- a normalized scalar product is used to determine the scale factor scale(x,y) from donor_residuum(x,y,t) and acceptor_residuum(x,y,t)
|
||||
- a normalized scalar product is used to determine the scale factor scale(x,y) between donor_residuum(x,y,t) and acceptor_residuum(x,y,t)
|
||||
- calculate mask (optional) ; based on the heart beat power at the spatial positions
|
||||
- scale acceptor signal (heartbeat_scale_a(x,y) * result_a(x,y,t)) and donor signal (heartbeat_scale_d(x,y) * result_d(x,y,t))
|
||||
- heartbeat_scale_a = torch.sqrt(scale)
|
||||
|
@ -74,8 +75,8 @@ We used a RTX 3090 as test GPU.
|
|||
- try to load previous mask
|
||||
- start cleaned_load_data
|
||||
- start load_data
|
||||
- work in XXXX.npy
|
||||
- np.load
|
||||
- work on XXXX.npy
|
||||
- np.load (load one trial)
|
||||
- organize acceptor (move to GPU memory)
|
||||
- organize donor (move to GPU memory)
|
||||
- organize oxygenation (move to GPU memory)
|
||||
|
@ -89,15 +90,19 @@ We used a RTX 3090 as test GPU.
|
|||
- move inter timeseries
|
||||
- acceptor time series and donor reference image; transformation also used on volume
|
||||
- spatial pooling (i.e. 2d average pooling layer)
|
||||
- data(x,y,t) = data(x,y,t) / data(x,y,t).mean(t) + 1
|
||||
- acceptor(x,y,t) = acceptor(x,y,t) / acceptor(x,y,t).mean(t) + 1
|
||||
- donor(x,y,t) = donor(x,y,t) / donor(x,y,t).mean(t) + 1
|
||||
- oxygenation(x,y,t) = oxygenation(x,y,t) / oxygenation(x,y,t).mean(t) + 1
|
||||
- volume(x,y,t) = volume(x,y,t) / volume(x,y,t).mean(t) + 1
|
||||
- frame shift
|
||||
- the first frame of donor and acceptor time series is dropped
|
||||
- the oxygenation and volume time series are interpolated between two frames (to compensate for the 5ms delay)
|
||||
- measure heart rate (measure_heartbeat_frequency) i.e. find the frequency with the highest power in the frequency band
|
||||
- measure heart rate (measure_heartbeat_frequency) i.e. find the frequency f_HB(x,y) with the highest power in the frequency band in the volume signal
|
||||
- use "regression" (i.e. iterative non-orthogonal basis decomposition); remove offset, linear trend, oxygenation and volume timeseries
|
||||
- donor: measure heart beat spectral power (measure_heartbeat_power)
|
||||
- acceptor: measure heart beat spectral power (measure_heartbeat_power)
|
||||
- donor: measure heart beat spectral power (measure_heartbeat_power) f_HB(x,y) +/- 3Hz; results in power_d(x,y)
|
||||
- acceptor: measure heart beat spectral power (measure_heartbeat_power) f_HB(x,y) +/- 3Hz ; results in power_a(x,y)
|
||||
- scale acceptor and donor signals via the powers
|
||||
- scale(x,y) = power_d(x,y) / (power_a(x,y) + 1e-20)
|
||||
- heartbeat_scale_a = torch.sqrt(scale)
|
||||
- heartbeat_scale_d = 1.0 / (heartbeat_scale_a + 1e-20)
|
||||
- result(x,y,t) = 1.0 + result_a(x,y,t) - result_d(x,y,t)
|
||||
|
|
Loading…
Reference in a new issue