{ "cells": [ { "cell_type": "markdown", "source": [ "# SPECTrecon ML-EM\n", "\n", "This page illustrates ML-EM reconstruction with the Julia package\n", "[`SPECTrecon`](https://github.com/JuliaImageRecon/SPECTrecon.jl)." ], "metadata": {} }, { "cell_type": "markdown", "source": [ "## Setup" ], "metadata": {} }, { "cell_type": "markdown", "source": [ "Packages needed here." ], "metadata": {} }, { "outputs": [], "cell_type": "code", "source": [ "using SPECTrecon: SPECTplan, psf_gauss, project!, backproject!, mlem, mlem!\n", "using MIRTjim: jim, prompt\n", "using Plots: scatter, plot!, default; default(markerstrokecolor=:auto)" ], "metadata": {}, "execution_count": null }, { "cell_type": "markdown", "source": [ "The following line is helpful when running this example.jl file as a script;\n", "this way it will prompt user to hit a key after each figure is displayed." ], "metadata": {} }, { "outputs": [], "cell_type": "code", "source": [ "isinteractive() ? jim(:prompt, true) : prompt(:draw);" ], "metadata": {}, "execution_count": null }, { "cell_type": "markdown", "source": [ "## Overview\n", "\n", "Maximum-likelihood expectation-maximization (ML-EM)\n", "is a classic algorithm for performing SPECT image reconstruction." ], "metadata": {} }, { "cell_type": "markdown", "source": [ "## Simulation data" ], "metadata": {} }, { "outputs": [], "cell_type": "code", "source": [ "nx,ny,nz = 64,64,50\n", "T = Float32\n", "xtrue = zeros(T, nx,ny,nz)\n", "xtrue[(1nx÷4):(2nx÷3), 1ny÷5:(3ny÷5), 2nz÷6:(3nz÷6)] .= 1\n", "xtrue[(2nx÷5):(3nx÷5), 1ny÷5:(2ny÷5), 4nz÷6:(5nz÷6)] .= 2\n", "\n", "average(x) = sum(x) / length(x)\n", "function mid3(x::AbstractArray{T,3}) where {T}\n", " (nx,ny,nz) = size(x)\n", " xy = x[:,:,ceil(Int, nz÷2)]\n", " xz = x[:,ceil(Int,end/2),:]\n", " zy = x[ceil(Int, nx÷2),:,:]'\n", " return [xy xz; zy fill(average(xy), nz, nz)]\n", "end\n", "jim(mid3(xtrue), \"Middle slices of xtrue\")" ], "metadata": {}, "execution_count": null }, { "cell_type": "markdown", "source": [ "## PSF" ], "metadata": {} }, { "cell_type": "markdown", "source": [ "Create a synthetic depth-dependent PSF for a single view" ], "metadata": {} }, { "outputs": [], "cell_type": "code", "source": [ "px = 11\n", "psf1 = psf_gauss( ; ny, px)\n", "jim(psf1, \"PSF for each of $ny planes\")" ], "metadata": {}, "execution_count": null }, { "cell_type": "markdown", "source": [ "In general the PSF can vary from view to view\n", "due to non-circular detector orbits.\n", "For simplicity, here we illustrate the case\n", "where the PSF is the same for every view." ], "metadata": {} }, { "outputs": [], "cell_type": "code", "source": [ "nview = 60\n", "psfs = repeat(psf1, 1, 1, 1, nview)\n", "size(psfs)" ], "metadata": {}, "execution_count": null }, { "cell_type": "markdown", "source": [ "## SPECT system model using `LinearMapAA`" ], "metadata": {} }, { "outputs": [], "cell_type": "code", "source": [ "dy = 8 # transaxial pixel size in mm\n", "mumap = zeros(T, size(xtrue)) # zero μ-map just for illustration here\n", "plan = SPECTplan(mumap, psfs, dy; T)\n", "\n", "using LinearMapsAA: LinearMapAA\n", "using LinearAlgebra: mul!\n", "forw! = (y,x) -> project!(y, x, plan)\n", "back! = (x,y) -> backproject!(x, y, plan)\n", "idim = (nx,ny,nz)\n", "odim = (nx,nz,nview)\n", "A = LinearMapAA(forw!, back!, (prod(odim),prod(idim)); T, odim, idim)" ], "metadata": {}, "execution_count": null }, { "cell_type": "markdown", "source": [ "## Basic Expectation-Maximization (EM) algorithm" ], "metadata": {} }, { "cell_type": "markdown", "source": [ "Noisy data" ], "metadata": {} }, { "outputs": [], "cell_type": "code", "source": [ "using Distributions: Poisson\n", "\n", "if !@isdefined(ynoisy) # generate (scaled) Poisson data\n", " ytrue = A * xtrue\n", " target_mean = 20 # aim for mean of 20 counts per ray\n", " scale = target_mean / average(ytrue)\n", " scatter_fraction = 0.1 # 10% uniform scatter for illustration\n", " scatter_mean = scatter_fraction * average(ytrue) # uniform for simplicity\n", " background = scatter_mean * ones(T,nx,nz,nview)\n", " ynoisy = rand.(Poisson.(scale * (ytrue + background))) / scale\n", "end\n", "jim(ynoisy, \"$nview noisy projection views\")" ], "metadata": {}, "execution_count": null }, { "cell_type": "markdown", "source": [ "## ML-EM algorithm - basic version" ], "metadata": {} }, { "outputs": [], "cell_type": "code", "source": [ "x0 = ones(T, nx, ny, nz) # initial uniform image\n", "\n", "niter = 30\n", "if !@isdefined(xhat1)\n", " xhat1 = mlem(x0, ynoisy, background, A; niter)\n", "end\n", "size(xhat1)" ], "metadata": {}, "execution_count": null }, { "cell_type": "markdown", "source": [ "This preferable ML-EM version preallocates the output `xhat2`:" ], "metadata": {} }, { "outputs": [], "cell_type": "code", "source": [ "if !@isdefined(xhat2)\n", " xhat2 = copy(x0)\n", " mlem!(xhat2, x0, ynoisy, background, A; niter)\n", "end\n", "@assert xhat1 ≈ xhat2\n", "\n", "jim(mid3(xhat2), \"ML-EM at $niter iterations\")" ], "metadata": {}, "execution_count": null }, { "cell_type": "markdown", "source": [ "---\n", "\n", "*This notebook was generated using [Literate.jl](https://github.com/fredrikekre/Literate.jl).*" ], "metadata": {} } ], "nbformat_minor": 3, "metadata": { "language_info": { "file_extension": ".jl", "mimetype": "application/julia", "name": "julia", "version": "1.11.1" }, "kernelspec": { "name": "julia-1.11", "display_name": "Julia 1.11.1", "language": "julia" } }, "nbformat": 4 }