Visualization

Our Projects

News

The aim of the MMIV visualization project is to research new visualization solutions for users from medical domains to enable the effective and efficient exploration, analysis, and/or presentation of data.

Core projects in the MMIV visualization project are:

MMIV Conference 2020 Videos

The video's from day 1 of the MMIV conference where we have obtained permission from the speakers are now online on our YouTube channel. Be sure to subscribe to see the videos of day 2 as soon as they are uploaded!

Helse-Vest funding awarded to MMIV initiatives!

Helse-Vest has just announced that they have allocated funds to 54 of the total 276 applications submitted by the deadline of September 15. Leif Oltedal has received project support for his project “Disrupt, potentiate...

Related Publications

2020

  • E. Mörth, K. Wagner-Larsen, E. Hodneland, C. Krakstad, I. Haldorsen, S. Bruckner, and N. Smit, “RadEx: Integrated Visual Exploration of Multiparametric Studies for Radiomic Tumor Profiling,” , vol. 39, iss. 7, pp. 611-622, 2020. doi:10.1111/cgf.14172
    [BibTeX] [Download PDF]
    @article{morth2020radex,
      title={RadEx: Integrated Visual Exploration of Multiparametric Studies for Radiomic Tumor Profiling},
      author={M{\"o}rth, E and Wagner-Larsen, K and Hodneland, E and Krakstad, C and Haldorsen, IS and Bruckner, S and Smit, NN},
      booktitle={Computer Graphics Forum},
      volume={39},
      number={7},
      pages={611--622},
      year={2020},
      doi = "10.1111/cgf.14172",
      url = "https://onlinelibrary.wiley.com/doi/full/10.1111/cgf.14172",
      organization={Wiley Online Library}
    }

  • E. Mörth, I. S. Haldorsen, S. Bruckner, and N. N. Smit, “ParaGlyder: Probe-driven Interactive Visual Analysis for Multiparametric Medical Imaging Data,” in Advances in Computer Graphics, 2020, pp. 351-363. doi:10.1007/978-3-030-61864-3_29
    [BibTeX] [Abstract] [Download PDF]

    Multiparametric imaging in cancer has been shown to be useful for tumor detection and may also depict functional tumor characteristics relevant for clinical phenotypes. However, when confronted with datasets consisting of multiple values per voxel, traditional reading of the imaging series fails to capture complicated patterns. These patterns of potentially important imaging properties of the parameter space may be critical for the analysis, but standard approaches do not deliver sufficient details. Therefore, in this paper, we present an approach that aims to enable the exploration and analysis of such multiparametric studies using an interactive visual analysis application to remedy the trade-offs between details in the value domain and in spatial resolution. This may aid in the discrimination between healthy and cancerous tissue and potentially highlight metastases that evolved from the primary tumor. We conducted an evaluation with eleven domain experts from different fields of research to confirm the utility of our approach.

    @inproceedings{moerth2020paraglyder,
      title        = {ParaGlyder: Probe-driven Interactive Visual Analysis for Multiparametric Medical Imaging Data},
      author       = {M{\"o}rth, Eric and Haldorsen, Ingfrid S. and Bruckner, Stefan and Smit, Noeska N.},
      year         = 2020,
      booktitle    = {Advances in Computer Graphics},
      publisher    = {Springer International Publishing},
      pages        = {351--363},
      doi          = {10.1007/978-3-030-61864-3_29},
      url          = {https://link.springer.com/chapter/10.1007%2F978-3-030-61864-3_29},
      abstract     = {Multiparametric imaging in cancer has been shown to be useful for tumor detection and may also depict functional tumor characteristics relevant for clinical phenotypes. However, when confronted with datasets consisting of multiple values per voxel, traditional reading of the imaging series fails to capture complicated patterns. These patterns of potentially important imaging properties of the parameter space may be critical for the analysis, but standard approaches do not deliver sufficient details. Therefore, in this paper, we present an approach that aims to enable the exploration and analysis of such multiparametric studies using an interactive visual analysis application to remedy the trade-offs between details in the value domain and in spatial resolution. This may aid in the discrimination between healthy and cancerous tissue and potentially highlight metastases that evolved from the primary tumor. We conducted an evaluation with eleven domain experts from different fields of research to confirm the utility of our approach.}
    }

  • L. Garrison, J. Vavs’ivcek, A. R. Craven, R. Grüner, N. N. Smit, and S. Bruckner, “Interactive visual exploration of metabolite ratios in MR spectroscopy studies,” Computers & Graphics, vol. 92, pp. 1-12, 2020. doi:10.1016/j.cag.2020.08.001
    [BibTeX] [Download PDF]
    @article{garrison2020spectra,
      title={Interactive visual exploration of metabolite ratios in {MR} spectroscopy studies},
    author = {Garrison, Laura and Va\v{s}\'{\i}\v{c}ek, Jakub and Craven, Alexander R. and Gr\"{u}ner, Renate and Smit, Noeska N. and Bruckner, Stefan},
      journal={Computers \& Graphics},
      year={2020},
      volume={92},
      pages={1--12},
      doi="10.1016/j.cag.2020.08.001",
      url = "https://www.sciencedirect.com/science/article/pii/S0097849320301199",
      publisher={Elsevier}
    }

2019

  • L. Garrison, J. Vasicek, R. Grüner, N. N. Smit, and S. Bruckner, “SpectraMosaic: An Exploratory Tool for the Interactive Visual Analysis of Magnetic Resonance Spectroscopy Data,” in Eurographics Workshop on Visual Computing for Biology and Medicine, 2019. doi:10.2312/vcbm.20191225
    [BibTeX] [Download PDF]
    @inproceedings{Garrison-2019-VCBM,
      title        = {{SpectraMosaic: An Exploratory Tool for the Interactive Visual Analysis of Magnetic Resonance Spectroscopy Data}},
      author       = {Garrison, Laura and Vasicek, Jakub and Gr\"{u}ner, Renate and Smit, Noeska N. and Bruckner, Stefan},
      year         = 2019,
      booktitle    = {Eurographics Workshop on Visual Computing for Biology and Medicine},
      publisher    = {The Eurographics Association},
      doi          = {10.2312/vcbm.20191225},
      url          = {https://diglib.eg.org/handle/10.2312/vcbm20191225}
    }

  • H. Bartsch, L. Garrison, S. Bruckner, A. Wang, S. F. Tapert, and R. Grüner, “MedUse: A Visual Analysis Tool for Medication Use Data in the ABCD Study,” in Eurographics Workshop on Visual Computing for Biology and Medicine, 2019. doi:10.2312/vcbm.20191236
    [BibTeX] [Download PDF]
    @inproceedings{bm.20191236,
      title        = {MedUse: A Visual Analysis Tool for Medication Use Data in the ABCD Study},
      author       = {Bartsch, Hauke and Garrison, Laura and Bruckner, Stefan and Wang, Ariel and Tapert, Susan F. and Gr\"{u}ner, Renate},
      year         = 2019,
      booktitle    = {Eurographics Workshop on Visual Computing for Biology and Medicine},
      publisher    = {The Eurographics Association},
      doi          = {10.2312/vcbm.20191236},
      url          = {https://diglib.eg.org/handle/10.2312/vcbm20191236}
    }

  • E. Mörth, R. G. Raidou, I. Viola, and N. Smit, “The Vitruvian Baby: Interactive Reformation of Fetal Ultrasound Data to a T-Position,” in Eurographics Workshop on Visual Computing for Biology and Medicine, 2019. doi:10.2312/vcbm.20191245
    [BibTeX] [Download PDF]
    @inproceedings{Moerth-2019-VCBM,
      title        = {The Vitruvian Baby: Interactive Reformation of Fetal Ultrasound Data to a T-Position},
      author       = {M{\"o}rth, Eric and Raidou, Renata Georgia and Viola, Ivan and Smit, Noeska},
      year         = 2019,
      booktitle    = {Eurographics Workshop on Visual Computing for Biology and Medicine},
      publisher    = {The Eurographics Association},
      doi          = {10.2312/vcbm.20191245},
      url          = {https://diglib.eg.org/handle/10.2312/vcbm20191245},
      pdf          = {pdfs/VCBM\_TheVitruvianBaby\_ShortPaper\_201-205.pdf},
      images       = {images/vcbmVitruvianBaby.jpg},
      thumbnails   = {images/vcbmVitruvianBaby.jpg}
    }

  • A. C. Kraima, N. P. West, N. Roberts, D. R. Magee, N. N. Smit, C. J. van de Velde, M. C. DeRuiter, H. J. Rutten, and P. Quirke, “The role of the longitudinal muscle in the anal sphincter complex: Implications for the Intersphincteric Plane in Low Rectal Cancer Surgery?,” Clinical Anatomy, 2019. doi:10.1002/ca.23444
    [BibTeX] [Download PDF]
    @article{kraima2019role,
      title        = {The role of the longitudinal muscle in the anal sphincter complex: Implications for the Intersphincteric Plane in Low Rectal Cancer Surgery?},
      author       = {Kraima, Anne C and West, Nicholas P and Roberts, Nicholas and Magee, Derek R and Smit, Noeska N and van de Velde, Cornelis JH and DeRuiter, Marco C and Rutten, Harm J and Quirke, Philip},
      year         = 2019,
      journal      = {Clinical Anatomy},
      publisher    = {Wiley Online Library},
      doi          = {10.1002/ca.23444},
      url          = {https://onlinelibrary.wiley.com/doi/full/10.1002/ca.23444}
    }

  • N. Smit and S. Bruckner, “Towards Advanced Interactive Visualization for Virtual Atlases,” in Biomedical Visualisation, Springer, 2019, pp. 85-96. doi:10.1007/978-3-030-19385-0\_6
    [BibTeX] [Download PDF]
    @incollection{smit2019towards,
      title        = {Towards Advanced Interactive Visualization for Virtual Atlases},
      author       = {Smit, Noeska and Bruckner, Stefan},
      year         = 2019,
      booktitle    = {Biomedical Visualisation},
      publisher    = {Springer},
      pages        = {85--96},
      doi          = {10.1007/978-3-030-19385-0\_6},
      url          = {http://noeskasmit.com/wp-content/uploads/2019/07/Smit\_AtlasVis\_2019.pdf}
    }

  • V. Solteszova, N. N. Smit, S. Stoppel, R. Grüner, and S. Bruckner, “Memento: Localized Time-Warping for Spatio-Temporal Selection,” Computer Graphics Forum, 2019. doi:10.1111/cgf.13763
    [BibTeX] [Download PDF]
    @article{solteszova2019memento,
      title        = {Memento: Localized Time-Warping for Spatio-Temporal Selection},
      author       = {Solteszova, V. and Smit, N. N. and Stoppel, S. and Gr\"{u}ner, R. and Bruckner, S.},
      year         = 2019,
      journal      = {Computer Graphics Forum},
      volume       = {0},
      number       = {0},
      pages        = {},
      doi          = {10.1111/cgf.13763},
      url          = {https://onlinelibrary.wiley.com/doi/pdf/10.1111/cgf.13763},
      keywords     = {interaction, temporal data, visualization, spatio-temporal projection, \textbullet{} Human-centred computing \rightarrow{} Visualization techniques, Scientific visualization, \textbullet{} Mathematics of computing \rightarrow{} Time series analysis}
    }

  • N. Smit, K. Lawonn, A. Kraima, M. deRuiter, S. Bruckner, E. Eisemann, and A. Vilanova, “Model-based Visualization for Medical Education and Training,” in Eurographics 2019 – Dirk Bartz Prize, 2019. doi:10.2312/egm.20191033
    [BibTeX] [Download PDF]
    @inproceedings{m.20191033,
      title        = {{Model-based Visualization for Medical Education and Training}},
      author       = {Smit, Noeska and Lawonn, Kai and Kraima, Annelot and deRuiter, Marco and Bruckner, Stefan and Eisemann, Elmar and Vilanova, Anna},
      year         = 2019,
      booktitle    = {Eurographics 2019 - Dirk Bartz Prize},
      publisher    = {The Eurographics Association},
      doi          = {10.2312/egm.20191033},
      issn         = {1017-4656},
      url          = {http://noeskasmit.com/wp-content/uploads/2019/06/Smit\_DBPrize\_2019.pdf},
      editor       = {Bruckner, Stefan and Oeltze-Jafra, Steffen}
    }

  • M. Meuschke, N. N. Smit, N. Lichtenberg, B. Preim, and K. Lawonn, “EvalViz–Surface Visualization Evaluation Wizard for Depth and Shape Perception Tasks,” Computers & Graphics, vol. 82, pp. 250-263, 2019. doi:10.1016/j.cag.2019.05.022
    [BibTeX] [Download PDF]
    @article{meuschke2019evalviz,
      title        = {EvalViz--Surface Visualization Evaluation Wizard for Depth and Shape Perception Tasks},
      author       = {Meuschke, Monique and Smit, Noeska N and Lichtenberg, Nils and Preim, Bernhard and Lawonn, Kai},
      year         = 2019,
      journal      = {Computers \& Graphics},
      publisher    = {Elsevier},
      volume       = 82,
      pages        = {250--263},
      doi          = {10.1016/j.cag.2019.05.022},
      url          = {http://noeskasmit.com/wp-content/uploads/2019/06/Meuschke\_EvalViz\_2019.pdf}
    }

2018

  • M. Meuschke, N. N. Smit, N. Lichtenberg, B. Preim, and K. Lawonn, “Automatic Generation of Web-Based User Studies to Evaluate Depth Perception in Vascular Surface Visualizations,” in Eurographics Workshop on Visual Computing for Biology and Medicine, 2018, pp. 33-44. doi:10.2312/vcbm.20181227
    [BibTeX] [Abstract] [Download PDF]

    User studies are often required in biomedical visualization application papers in order to provide evidence for the utility of the presented approach. An important aspect is how well depth information can be perceived, as depth encoding is important to enable an understandable representation of complex data. Unfortunately, in practice there is often little time available to perform such studies, and setting up and conducting user studies may be labor-intensive. In addition, it can be challenging to reach enough participants to support the contribution claims of the paper. In this paper, we propose a system that allows biomedical visualization researchers to quickly generate perceptual task-based user studies for novel surface visualizations, and to perform the resulting experiment via a web interface. This approach helps to reduce effort in the setup of user studies themselves, and at the same time leverages a web-based approach that can help researchers attract more participants to their study. We demonstrate our system using the specific application of depth judgment tasks to evaluate vascular surface visualizations, since there is a lot of recent interest in this area. However, the system is also generally applicable for conducting other task-based user studies in biomedical visualization.

    @inproceedings{Meuschke_VCBM_2018,
      title        = {{Automatic Generation of Web-Based User Studies to Evaluate Depth Perception in Vascular Surface Visualizations}},
      author       = {Monique Meuschke and Noeska N. Smit and Nils Lichtenberg and Bernhard Preim and Kai Lawonn},
      year         = 2018,
      booktitle    = {{Eurographics Workshop on Visual Computing for Biology and Medicine}},
      publisher    = {Eurographics Association},
      pages        = {033--044},
      doi          = {10.2312/vcbm.20181227},
      isbn         = {978-3-03868-056-7},
      issn         = {2070-5786},
      url          = {http://noeskasmit.com/wp-content/uploads/2018/10/Meuschke\_2018.pdf},
      editor       = {Anna Puig Puig and Thomas Schultz and Anna Vilanova and Ingrid Hotz and Barbora Kozlikova and Pere-Pau V\'{a}zquez},
      abstract     = {User studies are often required in biomedical visualization application papers in order to provide evidence for the utility of the presented approach. An important aspect is how well depth information can be perceived, as depth encoding is important to enable an understandable representation of complex data. Unfortunately, in practice there is often little time available to perform such studies, and setting up and conducting user studies may be labor-intensive. In addition, it can be challenging to reach enough participants to support the contribution claims of the paper. In this paper, we propose a system that allows biomedical visualization researchers to quickly generate perceptual task-based user studies for novel surface visualizations, and to perform the resulting experiment via a web interface. This approach helps to reduce effort in the setup of user studies themselves, and at the same time leverages a web-based approach that can help researchers attract more participants to their study. We demonstrate our system using the specific application of depth judgment tasks to evaluate vascular surface visualizations, since there is a lot of recent interest in this area. However, the system is also generally applicable for conducting other task-based user studies in biomedical visualization.}
    }