Accuracy evaluation of a 3D ultrasound-based neuronavigation system

Post on 13-May-2023

3 views 0 download

Transcript of Accuracy evaluation of a 3D ultrasound-based neuronavigation system

Biomedical Paper

Accuracy Evaluation of a 3D Ultrasound-BasedNeuronavigation System

Frank Lindseth, M.SC., Thomas Langø, PH.D., Jon Bang, PH.D.,and Toril A. Nagelhus Hernes, PH.D.

SINTEF Unimed, Ultrasound (F.L., T.L., J.B., T.A.N.H.), and the Norwegian University of Scienceand Technology, Department of Computer and Information Science (F.L.), Trondheim, Norway

ABSTRACTWe have investigated the 3D navigation accuracy of a frameless ultrasound-based neuronavigation system(SonoWand�) for surgical planning and intraoperative image guidance. In addition, we present a detaileddescription and review of the error sources associated with surgical neuronavigation based on preoper-ative MRI data and intraoperative ultrasound. A phantom with 27 precisely defined points was scannedwith ultrasound by various translation and tilt movements of the ultrasound probe (180 3D scans in total),and the 27 image points in each volume were located using an automatic detection algorithm. Theselocations were compared to the physically measured locations of the same 27 points. The accuracy of theneuronavigation system and the effect of varying acquisition conditions were found through a thoroughstatistical analysis of the differences between the two point sets. The accuracy was found to be 1.40 � 0.45mm (arithmetic mean) for the ultrasound-based neuronavigation system in our laboratory setting.Improper probe calibration was the major contributor to this figure. Based on our extensive data set andthorough evaluation, the accuracy found in the laboratory setting is expected to be close to the overallclinical accuracy for ultrasound-based neuronavigation. Our analysis indicates that the overall clinicalaccuracy may be as low as 2 mm when using intraoperative imaging to compensate for brain shift. CompAid Surg 7:197–222 (2002). ©2002 Wiley-Liss, Inc.

Key words: accuracy evaluation; computer-assisted surgery; neuronavigation; 3D ultrasound; neu-rosurgeryKey links: http://www.us.unimed.sintef.no/; http://www.mison.no/

INTRODUCTIONSeveral commercial three-dimensional (3D) navi-gation systems for image-guided surgery are avail-able today, and are being used routinely in neuro-surgery.1–6 The expected benefits from suchsystems are improved and easier understanding of3D orientation and anatomy, more confident sur-geons, more precise surgical planning and interven-

tions, reduction of residual tumor volumes (i.e.,more radical tumor resections), reduced operationtimes, and better patient outcomes.7–17

Conventional systems based on preoperativedata such as magnetic resonance imaging (MRI)and computed tomography (CT) scans will not re-flect the true anatomy of the patient, as changes in

Received October 19, 2001; accepted September 24, 2002.

Address correspondence/reprint requests to: Thomas Langø, SINTEF Unimed Ultrasound, 7465 Trondheim, Norway. E-mail:Thomas.Lango@sintef.no

Published online in Wiley InterScience (www.interscience.wiley.com). DOI: 10.1002/igs.10046

Computer Aided Surgery 7:197–222 (2002)

©2002 Wiley-Liss, Inc.

anatomy occur during a surgical procedure. Sys-tems based on intraoperative MRI or intraoperativeultrasound can, to a certain degree, compensate forthese changes by updating the 3D map of the pa-tient during surgery. In an interventional MRI sys-tem, the surgeon is operating inside the magnet,and by choosing speed over quality he is able toobtain one image per second.18 On the other hand,neuronavigation systems based on intraoperativeultrasound are becoming more accepted due to im-proved image quality and real-time 2D and 3Dfreehand capabilities, as well as future real-time 3Dpossibilities.19,20 There are two different ways ofusing 3D ultrasound to obtain a map that corre-sponds to the anatomy at all times. In the first, thesurgeon uses ultrasound indirectly to track the an-atomical changes that occur, then uses thesechanges to elastically deform the preoperative im-age data. Navigation is then undertaken with refer-ence to the manipulated preoperative MRI/CT da-ta.4 In the second approach, direct navigation isconducted according to the ultrasound data itself.1

The system investigated in the present study usesthe latter approach.

The delicacy, precision, and extent of thework that the surgeon can perform based on imageinformation rely on the surgeon’s confidence in theoverall clinical accuracy and the anatomic or patho-logic representation. The overall clinical accuracyin image-guided surgery is a measure of the differ-ence between the apparent location of a surgicaltool relative to some structure as indicated in theimage information presented to the surgeon, andthe actual location of the tool relative to that samestructure in the patient. This accuracy is difficult toassess in a clinical setting, due to the lack of fixedand well-defined landmarks inside the patient thatcan be reached accurately by a pointer. It is there-fore common practice to estimate the system’soverall accuracy in a controlled laboratory settingusing precisely built phantoms.21–23 To make aconclusion on the potential clinical accuracy, thedifferences between the clinical and laboratory set-tings must be carefully examined.

Several investigators have estimated the ac-curacy of navigation systems for surgery.6,21–25 Al-though much has been written about conventionalneuronavigation systems, the literature on accuracymeasurements of ultrasound-based neuronavigationis sparse. Hata et al.4 reported a root-mean-square(RMS) error of 3.1 mm, with standard deviation 2.5mm, at a depth of 10 mm from the transducer.These numbers represent the difference betweenthe position of a phantom point in one MRI scanand its position in several ultrasound scans. Simi-

larly, Comeau et al.26 reported an error of less than1.3 mm when mapping an ultrasound image pixelto its homologous MRI pixel, as measured on acustom-built phantom. Hartov et al.23 performed aphantom error analysis of a 3D ultrasound-basedneuronavigation system, and reported an overallerror of 2.96 � 1.85 mm when locating features inultrasound images.

More information is available on navigationsystems based on preoperative MRI, and the studiescover a broad range of different positioning sys-tems, various commercially available navigationtools, and imaging and registration techniques, aswell as setup and measurement techniques. A com-mon way of assessing the overall accuracy has beento use a rigid and precise phantom/head model inthe laboratory and measure the registration accu-racy. Typical results for skin fiducial-based regis-trations are in the order of 2 mm27 or better.28,29

Registration based on anatomical landmarks typi-cally yields poorer accuracy.27,29 Other groups havemeasured the accuracy in a clinical setting wherefiducials and/or the skin surface have been used.For fiducial registration, mean error results of 1.6mm,30 2.51 mm,6 and approximately 2 mm31 havebeen found. Other reported mean error results are3.03 mm for surface-fit registration,6 and 3.4 mmusing facial landmarks.30

However, a perfect match on the skin surfacedoes not necessarily imply a perfect match deepinside the brain. Rotation errors in the registrationprocedure may be difficult to discover, and mayresult in significant errors at the skull base. Schalleret al.5 performed measurements inside the brainusing landmarks such as the internal table of theskull, the falx, the tentorium, or the clinoid pro-cesses. They reported errors in the order of 3 mm.This approach is interesting, but it is difficult toobtain a precise measurement in more than one ortwo dimensions, and the number of measurementsin one patient will be sparse.

To gain a better understanding of the relation-ship between laboratory and clinical accuracy, wewill describe and compare the most relevant errorsources associated with neuronavigation based onpreoperative MRI and the error sources associatedwith neuronavigation based on intraoperative 3Dultrasound. Navigation based on preoperative MRIis well established and more frequently used com-pared to interventional MRI. Furthermore, the anal-ysis includes patient registration based on skin fi-ducials, with some comments on other registrationmethods. Apart from these restrictions, the descrip-tion of MRI error sources is intended to be general.Similarly, the section on ultrasound error sources

198 Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation

covers the general steps necessary for ultrasound-based neuronavigation. In both sections, however,the cited error magnitudes are based on a combi-nation of the available literature and our own ex-perience during 7 years of ultrasound-based image-guidance in neurosurgery.

Error Sources Associated with Neuro-navigation Based on Preoperative MRIPreoperative imaging. Fiducials are glued to theskin of the patient (action item 1 in Table 1). Thepatient is then positioned in the MRI scanner onhis/her back (action item 2). The normal procedurein our hospital is to stabilize the head with bi-temporal padding and a strap across the forehead.The fiducials can easily slide several millimetersduring the procedure and cause a significant error(error source A in Table 1). We normally place fivefiducials on the patient, but we avoid the back partof the head/skull and exclude the padding and strapto minimize this error. A 3D MRI scan is per-formed (3), and a digital data set is acquired. Inho-mogeneities of the magnetic field and nonlineargradients cause geometric errors (error source B)between the true anatomy and the image informa-

tion. We have no documentation for this error inour MRI system. However, Sumanaweera et al.32

reported values in the order of 2 mm for the aver-age difference between the true patient anatomyand the anatomy represented by MR images whenno actions are taken to correct for MR field distor-tions. The selected slice thickness and distancebetween consecutive slices will also affect the ac-curacy.33,34 The images are transferred to the nav-igation system and organized into a regular 3Dvolume (4). If the original images have a differentpixel resolution and image distance than the spec-ified voxel resolution of the regular 3D volume(which is normally not the case), this process willintroduce a small quantization/interpolation error(C).Patient registration. Next, the patient is placed onthe operating table and the position reference frameis attached to the head frame (5). Brain shift due togravity can occur (D), especially if the head orien-tation is different from that in the MRI scanner.This effect is often accounted for in functionalstereotaxy. The skin and fiducials can also slide anddeviate several millimeters from their original po-sition in the MRI scanner due to gravity and/or

Table 1. Error Chain Associated with Neuronavigation Based on Preoperative MRI Data

Action items Error sourcesError magnitude (mm)

�1 1–2 �21 Glue fiducials to skin2 Position patient in MR-

scannerA Skin/fiducial slide �*

3 Perform 3D MRI B Geometric distortion in MR data �34 �34 �32

4 Load images and generatevolume

C Quantization in volume reconstruction �*

5 Position patient on operatingtable

DE

Brain shift due to gravitySkin/fiducial slide due to gravity

�46

�*6 Mark fiducials in MR images F Fiducial identification �* �6,27

7 Touch fiducials with pointer G Skin/fiducial slide due to pointer pressure �*,27

H Position system and pointer tip definition �24,35

I Pointing �* �47

8 Match images to patient J Matching algorithm �*9 Point and reconstruct images

with overlaid colored crossKL

2D image extractionQuantization, colored cross

�*�*

M Interpretation �*N Position system and tool tip definition �35

10 Surgery and brain movements O Discrepancy between anatomy and images �*,48 �*,45,48

In the error magnitude columns, a star (*) indicates that the error value is based on our own experience and algorithm tests, while the numbers refer to the listof references.

Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation 199

manipulation during positioning of the head in thehead frame (E). This effect is probably most pro-nounced for elderly people.

The next step in the procedure is to registerthe position of the fiducials in the 3D data set by amanual or automatic method (6). This process issubject to operator errors and possibly algorithm/quantization errors depending on the slice distanceand zoom factor (magnification) of the images inwhich the operator is supposed to pinpoint thefiducials (F). A pointer is used to mark the fidu-cials, and hence register the patient relative to thepreoperative image data (7). The fiducials/skin canagain slide several millimeters27 due to the appliedpointer pressure (G), and the position system issubject to a certain error in the measurement andcalculation of the pointer tip position (H).24,35 Theoperator normally attempts to point at the center ofthe fiducial, but this procedure may also be subjectto a pointing error (I), depending on the fiducialtype being used.

A matching algorithm is then applied to finda best match, i.e., a transformation between thepatient space and image space (8). Both direct anditerative minimization algorithms exist. For point(fiducial or anatomical)-based registration, the ac-curacy will depend on the chosen implementationand the number of points used (J).

In the error chain of Table 1, we considerpoint (fiducial)-based patient registration only, asthis method is used in our clinic, and is also fre-quently used by others. Another common method issurface-based patient registration, in which a sur-face map of the head/face is created by sweepingthe skin with a 3D spatial digitizer, or by imaginga projected light pattern with an array of videocameras. The surface map is then registered to, forexample, the preoperative MRI or CT data by min-imizing some form of cost function. Some studieshave shown that, in neurosurgery, the errors forfiducial-based methods are smaller than the errorsfor surface-fitting and anatomical landmark-basedmethods.6,27,30,31 However, for other surgical appli-cations, such as ENT surgery, surface point regis-tration techniques have proven reliable and robust,with reported errors as small as 1.5 mm.36

Preoperative planning. A pointer or other cali-brated surgical instrument/tool is applied to theskin surface or within the brain to perform image-guided planning. The navigation system will recon-struct a 2D image from the 3D volume and draw across in the images at a location given by the tool(9). This process is subject to an interpolation errorfrom 3D to 2D data (K), a quantization error in thepositioning of the cross in the images (L), and an

operator-dependent interpretation error when thesurgeon is supposed to position the cross at a de-sired point in the image space (M). The tool tip isnow supposed to be located exactly at the spot onthe patient that is indicated by the crosshairs in theimages. However, this will rarely happen in prac-tice due to the error in measuring and estimatingthe tool tip position (N), as well as the contributionof the other error sources listed above.Intraoperative guidance. The last action item onthe list is the actual surgery in the brain (10). Mostprocedures will cause a certain amount of brainshift, and possibly a deformation of normal andpathologic structures. These changes may easilycause a discrepancy between the image space (3DMRI) and the patient space in the order of severalmillimeters or even centimeters (O).

Error Sources Associated with Neuro-navigation Based on 3D IntraoperativeUltrasoundThe following is a description of the general stepsnecessary when performing ultrasound-based neu-ronavigation, as outlined in Table 2. The actionitem and error source numbering is continued fromTable 1.

The craniotomy is planned, and in some casesa separate minicraniotomy is made for the ultra-sound probe. The skull is opened and the first 3Dultrasound scan is acquired for further planning.However, before 3D free-hand ultrasound imagingcan begin, a probe calibration procedure must beperformed to determine the position and orientationof the scan-plane relative to the sensor attached tothe ultrasound probe (action item 11 in Table 2).Various algorithms exist for probe calibration (see,e.g., the work by Langø37 or Prager et al.38). Thepossible error in this transformation (P) has beeninvestigated by others37–39 for various probe cali-bration methods, and will also be evaluated in thisstudy. Prior to ultrasound imaging in the operatingroom, the ultrasound probe is covered with a thinsterile drape and the position sensor frame is at-tached to the probe housing (12). Proper design ofthe adapter glued to the probe housing and thesensor frame ensures a small or negligible repeat-ability error associated with this process, eventhrough the sterile drape (Q).

A 3D ultrasound volume is then acquiredusing a position system to track the position andorientation of the probe (13). The probe is tilted�90° in �15 s to acquire �200 images from thevolume of interest in the brain. This process issubject to errors in tracking of the position sensorattached to the probe (R), errors in the synchroni-

200 Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation

zation between the positioning data and the images(S), and uncertainty in the positioning of each 2Dscan plane due to the finite time required for theacquisition of one image relative to the continuousprobe movement during scanning (T).

The images with position tags are transferredto the navigation computer and reconstructed(scan-converted) into a regular 3D volume (14)with a certain grid resolution. This process is sub-ject to geometric errors caused by the differencebetween the real speed of sound in the brain andthat assumed in the algorithm (U), and by interpo-lation errors in the 3D scan conversion procedure,in combination with the selected 3D grid resolution(V). Furthermore, the ultrasound plane actually hasa finite thickness, which implies an additional po-sition uncertainty for the elements in the acquiredimage. The error may be associated with variousstages in Table 2 (probe calibration, data acquisi-tion, volume reconstruction). We have chosen toconsider this error source as part of the 3D recon-struction error (W). The magnitude of this uncer-tainty varies with the probe type and the distancefrom the probe. The value listed in Table 2 is anestimated average over the depth range of interestin our experiment for the actual probe used.

Surgical planning can then start by moving

the pointer over the skin surface. Again, the posi-tion of the pointer determines which images aredisplayed on the monitor and the position of thecross in the images (9). The error sources associ-ated with this process are the same as for MRI-based navigation (K, L, M, and N).

An important advantage of ultrasound-guidednavigation is the ability to do repetitive 3D imagingduring surgery and thus work with a data set that hasrecently been updated according to possible changesin the brain anatomy. The error or discrepancybetween the images and the anatomy during sur-gery (O�, 10) will therefore be small or negligible.

The use of ultrasound does not require anypatient registration, because throughout the opera-tion all ultrasound volumes are acquired in thesame coordinate system as the one in which navi-gation is performed. Hence, error sources associ-ated with patient registration are excluded from theultrasound error chain.

It should be mentioned that, in our clinic,preoperative data (MRI, CT) is routinely used forplanning and throughout the surgical procedure foroverview purposes. These data must therefore beregistered to physical space through a patient reg-istration procedure, as described in the previoussection. Hence, the error chains associated with

Table 2. Error Chain Associated with Neuronavigation Based on Intraoperative Ultrasound

Action items Error sourcesError magnitude (mm)

�1 1–2 �211 Probe calibration P Sensor to scan plane transformation �38,39 �*,37–39

12 Mount positionsensor

Q Sensor attachment repeatability �*

13 Acquire 3Dultrasound data

RS

Position sensor trackingSynchronization between position data and

images

�35,49

�*

T 2D image position discretization �*14 Load images and

reconstructvolume: 3D scanconversion

UVW

Sound speed valueGrid resolution/interpolation algorithmsFinite thickness of ultrasound plane

�*�*

9 Point and reconstructimages withoverlaid coloredcross

KLMN

2D image extractionQuantization, colored crossInterpretationPosition system and tool tip definition

�*�*�*�*,35

10 Surgery and brainmovements

O� Brain shift (between repetitive 3D ultrasoundacquisitions)

�* (�)*

In the error magnitude columns, a star (*) indicates that the error value is based on our own experience and algorithm tests, while the numbers refer to the listof references.

Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation 201

placing the MRI or CT volumes and the ultrasoundvolumes in the patient will be independent. Weapply ultrasound repeatedly during surgery to ob-tain updated and detailed images of the area ofinterest in the brain. Tumor resections are oftenperformed using 3D ultrasound alone, with preop-erative MRI/CT data being used only for planningpurposes, for example, for the craniotomy.40 Analternative registration approach is to match theMRI/CT data directly to the ultrasound data byvolume-to-volume registration. This implies thatthe errors related to placing the MRI/CT volumesin the patient depend on the ultrasound error chain,in addition to the errors of the multimodal registra-tion process. The latter option is not used in ourclinic, and, to our knowledge, the procedure is stillonly used for research purposes elsewhere.

Scope of the WorkThe overall clinical accuracy of a navigation sys-tem will be determined by the contribution fromeach of the individual error sources described inTables 1 and 2. The net effect will not be the sumof all the error sources, but rather a stochasticcontribution from all terms. In ultrasound-basednavigation, error sources M and O� (Table 2) areaffected by the user, while the remaining sourcesare under the control of the system vendor. InMRI-based navigation, the user and procedure af-fect error sources A–G, M, and O (Table 1), whilethe remaining six error sources are under the con-trol of the vendor (H–L and N).

The purpose of this study was to perform a3D accuracy evaluation of an ultrasound-basednavigation system in the laboratory setting to ob-tain an estimate of the potential overall clinicalaccuracy of ultrasound-based neuronavigation. Er-ror sources P–W in Table 2 are included in thisanalysis, although Q and W cannot be quantifiedfrom our data.

MATERIALS AND METHODS

Navigation SystemThe SonoWand� system (MISON AS, Trondheim,Norway)1 is a neuronavigation system that differsfrom conventional neuronavigation systems by be-ing integrated with a built-in high-performance dig-ital ultrasound scanner for updating the 3D mapduring surgery. The system comprises a computerfor image processing and navigation and an optical3D position tracker (camera system). The single-rack system is shown in Figure 1(a). A direct linkbetween the ultrasound scanner and the navigationcomputer provides rapid transfer of 3D ultrasound

data. The system can function as a conventionalultrasound scanner, as a conventional neuronavigationsystem based on MRI or CT images, or, more impor-tantly, as a combined system in which full use is madeof the features and advantages of preoperative MRIand intraoperative 3D ultrasound. The system thusenables the surgeon to navigate directly by means ofintraoperative 3D ultrasound.

The navigation software can import MRI orCT data, perform patient registration using fiducialsor anatomical landmarks, and display navigationimages on the monitor in the same manner asconventional neuronavigation systems. In addition,the system measures the position of the ultrasoundprobe and tags images from the ultrasound scannerwith positions from the optical tracker. The ultra-sound probe with the attached position-sensorframe is shown in Figure 1(b). The position of theultrasound image relative to the sensor frame isdetermined through a probe calibration procedure.A scan conversion algorithm is performed on the3D ultrasound data to convert it into a regularvolume that is handled by the navigation softwarein a similar fashion to the preoperative MRI and CTvolumes. The system supports various navigationfeatures and display options.

Wire PhantomThe ultrasound volumes studied in this article wereacquired by scanning a precisely built wire phan-tom. A schematic drawing of the phantom is shownin Figure 2. The phantom is made of aluminum,and has four infrared-reflecting spheres mounted asreferences for the camera positioning system. Eigh-teen polyester wires of diameter 0.3 mm aremounted inside the phantom, with spring loadingsto keep the wires straight. The wires are parallel toeither the reference frame’s X- or Y-axis. They form27 wire crosses in a cubic pattern, with verticalseparation of 0.5 mm between the wire center axesat each cross. All the wire crosses lie within avolume of dimensions 5 � 5 � 5 cm.

The positions of all the wire crosses and the fourreflecting spheres have been physically measured on amachining table, with an accuracy of 0.1 mm in alldirections. These positions were then transformed intothe reference coordinate system used by the positiontracker. This is the coordinate system in which theultrasound volumes are reconstructed.

Experimental Setup, Data Acquisition, andVolume ReconstructionAn overview of the data flow and the experimentalsetup is shown in Figure 3 (refer to this figure forthe next three sections).

202 Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation

The wire phantom was immersed in waterand scanned using a 5-MHz flat phased array ul-trasound probe with an attached position sensor

[Fig. 1(b)]. The probe was mounted in a rigidholder for practical reasons (overall stability; goodrepetition of scans; full coverage of all wire crossesin all scans), but the scanning was done by manu-ally pulling or tilting the holder. As far as possible,the scanning motion was performed smoothly andat a constant speed, taking approximately 25 s tocover the total distance (�10 cm). The ultrasoundimages were acquired at a constant frame rate (sixframes/s), typically yielding 150 images per scan.These images are thus somewhat irregularly spacedthroughout the volume, but the typical distancebetween consecutive images is 0.6 mm for transla-tion scans and from 0.03 mm near the probe to�1.3 mm far from the probe for tilted scans. Scanswere performed parallel to the phantom’s X- andY-axes and diagonally. Both translation scans andtilts were performed for each direction. A total of180 scans were performed for the system evalua-tion (Table 3).

The highly accurate35 optical 3D trackingsystem (Polaris�, Northern Digital Inc., Ontario,Canada) monitored the positions of the phantomand the probe from various distances and elevationangles above the horizontal. Figure 4 shows theexperimental setup from above and from the side,while the specific numbers are given in Table 3.The five camera positions were chosen as: (1) theoptimal distance (1.8 m) specified by the vendor,combined with an elevation angle (45°) that en-

Fig. 1. (a) The single-rack ultrasound-based neuronavi-gation system (SonoWand�, MISON AS, Norway) with theposition tracking system attached to an adjustable arm (Po-laris�, Northern Digital, Canada). (b) Ultrasound probe (5MHz FPA) with an attached position-sensor frame andreflecting spheres.

Fig. 2. The wire cross phantom. The four spheres (a–d)constitute the reference frame for the optical positioningsystem. The dimensions of the cube defined by the 27 wirecrosses are 5 � 5 � 5 cm.

Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation 203

sured good visibility of both the phantom’s refer-ence frame and the probe’s tracking frame; and(2)–(5) reasonable variations in distance and eleva-tion around position 1, based on vendor specifica-tions and typical clinical use. The distances be-tween the position cameras and the reference framewere measured to within �10 cm accuracy, and theelevation to within �2°. The phantom and probewere oriented such that all scans were made di-rectly towards or away from the cameras.

High image quality is assured in the Sono-Wand system by using the raw digital data (not

video-grabbing) from the built-in ultrasound scan-ner. Scanner settings such as frequency, depth,sector width, and frames per second were tuned toachieve a satisfactory view of the wires in the waterbath on the scanner monitor. We scan-converted allvolumes with voxels of 0.65 � 0.65 � 0.65 mm.This resulted in volumes having sizes from 6 to 24megabytes, depending mainly on the acquisitiontime and scan distance. Sample ultrasound imagesfrom a reconstructed volume of a tilt scan areshown in Figure 5.

Important parameters in the reconstructionprocess are probe calibration to determine the po-sition of the image relative to the position sensorattached to the probe; synchronization of positionand image data; the speed of sound in water used inthe scan conversion; and the desired output resolu-tion measured in mm/voxel. We measured the tem-perature of the water bath containing the phantomto determine the sound speed, following the tabu-lated data of Duck41 to set a proper value. Both theprobe calibration method and the synchronizationprocedure used in the SonoWand system are pro-prietary to the vendor, and the details are unknownto us. Nevertheless, the effects of these parameterson the system accuracy can still be evaluated fromour experiments (see the Discussion section).

Automatic Detection of Wire Crosses

The wire crosses were detected in the ultrasoundvolumes by an automatic procedure.42 A small datacube was extracted around the expected position ofeach cross, accessible from the physical measure-ments. The cube was then correlated with a tem-plate cube of the same size, containing a modeldescription of the wire cross. This model descrip-

Fig. 3. Measurement setup and data flow. The positionsensor tracks the wire phantom and the ultrasound probeduring image acquisition. The automatic algorithm identi-fies the positions of the wire crosses (AIP) in the recon-structed volume. We compare AIP to the physically mea-sured positions of the wire crosses (MP) to assess theaccuracy of the whole 3D ultrasound-based navigation sys-tem.

Table 3. Overview of Camera Positions and Acquisition Conditions Used for the 180 ScannedVolumes

Camera position 1 2 3 4 5Distance/Elevation 1.87 m/45.5° 1.48 m/44.7° 1.82 m/35.3° 1.79 m/61.9° 2.08 m/44.9°

Acquisition Translation, �X 1, 2, 3 13, 14, 15 25, 26, 27 37, 38, 39 49, 50, 51Translation, X 4, 5, 6 16, 17, 18 28, 29, 30 40, 41, 42 52, 53, 54Tilt, �X 7, 8, 9 19, 20, 21 31, 32, 33 43, 44, 45 55, 56, 57Tilt, X 10, 11, 12 22, 23, 24 34, 35, 36 46, 47, 48 58, 59, 60Translation, �diag 61, 62, 63 73, 74, 75 85, 86, 87 97, 98, 99 109, 110, 111Translation, diag 64, 65, 66 76, 77, 78 88, 89, 90 100, 101, 102 112, 113, 114Tilt, �diag 67, 68, 69 79, 80, 81 91, 92, 93 103, 104, 105 115, 116, 117Tilt, diag 70, 71, 72 82, 83, 84 94, 95, 96 106, 107, 108 118, 119, 120Translation, �Y 121, 122, 123 133, 134, 135 145, 146, 147 157, 158, 159 169, 170, 171Translation, Y 124, 125, 126 136, 137, 138 148, 149, 150 160, 161, 162 172, 173, 174Tilt, �Y 127, 128, 129 139, 140, 141 151, 152, 153 163, 164, 165 175, 176, 177Tilt, Y 130, 131, 132 142, 143, 144 154, 155, 156 166, 167, 168 178, 179, 180

Three repeated scans were made for each camera position and acquisition method. The volumes were scanned during a 10-h session, and are numbered 1–180 inchronologic order.

204 Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation

tion is defined through a number of parameters, andwe used the optimal parameter setting foundearlier.42 The location of the correlation maximumwas used to find the actual position of the wirecross in the ultrasound volume.

Statistical AnalysisThe physical position measurements of the wirecrosses performed on the machining table wereconsidered true values. The positions found fromthe 3D ultrasound scans were then compared to thetrue values through a statistical analysis. We denotethe image points AIPp,v (Automatic Image Points,

p point 1 . . . 27, v volume 1 . . . 180) and thetrue values MPp (Measured Points, p point 1 . . .27). These points are all described in the samecoordinate system, i.e., the reference system usedby the position tracker.

Subtracting one data set from another we get27 residual vectors in 3D space for each of the 180volumes, i.e., a total of 4,860 3D error vectors:

Dp,v � AIPp,v � MPp (1)

The Euclidian lengths of these vectors are:

dp,v � �Dp,v� � �Dp,v�X�2 � Dp,v�Y�2 � Dp,v�Z�2

(2)

From the error vectors, or any subset of thevectors, we may calculate the vector mean andstandard deviation:

D� �1

Np � Nv� �

p

Np �v

Nv

Dp,v (3)

�D � � 1

Np � Nv � 1 �p

Np �v

Nv

�Dp,v � D� �2 (4)

where Np and Nv are the number of wire crossesand volumes in the data set, respectively. Similarly,we find the mean and standard deviation of thevector lengths as:

d� �1

Np � Nv�p

Np �v

Nv

dp,v (5)

�d � � 1

Np � Nv � 1 �p

Np �v

Nv

�dp,v � d� �2 (6)

The end points of the vectors Dp,v describe acloud of points in three dimensions. If the cloud isnot centered at the origin, there is a bias in the errorestimate, given by D� . The measurements may beconsidered accurate if the bias D� and the spread �D

are both small. The parameter d� is a single numberdescribing the overall accuracy of the system, i.e.,the value represents the mean distance between apoint in physical space and the corresponding pointin ultrasound image space. We also present d�

percentiles, where �% (� 50 or 95) of all theobserved values are below d�.

To examine whether varying factors (e.g.,

Fig. 4. Experimental setup during scanning of the phan-tom. (a) View from above. The figure indicates three dif-ferent camera positions, i.e., for scans along the X- andY-axes, and along the diagonal of the phantom. (b) Viewfrom the side. The elevation angle � from the horizontalplane and the distance l are given in Table 3.

Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation 205

acquisition conditions like camera position) hadsignificant influence upon the accuracy, we per-formed an analysis of variance (ANOVA) on thedata. A thorough description of this technique canbe found in statistical textbooks.43 The basic prin-ciple is that the data material is grouped into sev-eral subsets, one for each level of the factor beingconsidered. The variance between the groups isthen compared to the variance within each group,according to well-defined algorithms. The compar-ison is quantified by one calculated parameter, theso-called p value, which will be discussed below.

In the present context, we shall restrict our-selves to only one response variable, namely accu-racy in terms of d� . This response variable is mod-eled as a sum of contributions from the various datasubsets, plus a residual error term. The applicationof the standard ANOVA technique assumes that theresidual error obeys a normal distribution, and thisassumption holds well for our data set, as can beseen from a histogram plot of d� . We shall apply amultiway ANOVA analysis, which allows us toidentify interaction between several factors, in ad-dition to the main effects of each single factor. Theanalysis will be applied throughout to balanceddata sets and subsets, i.e., sets containing the samenumber of elements.

Our initial assumption (null hypothesis) isthat the factors do not affect the accuracy. In otherwords, the between-groups variation should notdiffer from the within-groups variation. Whetherthis is so is determined by the p value, which can beinterpreted as “the probability of obtaining thegiven data material provided that the null hypoth-esis is true.” Thus, a p value close to zero indicatesthat we can conclude—at significance level p—thatthe null hypothesis was wrong and must be re-

jected, and that the investigated factor or interac-tion has significant influence on the accuracy.

RESULTS AND INTERPRETATION

Results for All 180 Volumes

The 4,860 individual error vectors Dp,v are plottedas projections onto the XY-, XZ-, and YZ-planes inFigure 6. This plot clearly shows the bias andspread of the data set. The bias D� (3) (offset ofcenter of gravity) is (0.21 mm, 0.90 mm, 0.27mm), while the standard deviation of each compo-nent is (0.88 mm, 0.50 mm, 0.43 mm).

The X, Y, and Z components, and the length dof the individual error vectors D are calculated andthen averaged over each volume (27 individualvectors). The results are shown in Figure 7, wherethe volume numbers 1 through 180 follow the chro-nological order listed in Table 3. Rulers are in-cluded to show the acquisition conditions of eachvolume.

As an overall result, d averaged over all vol-umes yields d� 1.40 mm, with a standard devia-tion of �d 0.45 mm. In both Figures 6 and 7, wenotice biases and systematic variations that suggestthat the error vectors are not purely stochastic ac-cording to a normal distribution. On the contrary,the figures indicate significant systematic errorsthat are strongly correlated to the acquisition con-ditions. An overall impression of the results can begained from the following observations on Figure7:

1. The results are fairly consistent within eachgroup of three repeated volumes.

2. The results typically cluster into groups cor-

Fig. 5. Sample ultrasound images (three orthogonal views) from a tilted (�X direction) 3D scan of the phantom.

206 Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation

responding to the series of translation or tiltscans.

3. Scan orientation (X/Diagonal/Y scans) andcamera position show some correlation withthe results; however, these trends are notunambiguous.

4. The D(X) and D(Y) components are quiteirregular; some jumps and trends seem tocorrelate with the rulers, while other trendsdo not. The D(Z) component is more clearlyrelated to the translation/tilt ruler division.At the same time, an overall drift termseems to be superimposed on this compo-nent.

5. d varies in a rather complex manner withcamera position and acquisition method,due to its dependence upon D(X), D(Y), andD(Z).

Figure 8 shows the histogram of the 4,860individual lengths, and the accumulated histogram.The 50 and 95% percentile values are 1.35 and 2.12mm, respectively. The maximum d in the wholedata set was 2.99 mm.

As a check of the system repeatability, wegroup all three volumes scanned at the same acqui-sition conditions. The average d for each volume isplotted in Figure 9. Within each group of threevolumes, we use the max(d) min(d) difference asa measure of the repeatability. The average of thisdifference over all 60 groups is 0.10 mm. Being anorder of magnitude smaller than the variation be-tween the groups, this number indicates that theoverall repeatability of the system is good.

Although the features observed in Figures 6through 9 are, in general, correlated to the acquisi-

tion scheme, the relations cannot be determineddirectly from this overview. We shall thereforeinvestigate the effects of single factors and theirinteractions on the accuracy by performing anANOVA on the appropriate subsets (see next sec-tion). Furthermore, the extensive data set enablesus to search for the underlying error sources thatcause the systematic variations. This will be cov-ered in the Discussion.

Results for Data SubsetsThe extensive data set allows us to investigate howvarious camera positions and acquisition conditionsaffect the accuracy by analyzing appropriate sub-sets of the data (ANOVA). We first consider inter-action results of the multiway ANOVA applied tothe following factors: camera distance/camera ele-vation; scan orientation (X, diagonal, or Y scans);translation versus tilt scanning; and scanning to-wards versus away from the camera. As our exper-imental setup (Table 3) does not cover all possiblecombinations of camera distance and elevation, weanalyze the effect of distance versus acquisitionparameters (data set A; camera positions 1, 2, and5) separately from the effect of elevation versusacquisition parameters (data set B; camera posi-tions 1, 3, and 4). Each of these analyses thusinvolves 3/5 (2,916 points) of the total data set. Wewill classify the results against the 5% significancelevel (effect is significant when p � 0.050; notsignificant when p � 0.050).

The ANOVA results are listed in Table 4. Theresults for data sets A and B are fairly similar. Inboth cases, we find a significant three-factor inter-action between camera distance (elevation), scanorientation, and translation/tilt. All two-factor com-

Fig. 6. Projections of individual error vectors D (27 � 180 4,860 vectors) onto the XY-, XZ-, and YZ-planes.

Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation 207

binations of the same three factors, and the singlefactors themselves, also have significant effect onthe accuracy. The fourth factor, scan direction,shows no effect in data set A, neither alone nor in

combination with other factors. In data set B, how-ever, scan direction gives a significant effect incombination with translation/tilt and in the four-factor interaction, but not otherwise. This may be

Fig. 7. X, Y, and Z components, and length d, of the average error vector D� for each of the 180 volumes. The average d overall volumes (d� 1.40 mm) is shown by the horizontal line in the lowest plot. The volumes are numbered and presented inthe order they were measured. The four rulers at the top indicate the acquisition conditions for each volume, in accordance withTable 3. The smallest interval (subdivision of the “Scan dir.” ruler) represents the repetition of three volumes under identicalscanning conditions.

Fig. 8. (a) Histogram of all 27 � 180 4,860 error vector lengths d. (b) Cumulated histogram with the 50 and 95%percentiles indicated.

208 Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation

related to the way the volume is actually scannedduring translation or tilt (see the comments onTable 7 for details), or possibly to the time-taggingeffect (see Discussion section), but the latter will beshown to be rather small.

Because the accuracy results depend strongly

upon the combination of camera distance/eleva-tion, scan orientation, and translation/tilt, wepresent interaction plots in Tables 5 (data set A)and 6 (data set B). In both cases, the followingtrends seem to dominate (although exceptions dooccur):

Fig. 9. System repeatability, characterized by the error vector lengths, for all 180 volumes. The three volumes taken underthe same acquisition conditions are plotted at the same abscissa value (star symbol). From left to right, the volume groups aredisplayed in chronologic order, i.e., volumes 1–3, 4–6, 7–9, etc. (cf. Table 3). For each volume group, max(d) min(d) wascalculated, and this difference is shown by the solid line. The mean difference is 0.10 mm.

Table 4. Results of Multiway ANOVA for Data Set A (Variation of Camera Distance versusAcquisition Parameters) and Data Set B (Variation of Camera Elevation versus AcquisitionParameters)Factor/interaction A: Cam Camera Distance B: Cam Camera ElevationScanOrient (scan orientation) �.001 �.001Cam �.001 �.001TraTi (translation/tilt) �.001 �.001ScanDir (scan direction) .154 .885ScanOrient*Cam �.001 �.001ScanOrient*TraTi �.001 �.001ScanOrient*ScanDir .713 .426Cam*TraTi �.001 �.001Cam*ScanDir .941 .069TraTi*ScanDir .396 .021ScanOrient*Cam*TraTi �.001 �.001ScanOrient*Cam*ScanDir .247 .546ScanOrient*TraTi*ScanDir .528 .668Cam*TraTi*ScanDir .625 .247ScanOrient*Cam*TraTi*ScanDir .077 �.001The table lists p-values for the single factors and interactions.

Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation 209

1. X scans give better accuracy than Y scans,with diagonal scans somewhere between.

2. Increased camera distance or elevation im-proves the accuracy.

3. Translational scans give better accuracythan tilt scans.

To facilitate the comprehension of the main(single-factor) effects, Table 7 presents the ef-fects of varying camera distance, camera eleva-tion, scan orientation (X, diagonal, or Y), trans-lation versus tilt scan, scan direction (towards oraway from the camera), and the distance from theprobe. Due to the balanced experiment design,d�values for selected factors in Table 7 can befound simply by averaging over the superfluousdimensions of Tables 5 and 6. The additionalinformation presented in Table 7 is spread (�D)and range (d95), as well as factors not included inTables 5 and 6. The results confirm that all these

factors except scanning direction have a signifi-cant effect on the accuracy, as seen from the lowp values.

When comparing the camera positions, weassume approximate equality between the distancesfor positions 1, 3, and 4 (1.8 m), and between theelevations for positions 1, 2, and 5 (45°) (cf. Table3). Of the five investigated camera positions, thecombination of 1.8 m distance and 62° elevationgives the best accuracy (lowest d�). Tables 7a and bconfirm the previous result that increased distanceand increased elevation may improve the accuracy,within the limits of the position tracker. The effectof the camera position is discussed further in thesection below entitled Implications for Clinical Useof the System.

As already revealed from Tables 5 and 6, thescan orientation has significant influence upon theaccuracy, with X scans yielding the best results.This is confirmed by Table 7c.

Table 5. The Significant (p < .050) Three-Factor Interactions for Data Set A: Camera Distance,Scan Orientation, and Translation/Tilt

Scanorientation

Camera position (distance, m)2

(1.48)1

(1.87)5

(2.08)a) Translation scans.

X 1.24 1.09 0.59Diagonal 1.57 1.34 1.35Y 1.68 1.67 1.30

b) Tilt scans.X 1.48 1.38 0.82Diagonal 1.35 1.45 1.59Y 2.01 1.95 1.80

The table lists d� for each specific combination of the three factors.

210 Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation

Tables 7d through f confirm that the accuracyis better for translation scans than for tilt scans.This is to be expected, because all wire crosses inour phantom are imaged rather close (�9 cm) tothe probe in translation scans, thus reducing theeffect of the lateral image resolution degradingwith distance from the probe. The distance be-tween the probe and the farthest wire crossesmay have been in the order of 12 cm during tiltscans. In addition, the tilt movement implies lowresolution (spatial sampling) normal to the imageplane at large distances from the probe comparedto the translation movement. These argumentsare supported by Table 7g, which shows theactual degradation of accuracy with distancefrom the probe.

When comparing translation and tilt results, itshould also be kept in mind that tilting is thedominating 3D acquisition method in the clinicalsituation.

The effect of scanning direction is ex-ploited in Tables 7e and f. Here, the translationand tilt scans are separated because they differwith respect to the interpretation of “direction.”During a translation scan, the probe and theultrasound plane both move either towards oraway from the cameras. However, the rotationalmovement during a tilt scan causes the ultra-sound plane to approach the cameras when theprobe moves away, and vice versa. The scandirections stated in Tables 7e and f refer to themotion of the probe. The results show that thescanning direction itself has negligible effect.See, however, the section of the Discussion en-titled Time tagging for further details.

As already mentioned, Table 7g confirms thatthe accuracy degrades with distance from theprobe, due to reduced resolution as a function ofdepth, particularly for the type of sector-scanningprobe used here.

Table 6. The Significant (p < .050) Three-Factor Interactions for Data Set B: Camera Elevation,Scan Orientation, and Translation/Tilt

Scanorientation

Camera position (elevation, deg.)3

(35.3)1

(45.5)4

(61.9)a) Translation scans.

X 1.20 1.09 0.99Diagonal 1.62 1.34 1.10Y 1.77 1.67 1.27

b) Tilt scans.

X 1.42 1.38 0.73Diagonal 1.13 1.45 1.41Y 1.89 1.95 1.68

The table lists d� for each specific combination of the three factors.

Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation 211

Table 7. Effect on Accuracy of Varying Acquisition Conditions(ANOVA Results for Single Factors)

Cameradistance n d� �d d95

a) Effect of varying camera distance. Camera elevation angle was constant at 45°1.5 m 972 1.55 0.37 2.211.8 m 972 1.48 0.39 2.112.1 m 972 1.24 0.52 2.07

Cameraelevation n d� �d d95

b) Effect of varying camera elevation. Camera distance was constant at 1.8 m35° 972 1.50 0.41 2.2145° 972 1.48 0.39 2.1162° 972 1.20 0.42 1.87

Orientationof scan n d� �d d95

c) Effect of scan orientation.XDiagonallyY

162016201620

1.101.391.70

0.400.340.38

1.742.012.29

Acquisitiontype n d� �d d95

d) Effect of translation versus tilt scan.Translation 2430 1.32 0.42 2.00Tilt 2430 1.47 0.46 2.17

212 Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation

DISCUSSIONError SourcesFrom the total of 4,860 points, we have obtained anoverall laboratory accuracy of 1.40 mm. However,it is obvious that only a minor fraction of thisnumber should be attributed to random noise. First,the correlation between accuracy and acquisitionconditions is revealed graphically in Figures 6 and7, and by the ANOVA analysis (Tables 4 through7). Second, the repeatability test (Fig. 9) shows that

the random contribution is typically one order ofmagnitude less than the overall variations.

We therefore assume that the results aregreatly influenced by systematic error sources, andwill investigate this in detail. We consider the po-tential error sources listed in Table 8 to be the mostimportant in this context (cf. also Table 2). Thetable also shows how each error source wouldindividually affect the results under varying acqui-sition conditions. Thus, we may identify the error

Table 7. Effect on Accuracy of Varying Acquisition Conditions(ANOVA Results for Single Factors) (cont’d.)

Scandirection n D� d� �d d95

e) Effect of scanning towards versus away from the camera (translation scans).Transl towards 1215 0.36, 0.88, 0.00 1.32 0.43 2.01Transl. away 1215 0.42, 0.82, 0.01 1.32 0.42 1.99

Scandirection n D� d� �d d95

f) Effect of scanning towards versus away from the camera (tilt scans).Tilt towards 1215 0.01, 0.95, 0.53 1.48 0.44 2.13Tilt away 1215 0.05, 0.96, 0.55 1.47 0.47 2.22

Wire layer(vertical

distance fromprobe) n d� �d d95

g) Effect of depth inside the phantom.Top (4.0 cm) 1620 1.35 0.42 2.05Middle (6.5 cm) 1620 1.37 0.41 1.99Bottom (9.0 cm) 1620 1.47 0.50 2.28

The plots show d� within each group, and the 95% confidence interval on d� , which is equal to �1.96 � �d/�n for a normal distribution. n is the number of pointsin each group, and p is a measure of the effect of each factor (low p means significant effect). All D and d values are given in millimeters. All acquired data areused in Tables c through g, while data subsets A and B (each covering three out of five camera positions) are used in Tables a and b, respectively.

Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation 213

sources of greatest influence by investigating ap-propriate data subsets for these effects.

Bias in Automatic MethodThe automatic method for locating the wire crossesin the images has been thoroughly tested, and wasfound to be as accurate as any human operator.42 Inthe present study, the resolution of the recon-structed volume is the same (0.65 mm/voxel), andwe apply the “optimal” algorithm parameters fromthe previous work. The automatic method is there-fore expected to be unbiased, and to have an accu-racy of 0.25 voxels (0.16 mm).42

However, because we now have an entirelynew data set, acquired by a different ultrasoundscanner, we confirm the applicability of the auto-matic method and its parameter settings by runningthe following test. A total of 12 reconstructed vol-umes were selected, one for each acquisition con-dition (randomized over repetition number andcamera position—see Table 3). We then designed acomputer program that looped through each vol-ume and presented each of the 27 wire crosses onthe screen in three orthogonal projections, togetherwith the automatic method’s results as shown bythe hair crosses. Three skilled operators judgedthese images independently, classifying each wirecross according to whether he/she agreed or dis-agreed with the automatic results. The criterion foragreement was that the hair cross should be withinone voxel of what the operator considered to be the“correct” wire cross location.

The combined results of the three indepen-dent operators (total of 12 � 27 � 3 972 cases)indicated that 97.8% of crosses (951 cases) were inagreement with the operator’s opinion, whereas2.2% (21 cases) were not. These findings confirmour statement that the automatic method is as pre-

cise as any human operator, and hence has nosignificant bias.

Time TaggingBy time tagging, we mean synchronization of theposition data and the image before reconstructingthe volume. Position data and ultrasound data willnot be recorded exactly simultaneously, so a timelag (positive or negative) will occur between thesedata sets. The time lag is constant for a particularscanning setup. In a dynamic scanning session, thetime lag converts into a spatial offset between the“true” image position and the position recorded bythe position sensor. The offset is ideally propor-tional to probe velocity. Hence, reversing the scan-ning direction should give a sign change in theoffset component along the scanning direction. Wewill therefore investigate our data for such direc-tion-dependent errors.

We compare only translation scans along theX- or Y-axis, because, for these cases, time-taggingerrors give offsets in only one component. Table 9summarizes the results. As expected for the scansin the X direction, only the X component of D isaffected by the change of direction. D(X) is directedaway from the camera, and has the largest magni-tude when scanning towards the camera. The sameeffect—measured relative to the camera posi-tion—is seen for the scans in the Y direction. Thisindicates that incorrect time tagging may accountfor an offset of half the difference, i.e., approxi-mately 0.06 mm at the specified scanning speed.

The results thus indicate that time-taggingerrors may have contributed to the overall accu-racy, but only as a minute error source. The con-tribution itself is small and relatively insignificantcompared to the standard deviations listed in Table9. Furthermore, the effect is not seen as a signchange around zero, but merely as a fluctuationaround a considerable offset value, which mustthen be attributed to other error sources.

Speed of SoundUsing the correct sound speed is crucial for obtain-ing the correct geometry of the reconstructed vol-ume. The water tank used in the study was filledmore than 3 days before the measurements wereperformed; hence, the water could be assumed to bedegassed. During the data acquisition (10 h), wemonitored the air temperature close to the tankusing a standard thermometer with resolution of0.5°C. The temperature was kept in the range 25.5–27.0°C throughout, corresponding to theoreticalvalues for the sound speed in water of 1,498.0–1,501.9 m/s.41 Note that this variation in sound

Table 8. Possible Error Sources and TheirExpected Effect under Various ScanConfigurations

Error source

ActionRotate probe

(and scan) 90°about probe’s

axis

Scan inoppositedirection

Bias in automatic method No effect No effectTime tagging Effect rotates 90° Sign changeSound speed No effect on axial

beamNo effect

Other geometricdistortion (scanconversion effect)

Depends ondistortion type

No effect

Probe calibration Effect rotates 90° No effect

214 Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation

speed is much smaller than the variations one mayexpect in a clinical situation due to variation intissue types (see below).

For volume reconstruction, we used a con-stant sound speed of 1,499.3 m/s. The maximumdeviation from this value was thus 1.6 m/s, givinga percentage uncertainty of maximum 0.11% (1.6/1,499.3). This implies a stretching or compressionof the volume in the beam direction by 0.16 mm ata distance of 15 cm (maximum sector depth). How-ever, these numbers are extreme values. Consider-ing typical temperature fluctuations and averagingover the whole imaged volume, we expect thesound speed variation to account for far less than0.1 mm.

We also note that an incorrect value for thesound speed should mainly affect the Z componentof the error vector D, for the given probe orienta-tion and acquisition protocol. However, Figures 6and 7 show that the largest discrepancy typicallyoccurred in the Y component. This confirms thatvariation in the sound speed was not a significantsource of error in our experiment.

It is appropriate here to comment on the driftterm observed on the D(Z) component in Figure 7.Considering translation and tilt scans separately,the increase in D(Z) over 10 h is approximately 0.6mm. We are unable to give a satisfactory explana-tion for the drift. From the above discussion, it isunlikely that this is a sound speed effect caused bytemperature variation, because such a variationwould only account for errors in the order of 0.1mm. We have also considered the effect of micro-bubbles in the water, assuming that the water wasnot completely degassed at start of the measure-ment session. Due to repeated exposure to the ul-trasound, such bubbles might be gradually removedfrom the water, thus altering the bulk sound speed.However, our analysis shows that decreasing theamount of free gas would cause D(Z) to decreasewith time, i.e., the opposite of the observed effect.

Other Geometric Distortion EffectsBy visual inspection, we observe no obvious geo-metric distortions in the reconstructed volumes.

Unfortunately, we do not have access to the detailsof the 3D scan conversion algorithms. However, asimple test for geometric consistency is to checkthe dimensions of the reconstructed volume againstthe true dimensions of the phantom.

The imaged phantom contains 27 wirecrosses in a 3 � 3 � 3 cubic pattern, where thedistances between the wires are known to an accu-racy of �0.1 mm.42 For the two outer layers ofcrosses, the nominal distance is found by averagingover the nine coordinate differences. We calculatethe same distance for each processed volume (27automatically located wire cross positions), andfurther average the results over all 180 volumes.The differences between these distances are thus ameasure of the stretching/compression of the re-constructed volume. The results are listed in Table10. Compared to the nominal dimensions, the re-constructed volumes are, on an average, expandedalong the X- and Z-axes and compressed along theY-axis, the X discrepancy being the largest. Con-sidering each distance as the difference betweentwo stochastically independent numbers, it is ap-propriate to divide by �2 to find the bias of asingle layer (single point). This gives an averagebias of � X� 0.21 mm, � Y� 0.05 mm, and � Z� 0.05 mm for the reconstructed volume comparedto the phantom’s physical dimensions. The averagebias vector has a length of 0.22 mm. Consideringonly the mean value, we know from the discussionabove that the variation in sound speed does notaccount for a discrepancy of this magnitude. In anycase, the sound speed would primarily affect the Zdimension of the volume, while the observed effectis largest in the horizontal plane. The spread aroundthe mean values are also relatively high (note alsothat the manufacturing uncertainty in wire crosslocations is excluded). An extensive study wouldthus be needed to determine whether the observeddiscrepancy is significant and, if so, whether itmight be due to (for example) the 3D scan conver-sion algorithm. We have not pursued this topicfurther, because, after all, the observed discrepancyin geometric dimensions may only account for aminor fraction of the total error of 1.4 mm.

Table 9. Effect on Accuracy of Reversing the Scan DirectionAcquisition n D� (X), D� (Y), D� (Z) �D(X), �D(Y), �D(Z)�X scan (camera at �X) 405 0.32, 0.63, 0.24 0.65, 0.33, 0.18X scan (camera at �X) 405 0.21, 0.64, 0.25 0.75, 0.33, 0.19�Y scan (camera at Y) 405 1.29, 0.57, 0.22 0.43, 0.45, 0.15Y scan (camera at Y) 405 1.28, 0.69, 0.22 0.46, 0.40, 0.16Only translation scans parallel to the wires (X and Y directions) are considered. n is the number of points includedin the analysis. All D values are given in millimeters.

Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation 215

Probe CalibrationIncorrect probe calibration implies that an imagepoint will be displaced from its “true” position.This displacement is constant in the ultrasoundplane’s coordinate system. Thus, if the probe isshifted/rotated, the same shift/rotation occurs to thedisplacement.

We exploit this feature by comparing resultsobtained from all translation scans, sorted accord-ing to scanning direction (X, Y, or diagonally). Weneed to compare the error vectors D in the ultra-sound plane’s coordinate system, which is Xus (nor-mal to image), Yus (lateral image coordinate), Zus

(radial or depth image coordinate), as shown inFigure 10. The error vectors must therefore betransformed from the phantom’s coordinate systemX, Y, Z. One step in this transform involves thetransform between the probe system Xpr, Ypr, Zpr

and the ultrasound system Xus, Yus, Zus. This is, bydefinition, the unknown probe calibration. How-ever, for the present purpose, a sufficient approxi-

mation is to assume no rotation between thesesystems, such that Xus Xpr, Yus Ypr, Zus Zpr.With this approximation, the relations between thephantom’s system and the ultrasound plane’s sys-tem are:

(1) Scans in the X direction [Fig. 10(a)]Xus � XYus � YZus � Z (7)

(2) Scans in the diagonal direction [Fig. 10(b)]

Xus � �X � Y�/�2

Yus � �X � Y�/�2Zus � Z (8)

(3) Scans in the Y direction [Fig. 10(c)]Xus � YYus � XZus � Z (9)

The results of these transformations are listedin Table 11. In the ultrasound plane’s coordinatesystem, the average D vectors become (0.27,0.63, 0.24) for the X scans, (0.80, 1.01, 0.05)for the diagonal scans, and (0.63, 1.29, 0.22) forthe Y scans. Despite some discrepancies (note alsothe relatively large standard deviations in Table11), we observe a certain consistency betweenthese offset vectors. The mean of the vectors in the(Xus, Yus, Zus) system is (0.57, 0.98, 0.01), whichhas a length of 1.13 mm.

This analysis indicates that erroneous probecalibration is a major contributor to the systemerror in the present study. To confirm this further,we used the mean (Xus, Yus, Zus) vector as compen-sation to the original data set by converting it back

Fig. 10. Coordinate systems for the phantom (X, Y, Z) and ultrasound plane (Xus, Yus, Zus) viewed from above. The dashedrectangle indicates the probe orientation during the scan. (a) Scans in the X direction; (b) scans in the diagonal direction; (c)scans in the Y direction.

Table 10. Comparison of Nominal Distancesbetween Wire Cross Planes, and DistancesFound in Reconstructed Ultrasound Volumes

Wire crossplanes

Nominaldistance(mm) inphantom(n 9)

Distance(mm) in

ultrasoundvolumes

(n 1620)Xfront–Xback 50.04 � 0.10 50.34 � 0.51Yfront–Yback 50.05 � 0.08 49.98 � 0.39Ztop–Zbottom 50.03 � 0.08 50.10 � 0.25Means and standard deviations are based on the n coordinate differencesincluded in each analysis; the manufacturing uncertainty in physical wirelocations is not included.

216 Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation

into the reference system (X, Y, Z) (using separateconversion for the X, Y, and diagonal scanningdirections, by inversion of the above relations), andsubtracting it from the respective data subsets. Al-though the compensation vector was derived fromonly the translation scans, and thus is not, strictlyspeaking, valid for the tilt scans, we applied thecompensation to the whole data set. The averageerror vector length d� was 0.84 mm after compen-sation, compared to 1.40 mm before compensation.

For certain probe calibration procedures, thefinite width of the ultrasound plane [classified asreconstruction error (W) in Table 2] may influencethe calibration accuracy. However, we cannot saywhether this is so in the present case, because thedetails of the procedure used in the neuronavigationsystem studied here are unavailable to us. This alsoimplies that we are unable to suggest improvementsto the procedure.

Accuracy When Compensating for SystematicError Sources

The systematic error effects discussed above implythat the reconstructed volume is positioned with anoffset relative to the physical volume. This offsetmay be compensated for by moving the recon-structed volume until the residual d� attains a min-imum value. We have done this for each of the 180volumes, using standard point set registration algo-rithms (translational and rotational motion). Afterthis compensation, the d� over all volumes becomes0.37 mm. Additional uniform scaling of the trans-lated and rotated volumes gives negligible im-provement, while nonuniform scaling brings d�down to 0.26 mm.

These numbers result from individual offsetcompensation and therefore cannot be achieved bya common correction to all reconstructed volumes.However, they confirm that the volume is fairlyaccurately reconstructed with respect to geometricproportions. In addition, they give an indication ofthe overall (random) noise level in our experimen-tal design.

Implications for Clinical Use of the SystemDifferences in Acquisition Conditions betweenLaboratory and ClinicIn this work, we have evaluated the neuronaviga-tion system in a controlled laboratory setting. It is,however, important to understand how this settingdiffers from the clinical situation to assess the over-all clinical accuracy of the system.

In the clinical situation, 3D volumes are typ-ically acquired by tilted scans. We also made lineartranslation scans, because such volumes allow foreasier analysis of various effects. It should be notedthat tilted scans give slightly poorer accuracy thantranslation scans (Table 7d).

The camera position relative to the referenceand probe frames is also important. According tothe manufacturer of the positioning system, the bestperformance is obtained when the frames face thecameras directly. This was impossible in our labo-ratory setting, because the reference frame affixedto the wire phantom is horizontal. This probablyaccounts for the improvement in accuracy as thecameras are raised higher (Table 7b), although theprobe frame visibility is simultaneously degraded.Our evaluation includes five different camera posi-tions to cover the situation where the cameras maybe moved during an operation. However, it can beexpected that the accuracy will improve if theframes and cameras are optimally positioned in theclinical situation. We get an indication of this byconsidering only the tilted scans for the “best”camera position, which gives an average error vec-tor length d� 1.28 mm; �d 0.45 mm (cameraposition 4; distance 1.8 m and 62° elevation).

Sensor Frame Attachment/Probe CalibrationIdeally, one unique position sensor should be per-manently attached to one unique ultrasound probebefore calibration in the factory. However, thisoption represents practical difficulties in that mostultrasound probes cannot be sterilized, while theposition-sensor frame can. Alternatively, both de-vices can be covered by sterile drapes, but this mayinterfere with the view of the cameras.

Table 11. Effect on Accuracy of Rotating the Probe and Scan Direction

Acquisition nD� (X), D� (Y),

D� (Z)�D(X), �D(Y),

�D(Z)Transform from X, Y,

Z to Xus, Yus, Zus

D� (Xus), D� (Yus),D� (Zus)

All X scans 810 0.27, 0.63, 0.24 0.70, 0.33, 0.18 Equation 7 0.27, 0.63, 0.24All diagonal scans 810 0.15, 1.28, 0.05 0.43, 0.43, 0.19 Equation 8 0.80, 1.01, 0.05All Y scans 810 1.29, 0.63, 0.22 0.45, 0.43, 0.16 Equation 9 0.63, 1.29, 0.22Only translation scans are considered. (X, Y, Z) is the reference frame’s (phantom’s) coordinate system, while (Xus, Yus, Zus) is the coordinate system of theultrasound image plane. n is the number of points included in the analysis. All D values are given in millimeters.

Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation 217

To our knowledge, there is still no good so-lution to this problem for optical tracking systems,but current strategies include two methods: (a) cal-ibration of any probe to any position sensor insidethe operating room using a special ultrasound cal-ibration phantom and algorithm; or (b) calibrationof one particular probe to one particular positionsensor in the laboratory or factory, using a specialadapter that ensures repetitive and precise attach-ment between the two, even through a sterile probedrape. The latter option allows neither the probenor the position-sensor adapter to be replaced with-out returning both to the factory for new calibra-tion. The neuronavigation system investigated hereis based on the latter solution to give the vendorbetter control over the accuracy.

Sound Speed Variation in the Brain

For the human brain, the average value for thesound speed at 37°C and 5 MHz is 1,568 m/s.44 Theaverage value for soft human tissue is 1,540 m/s.41

Some variation exists between different tissuetypes; it can be as low as 1,504 m/s for tumor cystsat 22°C, and as high as 1,569 m/s for a certain typeof meningioma at 20°C.41 Temperature coefficientsfor acoustic velocity are sparse, but for human fetalbrain the coefficient has been estimated to be 1.5m/(s � °C) for the temperature range 24–37°C.41

Assuming this coefficient for cysts and meningio-mas, the sound speed at 37°C becomes approxi-mately 1,527 m/s and 1,588 m/s, respectively.

We illustrate how the sound speed may affectthe accuracy in a practical situation with the ideal-ized example shown in Figure 11(a). The area ofinterest is located a constant distance r from theprobe at the surface of the brain. We consider threetypes of intervening tissue: cyst with sound speed1,527 m/s; meningioma with sound speed 1,588m/s; or normal brain tissue with sound speed 1,568m/s. A displacement error occurs if the image isreconstructed with a sound speed differing from thetrue value. This error, at depth r, is given by

r � ��sc

�� 1�r (10)

Fig. 11. Idealized model to illustrate the effect of usingerroneous sound speed in volume reconstruction. (a) Ultra-sound imaging of the area of interest in the brain at adistance r from the probe. (b) The error in radial distancedue to volume reconstruction with a sound speed of 1,540m/s (average value for soft tissue), when the true soundspeed in intervening tissue is 1,527 m/s (cyst, circle

symbols), 1,568 m/s (average in normal brain, star sym-bols), or 1,588 m/s (meningioma, plus symbols). (c) Similarto (b), but with 1,568 m/s (average in normal brain) used forvolume reconstruction. The true sound speed in the inter-vening tissue is 1,527 m/s (cyst, circle symbols) or 1,588m/s (meningioma, plus symbols).

218 Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation

where v is the true sound speed, and vsc is the soundspeed used for reconstruction (scan conversion).We have calculated Equation (10) as a function ofdistance r for two cases of scan-conversion soundspeed: using the average sound speed of soft humantissue (1,540 m/s), or using the average soundspeed of brain tissue (1,568 m/s). The results areshown in Figure 11(b) and (c), respectively.

It should be mentioned that these examplesrepresent extreme cases, because the interveningtissue during imaging is seldom only cystic or ameningioma. Nevertheless, for the smaller valuesof r, the plot may have practical implications.

Overall Clinical AccuracyAs stated previously, the most important parameterfor the surgeon is the overall clinical accuracy.Although this parameter is difficult to assess, webelieve that an estimate can be made, based on ourlaboratory evaluation and a thorough understandingof the significant additional error sources that occurin the clinical setting. In neuronavigation based onpreoperative MRI, the laboratory accuracy may bewell controlled, with the clinical accuracy beingconsiderably worse due to error sources like patientregistration and brain shift during surgery. For neu-ronavigation based on intraoperative ultrasound,however, data registration is superfluous, and thebrain shift can be kept small by repeated imageacquisition. Hence, the overall clinical accuracy islikely to be closer to the overall laboratory accu-racy. This is further supported by the ultrasoundmodality being less susceptible to user- and proce-dure-dependent error sources than MRI. We willtherefore estimate the overall clinical accuracy ofthe neuronavigation system used in this study basedon our laboratory results. This calculation is sum-marized in Table 12.

Our starting point is the laboratory accuracyfor all tilt scans at the best camera setting, becausethis configuration is considered to be most relevantin the clinical situation (see the earlier section en-titled Differences in acquisition conditions betweenlaboratory and clinic). Our study does not includecalibration and tracking errors for a rigid pointer orsurgical tool, which are in the order of 0.6 mm.35

Furthermore, we have not considered errors asso-ciated with the extraction and presentation of a 2Dimage from the 3D volume. However, such errorsshould be rather small for a system using goodinterpolation routines and a high-resolution moni-tor.

By adding these error numbers, we obtain anestimate of the overall laboratory accuracy. Theerror sources are assumed to be stochastically in-

dependent. Hence, their contributions are added ona sum-of-squares basis.

For a rigid phantom, the overall laboratoryaccuracy may be better for an MRI system than forthe ultrasound-based system, mainly due to theprobe calibration needed for ultrasound-based nav-igation. However, in the clinical setting, the MRIoption will require a registration based on invasiveskull fiducials to be comparable to the laboratory(phantom) accuracy. The registration process is un-necessary when using ultrasound.

The main error sources for the ultrasoundsystem in the clinical situation—sound speed vari-ation and brain shift—have already been discussedin detail. The sound speed range indicated in Table12 is based on Figure 11. The brain-shift error maybe negligible if the 3D maps are frequently updated(zero when using real-time 3D ultrasound),whereas the error increases when the update is doneat longer intervals. The extreme situation would bewhere the surgery is based on a single ultrasoundvolume acquired before the operation starts. How-ever, even in this case, the situation is favorablecompared to preoperative MRI. For example, whenperforming a craniotomy, the brain shift may be aslarge as 1 cm,45,46 and this kind of error will beeliminated by using ultrasound. The problem ofbrain shift during the operation can also be reducedby using intraoperative MRI.

The surgeon’s interpretation of the imageswill be individual and subjective, and the accuracywill depend on the monitor’s size and resolution, inaddition to the image resolution and contrast.Hence, no exact number can be given for this error.However, we estimate that an accuracy in the orderof 0.5 mm should be attainable.

When summing these contributions, again asstochastically independent sources, we obtain anestimated overall clinical accuracy for the ultra-

Table 12. Overall Clinical Accuracy of anUltrasound-Based Neuronavigation SystemLaboratory accuracy (tilt scans, one camera

position) 1.3 mm� Calibration and position tracking of rigid

surgical tool 0.6 mm� Interpolation 2D slice from 3D/tool cross

indication 0.1 mm Overall laboratory accuracy 1.4 mm

� Sound speed uncertainty 0.5–3 mm� Brain shift 1–10 mm� Interpretation of images on monitor 0.5 mm Overall clinical accuracy 1.9–10.6 mmThe magnitude of the error numbers is discussed in the text. The numbers aresummed as stochastically independent contributions (�¥(. . .)2).

Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation 219

sound system of �2 mm under favorable condi-tions, i.e., when the sound speed used in the scan-ner is close to the average sound speed in theimaged tissue and the ultrasound volumes are fre-quently updated. The need for updating can bedetermined by real-time 2D imaging. If these con-ditions are not met, the accuracy becomes poorer.

CONCLUSIONWe have evaluated the 3D navigation accuracy ofan ultrasound-based neuronavigation system, usinga wire phantom in a laboratory setup. A total of 180scans of the phantom were performed under vari-ous acquisition conditions. The laboratory accuracywas found to be 1.40 � 0.45 mm. Detailed analysisof the data indicates that inappropriate probe cali-bration is the main error source in this study.

The differences between our setup and theclinical situation have been discussed. The overallclinical accuracy for ultrasound-based neuronavi-gation is expected to be close to the overall labo-ratory accuracy. This is partly due to the fact thatpatient registration is unnecessary. In addition, in-traoperative ultrasound imaging may eliminate thebrain-shift problem, which is the potentially largestsource of error.

ACKNOWLEDGMENTThis work was supported by grants from the Re-search Council of Norway through the StrategicUniversity Program in Medical Technology at theNorwegian University of Science and Technology(NTNU), the Strategic Institute Program at SINTEFUnimed, and the Norwegian Ministry of Health andSocial Affairs. We wish to thank the Laboratory forFine Mechanics at SINTEF Unimed in Trondheim,Norway, for making the equipment necessary forthis study, especially the main phantom, and alsofor providing indispensable measurements. We alsothank Geir Arne Tangen, M.Sc. (SINTEF Unimed)for valuable contributions to the laboratory set-up,and John S. Tyssedal (Associate Professor, Depart-ment of Mathematical Sciences, NTNU, Trond-heim, Norway) for helpful suggestions on theANOVA analysis. We wish to thank MISON ASand the Department of Neurosurgery at the Univer-sity Hospital in Trondheim, Norway, for providingus with access to the neuronavigation system eval-uated in this study, and, in particular, we thank ÅgeGrønningsæter (MISON AS) for helpful discus-sions.

REFERENCES1. Gronningsaeter A, Kleven A, Ommedal S, Aarseth

TE, Lie T, Lindseth F, Langø T, Unsgård G. Sono-

Wand, an ultrasound-based neuronavigation system.Neurosurgery 2000;47(6):1373–1380.

2. Gumprecht HK, Widenka DC, Lumenta CB. BrainLabVector Vision neuronavigation system: technologyand clinical experiences in 131 cases. Neurosurgery1999;44(1):97–105.

3. Hadani M, Spiegelman R, Feldman Z, Berkenstadt H,Ram Z. Novel, compact, intraoperative magnetic res-onance imaging-guided system for conventional neu-rosurgical operating rooms. Neurosurgery 2001;48(4):799–809.

4. Hata N, Dohi T, Iseki H, Takakura K. Development ofa frameless and armless stereotactic neuronavigationsystem with ultrasonographic registration. Neurosur-gery 1997;41(3):608–613.

5. Schaller C, Meyer B, van Roost D, Schramm J. Imageguided microsurgery with a semifreehand neuronavi-gational device. Comp Aid Surg 1997;2(3–4):162–171.

6. Sipos EP, Tebo SA, Zinreich SJ, Long DM, Brem H.In vivo accuracy testing and clinical experience withthe ISG Viewing Wand instrumentation application.Neurosurgery 1996;39(1):194–204.

7. Eljamel MS. Frameless stereotactic neurosurgery:two steps towards the holy grail of surgical naviga-tion. Stereotact Funct Neurosurg 1999;72(2– 4):125–128.

8. Gumprecht H, Trost HA, Lumenta CB. Neuroendos-copy combined with frameless neuronavigation.J Neurosurg 2000;14(2):129–131.

9. Knauth M, Witz CR, Tronnier VM, Aras N, Kunze S,Sartor K. Intraoperative MR imaging increases theextent of tumor resection in patients with high-gradegliomas. AJNR Am J Neuroradiol 1999;20:1642–1646.

10. Paleologos TS, Wadley JP, Kitchen ND, ThomasDGT. Clinical utility and cost-effectiveness of inter-active image-guided craniotomy: Clinical comparisonbetween conventional and image-guided meningiomasurgery. Neurosurgery 2000;47(1):40–48.

11. Reinhardt H, Trippel M, Westermann B, Gratzl O.Computer aided surgery with special focus on neu-ronavigation. Comput Med Imaging Graph1999;23(5):237–244.

12. Roessler K, Ungersboeck K, Aichholzer M, DietrichW, Goerzer H, Matula C, Czech T, Koos WT. Frame-less stereotactic lesion contour-guided surgery using acomputer-navigated microscope. Surg Neurol 1998;49:282–289.

13. Seifert V, Zimmermann M, Trantakis C, VitzthumH-E, Kuhnel K, Raabe A, Bootz F, Schneider J-P,Schmidt F, Dietrich J. Open MRI-guided neurosur-gery. Acta Neurochir 1999;141(5):455–464.

14. Steinmeier R, Fahlbusch R, Ganslandt O, Nimsky C,Buchfelder M, Kaus M, Heigl T, Lenz G, Kuth R, HukW. Intraoperative magnetic resonance imaging withthe Magnetom open scanner: concepts, neurosurgicalindications, and procedures. A preliminary report.Neurosurgery 1998;43(4):739–748.

15. Wadley J, Dorward N, Kitchen N, Thomas D. Pre-

220 Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation

operative planning and intra-operative guidance inmodern neurosurgery: a review of 300 cases. Ann RColl Surg Engl 1999;81(4):217–225.

16. Wirtz CR, Albert FK, Schwaderer M, Heuer C, Staub-ert A, Tronnier VM, Knauth M, Kunze S. The benefitof neuronavigation for neurosurgery analyzed by itsimpact on glioblastoma surgery. Neurol Res 2000;22:354–360.

17. Zakhary R, Keles E, Berger MS. Intraoperative imag-ing techniques in the treatment of brain tumors. CurrOpin Oncol 1999;11:152–156.

18. Samset E, Hirschberg H. Neuronavigation in intraop-erative MRI. Comp Aid Surg 1999;4(4):200–207.

19. Fenster A, Downey DB. 3-D ultrasound imaging: areview. IEEE Eng Med Biol 1996;15(6):41–51.

20. Nelson TR, Pretorius DH. Three-dimensional ultra-sound imaging. Ultrasound Med Biol 1998;24(9):1243–1270.

21. Cartellieri M, Kremser J, Vorbeck F. Comparison ofdifferent 3D navigation systems by a clinical “user.”Eur Arch Otorhinolaryngol 2001;258:38–41.

22. Dorward NL, Alberti O, Palmer JD, Kitchen ND,Thomas DGT. Accuracy of true frameless stereotaxy:in vivo measurement and laboratory phantom studies.J Neurosurg 1999;90:160–168.

23. Hartov A, Eisner SD, Roberts DW, Paulsen KD,Platenik LA, Miga MI. Error analysis for a free-handthree-dimensional ultrasound system for neuronaviga-tion. Neurosurg Focus 1999;6(3):Article 5.

24. Kaus M, Steinmeier R, Sporer T, Ganslandt O, Fahl-busch R. Technical accuracy of a neuronavigationsystem measured with a high-precision mechanicalmicromanipulator. Neurosurgery 1997;41(6):1431–1436.

25. Wirtz CR, Kanuth M, Hassfeld S, Tronnier VM, Al-bert FK, Bonsanto MM, Kunze S. Neuronavigation—first experiences with three different commerciallyavailable systems. Zentralbl Neurochir 1998;59:14–22.

26. Comeau RM, Sadikot AF, Fenster A, Peters TM.Intraoperative ultrasound for guidance and tissue shiftcorrection in image-guided surgery. Med Phys, 2000;27(4):787–800.

27. Helm PA, Eckel TS. Accuracy of registration methodsin frameless stereotaxis. Comp Aid Surg 1998;3(2):51–56.

28. Kall B, Goerss SJ, Stiving SO, Davis DH, Kelly PJ.Quantitative analysis of a noninvasive stereotactic im-age registration technique. Stereotact Funct Neurosurg1996;66:69–74.

29. Smith KR, Frank KJ, Bucholz RD. The NeuroSta-tion™—a highly accurate, minimally invasive solu-tion to frameless stereotactic neurosurgery. CompMed Imaging Graph 1994;18(4):247–256.

30. Villalobos H, Germano IM. Clinical evaluation ofmultimodality registration in frameless stereotaxy.Comp Aid Surg 1999;4(1):45–49.

31. Sandemann DR, Patel N, Chandler C, Nelson RJ,Coakham HB, Griffith HB. Advances in image-di-rected neurosurgery: preliminary experience with the

ISG Viewing Wand compared with the Leksell Gframe. Br J Neurosurg 1994;8:529–544.

32. Sumanaweera TS, Adler JR, Napel S, Glover G. Char-acterization of spatial distortion in magnetic resonanceimaging and its implications for stereotactic surgery.Neurosurgery 1994;35(4):696–704.

33. Maciunas RJ, Galloway RL, Latimer JW. The appli-cation accuracy of stereotactic frames. Neurosurgery1994;35(4):682–695.

34. Walton LB, Hampshire A, Forster DMC, KemenyAA. A phantom study to assess the accuracy of ste-reotactic localization, using T1-weighted magneticresonance imaging with the Leksell stereotactic sys-tem. Neurosurgery 1996;38(1):170–178.

35. Chassat F, Lavallee S. Experimental protocol of accu-racy evaluation of 6-D localizers for computer-inte-grated surgery: Application to four optical localizers.In: Wells WM, Colchester A, Delp S, editors: Pro-ceedings of First International Conference on MedicalImage Computing and Computer-Assisted Interven-tion (MICCAI’98), Cambridge, MA, October 1998.Lecture Notes in Computer Science 1496. Berlin:Springer, 1998. p 277–284.

36. Schmerber S, Chassat F. Accuracy evaluation of aCAS system: laboratory protocol and results with 6Dlocalizers, and clinical experiences in otorhinolaryn-gology. Comp Aid Surg 2001;6:1–13.

37. Langø T. Ultrasound guided surgery: Image process-ing and navigation. Doctoral thesis, Norwegian Uni-versity of Science and Technology, Trondheim, Nor-way, 2000.

38. Prager RW, Rohling RN, Gee AH, Berman L. Rapidcalibration for 3-D freehand ultrasound. UltrasoundMed Biol 1998;24(6):855–869.

39. Prager RW, Rohling RN, Gee AH, Berman L. Auto-matic calibration of 3-D free-hand ultrasound. Tech-nical report: CUED/F-INFENG/TR 303. Departmentof Engineering, Cambridge University, UK, 1997.

40. Unsgård G, Ommedal S, Muller T, Gronningsaeter A,Hernes TAN. Neuronavigation by intraoperative 3Dultrasound: initial experiences during brain tumor re-sections. Neurosurgery 2002;50(4):804–812.

41. Duck FA. Physical Properties of Tissue. A Compre-hensive Reference Book. London: Academic PressLtd., 1990.

42. Lindseth F, Bang J, Langø T. A robust and automaticmethod for evaluating the accuracy in 3D ultrasound-based navigation. Manuscript under review.

43. Box GEP, Hunter WG, Hunter JS. Statistics for Ex-perimenters: An Introduction to Design, Data Analy-sis, and Model Building (chapter 6). New York: JohnWiley & Sons, 1978.

44. Kremkau FW, Barnes RW, McGraw CP. Ultrasonicattenuation and propagation speed in normal humanbrain. J Acoust Soc Am 1981;70:29–38.

45. Nimsky C, Ganslandt O, Cerny S, Hastreiter P,Greiner G, Fahlbusch R. Quantification of, visualiza-tion of, and compensation for brain shift using intra-operative magnetic resonance imaging. Neurosurgery2000;47(5):1070–1080.

Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation 221

46. Roberts DW, Hartov A, Kennedy FE, Miga MI,Paulsen KD. Intraoperative brain shift and deforma-tion: A quantitative analysis of cortical displacementin 28 cases. Neurosurgery 1998;43(4):749–760.

47. Maurer CR, Fitzpatrick JM, Wang MY, Galloway RL,Maciunas RJ, Allen GS. Registration of head volumeimages using implantable fiducial markers. IEEETrans Med Imaging 1997;16(4):447–462.

48. Hill DLG, Maurer CR, Maciunas RJ, Barwise JA,Fitzpatrick JM, Wang MY. Measurement of intraop-erative brain surface deformation under a craniotomy.Neurosurgery 1998;43(3):514–528.

49. Khadem R, Yeh C, Sadeghi-Tehrani M, Bax MR, John-son JA, Welch JN, Wilkinson EP, Shahidi R. Compar-ative tracking error analysis of five different trackingsystems. Comp Aid Surg 2000;5(2):98 –107.

222 Lindseth et al.: Accuracy of Ultrasound-Based Neuronavigation