Modeling a Sensor to Improve Its Efficacy

12
Hindawi Publishing Corporation Journal of Sensors Volume 2013, Article ID 481054, 11 pages http://dx.doi.org/10.1155/2013/481054 Research Article Modeling a Sensor to Improve Its Efficacy Nabin K. Malakar, 1 Daniil Gladkov, 2 and Kevin H. Knuth 3 1 W. B. Hanson Center for Space Sciences, University of Texas at Dallas, Richardson, TX 75080, USA 2 Department of Physics, University at Albany (SUNY), Albany, NY 12222, USA 3 Departments of Physics and Informatics, University at Albany (SUNY), Albany, NY 12222, USA Correspondence should be addressed to Nabin K. Malakar; [email protected] Received 16 March 2013; Accepted 20 May 2013 Academic Editor: Guangming Song Copyright © 2013 Nabin K. Malakar et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Robots rely on sensors to provide them with information about their surroundings. However, high-quality sensors can be extremely expensive and cost-prohibitive. us many robotic systems must make due with lower-quality sensors. Here we demonstrate via a case study how modeling a sensor can improve its efficacy when employed within a Bayesian inferential framework. As a test bed we employ a robotic arm that is designed to autonomously take its own measurements using an inexpensive LEGO light sensor to estimate the position and radius of a white circle on a black field. e light sensor integrates the light arriving from a spatially distributed region within its field of view weighted by its spatial sensitivity function (SSF). We demonstrate that by incorporating an accurate model of the light sensor SSF into the likelihood function of a Bayesian inference engine, an autonomous system can make improved inferences about its surroundings. e method presented here is data based, fairly general, and made with plug-and-play in mind so that it could be implemented in similar problems. 1. Introduction Robots rely on sensors to provide them with information about their surroundings. However, high-quality sensors can be cost-prohibitive and oſten one must make due with lower quality sensors. In this paper we present a case study which demonstrates how employing an accurate model of a sensor within a Bayesian inferential framework can improve the quality of inferences made from the data produced by that sensor. In fact, the quality of the sensor can be quite poor, but if it is known precisely how it is poor, this information can be used to improve the results of inferences made from the sensor data. To accomplish this we rely on a Bayesian inferential framework where a machine learning system considers a set of hypotheses about its surroundings and identifies more probable hypotheses given incoming sensor data. Such infer- ences rely on a likelihood function, which quantifies the prob- ability that a hypothesized situation could have given rise to the data. e likelihood is oſten considered to represent the noise model, and this inherently includes a model of how the sensor is expected to behave when presented with a given stimulus. By incorporating an accurate model of the sensor, the inferences made by the system are improved. As a test bed we employ an autonomous robotic arm developed in the Knuth Cyberphysics Laboratory at the Uni- versity at Albany (SUNY). e robot is designed to perform studies in autonomous experimental design [1, 2]. In partic- ular it performs autonomous experiments where it uses an inexpensive LEGO light sensor to estimate the position and radius of a white circle on a black field. e light sensor integrates the light arriving from a spatially distributed region within its field of view weighted by its spatial sensitivity function (SSF). We consider two models of the light sensor. e na¨ ıve model predicts that the light sensor will return one value on average if it is centered over a black region and another higher value on average if it is centered on a white region. e more accurate model incorporates information about the SSF of the light sensor to predict what values the sensor would return given a hypothesized surface albedo field. We demonstrate that by incorporating a more accurate model of the light sensor into the likelihood function of a Bayesian inference engine, a robot can make improved infer- ences about its surroundings. e efficacy of the sensor model

Transcript of Modeling a Sensor to Improve Its Efficacy

Hindawi Publishing CorporationJournal of SensorsVolume 2013 Article ID 481054 11 pageshttpdxdoiorg1011552013481054

Research ArticleModeling a Sensor to Improve Its Efficacy

Nabin K Malakar1 Daniil Gladkov2 and Kevin H Knuth3

1 W B Hanson Center for Space Sciences University of Texas at Dallas Richardson TX 75080 USA2Department of Physics University at Albany (SUNY) Albany NY 12222 USA3Departments of Physics and Informatics University at Albany (SUNY) Albany NY 12222 USA

Correspondence should be addressed to Nabin K Malakar nabinkmgmailcom

Received 16 March 2013 Accepted 20 May 2013

Academic Editor Guangming Song

Copyright copy 2013 Nabin K Malakar et al This is an open access article distributed under the Creative Commons AttributionLicense which permits unrestricted use distribution and reproduction in any medium provided the original work is properlycited

Robots rely on sensors to provide themwith information about their surroundings However high-quality sensors can be extremelyexpensive and cost-prohibitive Thus many robotic systems must make due with lower-quality sensors Here we demonstrate via acase study how modeling a sensor can improve its efficacy when employed within a Bayesian inferential framework As a test bedwe employ a robotic arm that is designed to autonomously take its own measurements using an inexpensive LEGO light sensorto estimate the position and radius of a white circle on a black field The light sensor integrates the light arriving from a spatiallydistributed region within its field of view weighted by its spatial sensitivity function (SSF)We demonstrate that by incorporating anaccurate model of the light sensor SSF into the likelihood function of a Bayesian inference engine an autonomous system canmakeimproved inferences about its surroundingsThemethod presented here is data based fairly general and made with plug-and-playin mind so that it could be implemented in similar problems

1 Introduction

Robots rely on sensors to provide them with informationabout their surroundings However high-quality sensors canbe cost-prohibitive and often one must make due with lowerquality sensors In this paper we present a case study whichdemonstrates how employing an accurate model of a sensorwithin a Bayesian inferential framework can improve thequality of inferences made from the data produced by thatsensor In fact the quality of the sensor can be quite poor butif it is known precisely how it is poor this information canbe used to improve the results of inferences made from thesensor data

To accomplish this we rely on a Bayesian inferentialframework where a machine learning system considers a setof hypotheses about its surroundings and identifies moreprobable hypotheses given incoming sensor data Such infer-ences rely on a likelihood functionwhich quantifies the prob-ability that a hypothesized situation could have given rise tothe data The likelihood is often considered to represent thenoise model and this inherently includes a model of how thesensor is expected to behave when presented with a given

stimulus By incorporating an accurate model of the sensorthe inferences made by the system are improved

As a test bed we employ an autonomous robotic armdeveloped in the Knuth Cyberphysics Laboratory at the Uni-versity at Albany (SUNY) The robot is designed to performstudies in autonomous experimental design [1 2] In partic-ular it performs autonomous experiments where it uses aninexpensive LEGO light sensor to estimate the position andradius of a white circle on a black field The light sensorintegrates the light arriving from a spatially distributed regionwithin its field of view weighted by its spatial sensitivityfunction (SSF) We consider two models of the light sensorThe naıve model predicts that the light sensor will returnone value on average if it is centered over a black region andanother higher value on average if it is centered on a whiteregion The more accurate model incorporates informationabout the SSF of the light sensor to predict what values thesensor would return given a hypothesized surface albedofield We demonstrate that by incorporating a more accuratemodel of the light sensor into the likelihood function of aBayesian inference engine a robot can make improved infer-ences about its surroundingsThe efficacy of the sensormodel

2 Journal of Sensors

is quantified by the average number of measurements therobot needs to make to estimate the circle parameters withina given precision

There are two aspects to this work First is the character-ization of the light sensor and second is the incorporationof the light sensor model into the likelihood function ofthe robotrsquos machine learning system in a demonstration ofimproved efficacy In Section 2 we describe the robot its lightsensor the experiment that it is designed to perform and themachine learning system employed We then discuss themethods used to collect data from the light sensor themodelsused to describe the light sensor their incorporation into themachine learning system and the methods used to estimatethe model parameters and select the model order Section 3describes the resulting SSF model and the results of theexperiments comparing the naıve light sensor model to themore accurate SSF model In Section 4 we summarize ourresults which demonstrate how by incorporating a moreaccurate model of sensor one can improve its efficacy

2 Materials and Methods

In this section we begin by discussing various aspects of therobotic test bed followed by a discussion of the techniquesused to characterize the light sensor

21 Robotic ArmTest Bed Therobotic arm is designed to per-form studies in autonomous experimental design [1 2] Therobot itself is constructed using the LEGONXTMindstormssystem (Figure 1)The LEGOMindstorms systemwas utilizedin part to demonstrate that high-quality autonomous systemscan be achieved when using lower quality equipment if themachine learning and data analysis algorithms are handledcarefully It employs one motor to allow it to rotate about avertical axis indicated by the black line in the top center of thefigure and two motors to extend and lower the arm about thejoints located at the positions indicated by the short arrowsThe motors are controlled directly by the LEGO brick whichis commanded via Bluetooth by a Dell Latitude laptop com-puter running the robotrsquos machine learning system whichis programmed in MatLab (Mathworks Inc) The LEGOlight sensor is attached to the end of the arm (indicated by thelong arrow in Figure 1 and displayed in the inset at the upperright) Based on commands issued by the laptop the roboticarm can deploy the light sensor to any position within itsreach on the playing field The light sensor is lowered to anaverage height of 14mm above the surface before taking ameasurement The arm is designed using a trapezoidal con-struction that maintains the sensorrsquos orientation to be aimedat nadir always normal to the surface despite the extensionof the arm

The LEGO light sensor (LEGO Part 9844) consists of aphotodiode-LED pair The white circle is the photo diodeand the red circle is the illuminating LED Note that theyare separated by a narrow plastic ridge which prevents theLED from shining directly into the photo diode This ridgealong with the plastic lenses and the presence of the illu-minating LED affects the spatial sensitivity of the sensorWhen activated the light sensor flashes for a brief instant

Figure 1 A photograph showing the robotic arm along withthe circle it is programmed to characterize The robotic arm isconstructed using the LEGO NXT Mindstorms System It employsone motor to allow it to rotate about a vertical axis indicated by theblack line in the top center of the image and two motors to extendand lower the arm about the joints located at the positions indicatedby the short arrowsThe LEGO light sensor also shown in the insetis attached to the end of the arm as indicated by the long arrow

and measures the intensity of the reflected light The photodiode and its support circuitry are connected to the sensorport of the LEGOBrick (LEGOPart 9841) which runs on a 32bit ARM7 ATMELmicrocontrollerThemeasured intensitiesare converted by internal software running on the ATMELmicrocontroller to a scale of 1 to 100 which we refer to asLEGO units The light sensor integrates the light arrivingfrom a spatially distributed region within its field of viewThespatial sensitivity of the sensor to light sources within its fieldof view is described by the spatial sensitivity function (SSF)This function is unknown but if it were known one couldweight the surface albedo field with the SSF and integrate toobtain an estimate of the recorded intensity of the reflectedlight

22The Circle Characterization Experiment The robotic armis designed to deploy the light sensor to takemeasurements ofthe surface albedo at locationswithin a playing field of dimen-sions approximately 131times 65 LEGOdistance units (1048mmtimes520mm) within an automated experimental design para-digm [1ndash3]The robot is programmedwith a set of hypothesesof what shapes it could find placed on the playing fieldInstead of being programmed with a specific set of strate-gies for characterizing the hypothesized shapes the robotutilizes a generalized Bayesian Inference Engine coupled toan Inquiry Engine to make inferences about the hypothesesbased on the recorded data and to use uncertainties inits inferences to drive further exploration by autonomouslyselecting a new measurement location that promises to pro-vide the maximum amount of relevant information about theproblem

In this experiment the robotic arm is instructed thatthere is a white circle of unknown radius arbitrarily placed on

Journal of Sensors 3

the black field Such an instruction is encoded by providingthe robot with a model of the surface albedo consisting ofthree parameters the center location (119909

119900 119910119900) and radius 119903

119900

written jointly as

C = (119909119900 119910119900) 119903119900 (1)

so that given a measurement location (119909119894 119910119894) the albedo 119878 is

expected to be

119878 (119909119894 119910119894C) = 1 if 119863((119909

119894 119910119894) (119909119900 119910119900)) le 119903

119900

0 if 119863((119909119894 119910119894) (119909119900 119910119900)) gt 119903

119900

(2)

where

119863((119909119894 119910119894) (119909119900 119910119900)) = radic(119909

119894minus 119909119900)2

+ (119910119894minus 119910119900)2 (3)

is the Euclidean distance between the measurement location(119909119894 119910119894) and the center of the circle (119909

119900 119910119900) and an albedo of

1 signifies that the surface is white and 0 signifies that thesurface is black Precisely how these expectations are used tomake inferences from data is explained in the next sectionKeep in mind that while the circlersquos precise radius and posi-tion are unknown the robot has been provided with limitedprior information about the allowable range of radii andpositions

Again it is important to note that the robot does not scanthe surface to solve the problem nor does it try to find threepoints along the edge of the circle Instead it employs a gen-eral system that works for any expected shape or set of shapesthat autonomously and intelligently determines optimalmea-surement locations based both on what it knows and onwhat it does not knowThe number of measurements neededto characterize all three circle parameters within the desiredaccuracy is a measure of the efficiency of the experimentalprocedure

23TheMachine Learning System Themachine learning sys-tem employs a Bayesian Inference Engine to make inferencesabout the circle parameters given the recorded light intensi-ties as well as an Inquiry Engine designed to use the uncer-tainties in the circle parameter estimates to autonomouslyselect measurement locations that promise to provide themaximum amount of relevant information about the prob-lem

The core of the Bayesian Inference Engine is cen-tered around the computation of the posterior probabilityPr(C | D 119868) of the albedo model parameters C in (1) giventhe light sensor recordings (data) 119889

119894recorded at locations

(119909119894 119910119894) which we write compactly as

D = (1198891 (1199091 1199101)) (119889

119873 (119909119873 119910119873)) (4)

and any additional prior information 119868 Bayesrsquo Theoremallows one to write the posterior probability as a function ofthree related probabilities

Pr (C | D 119868) = Pr (C | 119868) Pr (D | C 119868)Pr (D | 119868)

(5)

where the right-hand side consists of the product of the priorprobability of the circle parameters Pr(C | 119868) which describeswhat is known about the circle before any sensor data areconsidered with a ratio of probabilities that are sensor datadependent It is in this sense that Bayesrsquo Theorem representsa learning algorithm since what is known about the circleparameters before the data are considered (prior probability)is modified by the recorded sensor data resulting in a quan-tification of what is known about the circle parameters afterthe data are considered (posterior probability) The prob-ability in the numerator on the right is the likelihoodPr(D | C 119868) which quantifies the probability that the sensordata could have resulted from the hypothesized circle param-eterized by C The probability in the denominator is themarginal likelihood or the evidence which here acts as anormalization factor Later when estimating the SSF of thelight sensor (which is a different problem) the evidencewhich can bewritten as the integral of the product of the priorand the likelihood over all possible model parameters willplay a critical role

The likelihood term Pr(D | C 119868) plays a critical role inthe inference problem since it essentially compares predic-tions made using the hypothesized circle parameters to theobserved data A naıve light sensor model would predict thatif the sensor was centered on a black region it would returna small number and if the sensor was centered on a whiteregion it would return a large number A more accurate lightsensor model would take into account the SSF of the lightsensor and perform an SSF-weighted integral of the hypoth-esized albedo field and compare this to the recorded sensordata These two models of the light sensor will be discussedin detail in the next section

The robot not only makes inferences from data butalso designs its own experiments by autonomously decidingwhere to take subsequent measurements [1ndash6] This canbe viewed in terms of Bayesian experimental design [7ndash13]where the Shannon entropy [14] is employed as the utilityfunction to decide where to take the next measurement Inshort the Bayesian Inference Engine samples 50 circles fromthe posterior probability using the nested sampling algorithm[15 16] Since these samples may consist of replicationsthey are further diversified by taking approximately 1000Metropolis-Hastings steps [17 18] Figure 2 illustrates therobotrsquos machine learning systemrsquos view of the playing fieldafter severalmeasurements have been recordedThe 50 circlessampled from the posterior probability (blue circles) reflectwhat the robot knows about the white circle (not shown)given the light sensor data that it has collectedThe algorithmthen considers a fine grid of potential measurement locationson the playing field At each potential measurement locationthe 50 sampled circles are queried using the likelihoodfunction to produce 50 samples of what measurement couldbe expected at that location given each hypothesized circleThis results in a set of potential measurement values at eachmeasurement location The Shannon entropy of the set ofpotential measurements at each location is computed whichresults in an entropy map which is illustrated in Figure 2 asthe copper-toned coloration upon which the blue circles lieThe set of measurement locations with the greatest entropy

4 Journal of Sensors

60

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

Figure 2This figure illustrates the robotrsquosmachine learning systemrsquosview of the playing field using the naıve light sensor model Theaxes label playing field coordinates in LEGO distance units Thepreviously obtained measurement locations used to obtain lightsensor data are indicated by the black and white squares indicatingthe relative intensitywith respect to the naıve light sensormodelThenext selectedmeasurement location is indicated by the green squareThe blue circles represent the 50 hypothesized circles sampled fromthe posterior probability The shaded background represents theentropy map such that brighter areas indicate the measurementlocations that promise to provide greater information about thecircle to be characterized Note that the low entropy area boundedby the white squares indicates that this region is probably insidethe white circle and that measurements made here will not beas informative as measurements made elsewhere The dark jaggededges at the bottom of the colored high entropy regions reflect theboundary between the playing field and the region that is outside ofthe robotic armrsquos reach

are identified and the next measurement location is ran-domly chosen from that set Thus the likelihood functionnot only affects the inferences about the circles made bythe machine learning algorithm but also affects the entropyof the measurement locations which guides further explo-ration In the next section we describe the two models of thelight sensor and their corresponding likelihood functions

The efficacy of the sensor model will be quantified by theaverage number of measurements the robot needs to make toestimate the circle parameters within a given precision

24 Models Describing a Light Sensor In this section we dis-cuss two models of a light sensor and indicate precisely howthey are integrated into the likelihood function used by boththe Bayesian Inference and Inquiry Engines

241TheNaıve Likelihood Anaıve light sensormodel wouldpredict that if the sensor was centered on a black region (sur-face albedo of zero) the sensor would return a small numberon average and if it were centered on a white region (surfacealbedo of unity) it would return a large number on averageOf course there are expected to be noise variations fromnumerous sources such as uneven lighting of the surfaceminor variations in albedo and noise in the sensor itselfSo one might expect that there is some expected squareddeviation from the two mean sensor values for the white andblack cases For this reason we model the expected sensorresponse with a Gaussian distribution with mean 120583

119861and

standard deviation 120590119861for the black surface and a Gaussian

distribution withmean 120583119882and standard deviation 120590

119882for the

white surface The likelihood of a measurement 119889119894at loca-

tion (119909119894 119910119894) corresponding to the naıve light sensor model

Prnaive((119889119894 (119909119894 119910119894)) | C 119868) can be written compactly as

Prnaive ((119889119894 (119909119894 119910119894)) | C 119868)

=

(2120587120590119882)minus12 exp[minus

(120583119882minus 119889119894)2

21205902

119882

]

for 119863((119909119894 119910119894) (119909119900 119910119900)) le 119903

119900

(2120587120590119861)minus12 exp[minus

(120583119861minus 119889119894)2

21205902

119861

]

for 119863((119909119894 119910119894) (119909119900 119910119900)) gt 119903

119900

(6)

where C = (119909119900 119910119900) 119903119900 represents the parameters of the hy-

pothesized circle and 119863((119909119894 119910119894) (119909119900 119910119900)) is the Euclidean

distance given in (3)The joint likelihood for119873-independentmeasurements is found by taking the product of the 119873single-measurement likelihoods In a practical experimentthe means 120583

119861and 120583

119882and standard deviations 120590

119861and 120590

119882

can be easily estimated by sampling known black and whiteregions several times using the light sensor

25 The SSF Likelihood A more accurate likelihood can bedeveloped by taking into account the fact that the photo diodeperforms a weighted integral of the light arriving from aspatially distributed regionwithin its field of view theweightsbeing described by the spatial sensitivity function (SSF) of thesensor Since the SSF of the light sensor could be arbitrarilycomplex withmany local peaks but is expected to decrease tozero far from the sensor we characterize it using a mixture ofGaussians (MoG) model which we describe in this section

The sensorrsquos response situated a fixed distance above thepoint (119909

119894 119910119894) to a known black-and-white pattern which can

be modeled in the lab frame by

119872(119909119894 119910119894) = 119868min + (119868max minus 119868min) 119877 (119909119894 119910119894) (7)

where 119868min and 119868max are observed intensities for a completelyblack surface (surface albedo of zero) and a completely whitesurface (surface albedo of unity) respectively and 119877 is ascalar response function varying between zero and one thatdepends both on the SSF and the surface albedo [19] Theminimum intensity 119868min acts as an offset and (119868max minus 119868min)serves to scale the response to LEGO units

The response of the sensor is described by119877(119909119894 119910119894) which

is a convolution of the sensor SSF and the surface albedo119878(119909 119910) given by

119877 (119909119894 119910119894) = int119889119909119889119910SSF (119909 minus 119909

119894 119910 minus 119910

119894) 119878 (119909 119910) (8)

where the SSF is defined so that the convolution with a com-pletely white surface results in a response of unity In practicewe approximate this integral as a sum over a pixelated gridwith 1mm square pixels

119877 (119909119894 119910119894) = sum

119909119910

SSF (119909 minus 119909119894 119910 minus 119910

119894) 119878 (119909 119910) (9)

Journal of Sensors 5

We employ a mixture of Gaussians (MOG) as a param-eterized model to describe the SSF in the sensorrsquos frame co-ordinates (1199091015840 1199101015840) = (119909 minus 119909

119894 119910 minus 119910

119894)

SSF (1199091015840 1199101015840) = 1

119870

119873

sum

119899=1

119886119899

times exp [minus 119860119899(1199091015840

minus1199061015840

119899)

2

+119861119896(1199101015840

minus V1015840

119899)

2

+ 2119862119899(1199091015840

minus1199061015840

119899) (1199101015840

minusV1015840

119899) ]

(10)

where (1199061015840119899 V1015840119899) denotes the center of the 119899th two-dimensional

Gaussian with amplitude 119886119899and covariance matrix elements

given by119860119899 119861119899and119862

119899 The constant119870 denotes a normaliza-

tion factor which ensures that the SSF integrates to unity [19]The model is sufficiently general so that one could vary thenumber of Gaussians to any number although we found thatin modeling this light sensor testing models with 119873 varyingfrom119873 = 1 to119873 = 4 was sufficient The MoG model resultsin a set of six parameters to be estimated for each Gaussian

120579119899= 119886119899 1199061015840

119899 V1015840

119899 119860119899 119861119899 119862119899 (11)

where the subscript 119899 is used to denote the 119899th Gaussian inthe set These must be estimated along with the two intensityparameters 119868min and 119868max in (7) so that an MoG model with119873 Gaussians consists of 6119873 + 2 parameters to be estimated

We assign a Student-119905 distribution to the SSF modellikelihood which can be arrived at by assigning a Gaus-sian likelihood and integrating over the unknown stan-dard deviation 120590 [16] By defining D = (119889

1 (1199091 1199101))

(1198892 (1199092 1199102)) (119889

119873 (119909119873 119910119873)) and writing the hypothe-

sized circle parameters as C = (119909119900 119910119900) 119903119900 we have

PrSSF (D | C 119868) prop [119873

sum

119894=1

(119872 (119909119894 119910119894) minus 119889119894)2

]

minus(119873minus1)2

(12)

where119873 is the number of measurements made by the robotand the function119872 defined in (7) relies on both (8) and (10)Note that in practice the MoG SSF model described in (10)is used to generate a discrete SSF matrix which is used in theconvolution (9) to compute the likelihood function via thesensor response model119872 in (7)

26 Data Collection for SSF Estimation In this section wedescribe the collection of the light sensor readings in the lab-oratory that were used to estimate the SSF of the light sensorThe SSF is a function not only of the properties of the photodiode but also of the illuminating LED and the height abovethe surface For this reason measurements were made at aheight of 14mm above a surface with a known albedo in adarkened room to avoid complications due to ambient lightand to eliminate the possibility of shadows cast by the sensoror other equipment [19]

The surface which we refer to as the black-and-whiteboundary pattern consisted of two regions a black region onthe left and a white region on the right separated by a sharp

(a)

0∘

90∘

180∘

270∘

(b)

Figure 3 (a) This figure illustrates the laboratory surface referredto as the black-and-white boundary pattern with known albedowhich consisted of two regions a black region on the left and a whiteregion on the right separated by a sharp linear boundary (b) Thisillustrates the four sensor orientations used to collect data for theestimation of the SSF along with the symbols used to indicate themeasured values plotted in Figure 5 The top row views the sensorsas if looking up from the table so that the photodiode placement inthe sensor package can be visualized Below these the gray sensorshapes illustrate how they are oriented looking down at both thesensors and the albedo surface Measurements were taken as thesensor was incrementally moved with one-millimeter steps in adirection perpendicular to the boundary (as indicated by the arrowat the bottom of the figure) from a position of 5 cm to the left ofthe boundary (well within the completely black region) to a position5 cm to the right of the boundary (well within the completely whiteregion)This process was repeated four times with the sensor in eachof the four orientations

linear boundary as shown in Figure 3(a) Here the surfaceand sensor are illustrated as if viewing them by looking up atthem from below the table surface so that the placement ofthe photodiode in the sensor package can be visualized Thelab frame was defined to be at the center of the black-whiteboundary so that the surface albedo 119878BW(119909 119910) is given by

119878BW (119909 119910) = 1 for 119909 gt 00 for 119909 le 0

(13)

Measurements were taken as the sensor was incrementallymoved with one-millimeter steps in a direction perpendic-ular to the boundary from a position of 5 cm to the left ofthe boundary (well within the completely black region) to aposition 5 cm to the right of the boundary (well within thecompletely white region) resulting in 101 measurementsThisprocess was repeated four times with the sensor in each of

6 Journal of Sensors

40

(0 0)

(a)

39

(0 0)

(b)

35

(4 4)

(c)

32

(minus4 4)

(d)

Figure 4 Four additional symmetry-breaking albedo patterns wereemployed In all cases the sensor was placed in the 0∘ orientationat the center of the pattern indicated by (0 0) In the two lowerpatterns the center of the white square was shifted diagonally fromthe center by 4mm as indicated by the coordinates (white text) ofthe corner The recorded intensity levels are displayed in the whitealbedo area

the four orientations (see Figure 3 for an explanation) givinga total of 404 measurements using this pattern

The black-and-white boundary pattern does not providesufficient information to uniquely infer the SSF since thesensor may have a response that is symmetric about a lineoriented at 45∘ with respect to the linear boundary For thisreason we employed four additional albedo patterns consist-ing of black regions with one white quadrant as illustrated inFigure 4 which resulted in four more measurements Thesehave surface albedos defined by

119878119886(119909 119910) =

1 for 119909 gt 0mm 119910 gt 0mm0 otherwise

119878119887(119909 119910) =

1 for 119909 lt 0mm 119910 gt 0mm0 otherwise

119878119888(119909 119910) =

1 for 119909 gt 4mm 119910 gt 4mm0 otherwise

119878119889(119909 119910) =

1 for 119909 lt minus4mm 119910 gt 4mm0 otherwise

(14)

where the subscripts relate each albedo function to thepattern illustrated in Figure 4

27 Estimating SSF MoG Model Parameters In this sectionwe describe the application of Bayesian methods to estimatethe SSF MoG model parameters Keep in mind that in thispaper we are considering two distinct inference problems therobotrsquos inferences about a circle and our inferences about

the light sensor SSF Both of these problems rely on makingpredictions about measured intensities using a light sensorFor this reason many of these equations will not only looksimilar to what we have presented previously but also dependon the same functions mapping modeled albedo fields topredicted sensor responses which are collectively repre-sented using the generic symbol D It may help to keep inmind that the robot is characterizing a circle quantified bymodel parameters represented jointly by the symbol C andwe are estimating the SSF of a light sensor quantified bymodelparameters represented jointly by the symbol 120579 below Withthe exception of the evidence represented by function 119885below (which is not used by the robot in this experiment)all of the probability functions contain the model parametersin their list of arguments making it clear to which inferenceproblem they refer

Theposterior probability for the SSFMoGmodel parame-ters collectively referred to as 120579 = 120579

1 1205792 120579

119873 for amodel

consisting of119873 Gaussians is given by

Pr (120579 | D 119868) = 1119885

Pr (120579 | 119868)Pr (D | 120579 119868) (15)

where here D refers to the data collected for the SSFestimation experiment described in the previous section 119868refers to our prior information about the SSF (which is that itmay have several local peaks and falls off to zero far from thesensor) and119885 refers to the evidence119885 = Pr(D | 119868) which canbe found by

119885 = int119889120579Pr (120579 | 119868)Pr (D | 120579 119868) (16)

In our earlier discussion where Bayesrsquo theorem was used tomake inferences about circles the evidence played the roleof a normalization factor Here since we can consider MoGmodels with different numbers of Gaussians and since weintegrate over all of the possible values of the parametersthe evidence quantifies the degree to which the hypothesizedmodel order119873 supports the dataThat is the optimal numberof Gaussians to be used in the MoG model of the SSF can befound by computing the evidence

All five sets of data D described in the previous sectionwere used to compute the posterior probability These areeach indexed by the subscript 119894 where 119894 = 1 2 3 4 so that119863

119894

refers to the data collected using each of the four orienta-tions 120601

119894= 0∘

90∘

180∘

270∘

to scan the black-and-whiteboundary pattern resulting in 119873

119894= 101 measurements for

119894 = 1 2 3 4 and where the value 119894 = 5 refers to the 119873119894= 4

measurements attained using the 0∘ orientation with the setof four additional patterns

We assign uniform priors so that this is essentially amaximum likelihood calculation with the posterior beingproportional to the likelihood We assign a Student-119905 distri-bution to the likelihood which can be arrived at by assigning

Journal of Sensors 7

a Gaussian likelihood and integrating over the unknownstandard deviation 120590 [16] This can be written as

Pr (119863119894| 120579 119868) prop [

[

119873119894

sum

119895=1

(119872119894119895(119909119894119895 119910119894119895)minus119863119894(119909119894119895 119910119894119895))

2

]

]

minus(119873119894minus1)2

(17)

where 119894 denotes each of the five sets of data and 119895 denotesthe 119895th measurement of that set which was taken at position(119909119894119895 119910119894119895) The function119872

119894119895(119909 119910) represents a predicted mea-

surement value obtained from (7) using (9) with the albedofunction 119878(119909 119910) defined using the albedo pattern 119878BW(119909 119910)with orientation 120601

119894for 119894 = 1 2 3 4 and the albedo patterns

119878119886 119878119887 119878119888 119878119889for 119894 = 5 and 119895 = 1 2 3 4 respectively As such

the likelihood relies on a difference between predicted andmeasured sensor responses

The joint likelihood for the five data sets is foundby takingthe product of the likelihoods for each data set since weexpect that the standard deviations that were marginalizedover to get the Student-119905 distribution could have beendifferent for each of the five data sets as they were not allrecorded at the same time

Pr (D | 120579 119868) =5

prod

119894=1

Pr (119863119894| 120579 119868) (18)

We employed nested sampling [15 16] to explore the pos-terior probability since in addition to providing parameterestimates it is explicitly designed to perform evidence calcu-lations which we use to perform model comparison in iden-tifying the most probable number of Gaussians in the MoGmodel For each of the four MoG models (number of Gaus-sians varying from one to four) the nested sampling algo-rithm was initialized with 300 samples and iterated until thechange in consecutive log-evidence values is less than 10minus8Typically one estimates the mean parameter values by takingan average of the samples weighted by a quantity computedby the nested sampling algorithm called the logWt in [15 16]Here we simply performed a logWt-weighted average of thesampled SSF fields computed using the sampled MoG modelparameters (rather than the logWt-weighted average of theMoG model parameters themselves) so the result obtainedusing an MoG model consisting of a single Gaussian is notstrictly a single two-dimensional Gaussian distribution It isthis discretized estimated SSF fieldmatrix that is used directlyin the convolution (9) to compute the likelihood functions asmentioned earlier in the last lines of Section 25

3 Results and Discussion

In this section we present the SSF MoG light sensor modelestimated from the laboratory data and evaluate its efficacy bydemonstrating a significant improvement of the performancein the autonomous robotic platform

31 Light Sensor SSF MoG Model The light sensor data col-lected using the black-and-white boundary pattern are illus-trated in Figure 5 One can see that the intensity recorded

0∘

90∘

180∘

270∘

Sensor orientation

50

45

40

35

30

25

20

minus5 minus4 minus3 minus2 minus1 0 1 2 3 4 5Position of the sensor (cm)

Reco

rded

inte

nsity

(LEG

O u

nits)

Figure 5 This figure illustrates the intensity measurements1198631 1198632 1198633 1198634from the four sensor orientations recorded from

the sensor using the black-and-white boundary pattern Figure 3shows the orientations of the sensor corresponding to the symbolsused in this figure

Table 1 A comparison of the tested MoG SSF models and theirrespective log evidence (in units of dataminus408)

MoG model order Log evidence Number of parameters1 Gaussian minus6655 plusmn 03 62 Gaussian minus6749 plusmn 03 123 Gaussian minus6719 plusmn 04 184 Gaussian minus7061 plusmn 04 24

by the sensor increases dramatically as the sensor crosses theboundary from the black region to the white region but thechange is not a step function which indicates the finite size ofthe surface area integrated by the sensor It is this effect that isto bemodeled by the SSF functionThere is an obvious asym-metry between the 90∘ and 270∘ orientations due to the shiftof the transition region In addition there is a significant dif-ference in slope of the transition region between the 0∘ 180∘orientations and the 90∘ 270∘ orientations indicating a sig-nificant difference in the width of the SSF in those directionsNote also that the minimum recorded response is not zero asthe reflectance of the black surface is not completely zero

The nested sampling algorithm produced the mean SSFfields for each of the MoG models tested as well as thecorresponding log evidence Table 1 which lists the log evi-dence computed for each MoG model order illustrates thatthe most probable model was obtained from the single two-dimensional Gaussian models by a factor of about exp(9)which means that it is about 8000 times more probable thanthe MoG consisting of two Gaussians Figure 6 shows themean SSF fields described by the MoG models of differentorders In all cases the center of the SSF is shifted slightlyabove the physical center of the sensor package due to theplacement of the photodiode (refer to Figure 1) as indicatedby the data in Figure 5 In addition as predicted one sees

8 Journal of Sensors

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6655 plusmn 03

1MoG

(mm)

(mm

)

(a)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6749 plusmn 03

2 MoG

(mm)

(mm

)

(b)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6719 plusmn 04

3MoG

(mm)

(mm

)

(c)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus7061 plusmn 04

4MoG

(mm)

(mm

)

(d)

Figure 6 This figure illustrates the SSF obtained from the four MoG models along with their corresponding log-evidence values

that the SSF is wider along the 90∘ndash270∘ axis than along the0∘ndash180∘ axis Last Figure 7 demonstrates that the predictedlight sensor output shows excellent similarity to the recordeddata for the black-and-white boundary pattern

In the next section we demonstrate that explicit knowl-edge about how the light sensor integrates light arriving fromwithin its field-of-view improves the inferences one canmakefrom its output

32 Efficacy of Sensor Model The mean SSF field obtainedusing a single two-dimensional Gaussianmodel (Figure 6(a))was incorporated into the likelihood function used by therobotrsquos machine learning system Here we compare therobotrsquos performance in locating and characterizing a circle byobserving the average number of measurements necessary tocharacterize the circle parameters within a precision of 4mm(which is one-half of the spacing between the holes in theLEGO technic parts)

The three panels comprising Figure 8(a) illustrate therobotrsquos machine learning systemrsquos view of the playing fieldusing the naıve light sensor model The previously obtainedmeasurement locations used to obtain light sensor data areindicated by the black and white squares indicating therelative intensity with respect to the naıve light sensor modelThe next selected measurement location is indicated by thegreen square The blue circles represent the 50 hypothesizedcircles sampled from the posterior probability The shadedbackground represents the entropy map which indicates themeasurement locations that promise to provide maximalinformation about the circle to be characterized The lowentropy area surrounding the white square indicates that theregion is probably inside the white circle (not shown) andthat measurements made there will not be as informative asmeasurementsmade elsewhere Note that the circles partitionthe plane and that each partition has a uniform entropy Allmeasurement locations within that partition or any otherpartition sharing the same entropy all stand to be equally

Journal of Sensors 9

50

55

45

40

35

30

25

20

minus3 minus2 minus1 0 1 2 3Position of the center of the sensor (cm)

Inte

nsity

(LEG

O u

nits)

Intensity measurement compared for 1MoG

Prediction1Prediction2Prediction3Prediction4

Data1Data2Data3Data4

Figure 7 A comparison of the observed data (red) with predictions(black) made by the SSF field estimated using the single two-dimensional Gaussian MoG model

informative Here it is the fact that the shape is known to bea circle that is driving the likelihood function

In contrast the three panels comprising Figure 8(b)illustrate the playing field using themore accurate SSFmodelHere one can see that the entropy is higher along the edgesof the sampled circles This indicates that the circle edgespromise to provide more information than the centers ofthe partitioned regions This is because that the SSF modelenables one to detect not only whether the light sensor issituated above the circles edge but also how much of the SSFoverlaps with the white circle That is it helps to identify notonlywhether the sensor is inside the circle (as is accomplishedusing the naıve light sensor model) but also the extent towhich the sensor is on the edge of the circle The additionalinformation provided about the functioning of the lightsensor translates directly into additional information aboutthe albedo that results in the sensor output

This additional information can be quantified by observ-ing how many measurements the robot is required to taketo obtain estimates of the circle parameters within thesame precision in the cases of each light sensor modelOur experiments revealed that on average it takes 19 plusmn 18measurements using the naıve light sensor model comparedto an average of 116plusmn 39measurements for themore accurateSSF light sensor model

4 Conclusion

Thequality of the inferences onemakes froma sensor dependnot only on the quality of the data returned by the sensorbut also on the information one possesses about the sensorrsquos

performance In this paper we have demonstrated via a casestudy how more precisely modeling a sensorrsquos performancecan improve the inferences one can make from its data Inthis case we demonstrated that one can achieve about 18reduction in the number of measurements needed by a robotto make the same inferences by more precisely modeling itslight sensor

This paper demonstrates how a machine learning systemthat employs Bayesian inference (and inquiry) relies onthe likelihood function of the data given the hypothesizedmodel parameters Rather than simply representing a noisemodel the likelihood function quantifies the probability thata hypothesized situation could have given rise to the recordeddata By incorporating more information about the sensors(or equipment) used to record the data one naturally isincorporating this information into the posterior probabilitywhich results in onersquos inferences

This is made even more apparent by a careful study ofthe experimental design problem that this particular roboticsystem is designed to explore For example it is easy toshow that by using the naıve light sensor model the entropydistribution for a proposed measurement location dependssolely on the number of sampled circles forwhich the locationis in the interior of the circle and the number of sampledcircles for which the location is exterior to the circle Giventhat we represented the posterior probability by sampling50 circles the maximum entropy occurs when the proposedmeasurement location is inside 25 circles (and outside 25circles) As the robotrsquos parameter estimates converge one canshow that the system is simply performing a binary searchby asking ldquoyesrdquo or ldquonordquo questions which implies that eachmeasurement results in one bit of information However inthe case where the robot employs an SSF model of the lightsensor the question the robot is essentially asking is moredetailed ldquoto what degree does the circle overlap the lightsensorrsquos SSFrdquoThe answer to such a question tends to providemore information which significantly improves system per-formance One can estimate the information gain achievedby employing the SSF model Consider that the naive modelreveals that estimating the circlersquos position and radius with aprecision of 4mmgiven the prior information about the circleand the playing field requires 25 bits of information Theexperiment using the SSFmodel requires on average 116mea-surements which implies that on average each measurementobtained using the SSF model provides about 25116 = 215bits of information One must keep in mind however thatthis is due to the fact that the SSF model is being used notonly to infer the circle parameters from the data but also toselect the measurement locations

Because the method presented here is based on a verygeneral inferential framework these methods can easily beapplied to other types of sensors and equipment in a widevariety of situations If one has designed a robotic machinelearning system to rely on likelihood functions then sensormodels can be incorporated in more or less a plug-and-playfashion This not only promises to improve the quality ofrobotic systems forced to rely on lower quality sensors but italso opens the possibility for calibration on the fly by updatingsensor models as data are continually collected

10 Journal of Sensors

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(a)

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(b)

Figure 8 (a)These three panels illustrate the robotrsquos machine learning systemrsquos view of the playing field using the naıve light sensor model asthe system progresses through the first three measurements The previously obtained measurement locations used to obtain light sensor dataare indicated by the black-and-white squares indicating the relative intensity with respect to the naıve light sensor model The next selectedmeasurement location is indicated by the green square The blue circles represent the 50 hypothesized circles sampled from the posteriorprobabilityThe shaded background represents the entropymap which indicates themeasurement locations that promise to providemaximalinformation about the circle to be characterized Note that the low entropy area surrounding the white square indicates that the region isprobably inside the white circle (not shown) and that measurements made there will not be as informative as measurements made elsewhereThe entropy map in Figure 2 shows the same experiment at a later stage after seven measurements have been recorded (b)These three panelsillustrate the robotrsquos machine learning systemrsquos view of the playing field using the more accurate SSF light sensor model Note that the entropymap reveals the circle edges to be highly informative This is because it helps to identify not only whether the sensor is inside the circle (as isaccomplished using the naıve light sensor model on the left) but also the extent to which the sensor is on the edge of the circle

Conflict of Interests

The authors have no conflict of interests to report

Acknowledgments

This research was supported in part by the University atAlbany Faculty Research Awards Program (Knuth) and theUniversity at Albany Benevolent Research Grant Award(Malakar) The authors would like to thank Scott Frasso andRotem Guttman for their assistance in interfacing MatLab tothe LEGOBrickTheywould also like to thank Phil Erner andA J Mesiti for their preliminary efforts on the robotic armexperiments and sensor characterization

References

[1] K H Knuth P M Erner and S Frasso ldquoDesigning intelligentinstrumentsrdquo in Proceedings of the 27th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering K Knuth A Caticha J Center A Giffin andC Rodriguez Eds vol 954 pp 203ndash211 AIP July 2007

[2] K H Knuth and J L Center Jr ldquoAutonomous science platformsand question-asking machinesrdquo in Proceedings of the 2nd Inter-national Workshop on Cognitive Information Processing (CIPrsquo10) pp 221ndash226 June 2010

[3] K H Knuth and J L Center ldquoAutonomous sensor placementrdquoin Proceedings of the IEEE International Conference on Technolo-gies for Practical Robot Applications pp 94ndash99 November 2008

Journal of Sensors 11

[4] K H Knuth ldquoIntelligent machines in the twenty-first centuryfoundations of inference and inquiryrdquo Philosophical Transac-tions of the Royal Society A vol 361 no 1813 pp 2859ndash28732003

[5] N K Malakar and K H Knuth ldquoEntropy-based search algo-rithm for experimental designrdquo in Proceedings of the 30thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering A M Djafari JF Bercher and P Bessiere Eds vol 1305 pp 157ndash164 AIP July2010

[6] N K Malakar K H Knuth and D J Lary ldquoMaximum jointentropy and information-based collaboration of automatedlearning machinesrdquo in Proceedings of the 31st InternationalWorkshop on Bayesian Inference and Maximum Entropy Meth-ods in Science and Engineering P Goyal A Giffin K H Knuthand E Vrscay Eds vol 1443 pp 230ndash237 AIP 2012

[7] D V Lindley ldquoOn a measure of the information provided by anexperimentrdquo The Annals of Mathematical Statistics vol 27 no4 pp 986ndash1005 1956

[8] V V Fedorov Theory of Optimal Experiments Probability andMathematical Statistics Academic Press 1972

[9] K Chaloner and I Verdinelli ldquoBayesian experimental design areviewrdquo Statistical Science vol 10 pp 273ndash304 1995

[10] P Sebastiani and H P Wynn ldquoBayesian experimental designand shannon informationrdquo in 1997 Proceedings of the Section onBayesian Statistical Science pp 176ndash181 1997 httpciteseerxistpsueduviewdocsummarydoi=1011566037

[11] P Sebastiani and H P Wynn ldquoMaximum entropy samplingand optimal Bayesian experimental designrdquo Journal of the RoyalStatistical Society B vol 62 no 1 pp 145ndash157 2000

[12] T J Loredo and D F Cherno ldquoBayesian adaptive explorationrdquoin Proceedings of the 23rd International Workshop on BayesianInference and Maximum Entropy Methods in Science and Engi-neering G Erickson and Y Zhai Eds pp 330ndash346 AIP August2003

[13] R Fischer ldquoBayesian experimental designmdashstudies for fusiondiagnosticsrdquo in Proceedings of the 24th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering vol 735 pp 76ndash83 November 2004

[14] C E Shannon and W Weaver The Mathematical Theory ofCommunication University of Illinois Press Chicago Ill USA1949

[15] J Skilling ldquoNested samplingrdquo in Proceedings of the 24th Inter-nationalWorkshop onBayesian Inference andMaximumEntropyMethods in Science and Engineering vol 735 pp 395ndash405 2004

[16] D Sivia and J Skilling Data Analysis A Bayesian TutorialOxford University Press New York NY USA 2nd edition2006

[17] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953

[18] W K Hastings ldquoMonte carlo sampling methods using Markovchains and their applicationsrdquo Biometrika vol 57 no 1 pp 97ndash109 1970

[19] N K Malakar A J Mesiti and K H Knuth ldquoThe spatialsensitivity function of a light sensorrdquo in Proceedings of the 29thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering P Goggans andC Y Chan Eds vol 1193 of AIP Conference Proceedings pp352ndash359American Institute of PhysicsOxfordMassUSA July2009

Submit your manuscripts athttpwwwhindawicom

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013Part I

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

DistributedSensor Networks

International Journal of

ISRN Signal Processing

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Mechanical Engineering

Advances in

Modelling amp Simulation in EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inOptoElectronics

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013

ISRN Sensor Networks

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawi Publishing Corporation httpwwwhindawicom Volume 2013

The Scientific World Journal

ISRN Robotics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

Antennas andPropagation

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

ISRN Electronics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

thinspJournalthinspofthinsp

Sensors

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Active and Passive Electronic Components

Chemical EngineeringInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Electrical and Computer Engineering

Journal of

ISRN Civil Engineering

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inAcoustics ampVibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

2 Journal of Sensors

is quantified by the average number of measurements therobot needs to make to estimate the circle parameters withina given precision

There are two aspects to this work First is the character-ization of the light sensor and second is the incorporationof the light sensor model into the likelihood function ofthe robotrsquos machine learning system in a demonstration ofimproved efficacy In Section 2 we describe the robot its lightsensor the experiment that it is designed to perform and themachine learning system employed We then discuss themethods used to collect data from the light sensor themodelsused to describe the light sensor their incorporation into themachine learning system and the methods used to estimatethe model parameters and select the model order Section 3describes the resulting SSF model and the results of theexperiments comparing the naıve light sensor model to themore accurate SSF model In Section 4 we summarize ourresults which demonstrate how by incorporating a moreaccurate model of sensor one can improve its efficacy

2 Materials and Methods

In this section we begin by discussing various aspects of therobotic test bed followed by a discussion of the techniquesused to characterize the light sensor

21 Robotic ArmTest Bed Therobotic arm is designed to per-form studies in autonomous experimental design [1 2] Therobot itself is constructed using the LEGONXTMindstormssystem (Figure 1)The LEGOMindstorms systemwas utilizedin part to demonstrate that high-quality autonomous systemscan be achieved when using lower quality equipment if themachine learning and data analysis algorithms are handledcarefully It employs one motor to allow it to rotate about avertical axis indicated by the black line in the top center of thefigure and two motors to extend and lower the arm about thejoints located at the positions indicated by the short arrowsThe motors are controlled directly by the LEGO brick whichis commanded via Bluetooth by a Dell Latitude laptop com-puter running the robotrsquos machine learning system whichis programmed in MatLab (Mathworks Inc) The LEGOlight sensor is attached to the end of the arm (indicated by thelong arrow in Figure 1 and displayed in the inset at the upperright) Based on commands issued by the laptop the roboticarm can deploy the light sensor to any position within itsreach on the playing field The light sensor is lowered to anaverage height of 14mm above the surface before taking ameasurement The arm is designed using a trapezoidal con-struction that maintains the sensorrsquos orientation to be aimedat nadir always normal to the surface despite the extensionof the arm

The LEGO light sensor (LEGO Part 9844) consists of aphotodiode-LED pair The white circle is the photo diodeand the red circle is the illuminating LED Note that theyare separated by a narrow plastic ridge which prevents theLED from shining directly into the photo diode This ridgealong with the plastic lenses and the presence of the illu-minating LED affects the spatial sensitivity of the sensorWhen activated the light sensor flashes for a brief instant

Figure 1 A photograph showing the robotic arm along withthe circle it is programmed to characterize The robotic arm isconstructed using the LEGO NXT Mindstorms System It employsone motor to allow it to rotate about a vertical axis indicated by theblack line in the top center of the image and two motors to extendand lower the arm about the joints located at the positions indicatedby the short arrowsThe LEGO light sensor also shown in the insetis attached to the end of the arm as indicated by the long arrow

and measures the intensity of the reflected light The photodiode and its support circuitry are connected to the sensorport of the LEGOBrick (LEGOPart 9841) which runs on a 32bit ARM7 ATMELmicrocontrollerThemeasured intensitiesare converted by internal software running on the ATMELmicrocontroller to a scale of 1 to 100 which we refer to asLEGO units The light sensor integrates the light arrivingfrom a spatially distributed region within its field of viewThespatial sensitivity of the sensor to light sources within its fieldof view is described by the spatial sensitivity function (SSF)This function is unknown but if it were known one couldweight the surface albedo field with the SSF and integrate toobtain an estimate of the recorded intensity of the reflectedlight

22The Circle Characterization Experiment The robotic armis designed to deploy the light sensor to takemeasurements ofthe surface albedo at locationswithin a playing field of dimen-sions approximately 131times 65 LEGOdistance units (1048mmtimes520mm) within an automated experimental design para-digm [1ndash3]The robot is programmedwith a set of hypothesesof what shapes it could find placed on the playing fieldInstead of being programmed with a specific set of strate-gies for characterizing the hypothesized shapes the robotutilizes a generalized Bayesian Inference Engine coupled toan Inquiry Engine to make inferences about the hypothesesbased on the recorded data and to use uncertainties inits inferences to drive further exploration by autonomouslyselecting a new measurement location that promises to pro-vide the maximum amount of relevant information about theproblem

In this experiment the robotic arm is instructed thatthere is a white circle of unknown radius arbitrarily placed on

Journal of Sensors 3

the black field Such an instruction is encoded by providingthe robot with a model of the surface albedo consisting ofthree parameters the center location (119909

119900 119910119900) and radius 119903

119900

written jointly as

C = (119909119900 119910119900) 119903119900 (1)

so that given a measurement location (119909119894 119910119894) the albedo 119878 is

expected to be

119878 (119909119894 119910119894C) = 1 if 119863((119909

119894 119910119894) (119909119900 119910119900)) le 119903

119900

0 if 119863((119909119894 119910119894) (119909119900 119910119900)) gt 119903

119900

(2)

where

119863((119909119894 119910119894) (119909119900 119910119900)) = radic(119909

119894minus 119909119900)2

+ (119910119894minus 119910119900)2 (3)

is the Euclidean distance between the measurement location(119909119894 119910119894) and the center of the circle (119909

119900 119910119900) and an albedo of

1 signifies that the surface is white and 0 signifies that thesurface is black Precisely how these expectations are used tomake inferences from data is explained in the next sectionKeep in mind that while the circlersquos precise radius and posi-tion are unknown the robot has been provided with limitedprior information about the allowable range of radii andpositions

Again it is important to note that the robot does not scanthe surface to solve the problem nor does it try to find threepoints along the edge of the circle Instead it employs a gen-eral system that works for any expected shape or set of shapesthat autonomously and intelligently determines optimalmea-surement locations based both on what it knows and onwhat it does not knowThe number of measurements neededto characterize all three circle parameters within the desiredaccuracy is a measure of the efficiency of the experimentalprocedure

23TheMachine Learning System Themachine learning sys-tem employs a Bayesian Inference Engine to make inferencesabout the circle parameters given the recorded light intensi-ties as well as an Inquiry Engine designed to use the uncer-tainties in the circle parameter estimates to autonomouslyselect measurement locations that promise to provide themaximum amount of relevant information about the prob-lem

The core of the Bayesian Inference Engine is cen-tered around the computation of the posterior probabilityPr(C | D 119868) of the albedo model parameters C in (1) giventhe light sensor recordings (data) 119889

119894recorded at locations

(119909119894 119910119894) which we write compactly as

D = (1198891 (1199091 1199101)) (119889

119873 (119909119873 119910119873)) (4)

and any additional prior information 119868 Bayesrsquo Theoremallows one to write the posterior probability as a function ofthree related probabilities

Pr (C | D 119868) = Pr (C | 119868) Pr (D | C 119868)Pr (D | 119868)

(5)

where the right-hand side consists of the product of the priorprobability of the circle parameters Pr(C | 119868) which describeswhat is known about the circle before any sensor data areconsidered with a ratio of probabilities that are sensor datadependent It is in this sense that Bayesrsquo Theorem representsa learning algorithm since what is known about the circleparameters before the data are considered (prior probability)is modified by the recorded sensor data resulting in a quan-tification of what is known about the circle parameters afterthe data are considered (posterior probability) The prob-ability in the numerator on the right is the likelihoodPr(D | C 119868) which quantifies the probability that the sensordata could have resulted from the hypothesized circle param-eterized by C The probability in the denominator is themarginal likelihood or the evidence which here acts as anormalization factor Later when estimating the SSF of thelight sensor (which is a different problem) the evidencewhich can bewritten as the integral of the product of the priorand the likelihood over all possible model parameters willplay a critical role

The likelihood term Pr(D | C 119868) plays a critical role inthe inference problem since it essentially compares predic-tions made using the hypothesized circle parameters to theobserved data A naıve light sensor model would predict thatif the sensor was centered on a black region it would returna small number and if the sensor was centered on a whiteregion it would return a large number A more accurate lightsensor model would take into account the SSF of the lightsensor and perform an SSF-weighted integral of the hypoth-esized albedo field and compare this to the recorded sensordata These two models of the light sensor will be discussedin detail in the next section

The robot not only makes inferences from data butalso designs its own experiments by autonomously decidingwhere to take subsequent measurements [1ndash6] This canbe viewed in terms of Bayesian experimental design [7ndash13]where the Shannon entropy [14] is employed as the utilityfunction to decide where to take the next measurement Inshort the Bayesian Inference Engine samples 50 circles fromthe posterior probability using the nested sampling algorithm[15 16] Since these samples may consist of replicationsthey are further diversified by taking approximately 1000Metropolis-Hastings steps [17 18] Figure 2 illustrates therobotrsquos machine learning systemrsquos view of the playing fieldafter severalmeasurements have been recordedThe 50 circlessampled from the posterior probability (blue circles) reflectwhat the robot knows about the white circle (not shown)given the light sensor data that it has collectedThe algorithmthen considers a fine grid of potential measurement locationson the playing field At each potential measurement locationthe 50 sampled circles are queried using the likelihoodfunction to produce 50 samples of what measurement couldbe expected at that location given each hypothesized circleThis results in a set of potential measurement values at eachmeasurement location The Shannon entropy of the set ofpotential measurements at each location is computed whichresults in an entropy map which is illustrated in Figure 2 asthe copper-toned coloration upon which the blue circles lieThe set of measurement locations with the greatest entropy

4 Journal of Sensors

60

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

Figure 2This figure illustrates the robotrsquosmachine learning systemrsquosview of the playing field using the naıve light sensor model Theaxes label playing field coordinates in LEGO distance units Thepreviously obtained measurement locations used to obtain lightsensor data are indicated by the black and white squares indicatingthe relative intensitywith respect to the naıve light sensormodelThenext selectedmeasurement location is indicated by the green squareThe blue circles represent the 50 hypothesized circles sampled fromthe posterior probability The shaded background represents theentropy map such that brighter areas indicate the measurementlocations that promise to provide greater information about thecircle to be characterized Note that the low entropy area boundedby the white squares indicates that this region is probably insidethe white circle and that measurements made here will not beas informative as measurements made elsewhere The dark jaggededges at the bottom of the colored high entropy regions reflect theboundary between the playing field and the region that is outside ofthe robotic armrsquos reach

are identified and the next measurement location is ran-domly chosen from that set Thus the likelihood functionnot only affects the inferences about the circles made bythe machine learning algorithm but also affects the entropyof the measurement locations which guides further explo-ration In the next section we describe the two models of thelight sensor and their corresponding likelihood functions

The efficacy of the sensor model will be quantified by theaverage number of measurements the robot needs to make toestimate the circle parameters within a given precision

24 Models Describing a Light Sensor In this section we dis-cuss two models of a light sensor and indicate precisely howthey are integrated into the likelihood function used by boththe Bayesian Inference and Inquiry Engines

241TheNaıve Likelihood Anaıve light sensormodel wouldpredict that if the sensor was centered on a black region (sur-face albedo of zero) the sensor would return a small numberon average and if it were centered on a white region (surfacealbedo of unity) it would return a large number on averageOf course there are expected to be noise variations fromnumerous sources such as uneven lighting of the surfaceminor variations in albedo and noise in the sensor itselfSo one might expect that there is some expected squareddeviation from the two mean sensor values for the white andblack cases For this reason we model the expected sensorresponse with a Gaussian distribution with mean 120583

119861and

standard deviation 120590119861for the black surface and a Gaussian

distribution withmean 120583119882and standard deviation 120590

119882for the

white surface The likelihood of a measurement 119889119894at loca-

tion (119909119894 119910119894) corresponding to the naıve light sensor model

Prnaive((119889119894 (119909119894 119910119894)) | C 119868) can be written compactly as

Prnaive ((119889119894 (119909119894 119910119894)) | C 119868)

=

(2120587120590119882)minus12 exp[minus

(120583119882minus 119889119894)2

21205902

119882

]

for 119863((119909119894 119910119894) (119909119900 119910119900)) le 119903

119900

(2120587120590119861)minus12 exp[minus

(120583119861minus 119889119894)2

21205902

119861

]

for 119863((119909119894 119910119894) (119909119900 119910119900)) gt 119903

119900

(6)

where C = (119909119900 119910119900) 119903119900 represents the parameters of the hy-

pothesized circle and 119863((119909119894 119910119894) (119909119900 119910119900)) is the Euclidean

distance given in (3)The joint likelihood for119873-independentmeasurements is found by taking the product of the 119873single-measurement likelihoods In a practical experimentthe means 120583

119861and 120583

119882and standard deviations 120590

119861and 120590

119882

can be easily estimated by sampling known black and whiteregions several times using the light sensor

25 The SSF Likelihood A more accurate likelihood can bedeveloped by taking into account the fact that the photo diodeperforms a weighted integral of the light arriving from aspatially distributed regionwithin its field of view theweightsbeing described by the spatial sensitivity function (SSF) of thesensor Since the SSF of the light sensor could be arbitrarilycomplex withmany local peaks but is expected to decrease tozero far from the sensor we characterize it using a mixture ofGaussians (MoG) model which we describe in this section

The sensorrsquos response situated a fixed distance above thepoint (119909

119894 119910119894) to a known black-and-white pattern which can

be modeled in the lab frame by

119872(119909119894 119910119894) = 119868min + (119868max minus 119868min) 119877 (119909119894 119910119894) (7)

where 119868min and 119868max are observed intensities for a completelyblack surface (surface albedo of zero) and a completely whitesurface (surface albedo of unity) respectively and 119877 is ascalar response function varying between zero and one thatdepends both on the SSF and the surface albedo [19] Theminimum intensity 119868min acts as an offset and (119868max minus 119868min)serves to scale the response to LEGO units

The response of the sensor is described by119877(119909119894 119910119894) which

is a convolution of the sensor SSF and the surface albedo119878(119909 119910) given by

119877 (119909119894 119910119894) = int119889119909119889119910SSF (119909 minus 119909

119894 119910 minus 119910

119894) 119878 (119909 119910) (8)

where the SSF is defined so that the convolution with a com-pletely white surface results in a response of unity In practicewe approximate this integral as a sum over a pixelated gridwith 1mm square pixels

119877 (119909119894 119910119894) = sum

119909119910

SSF (119909 minus 119909119894 119910 minus 119910

119894) 119878 (119909 119910) (9)

Journal of Sensors 5

We employ a mixture of Gaussians (MOG) as a param-eterized model to describe the SSF in the sensorrsquos frame co-ordinates (1199091015840 1199101015840) = (119909 minus 119909

119894 119910 minus 119910

119894)

SSF (1199091015840 1199101015840) = 1

119870

119873

sum

119899=1

119886119899

times exp [minus 119860119899(1199091015840

minus1199061015840

119899)

2

+119861119896(1199101015840

minus V1015840

119899)

2

+ 2119862119899(1199091015840

minus1199061015840

119899) (1199101015840

minusV1015840

119899) ]

(10)

where (1199061015840119899 V1015840119899) denotes the center of the 119899th two-dimensional

Gaussian with amplitude 119886119899and covariance matrix elements

given by119860119899 119861119899and119862

119899 The constant119870 denotes a normaliza-

tion factor which ensures that the SSF integrates to unity [19]The model is sufficiently general so that one could vary thenumber of Gaussians to any number although we found thatin modeling this light sensor testing models with 119873 varyingfrom119873 = 1 to119873 = 4 was sufficient The MoG model resultsin a set of six parameters to be estimated for each Gaussian

120579119899= 119886119899 1199061015840

119899 V1015840

119899 119860119899 119861119899 119862119899 (11)

where the subscript 119899 is used to denote the 119899th Gaussian inthe set These must be estimated along with the two intensityparameters 119868min and 119868max in (7) so that an MoG model with119873 Gaussians consists of 6119873 + 2 parameters to be estimated

We assign a Student-119905 distribution to the SSF modellikelihood which can be arrived at by assigning a Gaus-sian likelihood and integrating over the unknown stan-dard deviation 120590 [16] By defining D = (119889

1 (1199091 1199101))

(1198892 (1199092 1199102)) (119889

119873 (119909119873 119910119873)) and writing the hypothe-

sized circle parameters as C = (119909119900 119910119900) 119903119900 we have

PrSSF (D | C 119868) prop [119873

sum

119894=1

(119872 (119909119894 119910119894) minus 119889119894)2

]

minus(119873minus1)2

(12)

where119873 is the number of measurements made by the robotand the function119872 defined in (7) relies on both (8) and (10)Note that in practice the MoG SSF model described in (10)is used to generate a discrete SSF matrix which is used in theconvolution (9) to compute the likelihood function via thesensor response model119872 in (7)

26 Data Collection for SSF Estimation In this section wedescribe the collection of the light sensor readings in the lab-oratory that were used to estimate the SSF of the light sensorThe SSF is a function not only of the properties of the photodiode but also of the illuminating LED and the height abovethe surface For this reason measurements were made at aheight of 14mm above a surface with a known albedo in adarkened room to avoid complications due to ambient lightand to eliminate the possibility of shadows cast by the sensoror other equipment [19]

The surface which we refer to as the black-and-whiteboundary pattern consisted of two regions a black region onthe left and a white region on the right separated by a sharp

(a)

0∘

90∘

180∘

270∘

(b)

Figure 3 (a) This figure illustrates the laboratory surface referredto as the black-and-white boundary pattern with known albedowhich consisted of two regions a black region on the left and a whiteregion on the right separated by a sharp linear boundary (b) Thisillustrates the four sensor orientations used to collect data for theestimation of the SSF along with the symbols used to indicate themeasured values plotted in Figure 5 The top row views the sensorsas if looking up from the table so that the photodiode placement inthe sensor package can be visualized Below these the gray sensorshapes illustrate how they are oriented looking down at both thesensors and the albedo surface Measurements were taken as thesensor was incrementally moved with one-millimeter steps in adirection perpendicular to the boundary (as indicated by the arrowat the bottom of the figure) from a position of 5 cm to the left ofthe boundary (well within the completely black region) to a position5 cm to the right of the boundary (well within the completely whiteregion)This process was repeated four times with the sensor in eachof the four orientations

linear boundary as shown in Figure 3(a) Here the surfaceand sensor are illustrated as if viewing them by looking up atthem from below the table surface so that the placement ofthe photodiode in the sensor package can be visualized Thelab frame was defined to be at the center of the black-whiteboundary so that the surface albedo 119878BW(119909 119910) is given by

119878BW (119909 119910) = 1 for 119909 gt 00 for 119909 le 0

(13)

Measurements were taken as the sensor was incrementallymoved with one-millimeter steps in a direction perpendic-ular to the boundary from a position of 5 cm to the left ofthe boundary (well within the completely black region) to aposition 5 cm to the right of the boundary (well within thecompletely white region) resulting in 101 measurementsThisprocess was repeated four times with the sensor in each of

6 Journal of Sensors

40

(0 0)

(a)

39

(0 0)

(b)

35

(4 4)

(c)

32

(minus4 4)

(d)

Figure 4 Four additional symmetry-breaking albedo patterns wereemployed In all cases the sensor was placed in the 0∘ orientationat the center of the pattern indicated by (0 0) In the two lowerpatterns the center of the white square was shifted diagonally fromthe center by 4mm as indicated by the coordinates (white text) ofthe corner The recorded intensity levels are displayed in the whitealbedo area

the four orientations (see Figure 3 for an explanation) givinga total of 404 measurements using this pattern

The black-and-white boundary pattern does not providesufficient information to uniquely infer the SSF since thesensor may have a response that is symmetric about a lineoriented at 45∘ with respect to the linear boundary For thisreason we employed four additional albedo patterns consist-ing of black regions with one white quadrant as illustrated inFigure 4 which resulted in four more measurements Thesehave surface albedos defined by

119878119886(119909 119910) =

1 for 119909 gt 0mm 119910 gt 0mm0 otherwise

119878119887(119909 119910) =

1 for 119909 lt 0mm 119910 gt 0mm0 otherwise

119878119888(119909 119910) =

1 for 119909 gt 4mm 119910 gt 4mm0 otherwise

119878119889(119909 119910) =

1 for 119909 lt minus4mm 119910 gt 4mm0 otherwise

(14)

where the subscripts relate each albedo function to thepattern illustrated in Figure 4

27 Estimating SSF MoG Model Parameters In this sectionwe describe the application of Bayesian methods to estimatethe SSF MoG model parameters Keep in mind that in thispaper we are considering two distinct inference problems therobotrsquos inferences about a circle and our inferences about

the light sensor SSF Both of these problems rely on makingpredictions about measured intensities using a light sensorFor this reason many of these equations will not only looksimilar to what we have presented previously but also dependon the same functions mapping modeled albedo fields topredicted sensor responses which are collectively repre-sented using the generic symbol D It may help to keep inmind that the robot is characterizing a circle quantified bymodel parameters represented jointly by the symbol C andwe are estimating the SSF of a light sensor quantified bymodelparameters represented jointly by the symbol 120579 below Withthe exception of the evidence represented by function 119885below (which is not used by the robot in this experiment)all of the probability functions contain the model parametersin their list of arguments making it clear to which inferenceproblem they refer

Theposterior probability for the SSFMoGmodel parame-ters collectively referred to as 120579 = 120579

1 1205792 120579

119873 for amodel

consisting of119873 Gaussians is given by

Pr (120579 | D 119868) = 1119885

Pr (120579 | 119868)Pr (D | 120579 119868) (15)

where here D refers to the data collected for the SSFestimation experiment described in the previous section 119868refers to our prior information about the SSF (which is that itmay have several local peaks and falls off to zero far from thesensor) and119885 refers to the evidence119885 = Pr(D | 119868) which canbe found by

119885 = int119889120579Pr (120579 | 119868)Pr (D | 120579 119868) (16)

In our earlier discussion where Bayesrsquo theorem was used tomake inferences about circles the evidence played the roleof a normalization factor Here since we can consider MoGmodels with different numbers of Gaussians and since weintegrate over all of the possible values of the parametersthe evidence quantifies the degree to which the hypothesizedmodel order119873 supports the dataThat is the optimal numberof Gaussians to be used in the MoG model of the SSF can befound by computing the evidence

All five sets of data D described in the previous sectionwere used to compute the posterior probability These areeach indexed by the subscript 119894 where 119894 = 1 2 3 4 so that119863

119894

refers to the data collected using each of the four orienta-tions 120601

119894= 0∘

90∘

180∘

270∘

to scan the black-and-whiteboundary pattern resulting in 119873

119894= 101 measurements for

119894 = 1 2 3 4 and where the value 119894 = 5 refers to the 119873119894= 4

measurements attained using the 0∘ orientation with the setof four additional patterns

We assign uniform priors so that this is essentially amaximum likelihood calculation with the posterior beingproportional to the likelihood We assign a Student-119905 distri-bution to the likelihood which can be arrived at by assigning

Journal of Sensors 7

a Gaussian likelihood and integrating over the unknownstandard deviation 120590 [16] This can be written as

Pr (119863119894| 120579 119868) prop [

[

119873119894

sum

119895=1

(119872119894119895(119909119894119895 119910119894119895)minus119863119894(119909119894119895 119910119894119895))

2

]

]

minus(119873119894minus1)2

(17)

where 119894 denotes each of the five sets of data and 119895 denotesthe 119895th measurement of that set which was taken at position(119909119894119895 119910119894119895) The function119872

119894119895(119909 119910) represents a predicted mea-

surement value obtained from (7) using (9) with the albedofunction 119878(119909 119910) defined using the albedo pattern 119878BW(119909 119910)with orientation 120601

119894for 119894 = 1 2 3 4 and the albedo patterns

119878119886 119878119887 119878119888 119878119889for 119894 = 5 and 119895 = 1 2 3 4 respectively As such

the likelihood relies on a difference between predicted andmeasured sensor responses

The joint likelihood for the five data sets is foundby takingthe product of the likelihoods for each data set since weexpect that the standard deviations that were marginalizedover to get the Student-119905 distribution could have beendifferent for each of the five data sets as they were not allrecorded at the same time

Pr (D | 120579 119868) =5

prod

119894=1

Pr (119863119894| 120579 119868) (18)

We employed nested sampling [15 16] to explore the pos-terior probability since in addition to providing parameterestimates it is explicitly designed to perform evidence calcu-lations which we use to perform model comparison in iden-tifying the most probable number of Gaussians in the MoGmodel For each of the four MoG models (number of Gaus-sians varying from one to four) the nested sampling algo-rithm was initialized with 300 samples and iterated until thechange in consecutive log-evidence values is less than 10minus8Typically one estimates the mean parameter values by takingan average of the samples weighted by a quantity computedby the nested sampling algorithm called the logWt in [15 16]Here we simply performed a logWt-weighted average of thesampled SSF fields computed using the sampled MoG modelparameters (rather than the logWt-weighted average of theMoG model parameters themselves) so the result obtainedusing an MoG model consisting of a single Gaussian is notstrictly a single two-dimensional Gaussian distribution It isthis discretized estimated SSF fieldmatrix that is used directlyin the convolution (9) to compute the likelihood functions asmentioned earlier in the last lines of Section 25

3 Results and Discussion

In this section we present the SSF MoG light sensor modelestimated from the laboratory data and evaluate its efficacy bydemonstrating a significant improvement of the performancein the autonomous robotic platform

31 Light Sensor SSF MoG Model The light sensor data col-lected using the black-and-white boundary pattern are illus-trated in Figure 5 One can see that the intensity recorded

0∘

90∘

180∘

270∘

Sensor orientation

50

45

40

35

30

25

20

minus5 minus4 minus3 minus2 minus1 0 1 2 3 4 5Position of the sensor (cm)

Reco

rded

inte

nsity

(LEG

O u

nits)

Figure 5 This figure illustrates the intensity measurements1198631 1198632 1198633 1198634from the four sensor orientations recorded from

the sensor using the black-and-white boundary pattern Figure 3shows the orientations of the sensor corresponding to the symbolsused in this figure

Table 1 A comparison of the tested MoG SSF models and theirrespective log evidence (in units of dataminus408)

MoG model order Log evidence Number of parameters1 Gaussian minus6655 plusmn 03 62 Gaussian minus6749 plusmn 03 123 Gaussian minus6719 plusmn 04 184 Gaussian minus7061 plusmn 04 24

by the sensor increases dramatically as the sensor crosses theboundary from the black region to the white region but thechange is not a step function which indicates the finite size ofthe surface area integrated by the sensor It is this effect that isto bemodeled by the SSF functionThere is an obvious asym-metry between the 90∘ and 270∘ orientations due to the shiftof the transition region In addition there is a significant dif-ference in slope of the transition region between the 0∘ 180∘orientations and the 90∘ 270∘ orientations indicating a sig-nificant difference in the width of the SSF in those directionsNote also that the minimum recorded response is not zero asthe reflectance of the black surface is not completely zero

The nested sampling algorithm produced the mean SSFfields for each of the MoG models tested as well as thecorresponding log evidence Table 1 which lists the log evi-dence computed for each MoG model order illustrates thatthe most probable model was obtained from the single two-dimensional Gaussian models by a factor of about exp(9)which means that it is about 8000 times more probable thanthe MoG consisting of two Gaussians Figure 6 shows themean SSF fields described by the MoG models of differentorders In all cases the center of the SSF is shifted slightlyabove the physical center of the sensor package due to theplacement of the photodiode (refer to Figure 1) as indicatedby the data in Figure 5 In addition as predicted one sees

8 Journal of Sensors

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6655 plusmn 03

1MoG

(mm)

(mm

)

(a)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6749 plusmn 03

2 MoG

(mm)

(mm

)

(b)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6719 plusmn 04

3MoG

(mm)

(mm

)

(c)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus7061 plusmn 04

4MoG

(mm)

(mm

)

(d)

Figure 6 This figure illustrates the SSF obtained from the four MoG models along with their corresponding log-evidence values

that the SSF is wider along the 90∘ndash270∘ axis than along the0∘ndash180∘ axis Last Figure 7 demonstrates that the predictedlight sensor output shows excellent similarity to the recordeddata for the black-and-white boundary pattern

In the next section we demonstrate that explicit knowl-edge about how the light sensor integrates light arriving fromwithin its field-of-view improves the inferences one canmakefrom its output

32 Efficacy of Sensor Model The mean SSF field obtainedusing a single two-dimensional Gaussianmodel (Figure 6(a))was incorporated into the likelihood function used by therobotrsquos machine learning system Here we compare therobotrsquos performance in locating and characterizing a circle byobserving the average number of measurements necessary tocharacterize the circle parameters within a precision of 4mm(which is one-half of the spacing between the holes in theLEGO technic parts)

The three panels comprising Figure 8(a) illustrate therobotrsquos machine learning systemrsquos view of the playing fieldusing the naıve light sensor model The previously obtainedmeasurement locations used to obtain light sensor data areindicated by the black and white squares indicating therelative intensity with respect to the naıve light sensor modelThe next selected measurement location is indicated by thegreen square The blue circles represent the 50 hypothesizedcircles sampled from the posterior probability The shadedbackground represents the entropy map which indicates themeasurement locations that promise to provide maximalinformation about the circle to be characterized The lowentropy area surrounding the white square indicates that theregion is probably inside the white circle (not shown) andthat measurements made there will not be as informative asmeasurementsmade elsewhere Note that the circles partitionthe plane and that each partition has a uniform entropy Allmeasurement locations within that partition or any otherpartition sharing the same entropy all stand to be equally

Journal of Sensors 9

50

55

45

40

35

30

25

20

minus3 minus2 minus1 0 1 2 3Position of the center of the sensor (cm)

Inte

nsity

(LEG

O u

nits)

Intensity measurement compared for 1MoG

Prediction1Prediction2Prediction3Prediction4

Data1Data2Data3Data4

Figure 7 A comparison of the observed data (red) with predictions(black) made by the SSF field estimated using the single two-dimensional Gaussian MoG model

informative Here it is the fact that the shape is known to bea circle that is driving the likelihood function

In contrast the three panels comprising Figure 8(b)illustrate the playing field using themore accurate SSFmodelHere one can see that the entropy is higher along the edgesof the sampled circles This indicates that the circle edgespromise to provide more information than the centers ofthe partitioned regions This is because that the SSF modelenables one to detect not only whether the light sensor issituated above the circles edge but also how much of the SSFoverlaps with the white circle That is it helps to identify notonlywhether the sensor is inside the circle (as is accomplishedusing the naıve light sensor model) but also the extent towhich the sensor is on the edge of the circle The additionalinformation provided about the functioning of the lightsensor translates directly into additional information aboutthe albedo that results in the sensor output

This additional information can be quantified by observ-ing how many measurements the robot is required to taketo obtain estimates of the circle parameters within thesame precision in the cases of each light sensor modelOur experiments revealed that on average it takes 19 plusmn 18measurements using the naıve light sensor model comparedto an average of 116plusmn 39measurements for themore accurateSSF light sensor model

4 Conclusion

Thequality of the inferences onemakes froma sensor dependnot only on the quality of the data returned by the sensorbut also on the information one possesses about the sensorrsquos

performance In this paper we have demonstrated via a casestudy how more precisely modeling a sensorrsquos performancecan improve the inferences one can make from its data Inthis case we demonstrated that one can achieve about 18reduction in the number of measurements needed by a robotto make the same inferences by more precisely modeling itslight sensor

This paper demonstrates how a machine learning systemthat employs Bayesian inference (and inquiry) relies onthe likelihood function of the data given the hypothesizedmodel parameters Rather than simply representing a noisemodel the likelihood function quantifies the probability thata hypothesized situation could have given rise to the recordeddata By incorporating more information about the sensors(or equipment) used to record the data one naturally isincorporating this information into the posterior probabilitywhich results in onersquos inferences

This is made even more apparent by a careful study ofthe experimental design problem that this particular roboticsystem is designed to explore For example it is easy toshow that by using the naıve light sensor model the entropydistribution for a proposed measurement location dependssolely on the number of sampled circles forwhich the locationis in the interior of the circle and the number of sampledcircles for which the location is exterior to the circle Giventhat we represented the posterior probability by sampling50 circles the maximum entropy occurs when the proposedmeasurement location is inside 25 circles (and outside 25circles) As the robotrsquos parameter estimates converge one canshow that the system is simply performing a binary searchby asking ldquoyesrdquo or ldquonordquo questions which implies that eachmeasurement results in one bit of information However inthe case where the robot employs an SSF model of the lightsensor the question the robot is essentially asking is moredetailed ldquoto what degree does the circle overlap the lightsensorrsquos SSFrdquoThe answer to such a question tends to providemore information which significantly improves system per-formance One can estimate the information gain achievedby employing the SSF model Consider that the naive modelreveals that estimating the circlersquos position and radius with aprecision of 4mmgiven the prior information about the circleand the playing field requires 25 bits of information Theexperiment using the SSFmodel requires on average 116mea-surements which implies that on average each measurementobtained using the SSF model provides about 25116 = 215bits of information One must keep in mind however thatthis is due to the fact that the SSF model is being used notonly to infer the circle parameters from the data but also toselect the measurement locations

Because the method presented here is based on a verygeneral inferential framework these methods can easily beapplied to other types of sensors and equipment in a widevariety of situations If one has designed a robotic machinelearning system to rely on likelihood functions then sensormodels can be incorporated in more or less a plug-and-playfashion This not only promises to improve the quality ofrobotic systems forced to rely on lower quality sensors but italso opens the possibility for calibration on the fly by updatingsensor models as data are continually collected

10 Journal of Sensors

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(a)

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(b)

Figure 8 (a)These three panels illustrate the robotrsquos machine learning systemrsquos view of the playing field using the naıve light sensor model asthe system progresses through the first three measurements The previously obtained measurement locations used to obtain light sensor dataare indicated by the black-and-white squares indicating the relative intensity with respect to the naıve light sensor model The next selectedmeasurement location is indicated by the green square The blue circles represent the 50 hypothesized circles sampled from the posteriorprobabilityThe shaded background represents the entropymap which indicates themeasurement locations that promise to providemaximalinformation about the circle to be characterized Note that the low entropy area surrounding the white square indicates that the region isprobably inside the white circle (not shown) and that measurements made there will not be as informative as measurements made elsewhereThe entropy map in Figure 2 shows the same experiment at a later stage after seven measurements have been recorded (b)These three panelsillustrate the robotrsquos machine learning systemrsquos view of the playing field using the more accurate SSF light sensor model Note that the entropymap reveals the circle edges to be highly informative This is because it helps to identify not only whether the sensor is inside the circle (as isaccomplished using the naıve light sensor model on the left) but also the extent to which the sensor is on the edge of the circle

Conflict of Interests

The authors have no conflict of interests to report

Acknowledgments

This research was supported in part by the University atAlbany Faculty Research Awards Program (Knuth) and theUniversity at Albany Benevolent Research Grant Award(Malakar) The authors would like to thank Scott Frasso andRotem Guttman for their assistance in interfacing MatLab tothe LEGOBrickTheywould also like to thank Phil Erner andA J Mesiti for their preliminary efforts on the robotic armexperiments and sensor characterization

References

[1] K H Knuth P M Erner and S Frasso ldquoDesigning intelligentinstrumentsrdquo in Proceedings of the 27th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering K Knuth A Caticha J Center A Giffin andC Rodriguez Eds vol 954 pp 203ndash211 AIP July 2007

[2] K H Knuth and J L Center Jr ldquoAutonomous science platformsand question-asking machinesrdquo in Proceedings of the 2nd Inter-national Workshop on Cognitive Information Processing (CIPrsquo10) pp 221ndash226 June 2010

[3] K H Knuth and J L Center ldquoAutonomous sensor placementrdquoin Proceedings of the IEEE International Conference on Technolo-gies for Practical Robot Applications pp 94ndash99 November 2008

Journal of Sensors 11

[4] K H Knuth ldquoIntelligent machines in the twenty-first centuryfoundations of inference and inquiryrdquo Philosophical Transac-tions of the Royal Society A vol 361 no 1813 pp 2859ndash28732003

[5] N K Malakar and K H Knuth ldquoEntropy-based search algo-rithm for experimental designrdquo in Proceedings of the 30thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering A M Djafari JF Bercher and P Bessiere Eds vol 1305 pp 157ndash164 AIP July2010

[6] N K Malakar K H Knuth and D J Lary ldquoMaximum jointentropy and information-based collaboration of automatedlearning machinesrdquo in Proceedings of the 31st InternationalWorkshop on Bayesian Inference and Maximum Entropy Meth-ods in Science and Engineering P Goyal A Giffin K H Knuthand E Vrscay Eds vol 1443 pp 230ndash237 AIP 2012

[7] D V Lindley ldquoOn a measure of the information provided by anexperimentrdquo The Annals of Mathematical Statistics vol 27 no4 pp 986ndash1005 1956

[8] V V Fedorov Theory of Optimal Experiments Probability andMathematical Statistics Academic Press 1972

[9] K Chaloner and I Verdinelli ldquoBayesian experimental design areviewrdquo Statistical Science vol 10 pp 273ndash304 1995

[10] P Sebastiani and H P Wynn ldquoBayesian experimental designand shannon informationrdquo in 1997 Proceedings of the Section onBayesian Statistical Science pp 176ndash181 1997 httpciteseerxistpsueduviewdocsummarydoi=1011566037

[11] P Sebastiani and H P Wynn ldquoMaximum entropy samplingand optimal Bayesian experimental designrdquo Journal of the RoyalStatistical Society B vol 62 no 1 pp 145ndash157 2000

[12] T J Loredo and D F Cherno ldquoBayesian adaptive explorationrdquoin Proceedings of the 23rd International Workshop on BayesianInference and Maximum Entropy Methods in Science and Engi-neering G Erickson and Y Zhai Eds pp 330ndash346 AIP August2003

[13] R Fischer ldquoBayesian experimental designmdashstudies for fusiondiagnosticsrdquo in Proceedings of the 24th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering vol 735 pp 76ndash83 November 2004

[14] C E Shannon and W Weaver The Mathematical Theory ofCommunication University of Illinois Press Chicago Ill USA1949

[15] J Skilling ldquoNested samplingrdquo in Proceedings of the 24th Inter-nationalWorkshop onBayesian Inference andMaximumEntropyMethods in Science and Engineering vol 735 pp 395ndash405 2004

[16] D Sivia and J Skilling Data Analysis A Bayesian TutorialOxford University Press New York NY USA 2nd edition2006

[17] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953

[18] W K Hastings ldquoMonte carlo sampling methods using Markovchains and their applicationsrdquo Biometrika vol 57 no 1 pp 97ndash109 1970

[19] N K Malakar A J Mesiti and K H Knuth ldquoThe spatialsensitivity function of a light sensorrdquo in Proceedings of the 29thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering P Goggans andC Y Chan Eds vol 1193 of AIP Conference Proceedings pp352ndash359American Institute of PhysicsOxfordMassUSA July2009

Submit your manuscripts athttpwwwhindawicom

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013Part I

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

DistributedSensor Networks

International Journal of

ISRN Signal Processing

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Mechanical Engineering

Advances in

Modelling amp Simulation in EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inOptoElectronics

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013

ISRN Sensor Networks

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawi Publishing Corporation httpwwwhindawicom Volume 2013

The Scientific World Journal

ISRN Robotics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

Antennas andPropagation

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

ISRN Electronics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

thinspJournalthinspofthinsp

Sensors

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Active and Passive Electronic Components

Chemical EngineeringInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Electrical and Computer Engineering

Journal of

ISRN Civil Engineering

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inAcoustics ampVibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Journal of Sensors 3

the black field Such an instruction is encoded by providingthe robot with a model of the surface albedo consisting ofthree parameters the center location (119909

119900 119910119900) and radius 119903

119900

written jointly as

C = (119909119900 119910119900) 119903119900 (1)

so that given a measurement location (119909119894 119910119894) the albedo 119878 is

expected to be

119878 (119909119894 119910119894C) = 1 if 119863((119909

119894 119910119894) (119909119900 119910119900)) le 119903

119900

0 if 119863((119909119894 119910119894) (119909119900 119910119900)) gt 119903

119900

(2)

where

119863((119909119894 119910119894) (119909119900 119910119900)) = radic(119909

119894minus 119909119900)2

+ (119910119894minus 119910119900)2 (3)

is the Euclidean distance between the measurement location(119909119894 119910119894) and the center of the circle (119909

119900 119910119900) and an albedo of

1 signifies that the surface is white and 0 signifies that thesurface is black Precisely how these expectations are used tomake inferences from data is explained in the next sectionKeep in mind that while the circlersquos precise radius and posi-tion are unknown the robot has been provided with limitedprior information about the allowable range of radii andpositions

Again it is important to note that the robot does not scanthe surface to solve the problem nor does it try to find threepoints along the edge of the circle Instead it employs a gen-eral system that works for any expected shape or set of shapesthat autonomously and intelligently determines optimalmea-surement locations based both on what it knows and onwhat it does not knowThe number of measurements neededto characterize all three circle parameters within the desiredaccuracy is a measure of the efficiency of the experimentalprocedure

23TheMachine Learning System Themachine learning sys-tem employs a Bayesian Inference Engine to make inferencesabout the circle parameters given the recorded light intensi-ties as well as an Inquiry Engine designed to use the uncer-tainties in the circle parameter estimates to autonomouslyselect measurement locations that promise to provide themaximum amount of relevant information about the prob-lem

The core of the Bayesian Inference Engine is cen-tered around the computation of the posterior probabilityPr(C | D 119868) of the albedo model parameters C in (1) giventhe light sensor recordings (data) 119889

119894recorded at locations

(119909119894 119910119894) which we write compactly as

D = (1198891 (1199091 1199101)) (119889

119873 (119909119873 119910119873)) (4)

and any additional prior information 119868 Bayesrsquo Theoremallows one to write the posterior probability as a function ofthree related probabilities

Pr (C | D 119868) = Pr (C | 119868) Pr (D | C 119868)Pr (D | 119868)

(5)

where the right-hand side consists of the product of the priorprobability of the circle parameters Pr(C | 119868) which describeswhat is known about the circle before any sensor data areconsidered with a ratio of probabilities that are sensor datadependent It is in this sense that Bayesrsquo Theorem representsa learning algorithm since what is known about the circleparameters before the data are considered (prior probability)is modified by the recorded sensor data resulting in a quan-tification of what is known about the circle parameters afterthe data are considered (posterior probability) The prob-ability in the numerator on the right is the likelihoodPr(D | C 119868) which quantifies the probability that the sensordata could have resulted from the hypothesized circle param-eterized by C The probability in the denominator is themarginal likelihood or the evidence which here acts as anormalization factor Later when estimating the SSF of thelight sensor (which is a different problem) the evidencewhich can bewritten as the integral of the product of the priorand the likelihood over all possible model parameters willplay a critical role

The likelihood term Pr(D | C 119868) plays a critical role inthe inference problem since it essentially compares predic-tions made using the hypothesized circle parameters to theobserved data A naıve light sensor model would predict thatif the sensor was centered on a black region it would returna small number and if the sensor was centered on a whiteregion it would return a large number A more accurate lightsensor model would take into account the SSF of the lightsensor and perform an SSF-weighted integral of the hypoth-esized albedo field and compare this to the recorded sensordata These two models of the light sensor will be discussedin detail in the next section

The robot not only makes inferences from data butalso designs its own experiments by autonomously decidingwhere to take subsequent measurements [1ndash6] This canbe viewed in terms of Bayesian experimental design [7ndash13]where the Shannon entropy [14] is employed as the utilityfunction to decide where to take the next measurement Inshort the Bayesian Inference Engine samples 50 circles fromthe posterior probability using the nested sampling algorithm[15 16] Since these samples may consist of replicationsthey are further diversified by taking approximately 1000Metropolis-Hastings steps [17 18] Figure 2 illustrates therobotrsquos machine learning systemrsquos view of the playing fieldafter severalmeasurements have been recordedThe 50 circlessampled from the posterior probability (blue circles) reflectwhat the robot knows about the white circle (not shown)given the light sensor data that it has collectedThe algorithmthen considers a fine grid of potential measurement locationson the playing field At each potential measurement locationthe 50 sampled circles are queried using the likelihoodfunction to produce 50 samples of what measurement couldbe expected at that location given each hypothesized circleThis results in a set of potential measurement values at eachmeasurement location The Shannon entropy of the set ofpotential measurements at each location is computed whichresults in an entropy map which is illustrated in Figure 2 asthe copper-toned coloration upon which the blue circles lieThe set of measurement locations with the greatest entropy

4 Journal of Sensors

60

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

Figure 2This figure illustrates the robotrsquosmachine learning systemrsquosview of the playing field using the naıve light sensor model Theaxes label playing field coordinates in LEGO distance units Thepreviously obtained measurement locations used to obtain lightsensor data are indicated by the black and white squares indicatingthe relative intensitywith respect to the naıve light sensormodelThenext selectedmeasurement location is indicated by the green squareThe blue circles represent the 50 hypothesized circles sampled fromthe posterior probability The shaded background represents theentropy map such that brighter areas indicate the measurementlocations that promise to provide greater information about thecircle to be characterized Note that the low entropy area boundedby the white squares indicates that this region is probably insidethe white circle and that measurements made here will not beas informative as measurements made elsewhere The dark jaggededges at the bottom of the colored high entropy regions reflect theboundary between the playing field and the region that is outside ofthe robotic armrsquos reach

are identified and the next measurement location is ran-domly chosen from that set Thus the likelihood functionnot only affects the inferences about the circles made bythe machine learning algorithm but also affects the entropyof the measurement locations which guides further explo-ration In the next section we describe the two models of thelight sensor and their corresponding likelihood functions

The efficacy of the sensor model will be quantified by theaverage number of measurements the robot needs to make toestimate the circle parameters within a given precision

24 Models Describing a Light Sensor In this section we dis-cuss two models of a light sensor and indicate precisely howthey are integrated into the likelihood function used by boththe Bayesian Inference and Inquiry Engines

241TheNaıve Likelihood Anaıve light sensormodel wouldpredict that if the sensor was centered on a black region (sur-face albedo of zero) the sensor would return a small numberon average and if it were centered on a white region (surfacealbedo of unity) it would return a large number on averageOf course there are expected to be noise variations fromnumerous sources such as uneven lighting of the surfaceminor variations in albedo and noise in the sensor itselfSo one might expect that there is some expected squareddeviation from the two mean sensor values for the white andblack cases For this reason we model the expected sensorresponse with a Gaussian distribution with mean 120583

119861and

standard deviation 120590119861for the black surface and a Gaussian

distribution withmean 120583119882and standard deviation 120590

119882for the

white surface The likelihood of a measurement 119889119894at loca-

tion (119909119894 119910119894) corresponding to the naıve light sensor model

Prnaive((119889119894 (119909119894 119910119894)) | C 119868) can be written compactly as

Prnaive ((119889119894 (119909119894 119910119894)) | C 119868)

=

(2120587120590119882)minus12 exp[minus

(120583119882minus 119889119894)2

21205902

119882

]

for 119863((119909119894 119910119894) (119909119900 119910119900)) le 119903

119900

(2120587120590119861)minus12 exp[minus

(120583119861minus 119889119894)2

21205902

119861

]

for 119863((119909119894 119910119894) (119909119900 119910119900)) gt 119903

119900

(6)

where C = (119909119900 119910119900) 119903119900 represents the parameters of the hy-

pothesized circle and 119863((119909119894 119910119894) (119909119900 119910119900)) is the Euclidean

distance given in (3)The joint likelihood for119873-independentmeasurements is found by taking the product of the 119873single-measurement likelihoods In a practical experimentthe means 120583

119861and 120583

119882and standard deviations 120590

119861and 120590

119882

can be easily estimated by sampling known black and whiteregions several times using the light sensor

25 The SSF Likelihood A more accurate likelihood can bedeveloped by taking into account the fact that the photo diodeperforms a weighted integral of the light arriving from aspatially distributed regionwithin its field of view theweightsbeing described by the spatial sensitivity function (SSF) of thesensor Since the SSF of the light sensor could be arbitrarilycomplex withmany local peaks but is expected to decrease tozero far from the sensor we characterize it using a mixture ofGaussians (MoG) model which we describe in this section

The sensorrsquos response situated a fixed distance above thepoint (119909

119894 119910119894) to a known black-and-white pattern which can

be modeled in the lab frame by

119872(119909119894 119910119894) = 119868min + (119868max minus 119868min) 119877 (119909119894 119910119894) (7)

where 119868min and 119868max are observed intensities for a completelyblack surface (surface albedo of zero) and a completely whitesurface (surface albedo of unity) respectively and 119877 is ascalar response function varying between zero and one thatdepends both on the SSF and the surface albedo [19] Theminimum intensity 119868min acts as an offset and (119868max minus 119868min)serves to scale the response to LEGO units

The response of the sensor is described by119877(119909119894 119910119894) which

is a convolution of the sensor SSF and the surface albedo119878(119909 119910) given by

119877 (119909119894 119910119894) = int119889119909119889119910SSF (119909 minus 119909

119894 119910 minus 119910

119894) 119878 (119909 119910) (8)

where the SSF is defined so that the convolution with a com-pletely white surface results in a response of unity In practicewe approximate this integral as a sum over a pixelated gridwith 1mm square pixels

119877 (119909119894 119910119894) = sum

119909119910

SSF (119909 minus 119909119894 119910 minus 119910

119894) 119878 (119909 119910) (9)

Journal of Sensors 5

We employ a mixture of Gaussians (MOG) as a param-eterized model to describe the SSF in the sensorrsquos frame co-ordinates (1199091015840 1199101015840) = (119909 minus 119909

119894 119910 minus 119910

119894)

SSF (1199091015840 1199101015840) = 1

119870

119873

sum

119899=1

119886119899

times exp [minus 119860119899(1199091015840

minus1199061015840

119899)

2

+119861119896(1199101015840

minus V1015840

119899)

2

+ 2119862119899(1199091015840

minus1199061015840

119899) (1199101015840

minusV1015840

119899) ]

(10)

where (1199061015840119899 V1015840119899) denotes the center of the 119899th two-dimensional

Gaussian with amplitude 119886119899and covariance matrix elements

given by119860119899 119861119899and119862

119899 The constant119870 denotes a normaliza-

tion factor which ensures that the SSF integrates to unity [19]The model is sufficiently general so that one could vary thenumber of Gaussians to any number although we found thatin modeling this light sensor testing models with 119873 varyingfrom119873 = 1 to119873 = 4 was sufficient The MoG model resultsin a set of six parameters to be estimated for each Gaussian

120579119899= 119886119899 1199061015840

119899 V1015840

119899 119860119899 119861119899 119862119899 (11)

where the subscript 119899 is used to denote the 119899th Gaussian inthe set These must be estimated along with the two intensityparameters 119868min and 119868max in (7) so that an MoG model with119873 Gaussians consists of 6119873 + 2 parameters to be estimated

We assign a Student-119905 distribution to the SSF modellikelihood which can be arrived at by assigning a Gaus-sian likelihood and integrating over the unknown stan-dard deviation 120590 [16] By defining D = (119889

1 (1199091 1199101))

(1198892 (1199092 1199102)) (119889

119873 (119909119873 119910119873)) and writing the hypothe-

sized circle parameters as C = (119909119900 119910119900) 119903119900 we have

PrSSF (D | C 119868) prop [119873

sum

119894=1

(119872 (119909119894 119910119894) minus 119889119894)2

]

minus(119873minus1)2

(12)

where119873 is the number of measurements made by the robotand the function119872 defined in (7) relies on both (8) and (10)Note that in practice the MoG SSF model described in (10)is used to generate a discrete SSF matrix which is used in theconvolution (9) to compute the likelihood function via thesensor response model119872 in (7)

26 Data Collection for SSF Estimation In this section wedescribe the collection of the light sensor readings in the lab-oratory that were used to estimate the SSF of the light sensorThe SSF is a function not only of the properties of the photodiode but also of the illuminating LED and the height abovethe surface For this reason measurements were made at aheight of 14mm above a surface with a known albedo in adarkened room to avoid complications due to ambient lightand to eliminate the possibility of shadows cast by the sensoror other equipment [19]

The surface which we refer to as the black-and-whiteboundary pattern consisted of two regions a black region onthe left and a white region on the right separated by a sharp

(a)

0∘

90∘

180∘

270∘

(b)

Figure 3 (a) This figure illustrates the laboratory surface referredto as the black-and-white boundary pattern with known albedowhich consisted of two regions a black region on the left and a whiteregion on the right separated by a sharp linear boundary (b) Thisillustrates the four sensor orientations used to collect data for theestimation of the SSF along with the symbols used to indicate themeasured values plotted in Figure 5 The top row views the sensorsas if looking up from the table so that the photodiode placement inthe sensor package can be visualized Below these the gray sensorshapes illustrate how they are oriented looking down at both thesensors and the albedo surface Measurements were taken as thesensor was incrementally moved with one-millimeter steps in adirection perpendicular to the boundary (as indicated by the arrowat the bottom of the figure) from a position of 5 cm to the left ofthe boundary (well within the completely black region) to a position5 cm to the right of the boundary (well within the completely whiteregion)This process was repeated four times with the sensor in eachof the four orientations

linear boundary as shown in Figure 3(a) Here the surfaceand sensor are illustrated as if viewing them by looking up atthem from below the table surface so that the placement ofthe photodiode in the sensor package can be visualized Thelab frame was defined to be at the center of the black-whiteboundary so that the surface albedo 119878BW(119909 119910) is given by

119878BW (119909 119910) = 1 for 119909 gt 00 for 119909 le 0

(13)

Measurements were taken as the sensor was incrementallymoved with one-millimeter steps in a direction perpendic-ular to the boundary from a position of 5 cm to the left ofthe boundary (well within the completely black region) to aposition 5 cm to the right of the boundary (well within thecompletely white region) resulting in 101 measurementsThisprocess was repeated four times with the sensor in each of

6 Journal of Sensors

40

(0 0)

(a)

39

(0 0)

(b)

35

(4 4)

(c)

32

(minus4 4)

(d)

Figure 4 Four additional symmetry-breaking albedo patterns wereemployed In all cases the sensor was placed in the 0∘ orientationat the center of the pattern indicated by (0 0) In the two lowerpatterns the center of the white square was shifted diagonally fromthe center by 4mm as indicated by the coordinates (white text) ofthe corner The recorded intensity levels are displayed in the whitealbedo area

the four orientations (see Figure 3 for an explanation) givinga total of 404 measurements using this pattern

The black-and-white boundary pattern does not providesufficient information to uniquely infer the SSF since thesensor may have a response that is symmetric about a lineoriented at 45∘ with respect to the linear boundary For thisreason we employed four additional albedo patterns consist-ing of black regions with one white quadrant as illustrated inFigure 4 which resulted in four more measurements Thesehave surface albedos defined by

119878119886(119909 119910) =

1 for 119909 gt 0mm 119910 gt 0mm0 otherwise

119878119887(119909 119910) =

1 for 119909 lt 0mm 119910 gt 0mm0 otherwise

119878119888(119909 119910) =

1 for 119909 gt 4mm 119910 gt 4mm0 otherwise

119878119889(119909 119910) =

1 for 119909 lt minus4mm 119910 gt 4mm0 otherwise

(14)

where the subscripts relate each albedo function to thepattern illustrated in Figure 4

27 Estimating SSF MoG Model Parameters In this sectionwe describe the application of Bayesian methods to estimatethe SSF MoG model parameters Keep in mind that in thispaper we are considering two distinct inference problems therobotrsquos inferences about a circle and our inferences about

the light sensor SSF Both of these problems rely on makingpredictions about measured intensities using a light sensorFor this reason many of these equations will not only looksimilar to what we have presented previously but also dependon the same functions mapping modeled albedo fields topredicted sensor responses which are collectively repre-sented using the generic symbol D It may help to keep inmind that the robot is characterizing a circle quantified bymodel parameters represented jointly by the symbol C andwe are estimating the SSF of a light sensor quantified bymodelparameters represented jointly by the symbol 120579 below Withthe exception of the evidence represented by function 119885below (which is not used by the robot in this experiment)all of the probability functions contain the model parametersin their list of arguments making it clear to which inferenceproblem they refer

Theposterior probability for the SSFMoGmodel parame-ters collectively referred to as 120579 = 120579

1 1205792 120579

119873 for amodel

consisting of119873 Gaussians is given by

Pr (120579 | D 119868) = 1119885

Pr (120579 | 119868)Pr (D | 120579 119868) (15)

where here D refers to the data collected for the SSFestimation experiment described in the previous section 119868refers to our prior information about the SSF (which is that itmay have several local peaks and falls off to zero far from thesensor) and119885 refers to the evidence119885 = Pr(D | 119868) which canbe found by

119885 = int119889120579Pr (120579 | 119868)Pr (D | 120579 119868) (16)

In our earlier discussion where Bayesrsquo theorem was used tomake inferences about circles the evidence played the roleof a normalization factor Here since we can consider MoGmodels with different numbers of Gaussians and since weintegrate over all of the possible values of the parametersthe evidence quantifies the degree to which the hypothesizedmodel order119873 supports the dataThat is the optimal numberof Gaussians to be used in the MoG model of the SSF can befound by computing the evidence

All five sets of data D described in the previous sectionwere used to compute the posterior probability These areeach indexed by the subscript 119894 where 119894 = 1 2 3 4 so that119863

119894

refers to the data collected using each of the four orienta-tions 120601

119894= 0∘

90∘

180∘

270∘

to scan the black-and-whiteboundary pattern resulting in 119873

119894= 101 measurements for

119894 = 1 2 3 4 and where the value 119894 = 5 refers to the 119873119894= 4

measurements attained using the 0∘ orientation with the setof four additional patterns

We assign uniform priors so that this is essentially amaximum likelihood calculation with the posterior beingproportional to the likelihood We assign a Student-119905 distri-bution to the likelihood which can be arrived at by assigning

Journal of Sensors 7

a Gaussian likelihood and integrating over the unknownstandard deviation 120590 [16] This can be written as

Pr (119863119894| 120579 119868) prop [

[

119873119894

sum

119895=1

(119872119894119895(119909119894119895 119910119894119895)minus119863119894(119909119894119895 119910119894119895))

2

]

]

minus(119873119894minus1)2

(17)

where 119894 denotes each of the five sets of data and 119895 denotesthe 119895th measurement of that set which was taken at position(119909119894119895 119910119894119895) The function119872

119894119895(119909 119910) represents a predicted mea-

surement value obtained from (7) using (9) with the albedofunction 119878(119909 119910) defined using the albedo pattern 119878BW(119909 119910)with orientation 120601

119894for 119894 = 1 2 3 4 and the albedo patterns

119878119886 119878119887 119878119888 119878119889for 119894 = 5 and 119895 = 1 2 3 4 respectively As such

the likelihood relies on a difference between predicted andmeasured sensor responses

The joint likelihood for the five data sets is foundby takingthe product of the likelihoods for each data set since weexpect that the standard deviations that were marginalizedover to get the Student-119905 distribution could have beendifferent for each of the five data sets as they were not allrecorded at the same time

Pr (D | 120579 119868) =5

prod

119894=1

Pr (119863119894| 120579 119868) (18)

We employed nested sampling [15 16] to explore the pos-terior probability since in addition to providing parameterestimates it is explicitly designed to perform evidence calcu-lations which we use to perform model comparison in iden-tifying the most probable number of Gaussians in the MoGmodel For each of the four MoG models (number of Gaus-sians varying from one to four) the nested sampling algo-rithm was initialized with 300 samples and iterated until thechange in consecutive log-evidence values is less than 10minus8Typically one estimates the mean parameter values by takingan average of the samples weighted by a quantity computedby the nested sampling algorithm called the logWt in [15 16]Here we simply performed a logWt-weighted average of thesampled SSF fields computed using the sampled MoG modelparameters (rather than the logWt-weighted average of theMoG model parameters themselves) so the result obtainedusing an MoG model consisting of a single Gaussian is notstrictly a single two-dimensional Gaussian distribution It isthis discretized estimated SSF fieldmatrix that is used directlyin the convolution (9) to compute the likelihood functions asmentioned earlier in the last lines of Section 25

3 Results and Discussion

In this section we present the SSF MoG light sensor modelestimated from the laboratory data and evaluate its efficacy bydemonstrating a significant improvement of the performancein the autonomous robotic platform

31 Light Sensor SSF MoG Model The light sensor data col-lected using the black-and-white boundary pattern are illus-trated in Figure 5 One can see that the intensity recorded

0∘

90∘

180∘

270∘

Sensor orientation

50

45

40

35

30

25

20

minus5 minus4 minus3 minus2 minus1 0 1 2 3 4 5Position of the sensor (cm)

Reco

rded

inte

nsity

(LEG

O u

nits)

Figure 5 This figure illustrates the intensity measurements1198631 1198632 1198633 1198634from the four sensor orientations recorded from

the sensor using the black-and-white boundary pattern Figure 3shows the orientations of the sensor corresponding to the symbolsused in this figure

Table 1 A comparison of the tested MoG SSF models and theirrespective log evidence (in units of dataminus408)

MoG model order Log evidence Number of parameters1 Gaussian minus6655 plusmn 03 62 Gaussian minus6749 plusmn 03 123 Gaussian minus6719 plusmn 04 184 Gaussian minus7061 plusmn 04 24

by the sensor increases dramatically as the sensor crosses theboundary from the black region to the white region but thechange is not a step function which indicates the finite size ofthe surface area integrated by the sensor It is this effect that isto bemodeled by the SSF functionThere is an obvious asym-metry between the 90∘ and 270∘ orientations due to the shiftof the transition region In addition there is a significant dif-ference in slope of the transition region between the 0∘ 180∘orientations and the 90∘ 270∘ orientations indicating a sig-nificant difference in the width of the SSF in those directionsNote also that the minimum recorded response is not zero asthe reflectance of the black surface is not completely zero

The nested sampling algorithm produced the mean SSFfields for each of the MoG models tested as well as thecorresponding log evidence Table 1 which lists the log evi-dence computed for each MoG model order illustrates thatthe most probable model was obtained from the single two-dimensional Gaussian models by a factor of about exp(9)which means that it is about 8000 times more probable thanthe MoG consisting of two Gaussians Figure 6 shows themean SSF fields described by the MoG models of differentorders In all cases the center of the SSF is shifted slightlyabove the physical center of the sensor package due to theplacement of the photodiode (refer to Figure 1) as indicatedby the data in Figure 5 In addition as predicted one sees

8 Journal of Sensors

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6655 plusmn 03

1MoG

(mm)

(mm

)

(a)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6749 plusmn 03

2 MoG

(mm)

(mm

)

(b)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6719 plusmn 04

3MoG

(mm)

(mm

)

(c)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus7061 plusmn 04

4MoG

(mm)

(mm

)

(d)

Figure 6 This figure illustrates the SSF obtained from the four MoG models along with their corresponding log-evidence values

that the SSF is wider along the 90∘ndash270∘ axis than along the0∘ndash180∘ axis Last Figure 7 demonstrates that the predictedlight sensor output shows excellent similarity to the recordeddata for the black-and-white boundary pattern

In the next section we demonstrate that explicit knowl-edge about how the light sensor integrates light arriving fromwithin its field-of-view improves the inferences one canmakefrom its output

32 Efficacy of Sensor Model The mean SSF field obtainedusing a single two-dimensional Gaussianmodel (Figure 6(a))was incorporated into the likelihood function used by therobotrsquos machine learning system Here we compare therobotrsquos performance in locating and characterizing a circle byobserving the average number of measurements necessary tocharacterize the circle parameters within a precision of 4mm(which is one-half of the spacing between the holes in theLEGO technic parts)

The three panels comprising Figure 8(a) illustrate therobotrsquos machine learning systemrsquos view of the playing fieldusing the naıve light sensor model The previously obtainedmeasurement locations used to obtain light sensor data areindicated by the black and white squares indicating therelative intensity with respect to the naıve light sensor modelThe next selected measurement location is indicated by thegreen square The blue circles represent the 50 hypothesizedcircles sampled from the posterior probability The shadedbackground represents the entropy map which indicates themeasurement locations that promise to provide maximalinformation about the circle to be characterized The lowentropy area surrounding the white square indicates that theregion is probably inside the white circle (not shown) andthat measurements made there will not be as informative asmeasurementsmade elsewhere Note that the circles partitionthe plane and that each partition has a uniform entropy Allmeasurement locations within that partition or any otherpartition sharing the same entropy all stand to be equally

Journal of Sensors 9

50

55

45

40

35

30

25

20

minus3 minus2 minus1 0 1 2 3Position of the center of the sensor (cm)

Inte

nsity

(LEG

O u

nits)

Intensity measurement compared for 1MoG

Prediction1Prediction2Prediction3Prediction4

Data1Data2Data3Data4

Figure 7 A comparison of the observed data (red) with predictions(black) made by the SSF field estimated using the single two-dimensional Gaussian MoG model

informative Here it is the fact that the shape is known to bea circle that is driving the likelihood function

In contrast the three panels comprising Figure 8(b)illustrate the playing field using themore accurate SSFmodelHere one can see that the entropy is higher along the edgesof the sampled circles This indicates that the circle edgespromise to provide more information than the centers ofthe partitioned regions This is because that the SSF modelenables one to detect not only whether the light sensor issituated above the circles edge but also how much of the SSFoverlaps with the white circle That is it helps to identify notonlywhether the sensor is inside the circle (as is accomplishedusing the naıve light sensor model) but also the extent towhich the sensor is on the edge of the circle The additionalinformation provided about the functioning of the lightsensor translates directly into additional information aboutthe albedo that results in the sensor output

This additional information can be quantified by observ-ing how many measurements the robot is required to taketo obtain estimates of the circle parameters within thesame precision in the cases of each light sensor modelOur experiments revealed that on average it takes 19 plusmn 18measurements using the naıve light sensor model comparedto an average of 116plusmn 39measurements for themore accurateSSF light sensor model

4 Conclusion

Thequality of the inferences onemakes froma sensor dependnot only on the quality of the data returned by the sensorbut also on the information one possesses about the sensorrsquos

performance In this paper we have demonstrated via a casestudy how more precisely modeling a sensorrsquos performancecan improve the inferences one can make from its data Inthis case we demonstrated that one can achieve about 18reduction in the number of measurements needed by a robotto make the same inferences by more precisely modeling itslight sensor

This paper demonstrates how a machine learning systemthat employs Bayesian inference (and inquiry) relies onthe likelihood function of the data given the hypothesizedmodel parameters Rather than simply representing a noisemodel the likelihood function quantifies the probability thata hypothesized situation could have given rise to the recordeddata By incorporating more information about the sensors(or equipment) used to record the data one naturally isincorporating this information into the posterior probabilitywhich results in onersquos inferences

This is made even more apparent by a careful study ofthe experimental design problem that this particular roboticsystem is designed to explore For example it is easy toshow that by using the naıve light sensor model the entropydistribution for a proposed measurement location dependssolely on the number of sampled circles forwhich the locationis in the interior of the circle and the number of sampledcircles for which the location is exterior to the circle Giventhat we represented the posterior probability by sampling50 circles the maximum entropy occurs when the proposedmeasurement location is inside 25 circles (and outside 25circles) As the robotrsquos parameter estimates converge one canshow that the system is simply performing a binary searchby asking ldquoyesrdquo or ldquonordquo questions which implies that eachmeasurement results in one bit of information However inthe case where the robot employs an SSF model of the lightsensor the question the robot is essentially asking is moredetailed ldquoto what degree does the circle overlap the lightsensorrsquos SSFrdquoThe answer to such a question tends to providemore information which significantly improves system per-formance One can estimate the information gain achievedby employing the SSF model Consider that the naive modelreveals that estimating the circlersquos position and radius with aprecision of 4mmgiven the prior information about the circleand the playing field requires 25 bits of information Theexperiment using the SSFmodel requires on average 116mea-surements which implies that on average each measurementobtained using the SSF model provides about 25116 = 215bits of information One must keep in mind however thatthis is due to the fact that the SSF model is being used notonly to infer the circle parameters from the data but also toselect the measurement locations

Because the method presented here is based on a verygeneral inferential framework these methods can easily beapplied to other types of sensors and equipment in a widevariety of situations If one has designed a robotic machinelearning system to rely on likelihood functions then sensormodels can be incorporated in more or less a plug-and-playfashion This not only promises to improve the quality ofrobotic systems forced to rely on lower quality sensors but italso opens the possibility for calibration on the fly by updatingsensor models as data are continually collected

10 Journal of Sensors

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(a)

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(b)

Figure 8 (a)These three panels illustrate the robotrsquos machine learning systemrsquos view of the playing field using the naıve light sensor model asthe system progresses through the first three measurements The previously obtained measurement locations used to obtain light sensor dataare indicated by the black-and-white squares indicating the relative intensity with respect to the naıve light sensor model The next selectedmeasurement location is indicated by the green square The blue circles represent the 50 hypothesized circles sampled from the posteriorprobabilityThe shaded background represents the entropymap which indicates themeasurement locations that promise to providemaximalinformation about the circle to be characterized Note that the low entropy area surrounding the white square indicates that the region isprobably inside the white circle (not shown) and that measurements made there will not be as informative as measurements made elsewhereThe entropy map in Figure 2 shows the same experiment at a later stage after seven measurements have been recorded (b)These three panelsillustrate the robotrsquos machine learning systemrsquos view of the playing field using the more accurate SSF light sensor model Note that the entropymap reveals the circle edges to be highly informative This is because it helps to identify not only whether the sensor is inside the circle (as isaccomplished using the naıve light sensor model on the left) but also the extent to which the sensor is on the edge of the circle

Conflict of Interests

The authors have no conflict of interests to report

Acknowledgments

This research was supported in part by the University atAlbany Faculty Research Awards Program (Knuth) and theUniversity at Albany Benevolent Research Grant Award(Malakar) The authors would like to thank Scott Frasso andRotem Guttman for their assistance in interfacing MatLab tothe LEGOBrickTheywould also like to thank Phil Erner andA J Mesiti for their preliminary efforts on the robotic armexperiments and sensor characterization

References

[1] K H Knuth P M Erner and S Frasso ldquoDesigning intelligentinstrumentsrdquo in Proceedings of the 27th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering K Knuth A Caticha J Center A Giffin andC Rodriguez Eds vol 954 pp 203ndash211 AIP July 2007

[2] K H Knuth and J L Center Jr ldquoAutonomous science platformsand question-asking machinesrdquo in Proceedings of the 2nd Inter-national Workshop on Cognitive Information Processing (CIPrsquo10) pp 221ndash226 June 2010

[3] K H Knuth and J L Center ldquoAutonomous sensor placementrdquoin Proceedings of the IEEE International Conference on Technolo-gies for Practical Robot Applications pp 94ndash99 November 2008

Journal of Sensors 11

[4] K H Knuth ldquoIntelligent machines in the twenty-first centuryfoundations of inference and inquiryrdquo Philosophical Transac-tions of the Royal Society A vol 361 no 1813 pp 2859ndash28732003

[5] N K Malakar and K H Knuth ldquoEntropy-based search algo-rithm for experimental designrdquo in Proceedings of the 30thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering A M Djafari JF Bercher and P Bessiere Eds vol 1305 pp 157ndash164 AIP July2010

[6] N K Malakar K H Knuth and D J Lary ldquoMaximum jointentropy and information-based collaboration of automatedlearning machinesrdquo in Proceedings of the 31st InternationalWorkshop on Bayesian Inference and Maximum Entropy Meth-ods in Science and Engineering P Goyal A Giffin K H Knuthand E Vrscay Eds vol 1443 pp 230ndash237 AIP 2012

[7] D V Lindley ldquoOn a measure of the information provided by anexperimentrdquo The Annals of Mathematical Statistics vol 27 no4 pp 986ndash1005 1956

[8] V V Fedorov Theory of Optimal Experiments Probability andMathematical Statistics Academic Press 1972

[9] K Chaloner and I Verdinelli ldquoBayesian experimental design areviewrdquo Statistical Science vol 10 pp 273ndash304 1995

[10] P Sebastiani and H P Wynn ldquoBayesian experimental designand shannon informationrdquo in 1997 Proceedings of the Section onBayesian Statistical Science pp 176ndash181 1997 httpciteseerxistpsueduviewdocsummarydoi=1011566037

[11] P Sebastiani and H P Wynn ldquoMaximum entropy samplingand optimal Bayesian experimental designrdquo Journal of the RoyalStatistical Society B vol 62 no 1 pp 145ndash157 2000

[12] T J Loredo and D F Cherno ldquoBayesian adaptive explorationrdquoin Proceedings of the 23rd International Workshop on BayesianInference and Maximum Entropy Methods in Science and Engi-neering G Erickson and Y Zhai Eds pp 330ndash346 AIP August2003

[13] R Fischer ldquoBayesian experimental designmdashstudies for fusiondiagnosticsrdquo in Proceedings of the 24th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering vol 735 pp 76ndash83 November 2004

[14] C E Shannon and W Weaver The Mathematical Theory ofCommunication University of Illinois Press Chicago Ill USA1949

[15] J Skilling ldquoNested samplingrdquo in Proceedings of the 24th Inter-nationalWorkshop onBayesian Inference andMaximumEntropyMethods in Science and Engineering vol 735 pp 395ndash405 2004

[16] D Sivia and J Skilling Data Analysis A Bayesian TutorialOxford University Press New York NY USA 2nd edition2006

[17] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953

[18] W K Hastings ldquoMonte carlo sampling methods using Markovchains and their applicationsrdquo Biometrika vol 57 no 1 pp 97ndash109 1970

[19] N K Malakar A J Mesiti and K H Knuth ldquoThe spatialsensitivity function of a light sensorrdquo in Proceedings of the 29thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering P Goggans andC Y Chan Eds vol 1193 of AIP Conference Proceedings pp352ndash359American Institute of PhysicsOxfordMassUSA July2009

Submit your manuscripts athttpwwwhindawicom

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013Part I

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

DistributedSensor Networks

International Journal of

ISRN Signal Processing

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Mechanical Engineering

Advances in

Modelling amp Simulation in EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inOptoElectronics

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013

ISRN Sensor Networks

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawi Publishing Corporation httpwwwhindawicom Volume 2013

The Scientific World Journal

ISRN Robotics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

Antennas andPropagation

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

ISRN Electronics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

thinspJournalthinspofthinsp

Sensors

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Active and Passive Electronic Components

Chemical EngineeringInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Electrical and Computer Engineering

Journal of

ISRN Civil Engineering

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inAcoustics ampVibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

4 Journal of Sensors

60

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

Figure 2This figure illustrates the robotrsquosmachine learning systemrsquosview of the playing field using the naıve light sensor model Theaxes label playing field coordinates in LEGO distance units Thepreviously obtained measurement locations used to obtain lightsensor data are indicated by the black and white squares indicatingthe relative intensitywith respect to the naıve light sensormodelThenext selectedmeasurement location is indicated by the green squareThe blue circles represent the 50 hypothesized circles sampled fromthe posterior probability The shaded background represents theentropy map such that brighter areas indicate the measurementlocations that promise to provide greater information about thecircle to be characterized Note that the low entropy area boundedby the white squares indicates that this region is probably insidethe white circle and that measurements made here will not beas informative as measurements made elsewhere The dark jaggededges at the bottom of the colored high entropy regions reflect theboundary between the playing field and the region that is outside ofthe robotic armrsquos reach

are identified and the next measurement location is ran-domly chosen from that set Thus the likelihood functionnot only affects the inferences about the circles made bythe machine learning algorithm but also affects the entropyof the measurement locations which guides further explo-ration In the next section we describe the two models of thelight sensor and their corresponding likelihood functions

The efficacy of the sensor model will be quantified by theaverage number of measurements the robot needs to make toestimate the circle parameters within a given precision

24 Models Describing a Light Sensor In this section we dis-cuss two models of a light sensor and indicate precisely howthey are integrated into the likelihood function used by boththe Bayesian Inference and Inquiry Engines

241TheNaıve Likelihood Anaıve light sensormodel wouldpredict that if the sensor was centered on a black region (sur-face albedo of zero) the sensor would return a small numberon average and if it were centered on a white region (surfacealbedo of unity) it would return a large number on averageOf course there are expected to be noise variations fromnumerous sources such as uneven lighting of the surfaceminor variations in albedo and noise in the sensor itselfSo one might expect that there is some expected squareddeviation from the two mean sensor values for the white andblack cases For this reason we model the expected sensorresponse with a Gaussian distribution with mean 120583

119861and

standard deviation 120590119861for the black surface and a Gaussian

distribution withmean 120583119882and standard deviation 120590

119882for the

white surface The likelihood of a measurement 119889119894at loca-

tion (119909119894 119910119894) corresponding to the naıve light sensor model

Prnaive((119889119894 (119909119894 119910119894)) | C 119868) can be written compactly as

Prnaive ((119889119894 (119909119894 119910119894)) | C 119868)

=

(2120587120590119882)minus12 exp[minus

(120583119882minus 119889119894)2

21205902

119882

]

for 119863((119909119894 119910119894) (119909119900 119910119900)) le 119903

119900

(2120587120590119861)minus12 exp[minus

(120583119861minus 119889119894)2

21205902

119861

]

for 119863((119909119894 119910119894) (119909119900 119910119900)) gt 119903

119900

(6)

where C = (119909119900 119910119900) 119903119900 represents the parameters of the hy-

pothesized circle and 119863((119909119894 119910119894) (119909119900 119910119900)) is the Euclidean

distance given in (3)The joint likelihood for119873-independentmeasurements is found by taking the product of the 119873single-measurement likelihoods In a practical experimentthe means 120583

119861and 120583

119882and standard deviations 120590

119861and 120590

119882

can be easily estimated by sampling known black and whiteregions several times using the light sensor

25 The SSF Likelihood A more accurate likelihood can bedeveloped by taking into account the fact that the photo diodeperforms a weighted integral of the light arriving from aspatially distributed regionwithin its field of view theweightsbeing described by the spatial sensitivity function (SSF) of thesensor Since the SSF of the light sensor could be arbitrarilycomplex withmany local peaks but is expected to decrease tozero far from the sensor we characterize it using a mixture ofGaussians (MoG) model which we describe in this section

The sensorrsquos response situated a fixed distance above thepoint (119909

119894 119910119894) to a known black-and-white pattern which can

be modeled in the lab frame by

119872(119909119894 119910119894) = 119868min + (119868max minus 119868min) 119877 (119909119894 119910119894) (7)

where 119868min and 119868max are observed intensities for a completelyblack surface (surface albedo of zero) and a completely whitesurface (surface albedo of unity) respectively and 119877 is ascalar response function varying between zero and one thatdepends both on the SSF and the surface albedo [19] Theminimum intensity 119868min acts as an offset and (119868max minus 119868min)serves to scale the response to LEGO units

The response of the sensor is described by119877(119909119894 119910119894) which

is a convolution of the sensor SSF and the surface albedo119878(119909 119910) given by

119877 (119909119894 119910119894) = int119889119909119889119910SSF (119909 minus 119909

119894 119910 minus 119910

119894) 119878 (119909 119910) (8)

where the SSF is defined so that the convolution with a com-pletely white surface results in a response of unity In practicewe approximate this integral as a sum over a pixelated gridwith 1mm square pixels

119877 (119909119894 119910119894) = sum

119909119910

SSF (119909 minus 119909119894 119910 minus 119910

119894) 119878 (119909 119910) (9)

Journal of Sensors 5

We employ a mixture of Gaussians (MOG) as a param-eterized model to describe the SSF in the sensorrsquos frame co-ordinates (1199091015840 1199101015840) = (119909 minus 119909

119894 119910 minus 119910

119894)

SSF (1199091015840 1199101015840) = 1

119870

119873

sum

119899=1

119886119899

times exp [minus 119860119899(1199091015840

minus1199061015840

119899)

2

+119861119896(1199101015840

minus V1015840

119899)

2

+ 2119862119899(1199091015840

minus1199061015840

119899) (1199101015840

minusV1015840

119899) ]

(10)

where (1199061015840119899 V1015840119899) denotes the center of the 119899th two-dimensional

Gaussian with amplitude 119886119899and covariance matrix elements

given by119860119899 119861119899and119862

119899 The constant119870 denotes a normaliza-

tion factor which ensures that the SSF integrates to unity [19]The model is sufficiently general so that one could vary thenumber of Gaussians to any number although we found thatin modeling this light sensor testing models with 119873 varyingfrom119873 = 1 to119873 = 4 was sufficient The MoG model resultsin a set of six parameters to be estimated for each Gaussian

120579119899= 119886119899 1199061015840

119899 V1015840

119899 119860119899 119861119899 119862119899 (11)

where the subscript 119899 is used to denote the 119899th Gaussian inthe set These must be estimated along with the two intensityparameters 119868min and 119868max in (7) so that an MoG model with119873 Gaussians consists of 6119873 + 2 parameters to be estimated

We assign a Student-119905 distribution to the SSF modellikelihood which can be arrived at by assigning a Gaus-sian likelihood and integrating over the unknown stan-dard deviation 120590 [16] By defining D = (119889

1 (1199091 1199101))

(1198892 (1199092 1199102)) (119889

119873 (119909119873 119910119873)) and writing the hypothe-

sized circle parameters as C = (119909119900 119910119900) 119903119900 we have

PrSSF (D | C 119868) prop [119873

sum

119894=1

(119872 (119909119894 119910119894) minus 119889119894)2

]

minus(119873minus1)2

(12)

where119873 is the number of measurements made by the robotand the function119872 defined in (7) relies on both (8) and (10)Note that in practice the MoG SSF model described in (10)is used to generate a discrete SSF matrix which is used in theconvolution (9) to compute the likelihood function via thesensor response model119872 in (7)

26 Data Collection for SSF Estimation In this section wedescribe the collection of the light sensor readings in the lab-oratory that were used to estimate the SSF of the light sensorThe SSF is a function not only of the properties of the photodiode but also of the illuminating LED and the height abovethe surface For this reason measurements were made at aheight of 14mm above a surface with a known albedo in adarkened room to avoid complications due to ambient lightand to eliminate the possibility of shadows cast by the sensoror other equipment [19]

The surface which we refer to as the black-and-whiteboundary pattern consisted of two regions a black region onthe left and a white region on the right separated by a sharp

(a)

0∘

90∘

180∘

270∘

(b)

Figure 3 (a) This figure illustrates the laboratory surface referredto as the black-and-white boundary pattern with known albedowhich consisted of two regions a black region on the left and a whiteregion on the right separated by a sharp linear boundary (b) Thisillustrates the four sensor orientations used to collect data for theestimation of the SSF along with the symbols used to indicate themeasured values plotted in Figure 5 The top row views the sensorsas if looking up from the table so that the photodiode placement inthe sensor package can be visualized Below these the gray sensorshapes illustrate how they are oriented looking down at both thesensors and the albedo surface Measurements were taken as thesensor was incrementally moved with one-millimeter steps in adirection perpendicular to the boundary (as indicated by the arrowat the bottom of the figure) from a position of 5 cm to the left ofthe boundary (well within the completely black region) to a position5 cm to the right of the boundary (well within the completely whiteregion)This process was repeated four times with the sensor in eachof the four orientations

linear boundary as shown in Figure 3(a) Here the surfaceand sensor are illustrated as if viewing them by looking up atthem from below the table surface so that the placement ofthe photodiode in the sensor package can be visualized Thelab frame was defined to be at the center of the black-whiteboundary so that the surface albedo 119878BW(119909 119910) is given by

119878BW (119909 119910) = 1 for 119909 gt 00 for 119909 le 0

(13)

Measurements were taken as the sensor was incrementallymoved with one-millimeter steps in a direction perpendic-ular to the boundary from a position of 5 cm to the left ofthe boundary (well within the completely black region) to aposition 5 cm to the right of the boundary (well within thecompletely white region) resulting in 101 measurementsThisprocess was repeated four times with the sensor in each of

6 Journal of Sensors

40

(0 0)

(a)

39

(0 0)

(b)

35

(4 4)

(c)

32

(minus4 4)

(d)

Figure 4 Four additional symmetry-breaking albedo patterns wereemployed In all cases the sensor was placed in the 0∘ orientationat the center of the pattern indicated by (0 0) In the two lowerpatterns the center of the white square was shifted diagonally fromthe center by 4mm as indicated by the coordinates (white text) ofthe corner The recorded intensity levels are displayed in the whitealbedo area

the four orientations (see Figure 3 for an explanation) givinga total of 404 measurements using this pattern

The black-and-white boundary pattern does not providesufficient information to uniquely infer the SSF since thesensor may have a response that is symmetric about a lineoriented at 45∘ with respect to the linear boundary For thisreason we employed four additional albedo patterns consist-ing of black regions with one white quadrant as illustrated inFigure 4 which resulted in four more measurements Thesehave surface albedos defined by

119878119886(119909 119910) =

1 for 119909 gt 0mm 119910 gt 0mm0 otherwise

119878119887(119909 119910) =

1 for 119909 lt 0mm 119910 gt 0mm0 otherwise

119878119888(119909 119910) =

1 for 119909 gt 4mm 119910 gt 4mm0 otherwise

119878119889(119909 119910) =

1 for 119909 lt minus4mm 119910 gt 4mm0 otherwise

(14)

where the subscripts relate each albedo function to thepattern illustrated in Figure 4

27 Estimating SSF MoG Model Parameters In this sectionwe describe the application of Bayesian methods to estimatethe SSF MoG model parameters Keep in mind that in thispaper we are considering two distinct inference problems therobotrsquos inferences about a circle and our inferences about

the light sensor SSF Both of these problems rely on makingpredictions about measured intensities using a light sensorFor this reason many of these equations will not only looksimilar to what we have presented previously but also dependon the same functions mapping modeled albedo fields topredicted sensor responses which are collectively repre-sented using the generic symbol D It may help to keep inmind that the robot is characterizing a circle quantified bymodel parameters represented jointly by the symbol C andwe are estimating the SSF of a light sensor quantified bymodelparameters represented jointly by the symbol 120579 below Withthe exception of the evidence represented by function 119885below (which is not used by the robot in this experiment)all of the probability functions contain the model parametersin their list of arguments making it clear to which inferenceproblem they refer

Theposterior probability for the SSFMoGmodel parame-ters collectively referred to as 120579 = 120579

1 1205792 120579

119873 for amodel

consisting of119873 Gaussians is given by

Pr (120579 | D 119868) = 1119885

Pr (120579 | 119868)Pr (D | 120579 119868) (15)

where here D refers to the data collected for the SSFestimation experiment described in the previous section 119868refers to our prior information about the SSF (which is that itmay have several local peaks and falls off to zero far from thesensor) and119885 refers to the evidence119885 = Pr(D | 119868) which canbe found by

119885 = int119889120579Pr (120579 | 119868)Pr (D | 120579 119868) (16)

In our earlier discussion where Bayesrsquo theorem was used tomake inferences about circles the evidence played the roleof a normalization factor Here since we can consider MoGmodels with different numbers of Gaussians and since weintegrate over all of the possible values of the parametersthe evidence quantifies the degree to which the hypothesizedmodel order119873 supports the dataThat is the optimal numberof Gaussians to be used in the MoG model of the SSF can befound by computing the evidence

All five sets of data D described in the previous sectionwere used to compute the posterior probability These areeach indexed by the subscript 119894 where 119894 = 1 2 3 4 so that119863

119894

refers to the data collected using each of the four orienta-tions 120601

119894= 0∘

90∘

180∘

270∘

to scan the black-and-whiteboundary pattern resulting in 119873

119894= 101 measurements for

119894 = 1 2 3 4 and where the value 119894 = 5 refers to the 119873119894= 4

measurements attained using the 0∘ orientation with the setof four additional patterns

We assign uniform priors so that this is essentially amaximum likelihood calculation with the posterior beingproportional to the likelihood We assign a Student-119905 distri-bution to the likelihood which can be arrived at by assigning

Journal of Sensors 7

a Gaussian likelihood and integrating over the unknownstandard deviation 120590 [16] This can be written as

Pr (119863119894| 120579 119868) prop [

[

119873119894

sum

119895=1

(119872119894119895(119909119894119895 119910119894119895)minus119863119894(119909119894119895 119910119894119895))

2

]

]

minus(119873119894minus1)2

(17)

where 119894 denotes each of the five sets of data and 119895 denotesthe 119895th measurement of that set which was taken at position(119909119894119895 119910119894119895) The function119872

119894119895(119909 119910) represents a predicted mea-

surement value obtained from (7) using (9) with the albedofunction 119878(119909 119910) defined using the albedo pattern 119878BW(119909 119910)with orientation 120601

119894for 119894 = 1 2 3 4 and the albedo patterns

119878119886 119878119887 119878119888 119878119889for 119894 = 5 and 119895 = 1 2 3 4 respectively As such

the likelihood relies on a difference between predicted andmeasured sensor responses

The joint likelihood for the five data sets is foundby takingthe product of the likelihoods for each data set since weexpect that the standard deviations that were marginalizedover to get the Student-119905 distribution could have beendifferent for each of the five data sets as they were not allrecorded at the same time

Pr (D | 120579 119868) =5

prod

119894=1

Pr (119863119894| 120579 119868) (18)

We employed nested sampling [15 16] to explore the pos-terior probability since in addition to providing parameterestimates it is explicitly designed to perform evidence calcu-lations which we use to perform model comparison in iden-tifying the most probable number of Gaussians in the MoGmodel For each of the four MoG models (number of Gaus-sians varying from one to four) the nested sampling algo-rithm was initialized with 300 samples and iterated until thechange in consecutive log-evidence values is less than 10minus8Typically one estimates the mean parameter values by takingan average of the samples weighted by a quantity computedby the nested sampling algorithm called the logWt in [15 16]Here we simply performed a logWt-weighted average of thesampled SSF fields computed using the sampled MoG modelparameters (rather than the logWt-weighted average of theMoG model parameters themselves) so the result obtainedusing an MoG model consisting of a single Gaussian is notstrictly a single two-dimensional Gaussian distribution It isthis discretized estimated SSF fieldmatrix that is used directlyin the convolution (9) to compute the likelihood functions asmentioned earlier in the last lines of Section 25

3 Results and Discussion

In this section we present the SSF MoG light sensor modelestimated from the laboratory data and evaluate its efficacy bydemonstrating a significant improvement of the performancein the autonomous robotic platform

31 Light Sensor SSF MoG Model The light sensor data col-lected using the black-and-white boundary pattern are illus-trated in Figure 5 One can see that the intensity recorded

0∘

90∘

180∘

270∘

Sensor orientation

50

45

40

35

30

25

20

minus5 minus4 minus3 minus2 minus1 0 1 2 3 4 5Position of the sensor (cm)

Reco

rded

inte

nsity

(LEG

O u

nits)

Figure 5 This figure illustrates the intensity measurements1198631 1198632 1198633 1198634from the four sensor orientations recorded from

the sensor using the black-and-white boundary pattern Figure 3shows the orientations of the sensor corresponding to the symbolsused in this figure

Table 1 A comparison of the tested MoG SSF models and theirrespective log evidence (in units of dataminus408)

MoG model order Log evidence Number of parameters1 Gaussian minus6655 plusmn 03 62 Gaussian minus6749 plusmn 03 123 Gaussian minus6719 plusmn 04 184 Gaussian minus7061 plusmn 04 24

by the sensor increases dramatically as the sensor crosses theboundary from the black region to the white region but thechange is not a step function which indicates the finite size ofthe surface area integrated by the sensor It is this effect that isto bemodeled by the SSF functionThere is an obvious asym-metry between the 90∘ and 270∘ orientations due to the shiftof the transition region In addition there is a significant dif-ference in slope of the transition region between the 0∘ 180∘orientations and the 90∘ 270∘ orientations indicating a sig-nificant difference in the width of the SSF in those directionsNote also that the minimum recorded response is not zero asthe reflectance of the black surface is not completely zero

The nested sampling algorithm produced the mean SSFfields for each of the MoG models tested as well as thecorresponding log evidence Table 1 which lists the log evi-dence computed for each MoG model order illustrates thatthe most probable model was obtained from the single two-dimensional Gaussian models by a factor of about exp(9)which means that it is about 8000 times more probable thanthe MoG consisting of two Gaussians Figure 6 shows themean SSF fields described by the MoG models of differentorders In all cases the center of the SSF is shifted slightlyabove the physical center of the sensor package due to theplacement of the photodiode (refer to Figure 1) as indicatedby the data in Figure 5 In addition as predicted one sees

8 Journal of Sensors

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6655 plusmn 03

1MoG

(mm)

(mm

)

(a)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6749 plusmn 03

2 MoG

(mm)

(mm

)

(b)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6719 plusmn 04

3MoG

(mm)

(mm

)

(c)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus7061 plusmn 04

4MoG

(mm)

(mm

)

(d)

Figure 6 This figure illustrates the SSF obtained from the four MoG models along with their corresponding log-evidence values

that the SSF is wider along the 90∘ndash270∘ axis than along the0∘ndash180∘ axis Last Figure 7 demonstrates that the predictedlight sensor output shows excellent similarity to the recordeddata for the black-and-white boundary pattern

In the next section we demonstrate that explicit knowl-edge about how the light sensor integrates light arriving fromwithin its field-of-view improves the inferences one canmakefrom its output

32 Efficacy of Sensor Model The mean SSF field obtainedusing a single two-dimensional Gaussianmodel (Figure 6(a))was incorporated into the likelihood function used by therobotrsquos machine learning system Here we compare therobotrsquos performance in locating and characterizing a circle byobserving the average number of measurements necessary tocharacterize the circle parameters within a precision of 4mm(which is one-half of the spacing between the holes in theLEGO technic parts)

The three panels comprising Figure 8(a) illustrate therobotrsquos machine learning systemrsquos view of the playing fieldusing the naıve light sensor model The previously obtainedmeasurement locations used to obtain light sensor data areindicated by the black and white squares indicating therelative intensity with respect to the naıve light sensor modelThe next selected measurement location is indicated by thegreen square The blue circles represent the 50 hypothesizedcircles sampled from the posterior probability The shadedbackground represents the entropy map which indicates themeasurement locations that promise to provide maximalinformation about the circle to be characterized The lowentropy area surrounding the white square indicates that theregion is probably inside the white circle (not shown) andthat measurements made there will not be as informative asmeasurementsmade elsewhere Note that the circles partitionthe plane and that each partition has a uniform entropy Allmeasurement locations within that partition or any otherpartition sharing the same entropy all stand to be equally

Journal of Sensors 9

50

55

45

40

35

30

25

20

minus3 minus2 minus1 0 1 2 3Position of the center of the sensor (cm)

Inte

nsity

(LEG

O u

nits)

Intensity measurement compared for 1MoG

Prediction1Prediction2Prediction3Prediction4

Data1Data2Data3Data4

Figure 7 A comparison of the observed data (red) with predictions(black) made by the SSF field estimated using the single two-dimensional Gaussian MoG model

informative Here it is the fact that the shape is known to bea circle that is driving the likelihood function

In contrast the three panels comprising Figure 8(b)illustrate the playing field using themore accurate SSFmodelHere one can see that the entropy is higher along the edgesof the sampled circles This indicates that the circle edgespromise to provide more information than the centers ofthe partitioned regions This is because that the SSF modelenables one to detect not only whether the light sensor issituated above the circles edge but also how much of the SSFoverlaps with the white circle That is it helps to identify notonlywhether the sensor is inside the circle (as is accomplishedusing the naıve light sensor model) but also the extent towhich the sensor is on the edge of the circle The additionalinformation provided about the functioning of the lightsensor translates directly into additional information aboutthe albedo that results in the sensor output

This additional information can be quantified by observ-ing how many measurements the robot is required to taketo obtain estimates of the circle parameters within thesame precision in the cases of each light sensor modelOur experiments revealed that on average it takes 19 plusmn 18measurements using the naıve light sensor model comparedto an average of 116plusmn 39measurements for themore accurateSSF light sensor model

4 Conclusion

Thequality of the inferences onemakes froma sensor dependnot only on the quality of the data returned by the sensorbut also on the information one possesses about the sensorrsquos

performance In this paper we have demonstrated via a casestudy how more precisely modeling a sensorrsquos performancecan improve the inferences one can make from its data Inthis case we demonstrated that one can achieve about 18reduction in the number of measurements needed by a robotto make the same inferences by more precisely modeling itslight sensor

This paper demonstrates how a machine learning systemthat employs Bayesian inference (and inquiry) relies onthe likelihood function of the data given the hypothesizedmodel parameters Rather than simply representing a noisemodel the likelihood function quantifies the probability thata hypothesized situation could have given rise to the recordeddata By incorporating more information about the sensors(or equipment) used to record the data one naturally isincorporating this information into the posterior probabilitywhich results in onersquos inferences

This is made even more apparent by a careful study ofthe experimental design problem that this particular roboticsystem is designed to explore For example it is easy toshow that by using the naıve light sensor model the entropydistribution for a proposed measurement location dependssolely on the number of sampled circles forwhich the locationis in the interior of the circle and the number of sampledcircles for which the location is exterior to the circle Giventhat we represented the posterior probability by sampling50 circles the maximum entropy occurs when the proposedmeasurement location is inside 25 circles (and outside 25circles) As the robotrsquos parameter estimates converge one canshow that the system is simply performing a binary searchby asking ldquoyesrdquo or ldquonordquo questions which implies that eachmeasurement results in one bit of information However inthe case where the robot employs an SSF model of the lightsensor the question the robot is essentially asking is moredetailed ldquoto what degree does the circle overlap the lightsensorrsquos SSFrdquoThe answer to such a question tends to providemore information which significantly improves system per-formance One can estimate the information gain achievedby employing the SSF model Consider that the naive modelreveals that estimating the circlersquos position and radius with aprecision of 4mmgiven the prior information about the circleand the playing field requires 25 bits of information Theexperiment using the SSFmodel requires on average 116mea-surements which implies that on average each measurementobtained using the SSF model provides about 25116 = 215bits of information One must keep in mind however thatthis is due to the fact that the SSF model is being used notonly to infer the circle parameters from the data but also toselect the measurement locations

Because the method presented here is based on a verygeneral inferential framework these methods can easily beapplied to other types of sensors and equipment in a widevariety of situations If one has designed a robotic machinelearning system to rely on likelihood functions then sensormodels can be incorporated in more or less a plug-and-playfashion This not only promises to improve the quality ofrobotic systems forced to rely on lower quality sensors but italso opens the possibility for calibration on the fly by updatingsensor models as data are continually collected

10 Journal of Sensors

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(a)

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(b)

Figure 8 (a)These three panels illustrate the robotrsquos machine learning systemrsquos view of the playing field using the naıve light sensor model asthe system progresses through the first three measurements The previously obtained measurement locations used to obtain light sensor dataare indicated by the black-and-white squares indicating the relative intensity with respect to the naıve light sensor model The next selectedmeasurement location is indicated by the green square The blue circles represent the 50 hypothesized circles sampled from the posteriorprobabilityThe shaded background represents the entropymap which indicates themeasurement locations that promise to providemaximalinformation about the circle to be characterized Note that the low entropy area surrounding the white square indicates that the region isprobably inside the white circle (not shown) and that measurements made there will not be as informative as measurements made elsewhereThe entropy map in Figure 2 shows the same experiment at a later stage after seven measurements have been recorded (b)These three panelsillustrate the robotrsquos machine learning systemrsquos view of the playing field using the more accurate SSF light sensor model Note that the entropymap reveals the circle edges to be highly informative This is because it helps to identify not only whether the sensor is inside the circle (as isaccomplished using the naıve light sensor model on the left) but also the extent to which the sensor is on the edge of the circle

Conflict of Interests

The authors have no conflict of interests to report

Acknowledgments

This research was supported in part by the University atAlbany Faculty Research Awards Program (Knuth) and theUniversity at Albany Benevolent Research Grant Award(Malakar) The authors would like to thank Scott Frasso andRotem Guttman for their assistance in interfacing MatLab tothe LEGOBrickTheywould also like to thank Phil Erner andA J Mesiti for their preliminary efforts on the robotic armexperiments and sensor characterization

References

[1] K H Knuth P M Erner and S Frasso ldquoDesigning intelligentinstrumentsrdquo in Proceedings of the 27th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering K Knuth A Caticha J Center A Giffin andC Rodriguez Eds vol 954 pp 203ndash211 AIP July 2007

[2] K H Knuth and J L Center Jr ldquoAutonomous science platformsand question-asking machinesrdquo in Proceedings of the 2nd Inter-national Workshop on Cognitive Information Processing (CIPrsquo10) pp 221ndash226 June 2010

[3] K H Knuth and J L Center ldquoAutonomous sensor placementrdquoin Proceedings of the IEEE International Conference on Technolo-gies for Practical Robot Applications pp 94ndash99 November 2008

Journal of Sensors 11

[4] K H Knuth ldquoIntelligent machines in the twenty-first centuryfoundations of inference and inquiryrdquo Philosophical Transac-tions of the Royal Society A vol 361 no 1813 pp 2859ndash28732003

[5] N K Malakar and K H Knuth ldquoEntropy-based search algo-rithm for experimental designrdquo in Proceedings of the 30thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering A M Djafari JF Bercher and P Bessiere Eds vol 1305 pp 157ndash164 AIP July2010

[6] N K Malakar K H Knuth and D J Lary ldquoMaximum jointentropy and information-based collaboration of automatedlearning machinesrdquo in Proceedings of the 31st InternationalWorkshop on Bayesian Inference and Maximum Entropy Meth-ods in Science and Engineering P Goyal A Giffin K H Knuthand E Vrscay Eds vol 1443 pp 230ndash237 AIP 2012

[7] D V Lindley ldquoOn a measure of the information provided by anexperimentrdquo The Annals of Mathematical Statistics vol 27 no4 pp 986ndash1005 1956

[8] V V Fedorov Theory of Optimal Experiments Probability andMathematical Statistics Academic Press 1972

[9] K Chaloner and I Verdinelli ldquoBayesian experimental design areviewrdquo Statistical Science vol 10 pp 273ndash304 1995

[10] P Sebastiani and H P Wynn ldquoBayesian experimental designand shannon informationrdquo in 1997 Proceedings of the Section onBayesian Statistical Science pp 176ndash181 1997 httpciteseerxistpsueduviewdocsummarydoi=1011566037

[11] P Sebastiani and H P Wynn ldquoMaximum entropy samplingand optimal Bayesian experimental designrdquo Journal of the RoyalStatistical Society B vol 62 no 1 pp 145ndash157 2000

[12] T J Loredo and D F Cherno ldquoBayesian adaptive explorationrdquoin Proceedings of the 23rd International Workshop on BayesianInference and Maximum Entropy Methods in Science and Engi-neering G Erickson and Y Zhai Eds pp 330ndash346 AIP August2003

[13] R Fischer ldquoBayesian experimental designmdashstudies for fusiondiagnosticsrdquo in Proceedings of the 24th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering vol 735 pp 76ndash83 November 2004

[14] C E Shannon and W Weaver The Mathematical Theory ofCommunication University of Illinois Press Chicago Ill USA1949

[15] J Skilling ldquoNested samplingrdquo in Proceedings of the 24th Inter-nationalWorkshop onBayesian Inference andMaximumEntropyMethods in Science and Engineering vol 735 pp 395ndash405 2004

[16] D Sivia and J Skilling Data Analysis A Bayesian TutorialOxford University Press New York NY USA 2nd edition2006

[17] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953

[18] W K Hastings ldquoMonte carlo sampling methods using Markovchains and their applicationsrdquo Biometrika vol 57 no 1 pp 97ndash109 1970

[19] N K Malakar A J Mesiti and K H Knuth ldquoThe spatialsensitivity function of a light sensorrdquo in Proceedings of the 29thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering P Goggans andC Y Chan Eds vol 1193 of AIP Conference Proceedings pp352ndash359American Institute of PhysicsOxfordMassUSA July2009

Submit your manuscripts athttpwwwhindawicom

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013Part I

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

DistributedSensor Networks

International Journal of

ISRN Signal Processing

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Mechanical Engineering

Advances in

Modelling amp Simulation in EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inOptoElectronics

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013

ISRN Sensor Networks

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawi Publishing Corporation httpwwwhindawicom Volume 2013

The Scientific World Journal

ISRN Robotics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

Antennas andPropagation

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

ISRN Electronics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

thinspJournalthinspofthinsp

Sensors

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Active and Passive Electronic Components

Chemical EngineeringInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Electrical and Computer Engineering

Journal of

ISRN Civil Engineering

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inAcoustics ampVibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Journal of Sensors 5

We employ a mixture of Gaussians (MOG) as a param-eterized model to describe the SSF in the sensorrsquos frame co-ordinates (1199091015840 1199101015840) = (119909 minus 119909

119894 119910 minus 119910

119894)

SSF (1199091015840 1199101015840) = 1

119870

119873

sum

119899=1

119886119899

times exp [minus 119860119899(1199091015840

minus1199061015840

119899)

2

+119861119896(1199101015840

minus V1015840

119899)

2

+ 2119862119899(1199091015840

minus1199061015840

119899) (1199101015840

minusV1015840

119899) ]

(10)

where (1199061015840119899 V1015840119899) denotes the center of the 119899th two-dimensional

Gaussian with amplitude 119886119899and covariance matrix elements

given by119860119899 119861119899and119862

119899 The constant119870 denotes a normaliza-

tion factor which ensures that the SSF integrates to unity [19]The model is sufficiently general so that one could vary thenumber of Gaussians to any number although we found thatin modeling this light sensor testing models with 119873 varyingfrom119873 = 1 to119873 = 4 was sufficient The MoG model resultsin a set of six parameters to be estimated for each Gaussian

120579119899= 119886119899 1199061015840

119899 V1015840

119899 119860119899 119861119899 119862119899 (11)

where the subscript 119899 is used to denote the 119899th Gaussian inthe set These must be estimated along with the two intensityparameters 119868min and 119868max in (7) so that an MoG model with119873 Gaussians consists of 6119873 + 2 parameters to be estimated

We assign a Student-119905 distribution to the SSF modellikelihood which can be arrived at by assigning a Gaus-sian likelihood and integrating over the unknown stan-dard deviation 120590 [16] By defining D = (119889

1 (1199091 1199101))

(1198892 (1199092 1199102)) (119889

119873 (119909119873 119910119873)) and writing the hypothe-

sized circle parameters as C = (119909119900 119910119900) 119903119900 we have

PrSSF (D | C 119868) prop [119873

sum

119894=1

(119872 (119909119894 119910119894) minus 119889119894)2

]

minus(119873minus1)2

(12)

where119873 is the number of measurements made by the robotand the function119872 defined in (7) relies on both (8) and (10)Note that in practice the MoG SSF model described in (10)is used to generate a discrete SSF matrix which is used in theconvolution (9) to compute the likelihood function via thesensor response model119872 in (7)

26 Data Collection for SSF Estimation In this section wedescribe the collection of the light sensor readings in the lab-oratory that were used to estimate the SSF of the light sensorThe SSF is a function not only of the properties of the photodiode but also of the illuminating LED and the height abovethe surface For this reason measurements were made at aheight of 14mm above a surface with a known albedo in adarkened room to avoid complications due to ambient lightand to eliminate the possibility of shadows cast by the sensoror other equipment [19]

The surface which we refer to as the black-and-whiteboundary pattern consisted of two regions a black region onthe left and a white region on the right separated by a sharp

(a)

0∘

90∘

180∘

270∘

(b)

Figure 3 (a) This figure illustrates the laboratory surface referredto as the black-and-white boundary pattern with known albedowhich consisted of two regions a black region on the left and a whiteregion on the right separated by a sharp linear boundary (b) Thisillustrates the four sensor orientations used to collect data for theestimation of the SSF along with the symbols used to indicate themeasured values plotted in Figure 5 The top row views the sensorsas if looking up from the table so that the photodiode placement inthe sensor package can be visualized Below these the gray sensorshapes illustrate how they are oriented looking down at both thesensors and the albedo surface Measurements were taken as thesensor was incrementally moved with one-millimeter steps in adirection perpendicular to the boundary (as indicated by the arrowat the bottom of the figure) from a position of 5 cm to the left ofthe boundary (well within the completely black region) to a position5 cm to the right of the boundary (well within the completely whiteregion)This process was repeated four times with the sensor in eachof the four orientations

linear boundary as shown in Figure 3(a) Here the surfaceand sensor are illustrated as if viewing them by looking up atthem from below the table surface so that the placement ofthe photodiode in the sensor package can be visualized Thelab frame was defined to be at the center of the black-whiteboundary so that the surface albedo 119878BW(119909 119910) is given by

119878BW (119909 119910) = 1 for 119909 gt 00 for 119909 le 0

(13)

Measurements were taken as the sensor was incrementallymoved with one-millimeter steps in a direction perpendic-ular to the boundary from a position of 5 cm to the left ofthe boundary (well within the completely black region) to aposition 5 cm to the right of the boundary (well within thecompletely white region) resulting in 101 measurementsThisprocess was repeated four times with the sensor in each of

6 Journal of Sensors

40

(0 0)

(a)

39

(0 0)

(b)

35

(4 4)

(c)

32

(minus4 4)

(d)

Figure 4 Four additional symmetry-breaking albedo patterns wereemployed In all cases the sensor was placed in the 0∘ orientationat the center of the pattern indicated by (0 0) In the two lowerpatterns the center of the white square was shifted diagonally fromthe center by 4mm as indicated by the coordinates (white text) ofthe corner The recorded intensity levels are displayed in the whitealbedo area

the four orientations (see Figure 3 for an explanation) givinga total of 404 measurements using this pattern

The black-and-white boundary pattern does not providesufficient information to uniquely infer the SSF since thesensor may have a response that is symmetric about a lineoriented at 45∘ with respect to the linear boundary For thisreason we employed four additional albedo patterns consist-ing of black regions with one white quadrant as illustrated inFigure 4 which resulted in four more measurements Thesehave surface albedos defined by

119878119886(119909 119910) =

1 for 119909 gt 0mm 119910 gt 0mm0 otherwise

119878119887(119909 119910) =

1 for 119909 lt 0mm 119910 gt 0mm0 otherwise

119878119888(119909 119910) =

1 for 119909 gt 4mm 119910 gt 4mm0 otherwise

119878119889(119909 119910) =

1 for 119909 lt minus4mm 119910 gt 4mm0 otherwise

(14)

where the subscripts relate each albedo function to thepattern illustrated in Figure 4

27 Estimating SSF MoG Model Parameters In this sectionwe describe the application of Bayesian methods to estimatethe SSF MoG model parameters Keep in mind that in thispaper we are considering two distinct inference problems therobotrsquos inferences about a circle and our inferences about

the light sensor SSF Both of these problems rely on makingpredictions about measured intensities using a light sensorFor this reason many of these equations will not only looksimilar to what we have presented previously but also dependon the same functions mapping modeled albedo fields topredicted sensor responses which are collectively repre-sented using the generic symbol D It may help to keep inmind that the robot is characterizing a circle quantified bymodel parameters represented jointly by the symbol C andwe are estimating the SSF of a light sensor quantified bymodelparameters represented jointly by the symbol 120579 below Withthe exception of the evidence represented by function 119885below (which is not used by the robot in this experiment)all of the probability functions contain the model parametersin their list of arguments making it clear to which inferenceproblem they refer

Theposterior probability for the SSFMoGmodel parame-ters collectively referred to as 120579 = 120579

1 1205792 120579

119873 for amodel

consisting of119873 Gaussians is given by

Pr (120579 | D 119868) = 1119885

Pr (120579 | 119868)Pr (D | 120579 119868) (15)

where here D refers to the data collected for the SSFestimation experiment described in the previous section 119868refers to our prior information about the SSF (which is that itmay have several local peaks and falls off to zero far from thesensor) and119885 refers to the evidence119885 = Pr(D | 119868) which canbe found by

119885 = int119889120579Pr (120579 | 119868)Pr (D | 120579 119868) (16)

In our earlier discussion where Bayesrsquo theorem was used tomake inferences about circles the evidence played the roleof a normalization factor Here since we can consider MoGmodels with different numbers of Gaussians and since weintegrate over all of the possible values of the parametersthe evidence quantifies the degree to which the hypothesizedmodel order119873 supports the dataThat is the optimal numberof Gaussians to be used in the MoG model of the SSF can befound by computing the evidence

All five sets of data D described in the previous sectionwere used to compute the posterior probability These areeach indexed by the subscript 119894 where 119894 = 1 2 3 4 so that119863

119894

refers to the data collected using each of the four orienta-tions 120601

119894= 0∘

90∘

180∘

270∘

to scan the black-and-whiteboundary pattern resulting in 119873

119894= 101 measurements for

119894 = 1 2 3 4 and where the value 119894 = 5 refers to the 119873119894= 4

measurements attained using the 0∘ orientation with the setof four additional patterns

We assign uniform priors so that this is essentially amaximum likelihood calculation with the posterior beingproportional to the likelihood We assign a Student-119905 distri-bution to the likelihood which can be arrived at by assigning

Journal of Sensors 7

a Gaussian likelihood and integrating over the unknownstandard deviation 120590 [16] This can be written as

Pr (119863119894| 120579 119868) prop [

[

119873119894

sum

119895=1

(119872119894119895(119909119894119895 119910119894119895)minus119863119894(119909119894119895 119910119894119895))

2

]

]

minus(119873119894minus1)2

(17)

where 119894 denotes each of the five sets of data and 119895 denotesthe 119895th measurement of that set which was taken at position(119909119894119895 119910119894119895) The function119872

119894119895(119909 119910) represents a predicted mea-

surement value obtained from (7) using (9) with the albedofunction 119878(119909 119910) defined using the albedo pattern 119878BW(119909 119910)with orientation 120601

119894for 119894 = 1 2 3 4 and the albedo patterns

119878119886 119878119887 119878119888 119878119889for 119894 = 5 and 119895 = 1 2 3 4 respectively As such

the likelihood relies on a difference between predicted andmeasured sensor responses

The joint likelihood for the five data sets is foundby takingthe product of the likelihoods for each data set since weexpect that the standard deviations that were marginalizedover to get the Student-119905 distribution could have beendifferent for each of the five data sets as they were not allrecorded at the same time

Pr (D | 120579 119868) =5

prod

119894=1

Pr (119863119894| 120579 119868) (18)

We employed nested sampling [15 16] to explore the pos-terior probability since in addition to providing parameterestimates it is explicitly designed to perform evidence calcu-lations which we use to perform model comparison in iden-tifying the most probable number of Gaussians in the MoGmodel For each of the four MoG models (number of Gaus-sians varying from one to four) the nested sampling algo-rithm was initialized with 300 samples and iterated until thechange in consecutive log-evidence values is less than 10minus8Typically one estimates the mean parameter values by takingan average of the samples weighted by a quantity computedby the nested sampling algorithm called the logWt in [15 16]Here we simply performed a logWt-weighted average of thesampled SSF fields computed using the sampled MoG modelparameters (rather than the logWt-weighted average of theMoG model parameters themselves) so the result obtainedusing an MoG model consisting of a single Gaussian is notstrictly a single two-dimensional Gaussian distribution It isthis discretized estimated SSF fieldmatrix that is used directlyin the convolution (9) to compute the likelihood functions asmentioned earlier in the last lines of Section 25

3 Results and Discussion

In this section we present the SSF MoG light sensor modelestimated from the laboratory data and evaluate its efficacy bydemonstrating a significant improvement of the performancein the autonomous robotic platform

31 Light Sensor SSF MoG Model The light sensor data col-lected using the black-and-white boundary pattern are illus-trated in Figure 5 One can see that the intensity recorded

0∘

90∘

180∘

270∘

Sensor orientation

50

45

40

35

30

25

20

minus5 minus4 minus3 minus2 minus1 0 1 2 3 4 5Position of the sensor (cm)

Reco

rded

inte

nsity

(LEG

O u

nits)

Figure 5 This figure illustrates the intensity measurements1198631 1198632 1198633 1198634from the four sensor orientations recorded from

the sensor using the black-and-white boundary pattern Figure 3shows the orientations of the sensor corresponding to the symbolsused in this figure

Table 1 A comparison of the tested MoG SSF models and theirrespective log evidence (in units of dataminus408)

MoG model order Log evidence Number of parameters1 Gaussian minus6655 plusmn 03 62 Gaussian minus6749 plusmn 03 123 Gaussian minus6719 plusmn 04 184 Gaussian minus7061 plusmn 04 24

by the sensor increases dramatically as the sensor crosses theboundary from the black region to the white region but thechange is not a step function which indicates the finite size ofthe surface area integrated by the sensor It is this effect that isto bemodeled by the SSF functionThere is an obvious asym-metry between the 90∘ and 270∘ orientations due to the shiftof the transition region In addition there is a significant dif-ference in slope of the transition region between the 0∘ 180∘orientations and the 90∘ 270∘ orientations indicating a sig-nificant difference in the width of the SSF in those directionsNote also that the minimum recorded response is not zero asthe reflectance of the black surface is not completely zero

The nested sampling algorithm produced the mean SSFfields for each of the MoG models tested as well as thecorresponding log evidence Table 1 which lists the log evi-dence computed for each MoG model order illustrates thatthe most probable model was obtained from the single two-dimensional Gaussian models by a factor of about exp(9)which means that it is about 8000 times more probable thanthe MoG consisting of two Gaussians Figure 6 shows themean SSF fields described by the MoG models of differentorders In all cases the center of the SSF is shifted slightlyabove the physical center of the sensor package due to theplacement of the photodiode (refer to Figure 1) as indicatedby the data in Figure 5 In addition as predicted one sees

8 Journal of Sensors

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6655 plusmn 03

1MoG

(mm)

(mm

)

(a)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6749 plusmn 03

2 MoG

(mm)

(mm

)

(b)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6719 plusmn 04

3MoG

(mm)

(mm

)

(c)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus7061 plusmn 04

4MoG

(mm)

(mm

)

(d)

Figure 6 This figure illustrates the SSF obtained from the four MoG models along with their corresponding log-evidence values

that the SSF is wider along the 90∘ndash270∘ axis than along the0∘ndash180∘ axis Last Figure 7 demonstrates that the predictedlight sensor output shows excellent similarity to the recordeddata for the black-and-white boundary pattern

In the next section we demonstrate that explicit knowl-edge about how the light sensor integrates light arriving fromwithin its field-of-view improves the inferences one canmakefrom its output

32 Efficacy of Sensor Model The mean SSF field obtainedusing a single two-dimensional Gaussianmodel (Figure 6(a))was incorporated into the likelihood function used by therobotrsquos machine learning system Here we compare therobotrsquos performance in locating and characterizing a circle byobserving the average number of measurements necessary tocharacterize the circle parameters within a precision of 4mm(which is one-half of the spacing between the holes in theLEGO technic parts)

The three panels comprising Figure 8(a) illustrate therobotrsquos machine learning systemrsquos view of the playing fieldusing the naıve light sensor model The previously obtainedmeasurement locations used to obtain light sensor data areindicated by the black and white squares indicating therelative intensity with respect to the naıve light sensor modelThe next selected measurement location is indicated by thegreen square The blue circles represent the 50 hypothesizedcircles sampled from the posterior probability The shadedbackground represents the entropy map which indicates themeasurement locations that promise to provide maximalinformation about the circle to be characterized The lowentropy area surrounding the white square indicates that theregion is probably inside the white circle (not shown) andthat measurements made there will not be as informative asmeasurementsmade elsewhere Note that the circles partitionthe plane and that each partition has a uniform entropy Allmeasurement locations within that partition or any otherpartition sharing the same entropy all stand to be equally

Journal of Sensors 9

50

55

45

40

35

30

25

20

minus3 minus2 minus1 0 1 2 3Position of the center of the sensor (cm)

Inte

nsity

(LEG

O u

nits)

Intensity measurement compared for 1MoG

Prediction1Prediction2Prediction3Prediction4

Data1Data2Data3Data4

Figure 7 A comparison of the observed data (red) with predictions(black) made by the SSF field estimated using the single two-dimensional Gaussian MoG model

informative Here it is the fact that the shape is known to bea circle that is driving the likelihood function

In contrast the three panels comprising Figure 8(b)illustrate the playing field using themore accurate SSFmodelHere one can see that the entropy is higher along the edgesof the sampled circles This indicates that the circle edgespromise to provide more information than the centers ofthe partitioned regions This is because that the SSF modelenables one to detect not only whether the light sensor issituated above the circles edge but also how much of the SSFoverlaps with the white circle That is it helps to identify notonlywhether the sensor is inside the circle (as is accomplishedusing the naıve light sensor model) but also the extent towhich the sensor is on the edge of the circle The additionalinformation provided about the functioning of the lightsensor translates directly into additional information aboutthe albedo that results in the sensor output

This additional information can be quantified by observ-ing how many measurements the robot is required to taketo obtain estimates of the circle parameters within thesame precision in the cases of each light sensor modelOur experiments revealed that on average it takes 19 plusmn 18measurements using the naıve light sensor model comparedto an average of 116plusmn 39measurements for themore accurateSSF light sensor model

4 Conclusion

Thequality of the inferences onemakes froma sensor dependnot only on the quality of the data returned by the sensorbut also on the information one possesses about the sensorrsquos

performance In this paper we have demonstrated via a casestudy how more precisely modeling a sensorrsquos performancecan improve the inferences one can make from its data Inthis case we demonstrated that one can achieve about 18reduction in the number of measurements needed by a robotto make the same inferences by more precisely modeling itslight sensor

This paper demonstrates how a machine learning systemthat employs Bayesian inference (and inquiry) relies onthe likelihood function of the data given the hypothesizedmodel parameters Rather than simply representing a noisemodel the likelihood function quantifies the probability thata hypothesized situation could have given rise to the recordeddata By incorporating more information about the sensors(or equipment) used to record the data one naturally isincorporating this information into the posterior probabilitywhich results in onersquos inferences

This is made even more apparent by a careful study ofthe experimental design problem that this particular roboticsystem is designed to explore For example it is easy toshow that by using the naıve light sensor model the entropydistribution for a proposed measurement location dependssolely on the number of sampled circles forwhich the locationis in the interior of the circle and the number of sampledcircles for which the location is exterior to the circle Giventhat we represented the posterior probability by sampling50 circles the maximum entropy occurs when the proposedmeasurement location is inside 25 circles (and outside 25circles) As the robotrsquos parameter estimates converge one canshow that the system is simply performing a binary searchby asking ldquoyesrdquo or ldquonordquo questions which implies that eachmeasurement results in one bit of information However inthe case where the robot employs an SSF model of the lightsensor the question the robot is essentially asking is moredetailed ldquoto what degree does the circle overlap the lightsensorrsquos SSFrdquoThe answer to such a question tends to providemore information which significantly improves system per-formance One can estimate the information gain achievedby employing the SSF model Consider that the naive modelreveals that estimating the circlersquos position and radius with aprecision of 4mmgiven the prior information about the circleand the playing field requires 25 bits of information Theexperiment using the SSFmodel requires on average 116mea-surements which implies that on average each measurementobtained using the SSF model provides about 25116 = 215bits of information One must keep in mind however thatthis is due to the fact that the SSF model is being used notonly to infer the circle parameters from the data but also toselect the measurement locations

Because the method presented here is based on a verygeneral inferential framework these methods can easily beapplied to other types of sensors and equipment in a widevariety of situations If one has designed a robotic machinelearning system to rely on likelihood functions then sensormodels can be incorporated in more or less a plug-and-playfashion This not only promises to improve the quality ofrobotic systems forced to rely on lower quality sensors but italso opens the possibility for calibration on the fly by updatingsensor models as data are continually collected

10 Journal of Sensors

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(a)

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(b)

Figure 8 (a)These three panels illustrate the robotrsquos machine learning systemrsquos view of the playing field using the naıve light sensor model asthe system progresses through the first three measurements The previously obtained measurement locations used to obtain light sensor dataare indicated by the black-and-white squares indicating the relative intensity with respect to the naıve light sensor model The next selectedmeasurement location is indicated by the green square The blue circles represent the 50 hypothesized circles sampled from the posteriorprobabilityThe shaded background represents the entropymap which indicates themeasurement locations that promise to providemaximalinformation about the circle to be characterized Note that the low entropy area surrounding the white square indicates that the region isprobably inside the white circle (not shown) and that measurements made there will not be as informative as measurements made elsewhereThe entropy map in Figure 2 shows the same experiment at a later stage after seven measurements have been recorded (b)These three panelsillustrate the robotrsquos machine learning systemrsquos view of the playing field using the more accurate SSF light sensor model Note that the entropymap reveals the circle edges to be highly informative This is because it helps to identify not only whether the sensor is inside the circle (as isaccomplished using the naıve light sensor model on the left) but also the extent to which the sensor is on the edge of the circle

Conflict of Interests

The authors have no conflict of interests to report

Acknowledgments

This research was supported in part by the University atAlbany Faculty Research Awards Program (Knuth) and theUniversity at Albany Benevolent Research Grant Award(Malakar) The authors would like to thank Scott Frasso andRotem Guttman for their assistance in interfacing MatLab tothe LEGOBrickTheywould also like to thank Phil Erner andA J Mesiti for their preliminary efforts on the robotic armexperiments and sensor characterization

References

[1] K H Knuth P M Erner and S Frasso ldquoDesigning intelligentinstrumentsrdquo in Proceedings of the 27th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering K Knuth A Caticha J Center A Giffin andC Rodriguez Eds vol 954 pp 203ndash211 AIP July 2007

[2] K H Knuth and J L Center Jr ldquoAutonomous science platformsand question-asking machinesrdquo in Proceedings of the 2nd Inter-national Workshop on Cognitive Information Processing (CIPrsquo10) pp 221ndash226 June 2010

[3] K H Knuth and J L Center ldquoAutonomous sensor placementrdquoin Proceedings of the IEEE International Conference on Technolo-gies for Practical Robot Applications pp 94ndash99 November 2008

Journal of Sensors 11

[4] K H Knuth ldquoIntelligent machines in the twenty-first centuryfoundations of inference and inquiryrdquo Philosophical Transac-tions of the Royal Society A vol 361 no 1813 pp 2859ndash28732003

[5] N K Malakar and K H Knuth ldquoEntropy-based search algo-rithm for experimental designrdquo in Proceedings of the 30thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering A M Djafari JF Bercher and P Bessiere Eds vol 1305 pp 157ndash164 AIP July2010

[6] N K Malakar K H Knuth and D J Lary ldquoMaximum jointentropy and information-based collaboration of automatedlearning machinesrdquo in Proceedings of the 31st InternationalWorkshop on Bayesian Inference and Maximum Entropy Meth-ods in Science and Engineering P Goyal A Giffin K H Knuthand E Vrscay Eds vol 1443 pp 230ndash237 AIP 2012

[7] D V Lindley ldquoOn a measure of the information provided by anexperimentrdquo The Annals of Mathematical Statistics vol 27 no4 pp 986ndash1005 1956

[8] V V Fedorov Theory of Optimal Experiments Probability andMathematical Statistics Academic Press 1972

[9] K Chaloner and I Verdinelli ldquoBayesian experimental design areviewrdquo Statistical Science vol 10 pp 273ndash304 1995

[10] P Sebastiani and H P Wynn ldquoBayesian experimental designand shannon informationrdquo in 1997 Proceedings of the Section onBayesian Statistical Science pp 176ndash181 1997 httpciteseerxistpsueduviewdocsummarydoi=1011566037

[11] P Sebastiani and H P Wynn ldquoMaximum entropy samplingand optimal Bayesian experimental designrdquo Journal of the RoyalStatistical Society B vol 62 no 1 pp 145ndash157 2000

[12] T J Loredo and D F Cherno ldquoBayesian adaptive explorationrdquoin Proceedings of the 23rd International Workshop on BayesianInference and Maximum Entropy Methods in Science and Engi-neering G Erickson and Y Zhai Eds pp 330ndash346 AIP August2003

[13] R Fischer ldquoBayesian experimental designmdashstudies for fusiondiagnosticsrdquo in Proceedings of the 24th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering vol 735 pp 76ndash83 November 2004

[14] C E Shannon and W Weaver The Mathematical Theory ofCommunication University of Illinois Press Chicago Ill USA1949

[15] J Skilling ldquoNested samplingrdquo in Proceedings of the 24th Inter-nationalWorkshop onBayesian Inference andMaximumEntropyMethods in Science and Engineering vol 735 pp 395ndash405 2004

[16] D Sivia and J Skilling Data Analysis A Bayesian TutorialOxford University Press New York NY USA 2nd edition2006

[17] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953

[18] W K Hastings ldquoMonte carlo sampling methods using Markovchains and their applicationsrdquo Biometrika vol 57 no 1 pp 97ndash109 1970

[19] N K Malakar A J Mesiti and K H Knuth ldquoThe spatialsensitivity function of a light sensorrdquo in Proceedings of the 29thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering P Goggans andC Y Chan Eds vol 1193 of AIP Conference Proceedings pp352ndash359American Institute of PhysicsOxfordMassUSA July2009

Submit your manuscripts athttpwwwhindawicom

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013Part I

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

DistributedSensor Networks

International Journal of

ISRN Signal Processing

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Mechanical Engineering

Advances in

Modelling amp Simulation in EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inOptoElectronics

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013

ISRN Sensor Networks

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawi Publishing Corporation httpwwwhindawicom Volume 2013

The Scientific World Journal

ISRN Robotics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

Antennas andPropagation

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

ISRN Electronics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

thinspJournalthinspofthinsp

Sensors

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Active and Passive Electronic Components

Chemical EngineeringInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Electrical and Computer Engineering

Journal of

ISRN Civil Engineering

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inAcoustics ampVibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

6 Journal of Sensors

40

(0 0)

(a)

39

(0 0)

(b)

35

(4 4)

(c)

32

(minus4 4)

(d)

Figure 4 Four additional symmetry-breaking albedo patterns wereemployed In all cases the sensor was placed in the 0∘ orientationat the center of the pattern indicated by (0 0) In the two lowerpatterns the center of the white square was shifted diagonally fromthe center by 4mm as indicated by the coordinates (white text) ofthe corner The recorded intensity levels are displayed in the whitealbedo area

the four orientations (see Figure 3 for an explanation) givinga total of 404 measurements using this pattern

The black-and-white boundary pattern does not providesufficient information to uniquely infer the SSF since thesensor may have a response that is symmetric about a lineoriented at 45∘ with respect to the linear boundary For thisreason we employed four additional albedo patterns consist-ing of black regions with one white quadrant as illustrated inFigure 4 which resulted in four more measurements Thesehave surface albedos defined by

119878119886(119909 119910) =

1 for 119909 gt 0mm 119910 gt 0mm0 otherwise

119878119887(119909 119910) =

1 for 119909 lt 0mm 119910 gt 0mm0 otherwise

119878119888(119909 119910) =

1 for 119909 gt 4mm 119910 gt 4mm0 otherwise

119878119889(119909 119910) =

1 for 119909 lt minus4mm 119910 gt 4mm0 otherwise

(14)

where the subscripts relate each albedo function to thepattern illustrated in Figure 4

27 Estimating SSF MoG Model Parameters In this sectionwe describe the application of Bayesian methods to estimatethe SSF MoG model parameters Keep in mind that in thispaper we are considering two distinct inference problems therobotrsquos inferences about a circle and our inferences about

the light sensor SSF Both of these problems rely on makingpredictions about measured intensities using a light sensorFor this reason many of these equations will not only looksimilar to what we have presented previously but also dependon the same functions mapping modeled albedo fields topredicted sensor responses which are collectively repre-sented using the generic symbol D It may help to keep inmind that the robot is characterizing a circle quantified bymodel parameters represented jointly by the symbol C andwe are estimating the SSF of a light sensor quantified bymodelparameters represented jointly by the symbol 120579 below Withthe exception of the evidence represented by function 119885below (which is not used by the robot in this experiment)all of the probability functions contain the model parametersin their list of arguments making it clear to which inferenceproblem they refer

Theposterior probability for the SSFMoGmodel parame-ters collectively referred to as 120579 = 120579

1 1205792 120579

119873 for amodel

consisting of119873 Gaussians is given by

Pr (120579 | D 119868) = 1119885

Pr (120579 | 119868)Pr (D | 120579 119868) (15)

where here D refers to the data collected for the SSFestimation experiment described in the previous section 119868refers to our prior information about the SSF (which is that itmay have several local peaks and falls off to zero far from thesensor) and119885 refers to the evidence119885 = Pr(D | 119868) which canbe found by

119885 = int119889120579Pr (120579 | 119868)Pr (D | 120579 119868) (16)

In our earlier discussion where Bayesrsquo theorem was used tomake inferences about circles the evidence played the roleof a normalization factor Here since we can consider MoGmodels with different numbers of Gaussians and since weintegrate over all of the possible values of the parametersthe evidence quantifies the degree to which the hypothesizedmodel order119873 supports the dataThat is the optimal numberof Gaussians to be used in the MoG model of the SSF can befound by computing the evidence

All five sets of data D described in the previous sectionwere used to compute the posterior probability These areeach indexed by the subscript 119894 where 119894 = 1 2 3 4 so that119863

119894

refers to the data collected using each of the four orienta-tions 120601

119894= 0∘

90∘

180∘

270∘

to scan the black-and-whiteboundary pattern resulting in 119873

119894= 101 measurements for

119894 = 1 2 3 4 and where the value 119894 = 5 refers to the 119873119894= 4

measurements attained using the 0∘ orientation with the setof four additional patterns

We assign uniform priors so that this is essentially amaximum likelihood calculation with the posterior beingproportional to the likelihood We assign a Student-119905 distri-bution to the likelihood which can be arrived at by assigning

Journal of Sensors 7

a Gaussian likelihood and integrating over the unknownstandard deviation 120590 [16] This can be written as

Pr (119863119894| 120579 119868) prop [

[

119873119894

sum

119895=1

(119872119894119895(119909119894119895 119910119894119895)minus119863119894(119909119894119895 119910119894119895))

2

]

]

minus(119873119894minus1)2

(17)

where 119894 denotes each of the five sets of data and 119895 denotesthe 119895th measurement of that set which was taken at position(119909119894119895 119910119894119895) The function119872

119894119895(119909 119910) represents a predicted mea-

surement value obtained from (7) using (9) with the albedofunction 119878(119909 119910) defined using the albedo pattern 119878BW(119909 119910)with orientation 120601

119894for 119894 = 1 2 3 4 and the albedo patterns

119878119886 119878119887 119878119888 119878119889for 119894 = 5 and 119895 = 1 2 3 4 respectively As such

the likelihood relies on a difference between predicted andmeasured sensor responses

The joint likelihood for the five data sets is foundby takingthe product of the likelihoods for each data set since weexpect that the standard deviations that were marginalizedover to get the Student-119905 distribution could have beendifferent for each of the five data sets as they were not allrecorded at the same time

Pr (D | 120579 119868) =5

prod

119894=1

Pr (119863119894| 120579 119868) (18)

We employed nested sampling [15 16] to explore the pos-terior probability since in addition to providing parameterestimates it is explicitly designed to perform evidence calcu-lations which we use to perform model comparison in iden-tifying the most probable number of Gaussians in the MoGmodel For each of the four MoG models (number of Gaus-sians varying from one to four) the nested sampling algo-rithm was initialized with 300 samples and iterated until thechange in consecutive log-evidence values is less than 10minus8Typically one estimates the mean parameter values by takingan average of the samples weighted by a quantity computedby the nested sampling algorithm called the logWt in [15 16]Here we simply performed a logWt-weighted average of thesampled SSF fields computed using the sampled MoG modelparameters (rather than the logWt-weighted average of theMoG model parameters themselves) so the result obtainedusing an MoG model consisting of a single Gaussian is notstrictly a single two-dimensional Gaussian distribution It isthis discretized estimated SSF fieldmatrix that is used directlyin the convolution (9) to compute the likelihood functions asmentioned earlier in the last lines of Section 25

3 Results and Discussion

In this section we present the SSF MoG light sensor modelestimated from the laboratory data and evaluate its efficacy bydemonstrating a significant improvement of the performancein the autonomous robotic platform

31 Light Sensor SSF MoG Model The light sensor data col-lected using the black-and-white boundary pattern are illus-trated in Figure 5 One can see that the intensity recorded

0∘

90∘

180∘

270∘

Sensor orientation

50

45

40

35

30

25

20

minus5 minus4 minus3 minus2 minus1 0 1 2 3 4 5Position of the sensor (cm)

Reco

rded

inte

nsity

(LEG

O u

nits)

Figure 5 This figure illustrates the intensity measurements1198631 1198632 1198633 1198634from the four sensor orientations recorded from

the sensor using the black-and-white boundary pattern Figure 3shows the orientations of the sensor corresponding to the symbolsused in this figure

Table 1 A comparison of the tested MoG SSF models and theirrespective log evidence (in units of dataminus408)

MoG model order Log evidence Number of parameters1 Gaussian minus6655 plusmn 03 62 Gaussian minus6749 plusmn 03 123 Gaussian minus6719 plusmn 04 184 Gaussian minus7061 plusmn 04 24

by the sensor increases dramatically as the sensor crosses theboundary from the black region to the white region but thechange is not a step function which indicates the finite size ofthe surface area integrated by the sensor It is this effect that isto bemodeled by the SSF functionThere is an obvious asym-metry between the 90∘ and 270∘ orientations due to the shiftof the transition region In addition there is a significant dif-ference in slope of the transition region between the 0∘ 180∘orientations and the 90∘ 270∘ orientations indicating a sig-nificant difference in the width of the SSF in those directionsNote also that the minimum recorded response is not zero asthe reflectance of the black surface is not completely zero

The nested sampling algorithm produced the mean SSFfields for each of the MoG models tested as well as thecorresponding log evidence Table 1 which lists the log evi-dence computed for each MoG model order illustrates thatthe most probable model was obtained from the single two-dimensional Gaussian models by a factor of about exp(9)which means that it is about 8000 times more probable thanthe MoG consisting of two Gaussians Figure 6 shows themean SSF fields described by the MoG models of differentorders In all cases the center of the SSF is shifted slightlyabove the physical center of the sensor package due to theplacement of the photodiode (refer to Figure 1) as indicatedby the data in Figure 5 In addition as predicted one sees

8 Journal of Sensors

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6655 plusmn 03

1MoG

(mm)

(mm

)

(a)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6749 plusmn 03

2 MoG

(mm)

(mm

)

(b)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6719 plusmn 04

3MoG

(mm)

(mm

)

(c)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus7061 plusmn 04

4MoG

(mm)

(mm

)

(d)

Figure 6 This figure illustrates the SSF obtained from the four MoG models along with their corresponding log-evidence values

that the SSF is wider along the 90∘ndash270∘ axis than along the0∘ndash180∘ axis Last Figure 7 demonstrates that the predictedlight sensor output shows excellent similarity to the recordeddata for the black-and-white boundary pattern

In the next section we demonstrate that explicit knowl-edge about how the light sensor integrates light arriving fromwithin its field-of-view improves the inferences one canmakefrom its output

32 Efficacy of Sensor Model The mean SSF field obtainedusing a single two-dimensional Gaussianmodel (Figure 6(a))was incorporated into the likelihood function used by therobotrsquos machine learning system Here we compare therobotrsquos performance in locating and characterizing a circle byobserving the average number of measurements necessary tocharacterize the circle parameters within a precision of 4mm(which is one-half of the spacing between the holes in theLEGO technic parts)

The three panels comprising Figure 8(a) illustrate therobotrsquos machine learning systemrsquos view of the playing fieldusing the naıve light sensor model The previously obtainedmeasurement locations used to obtain light sensor data areindicated by the black and white squares indicating therelative intensity with respect to the naıve light sensor modelThe next selected measurement location is indicated by thegreen square The blue circles represent the 50 hypothesizedcircles sampled from the posterior probability The shadedbackground represents the entropy map which indicates themeasurement locations that promise to provide maximalinformation about the circle to be characterized The lowentropy area surrounding the white square indicates that theregion is probably inside the white circle (not shown) andthat measurements made there will not be as informative asmeasurementsmade elsewhere Note that the circles partitionthe plane and that each partition has a uniform entropy Allmeasurement locations within that partition or any otherpartition sharing the same entropy all stand to be equally

Journal of Sensors 9

50

55

45

40

35

30

25

20

minus3 minus2 minus1 0 1 2 3Position of the center of the sensor (cm)

Inte

nsity

(LEG

O u

nits)

Intensity measurement compared for 1MoG

Prediction1Prediction2Prediction3Prediction4

Data1Data2Data3Data4

Figure 7 A comparison of the observed data (red) with predictions(black) made by the SSF field estimated using the single two-dimensional Gaussian MoG model

informative Here it is the fact that the shape is known to bea circle that is driving the likelihood function

In contrast the three panels comprising Figure 8(b)illustrate the playing field using themore accurate SSFmodelHere one can see that the entropy is higher along the edgesof the sampled circles This indicates that the circle edgespromise to provide more information than the centers ofthe partitioned regions This is because that the SSF modelenables one to detect not only whether the light sensor issituated above the circles edge but also how much of the SSFoverlaps with the white circle That is it helps to identify notonlywhether the sensor is inside the circle (as is accomplishedusing the naıve light sensor model) but also the extent towhich the sensor is on the edge of the circle The additionalinformation provided about the functioning of the lightsensor translates directly into additional information aboutthe albedo that results in the sensor output

This additional information can be quantified by observ-ing how many measurements the robot is required to taketo obtain estimates of the circle parameters within thesame precision in the cases of each light sensor modelOur experiments revealed that on average it takes 19 plusmn 18measurements using the naıve light sensor model comparedto an average of 116plusmn 39measurements for themore accurateSSF light sensor model

4 Conclusion

Thequality of the inferences onemakes froma sensor dependnot only on the quality of the data returned by the sensorbut also on the information one possesses about the sensorrsquos

performance In this paper we have demonstrated via a casestudy how more precisely modeling a sensorrsquos performancecan improve the inferences one can make from its data Inthis case we demonstrated that one can achieve about 18reduction in the number of measurements needed by a robotto make the same inferences by more precisely modeling itslight sensor

This paper demonstrates how a machine learning systemthat employs Bayesian inference (and inquiry) relies onthe likelihood function of the data given the hypothesizedmodel parameters Rather than simply representing a noisemodel the likelihood function quantifies the probability thata hypothesized situation could have given rise to the recordeddata By incorporating more information about the sensors(or equipment) used to record the data one naturally isincorporating this information into the posterior probabilitywhich results in onersquos inferences

This is made even more apparent by a careful study ofthe experimental design problem that this particular roboticsystem is designed to explore For example it is easy toshow that by using the naıve light sensor model the entropydistribution for a proposed measurement location dependssolely on the number of sampled circles forwhich the locationis in the interior of the circle and the number of sampledcircles for which the location is exterior to the circle Giventhat we represented the posterior probability by sampling50 circles the maximum entropy occurs when the proposedmeasurement location is inside 25 circles (and outside 25circles) As the robotrsquos parameter estimates converge one canshow that the system is simply performing a binary searchby asking ldquoyesrdquo or ldquonordquo questions which implies that eachmeasurement results in one bit of information However inthe case where the robot employs an SSF model of the lightsensor the question the robot is essentially asking is moredetailed ldquoto what degree does the circle overlap the lightsensorrsquos SSFrdquoThe answer to such a question tends to providemore information which significantly improves system per-formance One can estimate the information gain achievedby employing the SSF model Consider that the naive modelreveals that estimating the circlersquos position and radius with aprecision of 4mmgiven the prior information about the circleand the playing field requires 25 bits of information Theexperiment using the SSFmodel requires on average 116mea-surements which implies that on average each measurementobtained using the SSF model provides about 25116 = 215bits of information One must keep in mind however thatthis is due to the fact that the SSF model is being used notonly to infer the circle parameters from the data but also toselect the measurement locations

Because the method presented here is based on a verygeneral inferential framework these methods can easily beapplied to other types of sensors and equipment in a widevariety of situations If one has designed a robotic machinelearning system to rely on likelihood functions then sensormodels can be incorporated in more or less a plug-and-playfashion This not only promises to improve the quality ofrobotic systems forced to rely on lower quality sensors but italso opens the possibility for calibration on the fly by updatingsensor models as data are continually collected

10 Journal of Sensors

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(a)

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(b)

Figure 8 (a)These three panels illustrate the robotrsquos machine learning systemrsquos view of the playing field using the naıve light sensor model asthe system progresses through the first three measurements The previously obtained measurement locations used to obtain light sensor dataare indicated by the black-and-white squares indicating the relative intensity with respect to the naıve light sensor model The next selectedmeasurement location is indicated by the green square The blue circles represent the 50 hypothesized circles sampled from the posteriorprobabilityThe shaded background represents the entropymap which indicates themeasurement locations that promise to providemaximalinformation about the circle to be characterized Note that the low entropy area surrounding the white square indicates that the region isprobably inside the white circle (not shown) and that measurements made there will not be as informative as measurements made elsewhereThe entropy map in Figure 2 shows the same experiment at a later stage after seven measurements have been recorded (b)These three panelsillustrate the robotrsquos machine learning systemrsquos view of the playing field using the more accurate SSF light sensor model Note that the entropymap reveals the circle edges to be highly informative This is because it helps to identify not only whether the sensor is inside the circle (as isaccomplished using the naıve light sensor model on the left) but also the extent to which the sensor is on the edge of the circle

Conflict of Interests

The authors have no conflict of interests to report

Acknowledgments

This research was supported in part by the University atAlbany Faculty Research Awards Program (Knuth) and theUniversity at Albany Benevolent Research Grant Award(Malakar) The authors would like to thank Scott Frasso andRotem Guttman for their assistance in interfacing MatLab tothe LEGOBrickTheywould also like to thank Phil Erner andA J Mesiti for their preliminary efforts on the robotic armexperiments and sensor characterization

References

[1] K H Knuth P M Erner and S Frasso ldquoDesigning intelligentinstrumentsrdquo in Proceedings of the 27th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering K Knuth A Caticha J Center A Giffin andC Rodriguez Eds vol 954 pp 203ndash211 AIP July 2007

[2] K H Knuth and J L Center Jr ldquoAutonomous science platformsand question-asking machinesrdquo in Proceedings of the 2nd Inter-national Workshop on Cognitive Information Processing (CIPrsquo10) pp 221ndash226 June 2010

[3] K H Knuth and J L Center ldquoAutonomous sensor placementrdquoin Proceedings of the IEEE International Conference on Technolo-gies for Practical Robot Applications pp 94ndash99 November 2008

Journal of Sensors 11

[4] K H Knuth ldquoIntelligent machines in the twenty-first centuryfoundations of inference and inquiryrdquo Philosophical Transac-tions of the Royal Society A vol 361 no 1813 pp 2859ndash28732003

[5] N K Malakar and K H Knuth ldquoEntropy-based search algo-rithm for experimental designrdquo in Proceedings of the 30thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering A M Djafari JF Bercher and P Bessiere Eds vol 1305 pp 157ndash164 AIP July2010

[6] N K Malakar K H Knuth and D J Lary ldquoMaximum jointentropy and information-based collaboration of automatedlearning machinesrdquo in Proceedings of the 31st InternationalWorkshop on Bayesian Inference and Maximum Entropy Meth-ods in Science and Engineering P Goyal A Giffin K H Knuthand E Vrscay Eds vol 1443 pp 230ndash237 AIP 2012

[7] D V Lindley ldquoOn a measure of the information provided by anexperimentrdquo The Annals of Mathematical Statistics vol 27 no4 pp 986ndash1005 1956

[8] V V Fedorov Theory of Optimal Experiments Probability andMathematical Statistics Academic Press 1972

[9] K Chaloner and I Verdinelli ldquoBayesian experimental design areviewrdquo Statistical Science vol 10 pp 273ndash304 1995

[10] P Sebastiani and H P Wynn ldquoBayesian experimental designand shannon informationrdquo in 1997 Proceedings of the Section onBayesian Statistical Science pp 176ndash181 1997 httpciteseerxistpsueduviewdocsummarydoi=1011566037

[11] P Sebastiani and H P Wynn ldquoMaximum entropy samplingand optimal Bayesian experimental designrdquo Journal of the RoyalStatistical Society B vol 62 no 1 pp 145ndash157 2000

[12] T J Loredo and D F Cherno ldquoBayesian adaptive explorationrdquoin Proceedings of the 23rd International Workshop on BayesianInference and Maximum Entropy Methods in Science and Engi-neering G Erickson and Y Zhai Eds pp 330ndash346 AIP August2003

[13] R Fischer ldquoBayesian experimental designmdashstudies for fusiondiagnosticsrdquo in Proceedings of the 24th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering vol 735 pp 76ndash83 November 2004

[14] C E Shannon and W Weaver The Mathematical Theory ofCommunication University of Illinois Press Chicago Ill USA1949

[15] J Skilling ldquoNested samplingrdquo in Proceedings of the 24th Inter-nationalWorkshop onBayesian Inference andMaximumEntropyMethods in Science and Engineering vol 735 pp 395ndash405 2004

[16] D Sivia and J Skilling Data Analysis A Bayesian TutorialOxford University Press New York NY USA 2nd edition2006

[17] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953

[18] W K Hastings ldquoMonte carlo sampling methods using Markovchains and their applicationsrdquo Biometrika vol 57 no 1 pp 97ndash109 1970

[19] N K Malakar A J Mesiti and K H Knuth ldquoThe spatialsensitivity function of a light sensorrdquo in Proceedings of the 29thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering P Goggans andC Y Chan Eds vol 1193 of AIP Conference Proceedings pp352ndash359American Institute of PhysicsOxfordMassUSA July2009

Submit your manuscripts athttpwwwhindawicom

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013Part I

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

DistributedSensor Networks

International Journal of

ISRN Signal Processing

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Mechanical Engineering

Advances in

Modelling amp Simulation in EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inOptoElectronics

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013

ISRN Sensor Networks

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawi Publishing Corporation httpwwwhindawicom Volume 2013

The Scientific World Journal

ISRN Robotics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

Antennas andPropagation

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

ISRN Electronics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

thinspJournalthinspofthinsp

Sensors

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Active and Passive Electronic Components

Chemical EngineeringInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Electrical and Computer Engineering

Journal of

ISRN Civil Engineering

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inAcoustics ampVibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Journal of Sensors 7

a Gaussian likelihood and integrating over the unknownstandard deviation 120590 [16] This can be written as

Pr (119863119894| 120579 119868) prop [

[

119873119894

sum

119895=1

(119872119894119895(119909119894119895 119910119894119895)minus119863119894(119909119894119895 119910119894119895))

2

]

]

minus(119873119894minus1)2

(17)

where 119894 denotes each of the five sets of data and 119895 denotesthe 119895th measurement of that set which was taken at position(119909119894119895 119910119894119895) The function119872

119894119895(119909 119910) represents a predicted mea-

surement value obtained from (7) using (9) with the albedofunction 119878(119909 119910) defined using the albedo pattern 119878BW(119909 119910)with orientation 120601

119894for 119894 = 1 2 3 4 and the albedo patterns

119878119886 119878119887 119878119888 119878119889for 119894 = 5 and 119895 = 1 2 3 4 respectively As such

the likelihood relies on a difference between predicted andmeasured sensor responses

The joint likelihood for the five data sets is foundby takingthe product of the likelihoods for each data set since weexpect that the standard deviations that were marginalizedover to get the Student-119905 distribution could have beendifferent for each of the five data sets as they were not allrecorded at the same time

Pr (D | 120579 119868) =5

prod

119894=1

Pr (119863119894| 120579 119868) (18)

We employed nested sampling [15 16] to explore the pos-terior probability since in addition to providing parameterestimates it is explicitly designed to perform evidence calcu-lations which we use to perform model comparison in iden-tifying the most probable number of Gaussians in the MoGmodel For each of the four MoG models (number of Gaus-sians varying from one to four) the nested sampling algo-rithm was initialized with 300 samples and iterated until thechange in consecutive log-evidence values is less than 10minus8Typically one estimates the mean parameter values by takingan average of the samples weighted by a quantity computedby the nested sampling algorithm called the logWt in [15 16]Here we simply performed a logWt-weighted average of thesampled SSF fields computed using the sampled MoG modelparameters (rather than the logWt-weighted average of theMoG model parameters themselves) so the result obtainedusing an MoG model consisting of a single Gaussian is notstrictly a single two-dimensional Gaussian distribution It isthis discretized estimated SSF fieldmatrix that is used directlyin the convolution (9) to compute the likelihood functions asmentioned earlier in the last lines of Section 25

3 Results and Discussion

In this section we present the SSF MoG light sensor modelestimated from the laboratory data and evaluate its efficacy bydemonstrating a significant improvement of the performancein the autonomous robotic platform

31 Light Sensor SSF MoG Model The light sensor data col-lected using the black-and-white boundary pattern are illus-trated in Figure 5 One can see that the intensity recorded

0∘

90∘

180∘

270∘

Sensor orientation

50

45

40

35

30

25

20

minus5 minus4 minus3 minus2 minus1 0 1 2 3 4 5Position of the sensor (cm)

Reco

rded

inte

nsity

(LEG

O u

nits)

Figure 5 This figure illustrates the intensity measurements1198631 1198632 1198633 1198634from the four sensor orientations recorded from

the sensor using the black-and-white boundary pattern Figure 3shows the orientations of the sensor corresponding to the symbolsused in this figure

Table 1 A comparison of the tested MoG SSF models and theirrespective log evidence (in units of dataminus408)

MoG model order Log evidence Number of parameters1 Gaussian minus6655 plusmn 03 62 Gaussian minus6749 plusmn 03 123 Gaussian minus6719 plusmn 04 184 Gaussian minus7061 plusmn 04 24

by the sensor increases dramatically as the sensor crosses theboundary from the black region to the white region but thechange is not a step function which indicates the finite size ofthe surface area integrated by the sensor It is this effect that isto bemodeled by the SSF functionThere is an obvious asym-metry between the 90∘ and 270∘ orientations due to the shiftof the transition region In addition there is a significant dif-ference in slope of the transition region between the 0∘ 180∘orientations and the 90∘ 270∘ orientations indicating a sig-nificant difference in the width of the SSF in those directionsNote also that the minimum recorded response is not zero asthe reflectance of the black surface is not completely zero

The nested sampling algorithm produced the mean SSFfields for each of the MoG models tested as well as thecorresponding log evidence Table 1 which lists the log evi-dence computed for each MoG model order illustrates thatthe most probable model was obtained from the single two-dimensional Gaussian models by a factor of about exp(9)which means that it is about 8000 times more probable thanthe MoG consisting of two Gaussians Figure 6 shows themean SSF fields described by the MoG models of differentorders In all cases the center of the SSF is shifted slightlyabove the physical center of the sensor package due to theplacement of the photodiode (refer to Figure 1) as indicatedby the data in Figure 5 In addition as predicted one sees

8 Journal of Sensors

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6655 plusmn 03

1MoG

(mm)

(mm

)

(a)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6749 plusmn 03

2 MoG

(mm)

(mm

)

(b)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6719 plusmn 04

3MoG

(mm)

(mm

)

(c)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus7061 plusmn 04

4MoG

(mm)

(mm

)

(d)

Figure 6 This figure illustrates the SSF obtained from the four MoG models along with their corresponding log-evidence values

that the SSF is wider along the 90∘ndash270∘ axis than along the0∘ndash180∘ axis Last Figure 7 demonstrates that the predictedlight sensor output shows excellent similarity to the recordeddata for the black-and-white boundary pattern

In the next section we demonstrate that explicit knowl-edge about how the light sensor integrates light arriving fromwithin its field-of-view improves the inferences one canmakefrom its output

32 Efficacy of Sensor Model The mean SSF field obtainedusing a single two-dimensional Gaussianmodel (Figure 6(a))was incorporated into the likelihood function used by therobotrsquos machine learning system Here we compare therobotrsquos performance in locating and characterizing a circle byobserving the average number of measurements necessary tocharacterize the circle parameters within a precision of 4mm(which is one-half of the spacing between the holes in theLEGO technic parts)

The three panels comprising Figure 8(a) illustrate therobotrsquos machine learning systemrsquos view of the playing fieldusing the naıve light sensor model The previously obtainedmeasurement locations used to obtain light sensor data areindicated by the black and white squares indicating therelative intensity with respect to the naıve light sensor modelThe next selected measurement location is indicated by thegreen square The blue circles represent the 50 hypothesizedcircles sampled from the posterior probability The shadedbackground represents the entropy map which indicates themeasurement locations that promise to provide maximalinformation about the circle to be characterized The lowentropy area surrounding the white square indicates that theregion is probably inside the white circle (not shown) andthat measurements made there will not be as informative asmeasurementsmade elsewhere Note that the circles partitionthe plane and that each partition has a uniform entropy Allmeasurement locations within that partition or any otherpartition sharing the same entropy all stand to be equally

Journal of Sensors 9

50

55

45

40

35

30

25

20

minus3 minus2 minus1 0 1 2 3Position of the center of the sensor (cm)

Inte

nsity

(LEG

O u

nits)

Intensity measurement compared for 1MoG

Prediction1Prediction2Prediction3Prediction4

Data1Data2Data3Data4

Figure 7 A comparison of the observed data (red) with predictions(black) made by the SSF field estimated using the single two-dimensional Gaussian MoG model

informative Here it is the fact that the shape is known to bea circle that is driving the likelihood function

In contrast the three panels comprising Figure 8(b)illustrate the playing field using themore accurate SSFmodelHere one can see that the entropy is higher along the edgesof the sampled circles This indicates that the circle edgespromise to provide more information than the centers ofthe partitioned regions This is because that the SSF modelenables one to detect not only whether the light sensor issituated above the circles edge but also how much of the SSFoverlaps with the white circle That is it helps to identify notonlywhether the sensor is inside the circle (as is accomplishedusing the naıve light sensor model) but also the extent towhich the sensor is on the edge of the circle The additionalinformation provided about the functioning of the lightsensor translates directly into additional information aboutthe albedo that results in the sensor output

This additional information can be quantified by observ-ing how many measurements the robot is required to taketo obtain estimates of the circle parameters within thesame precision in the cases of each light sensor modelOur experiments revealed that on average it takes 19 plusmn 18measurements using the naıve light sensor model comparedto an average of 116plusmn 39measurements for themore accurateSSF light sensor model

4 Conclusion

Thequality of the inferences onemakes froma sensor dependnot only on the quality of the data returned by the sensorbut also on the information one possesses about the sensorrsquos

performance In this paper we have demonstrated via a casestudy how more precisely modeling a sensorrsquos performancecan improve the inferences one can make from its data Inthis case we demonstrated that one can achieve about 18reduction in the number of measurements needed by a robotto make the same inferences by more precisely modeling itslight sensor

This paper demonstrates how a machine learning systemthat employs Bayesian inference (and inquiry) relies onthe likelihood function of the data given the hypothesizedmodel parameters Rather than simply representing a noisemodel the likelihood function quantifies the probability thata hypothesized situation could have given rise to the recordeddata By incorporating more information about the sensors(or equipment) used to record the data one naturally isincorporating this information into the posterior probabilitywhich results in onersquos inferences

This is made even more apparent by a careful study ofthe experimental design problem that this particular roboticsystem is designed to explore For example it is easy toshow that by using the naıve light sensor model the entropydistribution for a proposed measurement location dependssolely on the number of sampled circles forwhich the locationis in the interior of the circle and the number of sampledcircles for which the location is exterior to the circle Giventhat we represented the posterior probability by sampling50 circles the maximum entropy occurs when the proposedmeasurement location is inside 25 circles (and outside 25circles) As the robotrsquos parameter estimates converge one canshow that the system is simply performing a binary searchby asking ldquoyesrdquo or ldquonordquo questions which implies that eachmeasurement results in one bit of information However inthe case where the robot employs an SSF model of the lightsensor the question the robot is essentially asking is moredetailed ldquoto what degree does the circle overlap the lightsensorrsquos SSFrdquoThe answer to such a question tends to providemore information which significantly improves system per-formance One can estimate the information gain achievedby employing the SSF model Consider that the naive modelreveals that estimating the circlersquos position and radius with aprecision of 4mmgiven the prior information about the circleand the playing field requires 25 bits of information Theexperiment using the SSFmodel requires on average 116mea-surements which implies that on average each measurementobtained using the SSF model provides about 25116 = 215bits of information One must keep in mind however thatthis is due to the fact that the SSF model is being used notonly to infer the circle parameters from the data but also toselect the measurement locations

Because the method presented here is based on a verygeneral inferential framework these methods can easily beapplied to other types of sensors and equipment in a widevariety of situations If one has designed a robotic machinelearning system to rely on likelihood functions then sensormodels can be incorporated in more or less a plug-and-playfashion This not only promises to improve the quality ofrobotic systems forced to rely on lower quality sensors but italso opens the possibility for calibration on the fly by updatingsensor models as data are continually collected

10 Journal of Sensors

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(a)

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(b)

Figure 8 (a)These three panels illustrate the robotrsquos machine learning systemrsquos view of the playing field using the naıve light sensor model asthe system progresses through the first three measurements The previously obtained measurement locations used to obtain light sensor dataare indicated by the black-and-white squares indicating the relative intensity with respect to the naıve light sensor model The next selectedmeasurement location is indicated by the green square The blue circles represent the 50 hypothesized circles sampled from the posteriorprobabilityThe shaded background represents the entropymap which indicates themeasurement locations that promise to providemaximalinformation about the circle to be characterized Note that the low entropy area surrounding the white square indicates that the region isprobably inside the white circle (not shown) and that measurements made there will not be as informative as measurements made elsewhereThe entropy map in Figure 2 shows the same experiment at a later stage after seven measurements have been recorded (b)These three panelsillustrate the robotrsquos machine learning systemrsquos view of the playing field using the more accurate SSF light sensor model Note that the entropymap reveals the circle edges to be highly informative This is because it helps to identify not only whether the sensor is inside the circle (as isaccomplished using the naıve light sensor model on the left) but also the extent to which the sensor is on the edge of the circle

Conflict of Interests

The authors have no conflict of interests to report

Acknowledgments

This research was supported in part by the University atAlbany Faculty Research Awards Program (Knuth) and theUniversity at Albany Benevolent Research Grant Award(Malakar) The authors would like to thank Scott Frasso andRotem Guttman for their assistance in interfacing MatLab tothe LEGOBrickTheywould also like to thank Phil Erner andA J Mesiti for their preliminary efforts on the robotic armexperiments and sensor characterization

References

[1] K H Knuth P M Erner and S Frasso ldquoDesigning intelligentinstrumentsrdquo in Proceedings of the 27th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering K Knuth A Caticha J Center A Giffin andC Rodriguez Eds vol 954 pp 203ndash211 AIP July 2007

[2] K H Knuth and J L Center Jr ldquoAutonomous science platformsand question-asking machinesrdquo in Proceedings of the 2nd Inter-national Workshop on Cognitive Information Processing (CIPrsquo10) pp 221ndash226 June 2010

[3] K H Knuth and J L Center ldquoAutonomous sensor placementrdquoin Proceedings of the IEEE International Conference on Technolo-gies for Practical Robot Applications pp 94ndash99 November 2008

Journal of Sensors 11

[4] K H Knuth ldquoIntelligent machines in the twenty-first centuryfoundations of inference and inquiryrdquo Philosophical Transac-tions of the Royal Society A vol 361 no 1813 pp 2859ndash28732003

[5] N K Malakar and K H Knuth ldquoEntropy-based search algo-rithm for experimental designrdquo in Proceedings of the 30thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering A M Djafari JF Bercher and P Bessiere Eds vol 1305 pp 157ndash164 AIP July2010

[6] N K Malakar K H Knuth and D J Lary ldquoMaximum jointentropy and information-based collaboration of automatedlearning machinesrdquo in Proceedings of the 31st InternationalWorkshop on Bayesian Inference and Maximum Entropy Meth-ods in Science and Engineering P Goyal A Giffin K H Knuthand E Vrscay Eds vol 1443 pp 230ndash237 AIP 2012

[7] D V Lindley ldquoOn a measure of the information provided by anexperimentrdquo The Annals of Mathematical Statistics vol 27 no4 pp 986ndash1005 1956

[8] V V Fedorov Theory of Optimal Experiments Probability andMathematical Statistics Academic Press 1972

[9] K Chaloner and I Verdinelli ldquoBayesian experimental design areviewrdquo Statistical Science vol 10 pp 273ndash304 1995

[10] P Sebastiani and H P Wynn ldquoBayesian experimental designand shannon informationrdquo in 1997 Proceedings of the Section onBayesian Statistical Science pp 176ndash181 1997 httpciteseerxistpsueduviewdocsummarydoi=1011566037

[11] P Sebastiani and H P Wynn ldquoMaximum entropy samplingand optimal Bayesian experimental designrdquo Journal of the RoyalStatistical Society B vol 62 no 1 pp 145ndash157 2000

[12] T J Loredo and D F Cherno ldquoBayesian adaptive explorationrdquoin Proceedings of the 23rd International Workshop on BayesianInference and Maximum Entropy Methods in Science and Engi-neering G Erickson and Y Zhai Eds pp 330ndash346 AIP August2003

[13] R Fischer ldquoBayesian experimental designmdashstudies for fusiondiagnosticsrdquo in Proceedings of the 24th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering vol 735 pp 76ndash83 November 2004

[14] C E Shannon and W Weaver The Mathematical Theory ofCommunication University of Illinois Press Chicago Ill USA1949

[15] J Skilling ldquoNested samplingrdquo in Proceedings of the 24th Inter-nationalWorkshop onBayesian Inference andMaximumEntropyMethods in Science and Engineering vol 735 pp 395ndash405 2004

[16] D Sivia and J Skilling Data Analysis A Bayesian TutorialOxford University Press New York NY USA 2nd edition2006

[17] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953

[18] W K Hastings ldquoMonte carlo sampling methods using Markovchains and their applicationsrdquo Biometrika vol 57 no 1 pp 97ndash109 1970

[19] N K Malakar A J Mesiti and K H Knuth ldquoThe spatialsensitivity function of a light sensorrdquo in Proceedings of the 29thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering P Goggans andC Y Chan Eds vol 1193 of AIP Conference Proceedings pp352ndash359American Institute of PhysicsOxfordMassUSA July2009

Submit your manuscripts athttpwwwhindawicom

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013Part I

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

DistributedSensor Networks

International Journal of

ISRN Signal Processing

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Mechanical Engineering

Advances in

Modelling amp Simulation in EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inOptoElectronics

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013

ISRN Sensor Networks

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawi Publishing Corporation httpwwwhindawicom Volume 2013

The Scientific World Journal

ISRN Robotics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

Antennas andPropagation

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

ISRN Electronics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

thinspJournalthinspofthinsp

Sensors

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Active and Passive Electronic Components

Chemical EngineeringInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Electrical and Computer Engineering

Journal of

ISRN Civil Engineering

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inAcoustics ampVibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

8 Journal of Sensors

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6655 plusmn 03

1MoG

(mm)

(mm

)

(a)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6749 plusmn 03

2 MoG

(mm)

(mm

)

(b)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus6719 plusmn 04

3MoG

(mm)

(mm

)

(c)

30

35

40

45

50

55

60

65

7030 35 40 45 50 55 60 65 70

log Z = minus7061 plusmn 04

4MoG

(mm)

(mm

)

(d)

Figure 6 This figure illustrates the SSF obtained from the four MoG models along with their corresponding log-evidence values

that the SSF is wider along the 90∘ndash270∘ axis than along the0∘ndash180∘ axis Last Figure 7 demonstrates that the predictedlight sensor output shows excellent similarity to the recordeddata for the black-and-white boundary pattern

In the next section we demonstrate that explicit knowl-edge about how the light sensor integrates light arriving fromwithin its field-of-view improves the inferences one canmakefrom its output

32 Efficacy of Sensor Model The mean SSF field obtainedusing a single two-dimensional Gaussianmodel (Figure 6(a))was incorporated into the likelihood function used by therobotrsquos machine learning system Here we compare therobotrsquos performance in locating and characterizing a circle byobserving the average number of measurements necessary tocharacterize the circle parameters within a precision of 4mm(which is one-half of the spacing between the holes in theLEGO technic parts)

The three panels comprising Figure 8(a) illustrate therobotrsquos machine learning systemrsquos view of the playing fieldusing the naıve light sensor model The previously obtainedmeasurement locations used to obtain light sensor data areindicated by the black and white squares indicating therelative intensity with respect to the naıve light sensor modelThe next selected measurement location is indicated by thegreen square The blue circles represent the 50 hypothesizedcircles sampled from the posterior probability The shadedbackground represents the entropy map which indicates themeasurement locations that promise to provide maximalinformation about the circle to be characterized The lowentropy area surrounding the white square indicates that theregion is probably inside the white circle (not shown) andthat measurements made there will not be as informative asmeasurementsmade elsewhere Note that the circles partitionthe plane and that each partition has a uniform entropy Allmeasurement locations within that partition or any otherpartition sharing the same entropy all stand to be equally

Journal of Sensors 9

50

55

45

40

35

30

25

20

minus3 minus2 minus1 0 1 2 3Position of the center of the sensor (cm)

Inte

nsity

(LEG

O u

nits)

Intensity measurement compared for 1MoG

Prediction1Prediction2Prediction3Prediction4

Data1Data2Data3Data4

Figure 7 A comparison of the observed data (red) with predictions(black) made by the SSF field estimated using the single two-dimensional Gaussian MoG model

informative Here it is the fact that the shape is known to bea circle that is driving the likelihood function

In contrast the three panels comprising Figure 8(b)illustrate the playing field using themore accurate SSFmodelHere one can see that the entropy is higher along the edgesof the sampled circles This indicates that the circle edgespromise to provide more information than the centers ofthe partitioned regions This is because that the SSF modelenables one to detect not only whether the light sensor issituated above the circles edge but also how much of the SSFoverlaps with the white circle That is it helps to identify notonlywhether the sensor is inside the circle (as is accomplishedusing the naıve light sensor model) but also the extent towhich the sensor is on the edge of the circle The additionalinformation provided about the functioning of the lightsensor translates directly into additional information aboutthe albedo that results in the sensor output

This additional information can be quantified by observ-ing how many measurements the robot is required to taketo obtain estimates of the circle parameters within thesame precision in the cases of each light sensor modelOur experiments revealed that on average it takes 19 plusmn 18measurements using the naıve light sensor model comparedto an average of 116plusmn 39measurements for themore accurateSSF light sensor model

4 Conclusion

Thequality of the inferences onemakes froma sensor dependnot only on the quality of the data returned by the sensorbut also on the information one possesses about the sensorrsquos

performance In this paper we have demonstrated via a casestudy how more precisely modeling a sensorrsquos performancecan improve the inferences one can make from its data Inthis case we demonstrated that one can achieve about 18reduction in the number of measurements needed by a robotto make the same inferences by more precisely modeling itslight sensor

This paper demonstrates how a machine learning systemthat employs Bayesian inference (and inquiry) relies onthe likelihood function of the data given the hypothesizedmodel parameters Rather than simply representing a noisemodel the likelihood function quantifies the probability thata hypothesized situation could have given rise to the recordeddata By incorporating more information about the sensors(or equipment) used to record the data one naturally isincorporating this information into the posterior probabilitywhich results in onersquos inferences

This is made even more apparent by a careful study ofthe experimental design problem that this particular roboticsystem is designed to explore For example it is easy toshow that by using the naıve light sensor model the entropydistribution for a proposed measurement location dependssolely on the number of sampled circles forwhich the locationis in the interior of the circle and the number of sampledcircles for which the location is exterior to the circle Giventhat we represented the posterior probability by sampling50 circles the maximum entropy occurs when the proposedmeasurement location is inside 25 circles (and outside 25circles) As the robotrsquos parameter estimates converge one canshow that the system is simply performing a binary searchby asking ldquoyesrdquo or ldquonordquo questions which implies that eachmeasurement results in one bit of information However inthe case where the robot employs an SSF model of the lightsensor the question the robot is essentially asking is moredetailed ldquoto what degree does the circle overlap the lightsensorrsquos SSFrdquoThe answer to such a question tends to providemore information which significantly improves system per-formance One can estimate the information gain achievedby employing the SSF model Consider that the naive modelreveals that estimating the circlersquos position and radius with aprecision of 4mmgiven the prior information about the circleand the playing field requires 25 bits of information Theexperiment using the SSFmodel requires on average 116mea-surements which implies that on average each measurementobtained using the SSF model provides about 25116 = 215bits of information One must keep in mind however thatthis is due to the fact that the SSF model is being used notonly to infer the circle parameters from the data but also toselect the measurement locations

Because the method presented here is based on a verygeneral inferential framework these methods can easily beapplied to other types of sensors and equipment in a widevariety of situations If one has designed a robotic machinelearning system to rely on likelihood functions then sensormodels can be incorporated in more or less a plug-and-playfashion This not only promises to improve the quality ofrobotic systems forced to rely on lower quality sensors but italso opens the possibility for calibration on the fly by updatingsensor models as data are continually collected

10 Journal of Sensors

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(a)

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(b)

Figure 8 (a)These three panels illustrate the robotrsquos machine learning systemrsquos view of the playing field using the naıve light sensor model asthe system progresses through the first three measurements The previously obtained measurement locations used to obtain light sensor dataare indicated by the black-and-white squares indicating the relative intensity with respect to the naıve light sensor model The next selectedmeasurement location is indicated by the green square The blue circles represent the 50 hypothesized circles sampled from the posteriorprobabilityThe shaded background represents the entropymap which indicates themeasurement locations that promise to providemaximalinformation about the circle to be characterized Note that the low entropy area surrounding the white square indicates that the region isprobably inside the white circle (not shown) and that measurements made there will not be as informative as measurements made elsewhereThe entropy map in Figure 2 shows the same experiment at a later stage after seven measurements have been recorded (b)These three panelsillustrate the robotrsquos machine learning systemrsquos view of the playing field using the more accurate SSF light sensor model Note that the entropymap reveals the circle edges to be highly informative This is because it helps to identify not only whether the sensor is inside the circle (as isaccomplished using the naıve light sensor model on the left) but also the extent to which the sensor is on the edge of the circle

Conflict of Interests

The authors have no conflict of interests to report

Acknowledgments

This research was supported in part by the University atAlbany Faculty Research Awards Program (Knuth) and theUniversity at Albany Benevolent Research Grant Award(Malakar) The authors would like to thank Scott Frasso andRotem Guttman for their assistance in interfacing MatLab tothe LEGOBrickTheywould also like to thank Phil Erner andA J Mesiti for their preliminary efforts on the robotic armexperiments and sensor characterization

References

[1] K H Knuth P M Erner and S Frasso ldquoDesigning intelligentinstrumentsrdquo in Proceedings of the 27th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering K Knuth A Caticha J Center A Giffin andC Rodriguez Eds vol 954 pp 203ndash211 AIP July 2007

[2] K H Knuth and J L Center Jr ldquoAutonomous science platformsand question-asking machinesrdquo in Proceedings of the 2nd Inter-national Workshop on Cognitive Information Processing (CIPrsquo10) pp 221ndash226 June 2010

[3] K H Knuth and J L Center ldquoAutonomous sensor placementrdquoin Proceedings of the IEEE International Conference on Technolo-gies for Practical Robot Applications pp 94ndash99 November 2008

Journal of Sensors 11

[4] K H Knuth ldquoIntelligent machines in the twenty-first centuryfoundations of inference and inquiryrdquo Philosophical Transac-tions of the Royal Society A vol 361 no 1813 pp 2859ndash28732003

[5] N K Malakar and K H Knuth ldquoEntropy-based search algo-rithm for experimental designrdquo in Proceedings of the 30thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering A M Djafari JF Bercher and P Bessiere Eds vol 1305 pp 157ndash164 AIP July2010

[6] N K Malakar K H Knuth and D J Lary ldquoMaximum jointentropy and information-based collaboration of automatedlearning machinesrdquo in Proceedings of the 31st InternationalWorkshop on Bayesian Inference and Maximum Entropy Meth-ods in Science and Engineering P Goyal A Giffin K H Knuthand E Vrscay Eds vol 1443 pp 230ndash237 AIP 2012

[7] D V Lindley ldquoOn a measure of the information provided by anexperimentrdquo The Annals of Mathematical Statistics vol 27 no4 pp 986ndash1005 1956

[8] V V Fedorov Theory of Optimal Experiments Probability andMathematical Statistics Academic Press 1972

[9] K Chaloner and I Verdinelli ldquoBayesian experimental design areviewrdquo Statistical Science vol 10 pp 273ndash304 1995

[10] P Sebastiani and H P Wynn ldquoBayesian experimental designand shannon informationrdquo in 1997 Proceedings of the Section onBayesian Statistical Science pp 176ndash181 1997 httpciteseerxistpsueduviewdocsummarydoi=1011566037

[11] P Sebastiani and H P Wynn ldquoMaximum entropy samplingand optimal Bayesian experimental designrdquo Journal of the RoyalStatistical Society B vol 62 no 1 pp 145ndash157 2000

[12] T J Loredo and D F Cherno ldquoBayesian adaptive explorationrdquoin Proceedings of the 23rd International Workshop on BayesianInference and Maximum Entropy Methods in Science and Engi-neering G Erickson and Y Zhai Eds pp 330ndash346 AIP August2003

[13] R Fischer ldquoBayesian experimental designmdashstudies for fusiondiagnosticsrdquo in Proceedings of the 24th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering vol 735 pp 76ndash83 November 2004

[14] C E Shannon and W Weaver The Mathematical Theory ofCommunication University of Illinois Press Chicago Ill USA1949

[15] J Skilling ldquoNested samplingrdquo in Proceedings of the 24th Inter-nationalWorkshop onBayesian Inference andMaximumEntropyMethods in Science and Engineering vol 735 pp 395ndash405 2004

[16] D Sivia and J Skilling Data Analysis A Bayesian TutorialOxford University Press New York NY USA 2nd edition2006

[17] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953

[18] W K Hastings ldquoMonte carlo sampling methods using Markovchains and their applicationsrdquo Biometrika vol 57 no 1 pp 97ndash109 1970

[19] N K Malakar A J Mesiti and K H Knuth ldquoThe spatialsensitivity function of a light sensorrdquo in Proceedings of the 29thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering P Goggans andC Y Chan Eds vol 1193 of AIP Conference Proceedings pp352ndash359American Institute of PhysicsOxfordMassUSA July2009

Submit your manuscripts athttpwwwhindawicom

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013Part I

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

DistributedSensor Networks

International Journal of

ISRN Signal Processing

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Mechanical Engineering

Advances in

Modelling amp Simulation in EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inOptoElectronics

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013

ISRN Sensor Networks

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawi Publishing Corporation httpwwwhindawicom Volume 2013

The Scientific World Journal

ISRN Robotics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

Antennas andPropagation

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

ISRN Electronics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

thinspJournalthinspofthinsp

Sensors

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Active and Passive Electronic Components

Chemical EngineeringInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Electrical and Computer Engineering

Journal of

ISRN Civil Engineering

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inAcoustics ampVibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Journal of Sensors 9

50

55

45

40

35

30

25

20

minus3 minus2 minus1 0 1 2 3Position of the center of the sensor (cm)

Inte

nsity

(LEG

O u

nits)

Intensity measurement compared for 1MoG

Prediction1Prediction2Prediction3Prediction4

Data1Data2Data3Data4

Figure 7 A comparison of the observed data (red) with predictions(black) made by the SSF field estimated using the single two-dimensional Gaussian MoG model

informative Here it is the fact that the shape is known to bea circle that is driving the likelihood function

In contrast the three panels comprising Figure 8(b)illustrate the playing field using themore accurate SSFmodelHere one can see that the entropy is higher along the edgesof the sampled circles This indicates that the circle edgespromise to provide more information than the centers ofthe partitioned regions This is because that the SSF modelenables one to detect not only whether the light sensor issituated above the circles edge but also how much of the SSFoverlaps with the white circle That is it helps to identify notonlywhether the sensor is inside the circle (as is accomplishedusing the naıve light sensor model) but also the extent towhich the sensor is on the edge of the circle The additionalinformation provided about the functioning of the lightsensor translates directly into additional information aboutthe albedo that results in the sensor output

This additional information can be quantified by observ-ing how many measurements the robot is required to taketo obtain estimates of the circle parameters within thesame precision in the cases of each light sensor modelOur experiments revealed that on average it takes 19 plusmn 18measurements using the naıve light sensor model comparedto an average of 116plusmn 39measurements for themore accurateSSF light sensor model

4 Conclusion

Thequality of the inferences onemakes froma sensor dependnot only on the quality of the data returned by the sensorbut also on the information one possesses about the sensorrsquos

performance In this paper we have demonstrated via a casestudy how more precisely modeling a sensorrsquos performancecan improve the inferences one can make from its data Inthis case we demonstrated that one can achieve about 18reduction in the number of measurements needed by a robotto make the same inferences by more precisely modeling itslight sensor

This paper demonstrates how a machine learning systemthat employs Bayesian inference (and inquiry) relies onthe likelihood function of the data given the hypothesizedmodel parameters Rather than simply representing a noisemodel the likelihood function quantifies the probability thata hypothesized situation could have given rise to the recordeddata By incorporating more information about the sensors(or equipment) used to record the data one naturally isincorporating this information into the posterior probabilitywhich results in onersquos inferences

This is made even more apparent by a careful study ofthe experimental design problem that this particular roboticsystem is designed to explore For example it is easy toshow that by using the naıve light sensor model the entropydistribution for a proposed measurement location dependssolely on the number of sampled circles forwhich the locationis in the interior of the circle and the number of sampledcircles for which the location is exterior to the circle Giventhat we represented the posterior probability by sampling50 circles the maximum entropy occurs when the proposedmeasurement location is inside 25 circles (and outside 25circles) As the robotrsquos parameter estimates converge one canshow that the system is simply performing a binary searchby asking ldquoyesrdquo or ldquonordquo questions which implies that eachmeasurement results in one bit of information However inthe case where the robot employs an SSF model of the lightsensor the question the robot is essentially asking is moredetailed ldquoto what degree does the circle overlap the lightsensorrsquos SSFrdquoThe answer to such a question tends to providemore information which significantly improves system per-formance One can estimate the information gain achievedby employing the SSF model Consider that the naive modelreveals that estimating the circlersquos position and radius with aprecision of 4mmgiven the prior information about the circleand the playing field requires 25 bits of information Theexperiment using the SSFmodel requires on average 116mea-surements which implies that on average each measurementobtained using the SSF model provides about 25116 = 215bits of information One must keep in mind however thatthis is due to the fact that the SSF model is being used notonly to infer the circle parameters from the data but also toselect the measurement locations

Because the method presented here is based on a verygeneral inferential framework these methods can easily beapplied to other types of sensors and equipment in a widevariety of situations If one has designed a robotic machinelearning system to rely on likelihood functions then sensormodels can be incorporated in more or less a plug-and-playfashion This not only promises to improve the quality ofrobotic systems forced to rely on lower quality sensors but italso opens the possibility for calibration on the fly by updatingsensor models as data are continually collected

10 Journal of Sensors

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(a)

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(b)

Figure 8 (a)These three panels illustrate the robotrsquos machine learning systemrsquos view of the playing field using the naıve light sensor model asthe system progresses through the first three measurements The previously obtained measurement locations used to obtain light sensor dataare indicated by the black-and-white squares indicating the relative intensity with respect to the naıve light sensor model The next selectedmeasurement location is indicated by the green square The blue circles represent the 50 hypothesized circles sampled from the posteriorprobabilityThe shaded background represents the entropymap which indicates themeasurement locations that promise to providemaximalinformation about the circle to be characterized Note that the low entropy area surrounding the white square indicates that the region isprobably inside the white circle (not shown) and that measurements made there will not be as informative as measurements made elsewhereThe entropy map in Figure 2 shows the same experiment at a later stage after seven measurements have been recorded (b)These three panelsillustrate the robotrsquos machine learning systemrsquos view of the playing field using the more accurate SSF light sensor model Note that the entropymap reveals the circle edges to be highly informative This is because it helps to identify not only whether the sensor is inside the circle (as isaccomplished using the naıve light sensor model on the left) but also the extent to which the sensor is on the edge of the circle

Conflict of Interests

The authors have no conflict of interests to report

Acknowledgments

This research was supported in part by the University atAlbany Faculty Research Awards Program (Knuth) and theUniversity at Albany Benevolent Research Grant Award(Malakar) The authors would like to thank Scott Frasso andRotem Guttman for their assistance in interfacing MatLab tothe LEGOBrickTheywould also like to thank Phil Erner andA J Mesiti for their preliminary efforts on the robotic armexperiments and sensor characterization

References

[1] K H Knuth P M Erner and S Frasso ldquoDesigning intelligentinstrumentsrdquo in Proceedings of the 27th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering K Knuth A Caticha J Center A Giffin andC Rodriguez Eds vol 954 pp 203ndash211 AIP July 2007

[2] K H Knuth and J L Center Jr ldquoAutonomous science platformsand question-asking machinesrdquo in Proceedings of the 2nd Inter-national Workshop on Cognitive Information Processing (CIPrsquo10) pp 221ndash226 June 2010

[3] K H Knuth and J L Center ldquoAutonomous sensor placementrdquoin Proceedings of the IEEE International Conference on Technolo-gies for Practical Robot Applications pp 94ndash99 November 2008

Journal of Sensors 11

[4] K H Knuth ldquoIntelligent machines in the twenty-first centuryfoundations of inference and inquiryrdquo Philosophical Transac-tions of the Royal Society A vol 361 no 1813 pp 2859ndash28732003

[5] N K Malakar and K H Knuth ldquoEntropy-based search algo-rithm for experimental designrdquo in Proceedings of the 30thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering A M Djafari JF Bercher and P Bessiere Eds vol 1305 pp 157ndash164 AIP July2010

[6] N K Malakar K H Knuth and D J Lary ldquoMaximum jointentropy and information-based collaboration of automatedlearning machinesrdquo in Proceedings of the 31st InternationalWorkshop on Bayesian Inference and Maximum Entropy Meth-ods in Science and Engineering P Goyal A Giffin K H Knuthand E Vrscay Eds vol 1443 pp 230ndash237 AIP 2012

[7] D V Lindley ldquoOn a measure of the information provided by anexperimentrdquo The Annals of Mathematical Statistics vol 27 no4 pp 986ndash1005 1956

[8] V V Fedorov Theory of Optimal Experiments Probability andMathematical Statistics Academic Press 1972

[9] K Chaloner and I Verdinelli ldquoBayesian experimental design areviewrdquo Statistical Science vol 10 pp 273ndash304 1995

[10] P Sebastiani and H P Wynn ldquoBayesian experimental designand shannon informationrdquo in 1997 Proceedings of the Section onBayesian Statistical Science pp 176ndash181 1997 httpciteseerxistpsueduviewdocsummarydoi=1011566037

[11] P Sebastiani and H P Wynn ldquoMaximum entropy samplingand optimal Bayesian experimental designrdquo Journal of the RoyalStatistical Society B vol 62 no 1 pp 145ndash157 2000

[12] T J Loredo and D F Cherno ldquoBayesian adaptive explorationrdquoin Proceedings of the 23rd International Workshop on BayesianInference and Maximum Entropy Methods in Science and Engi-neering G Erickson and Y Zhai Eds pp 330ndash346 AIP August2003

[13] R Fischer ldquoBayesian experimental designmdashstudies for fusiondiagnosticsrdquo in Proceedings of the 24th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering vol 735 pp 76ndash83 November 2004

[14] C E Shannon and W Weaver The Mathematical Theory ofCommunication University of Illinois Press Chicago Ill USA1949

[15] J Skilling ldquoNested samplingrdquo in Proceedings of the 24th Inter-nationalWorkshop onBayesian Inference andMaximumEntropyMethods in Science and Engineering vol 735 pp 395ndash405 2004

[16] D Sivia and J Skilling Data Analysis A Bayesian TutorialOxford University Press New York NY USA 2nd edition2006

[17] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953

[18] W K Hastings ldquoMonte carlo sampling methods using Markovchains and their applicationsrdquo Biometrika vol 57 no 1 pp 97ndash109 1970

[19] N K Malakar A J Mesiti and K H Knuth ldquoThe spatialsensitivity function of a light sensorrdquo in Proceedings of the 29thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering P Goggans andC Y Chan Eds vol 1193 of AIP Conference Proceedings pp352ndash359American Institute of PhysicsOxfordMassUSA July2009

Submit your manuscripts athttpwwwhindawicom

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013Part I

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

DistributedSensor Networks

International Journal of

ISRN Signal Processing

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Mechanical Engineering

Advances in

Modelling amp Simulation in EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inOptoElectronics

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013

ISRN Sensor Networks

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawi Publishing Corporation httpwwwhindawicom Volume 2013

The Scientific World Journal

ISRN Robotics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

Antennas andPropagation

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

ISRN Electronics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

thinspJournalthinspofthinsp

Sensors

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Active and Passive Electronic Components

Chemical EngineeringInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Electrical and Computer Engineering

Journal of

ISRN Civil Engineering

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inAcoustics ampVibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

10 Journal of Sensors

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(a)

60

50

40

30

20

10

060

50

40

30

20

10

060

50

40

30

20

10

0minus60 minus40 minus20 0 20 40 60

LEGO distance units

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

LEG

O d

istan

ce u

nits

(b)

Figure 8 (a)These three panels illustrate the robotrsquos machine learning systemrsquos view of the playing field using the naıve light sensor model asthe system progresses through the first three measurements The previously obtained measurement locations used to obtain light sensor dataare indicated by the black-and-white squares indicating the relative intensity with respect to the naıve light sensor model The next selectedmeasurement location is indicated by the green square The blue circles represent the 50 hypothesized circles sampled from the posteriorprobabilityThe shaded background represents the entropymap which indicates themeasurement locations that promise to providemaximalinformation about the circle to be characterized Note that the low entropy area surrounding the white square indicates that the region isprobably inside the white circle (not shown) and that measurements made there will not be as informative as measurements made elsewhereThe entropy map in Figure 2 shows the same experiment at a later stage after seven measurements have been recorded (b)These three panelsillustrate the robotrsquos machine learning systemrsquos view of the playing field using the more accurate SSF light sensor model Note that the entropymap reveals the circle edges to be highly informative This is because it helps to identify not only whether the sensor is inside the circle (as isaccomplished using the naıve light sensor model on the left) but also the extent to which the sensor is on the edge of the circle

Conflict of Interests

The authors have no conflict of interests to report

Acknowledgments

This research was supported in part by the University atAlbany Faculty Research Awards Program (Knuth) and theUniversity at Albany Benevolent Research Grant Award(Malakar) The authors would like to thank Scott Frasso andRotem Guttman for their assistance in interfacing MatLab tothe LEGOBrickTheywould also like to thank Phil Erner andA J Mesiti for their preliminary efforts on the robotic armexperiments and sensor characterization

References

[1] K H Knuth P M Erner and S Frasso ldquoDesigning intelligentinstrumentsrdquo in Proceedings of the 27th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering K Knuth A Caticha J Center A Giffin andC Rodriguez Eds vol 954 pp 203ndash211 AIP July 2007

[2] K H Knuth and J L Center Jr ldquoAutonomous science platformsand question-asking machinesrdquo in Proceedings of the 2nd Inter-national Workshop on Cognitive Information Processing (CIPrsquo10) pp 221ndash226 June 2010

[3] K H Knuth and J L Center ldquoAutonomous sensor placementrdquoin Proceedings of the IEEE International Conference on Technolo-gies for Practical Robot Applications pp 94ndash99 November 2008

Journal of Sensors 11

[4] K H Knuth ldquoIntelligent machines in the twenty-first centuryfoundations of inference and inquiryrdquo Philosophical Transac-tions of the Royal Society A vol 361 no 1813 pp 2859ndash28732003

[5] N K Malakar and K H Knuth ldquoEntropy-based search algo-rithm for experimental designrdquo in Proceedings of the 30thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering A M Djafari JF Bercher and P Bessiere Eds vol 1305 pp 157ndash164 AIP July2010

[6] N K Malakar K H Knuth and D J Lary ldquoMaximum jointentropy and information-based collaboration of automatedlearning machinesrdquo in Proceedings of the 31st InternationalWorkshop on Bayesian Inference and Maximum Entropy Meth-ods in Science and Engineering P Goyal A Giffin K H Knuthand E Vrscay Eds vol 1443 pp 230ndash237 AIP 2012

[7] D V Lindley ldquoOn a measure of the information provided by anexperimentrdquo The Annals of Mathematical Statistics vol 27 no4 pp 986ndash1005 1956

[8] V V Fedorov Theory of Optimal Experiments Probability andMathematical Statistics Academic Press 1972

[9] K Chaloner and I Verdinelli ldquoBayesian experimental design areviewrdquo Statistical Science vol 10 pp 273ndash304 1995

[10] P Sebastiani and H P Wynn ldquoBayesian experimental designand shannon informationrdquo in 1997 Proceedings of the Section onBayesian Statistical Science pp 176ndash181 1997 httpciteseerxistpsueduviewdocsummarydoi=1011566037

[11] P Sebastiani and H P Wynn ldquoMaximum entropy samplingand optimal Bayesian experimental designrdquo Journal of the RoyalStatistical Society B vol 62 no 1 pp 145ndash157 2000

[12] T J Loredo and D F Cherno ldquoBayesian adaptive explorationrdquoin Proceedings of the 23rd International Workshop on BayesianInference and Maximum Entropy Methods in Science and Engi-neering G Erickson and Y Zhai Eds pp 330ndash346 AIP August2003

[13] R Fischer ldquoBayesian experimental designmdashstudies for fusiondiagnosticsrdquo in Proceedings of the 24th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering vol 735 pp 76ndash83 November 2004

[14] C E Shannon and W Weaver The Mathematical Theory ofCommunication University of Illinois Press Chicago Ill USA1949

[15] J Skilling ldquoNested samplingrdquo in Proceedings of the 24th Inter-nationalWorkshop onBayesian Inference andMaximumEntropyMethods in Science and Engineering vol 735 pp 395ndash405 2004

[16] D Sivia and J Skilling Data Analysis A Bayesian TutorialOxford University Press New York NY USA 2nd edition2006

[17] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953

[18] W K Hastings ldquoMonte carlo sampling methods using Markovchains and their applicationsrdquo Biometrika vol 57 no 1 pp 97ndash109 1970

[19] N K Malakar A J Mesiti and K H Knuth ldquoThe spatialsensitivity function of a light sensorrdquo in Proceedings of the 29thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering P Goggans andC Y Chan Eds vol 1193 of AIP Conference Proceedings pp352ndash359American Institute of PhysicsOxfordMassUSA July2009

Submit your manuscripts athttpwwwhindawicom

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013Part I

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

DistributedSensor Networks

International Journal of

ISRN Signal Processing

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Mechanical Engineering

Advances in

Modelling amp Simulation in EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inOptoElectronics

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013

ISRN Sensor Networks

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawi Publishing Corporation httpwwwhindawicom Volume 2013

The Scientific World Journal

ISRN Robotics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

Antennas andPropagation

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

ISRN Electronics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

thinspJournalthinspofthinsp

Sensors

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Active and Passive Electronic Components

Chemical EngineeringInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Electrical and Computer Engineering

Journal of

ISRN Civil Engineering

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inAcoustics ampVibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Journal of Sensors 11

[4] K H Knuth ldquoIntelligent machines in the twenty-first centuryfoundations of inference and inquiryrdquo Philosophical Transac-tions of the Royal Society A vol 361 no 1813 pp 2859ndash28732003

[5] N K Malakar and K H Knuth ldquoEntropy-based search algo-rithm for experimental designrdquo in Proceedings of the 30thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering A M Djafari JF Bercher and P Bessiere Eds vol 1305 pp 157ndash164 AIP July2010

[6] N K Malakar K H Knuth and D J Lary ldquoMaximum jointentropy and information-based collaboration of automatedlearning machinesrdquo in Proceedings of the 31st InternationalWorkshop on Bayesian Inference and Maximum Entropy Meth-ods in Science and Engineering P Goyal A Giffin K H Knuthand E Vrscay Eds vol 1443 pp 230ndash237 AIP 2012

[7] D V Lindley ldquoOn a measure of the information provided by anexperimentrdquo The Annals of Mathematical Statistics vol 27 no4 pp 986ndash1005 1956

[8] V V Fedorov Theory of Optimal Experiments Probability andMathematical Statistics Academic Press 1972

[9] K Chaloner and I Verdinelli ldquoBayesian experimental design areviewrdquo Statistical Science vol 10 pp 273ndash304 1995

[10] P Sebastiani and H P Wynn ldquoBayesian experimental designand shannon informationrdquo in 1997 Proceedings of the Section onBayesian Statistical Science pp 176ndash181 1997 httpciteseerxistpsueduviewdocsummarydoi=1011566037

[11] P Sebastiani and H P Wynn ldquoMaximum entropy samplingand optimal Bayesian experimental designrdquo Journal of the RoyalStatistical Society B vol 62 no 1 pp 145ndash157 2000

[12] T J Loredo and D F Cherno ldquoBayesian adaptive explorationrdquoin Proceedings of the 23rd International Workshop on BayesianInference and Maximum Entropy Methods in Science and Engi-neering G Erickson and Y Zhai Eds pp 330ndash346 AIP August2003

[13] R Fischer ldquoBayesian experimental designmdashstudies for fusiondiagnosticsrdquo in Proceedings of the 24th International Workshopon Bayesian Inference andMaximumEntropyMethods in Scienceand Engineering vol 735 pp 76ndash83 November 2004

[14] C E Shannon and W Weaver The Mathematical Theory ofCommunication University of Illinois Press Chicago Ill USA1949

[15] J Skilling ldquoNested samplingrdquo in Proceedings of the 24th Inter-nationalWorkshop onBayesian Inference andMaximumEntropyMethods in Science and Engineering vol 735 pp 395ndash405 2004

[16] D Sivia and J Skilling Data Analysis A Bayesian TutorialOxford University Press New York NY USA 2nd edition2006

[17] N Metropolis A W Rosenbluth M N Rosenbluth A HTeller and E Teller ldquoEquation of state calculations by fastcomputing machinesrdquo The Journal of Chemical Physics vol 21no 6 pp 1087ndash1092 1953

[18] W K Hastings ldquoMonte carlo sampling methods using Markovchains and their applicationsrdquo Biometrika vol 57 no 1 pp 97ndash109 1970

[19] N K Malakar A J Mesiti and K H Knuth ldquoThe spatialsensitivity function of a light sensorrdquo in Proceedings of the 29thInternational Workshop on Bayesian Inference and MaximumEntropy Methods in Science and Engineering P Goggans andC Y Chan Eds vol 1193 of AIP Conference Proceedings pp352ndash359American Institute of PhysicsOxfordMassUSA July2009

Submit your manuscripts athttpwwwhindawicom

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013Part I

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

DistributedSensor Networks

International Journal of

ISRN Signal Processing

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Mechanical Engineering

Advances in

Modelling amp Simulation in EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inOptoElectronics

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013

ISRN Sensor Networks

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawi Publishing Corporation httpwwwhindawicom Volume 2013

The Scientific World Journal

ISRN Robotics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

Antennas andPropagation

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

ISRN Electronics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

thinspJournalthinspofthinsp

Sensors

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Active and Passive Electronic Components

Chemical EngineeringInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Electrical and Computer Engineering

Journal of

ISRN Civil Engineering

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inAcoustics ampVibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Submit your manuscripts athttpwwwhindawicom

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013Part I

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

DistributedSensor Networks

International Journal of

ISRN Signal Processing

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Mechanical Engineering

Advances in

Modelling amp Simulation in EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inOptoElectronics

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2013

ISRN Sensor Networks

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawi Publishing Corporation httpwwwhindawicom Volume 2013

The Scientific World Journal

ISRN Robotics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

International Journal of

Antennas andPropagation

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

ISRN Electronics

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

thinspJournalthinspofthinsp

Sensors

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Active and Passive Electronic Components

Chemical EngineeringInternational Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Electrical and Computer Engineering

Journal of

ISRN Civil Engineering

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013

Advances inAcoustics ampVibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2013