Learning Temporal Point Processes with Intermittent ...

11
Learning Temporal Point Processes with Intermittent Observations Vinayak Gupta Srikanta Bedathur Sourangshu Bhattacharya Abir De IIT Delhi {vinayak.gupta, srikanta}@cse.iitd.ac.in IIT Delhi IIT Kharagpur [email protected] IIT Bombay [email protected] Abstract Marked temporal point processes (MTPP) have emerged as a powerful framework to model the underlying generative mechanism of asynchronous events localized in continu- ous time. Most existing models and infer- ence methods in MTPP framework consider only the complete observation scenario i.e. the event sequence being modeled is com- pletely observed with no missing events – an ideal setting barely encountered in prac- tice. A recent line of work which consid- ers missing events uses supervised learning techniques which require a missing or ob- served label for each event. In this work, we provide a novel unsupervised model and inference method for MTPPs in presence of missing events. We first model the genera- tive processes of observed events and missing events using two MTPPs, where the miss- ing events are represented as latent random variables. Then we devise an unsupervised training method that jointly learns both the MTPPs by means of variational inference. Experiments with real datasets show that our modeling and inference frameworks can ef- fectively impute the missing data among the observed events, which in turn enhances its predictive prowess. 1 Introduction In recent years, marked temporal point processes (MTPPs) (Valera et al., 2014; Rizoiu et al., 2017; Wang et al., 2017; Daley and Vere-Jones, 2007) have shown an outstanding potential to characterize asyn- chronous events localized in continuous time that Proceedings of the 24 th International Conference on Artifi- cial Intelligence and Statistics (AISTATS) 2021, San Diego, California, USA. PMLR: Volume 130. Copyright 2021 by the author(s). appear in a wide range of applications in health- care (Lorch et al., 2018; Rizoiu et al., 2018), traffic (Du et al., 2016; Guo et al., 2018), web and social net- works (Valera et al., 2014; Du et al., 2015; Tabibian et al., 2019; Kumar et al., 2019; De et al., 2016; Du et al., 2016; Farajtabar et al., 2017; Jing and Smola, 2017; Likhyani et al., 2020), finance (Bacry et al., 2015) and many more. In MTPP, an event is rep- resented using two quantities: (i) the time of its oc- currence and (ii) the associated mark, where the latter indicates the category of the event and therefore bears different meaning for different applications. For exam- ple, in a social network setting, the marks may indi- cate users’ likes, topics and opinions of the posts; in finance, they may correspond to the stock prices and the amount of sales; in healthcare, they may indicate the state of the disease of an individual. In this context, most of the MTPP models (Valera et al., 2014; Wang et al., 2017; Huang et al., 2019; Du et al., 2016; Zuo et al., 2020; Zhang et al., 2020)— with a few recent exceptions (Shelton et al., 2018; Mei et al., 2019)— have considered only the settings where the training data is completely observed or, in other words, there is no missing observation at all. While working with fully observed data is ideal for under- standing any dynamical system, this is not possible in many practical scenarios. We may miss observing events due to constraints such as crawling restrictions by social media platforms, privacy restrictions (cer- tain users may disallow collection of certain types of data), budgetary factors such as data collection for exit polls, or other practical factors e.g. a patient may not be available at a certain time. This results in poor predictive performance of MTPP models (Du et al., 2016; Zuo et al., 2020; Zhang et al., 2020) that skirt this issue. Statistical analysis in presence of missing data have been widely researched in literature in various contexts (Che et al., 2018; Yoon et al., 2019; Tian et al., 2018; Śmieja et al., 2018). Little and Rubin (2019) offers a comprehensive survey. It provides three models that capture data missing mechanisms in the increasing or-

Transcript of Learning Temporal Point Processes with Intermittent ...

Learning Temporal Point Processes with Intermittent Observations

Vinayak Gupta Srikanta Bedathur Sourangshu Bhattacharya Abir DeIIT Delhi

{vinayak.gupta, srikanta}@cse.iitd.ac.inIIT Delhi IIT Kharagpur

[email protected] Bombay

[email protected]

Abstract

Marked temporal point processes (MTPP)have emerged as a powerful framework tomodel the underlying generative mechanismof asynchronous events localized in continu-ous time. Most existing models and infer-ence methods in MTPP framework consideronly the complete observation scenario i.e.the event sequence being modeled is com-pletely observed with no missing events –an ideal setting barely encountered in prac-tice. A recent line of work which consid-ers missing events uses supervised learningtechniques which require a missing or ob-served label for each event. In this work,we provide a novel unsupervised model andinference method for MTPPs in presence ofmissing events. We first model the genera-tive processes of observed events and missingevents using two MTPPs, where the miss-ing events are represented as latent randomvariables. Then we devise an unsupervisedtraining method that jointly learns both theMTPPs by means of variational inference.Experiments with real datasets show that ourmodeling and inference frameworks can ef-fectively impute the missing data among theobserved events, which in turn enhances itspredictive prowess.

1 Introduction

In recent years, marked temporal point processes(MTPPs) (Valera et al., 2014; Rizoiu et al., 2017;Wang et al., 2017; Daley and Vere-Jones, 2007) haveshown an outstanding potential to characterize asyn-chronous events localized in continuous time that

Proceedings of the 24th International Conference on Artifi-cial Intelligence and Statistics (AISTATS) 2021, San Diego,California, USA. PMLR: Volume 130. Copyright 2021 bythe author(s).

appear in a wide range of applications in health-care (Lorch et al., 2018; Rizoiu et al., 2018), traffic(Du et al., 2016; Guo et al., 2018), web and social net-works (Valera et al., 2014; Du et al., 2015; Tabibianet al., 2019; Kumar et al., 2019; De et al., 2016; Duet al., 2016; Farajtabar et al., 2017; Jing and Smola,2017; Likhyani et al., 2020), finance (Bacry et al.,2015) and many more. In MTPP, an event is rep-resented using two quantities: (i) the time of its oc-currence and (ii) the associated mark, where the latterindicates the category of the event and therefore bearsdifferent meaning for different applications. For exam-ple, in a social network setting, the marks may indi-cate users’ likes, topics and opinions of the posts; infinance, they may correspond to the stock prices andthe amount of sales; in healthcare, they may indicatethe state of the disease of an individual.

In this context, most of the MTPP models (Valeraet al., 2014; Wang et al., 2017; Huang et al., 2019; Duet al., 2016; Zuo et al., 2020; Zhang et al., 2020)—with a few recent exceptions (Shelton et al., 2018; Meiet al., 2019)— have considered only the settings wherethe training data is completely observed or, in otherwords, there is no missing observation at all. Whileworking with fully observed data is ideal for under-standing any dynamical system, this is not possiblein many practical scenarios. We may miss observingevents due to constraints such as crawling restrictionsby social media platforms, privacy restrictions (cer-tain users may disallow collection of certain types ofdata), budgetary factors such as data collection forexit polls, or other practical factors e.g. a patient maynot be available at a certain time. This results in poorpredictive performance of MTPP models (Du et al.,2016; Zuo et al., 2020; Zhang et al., 2020) that skirtthis issue.

Statistical analysis in presence of missing data havebeen widely researched in literature in various contexts(Che et al., 2018; Yoon et al., 2019; Tian et al., 2018;Śmieja et al., 2018). Little and Rubin (2019) offers acomprehensive survey. It provides three models thatcapture data missing mechanisms in the increasing or-

Learning Temporal Point Processes with Intermittent Observations

der of complexity, viz., MCAR (missing completelyat random), MAR (missing at random) and MNAR(missing not at random). Recently, Shelton et al.(2018) and Mei et al. (2019) proposed novel methods toimpute missing events in MTPPs from the viewpointof MNAR mechanism. However, they focus on imput-ing missing data in between a-priori available observedevents, rather than predicting observed events in theface of missing events. Moreover, they deploy expen-sive learning and sampling mechanisms, which makethem often intractable in practice, especially in caseof learning from a sequence of streaming events. Forexample, Shelton et al. (2018) apply expensive MCMCsampling procedure to draw missing events betweenthe observation pairs, which requires several simula-tions of the sampling procedure upon arrival of a newsample. On the other hand Mei et al. (2019) uses bi-directional RNN which re-generates all missing eventsby making a completely new pass over the backwardRNN, whenever one new observation arrives. As aconsequence, it suffers from the quadratic complexitywith respect to the number of observed events. On theother hand, the proposal of Shelton et al. (2018) de-pends on a pre-defined influence structure among theunderlying events, which is available in linear multi-variate parameterized point processes. In more com-plex point processes, such a structure is not explicit,which limits their applicability in practice.

1.1 Present work

In this work, we overcome the above limitations usinga novel modeling framework for point processes calledIMTPP (Intermittently-observedMarked TemporalPoint Processes), which characterizes the dynamicsof both observed and missing events as two coupledMTPPs, conditioned on the history of previous events.In our setup, the generation of missing events dependsboth on the previously occurred missing events as wellas the previously occurred observed events. Therefore,they are MNAR (missing not at random), in the con-text of the literature of missing data (Little and Ru-bin, 2019). In contrast to the prior models (Mei et al.,2019; Shelton et al., 2018), IMTPP aims to learn thedynamics of both observed and missing events, ratherthan imputing missing events in between the knownobserved events, which is reflected in its superior pre-dictive power over those existing models.

Precisely, IMTPP represents the missing events as la-tent random variables, which together with the previ-ously observed events, seed the generative processes ofthe subsequent observed and missing events. Then itdeploys three generative models— MTPP for observedevents, prior MTPP for missing events and posteriorMTPP for missing events, using recurrent neural net-

works (RNN) that capture the nonlinear influence ofthe past events. To do so, it enjoys several technicalinnovations, which significantly boosts its efficiency aswell as predictive accuracy.(1) In a marked departure from almost all existingMTPP models (Du et al., 2016; Tabibian et al., 2019;Mei et al., 2019; De et al., 2016) which rely stronglyon conditional intensity functions, we use a log-normaldistribution to sample arrival times of the events. Assuggested by Shchur et al. (2020), such a distributionallows efficient sampling as well as more accurate pre-diction than the intensity-based models.(2) The built-in RNNs in our model can only makeforward computations. Therefore, they incrementallyupdate the dynamics upon the arrival of a new ob-servation. Consequently, unlike the prior models, itdoes not require to re-generate all the missing eventsresponding to the arrival of an observation, which sig-nificantly boosts the efficiency of both learning andprediction as compared to both the previous work (Meiet al., 2019; Shelton et al., 2018).

Our modeling framework allows us to train IMTPPusing efficient variational inference method, that max-imizes the evidence lower bound (ELBO) of the like-lihood of the observed events. Such a formulationconnects our model with the variational autoencoders(VAEs) (Chung et al., 2015; Bowman et al., 2015).However, in a sharp contrast to traditional VAEs,where the random noises or seeds often do not haveimmediate interpretations, our random variables bearconcrete physical explanations i.e. they are missingevents, which renders our model more explainable thanan off-the-shelf VAE.

Finally, our experiments with six diverse real datasetsshow that IMTPP can model missing observations,within a stream of observed events, and enhances thepredictive power of the original generative process fora full observation scenario.

2 Problem setup

In this section, we first introduce the notations andthen setup of our problem of learning marked temporalpoint process with missing events.

2.1 Preliminaries and notations

A marked temporal point process (MTPP) is astochastic process whose realization consists of a se-quence of discrete events Sk = {ei = (xi, ti)|i ∈[k], ti < ti+1}, where ti ∈ R+ is the time of occurrenceand xi ∈ C is a discrete mark of the i-th observedevent that occurred at time ti, with C to be the setof discrete marks. Here, Sk denotes the set of first k

Vinayak Gupta, Srikanta Bedathur, Sourangshu Bhattacharya, Abir De

observed events. We denote the inter-arrival times ofthe observed events as, ∆t,k = tk − tk−1.

However in reality, there may be instances where anevent has actually taken place, but not recorded inS. To this end, we introduce the MTPP for missingevents— a latent MTPP— which is characterized bya sequence of hidden events Mr = {εj = (yj , τj)|j ∈[r], τj < τj+1} where τj ∈ R+ and yj ∈ C are the timesand the marks of the j-th missing events. Therefore,Mr defines the set of first r missing events. Moreover,we denote the inter-arrival times of the missing eventsas, ∆τ,r = τr − τr−1. Note that τ•, y•, M• and ∆τ,•for the MTPP of missing events share similar meaningswith t•, x•, S• and ∆t,• respectively for the MTPP ofobserved events. We further define two quantities kand k as follows:

k = argminr{τr | tk < τr < tk+1} (1)

k = argmaxr{τr | tk < τr < tk+1} (2)

Here, k and k are the indices of the first and the lastmissing events respectively, among those which havearrived between k-th and k + 1-th observed events.Figure 1 (a) illustrates our setup.

In practice, the arrival times (t and τ) of both observedand missing events are continuous random variables,whereas the marks (x and y) are discrete random vari-ables. Therefore, following the state-of-the-art MTPPmodels, we model density function to draw timingsand probability mass function to draw marks, whichin turn induce a density function characterizing thegenerative process.

2.2 Our distinctive goal

Our goal in this paper is to design an MTPP modelwhich can generate the subsequent observed (ek+1)and missing events (εr+1), in a recursive manner, con-ditioned on the history of events Sk ∪Mr that haveoccurred thus far. Given the input sequence of ob-servations SK consisting of first K observed events{e1, e2, ..., eK}, we first train our generative model andthen predict the next observed event eK+1

1.

3 Components of IMTPP

At the very outset, IMTPP, our proposed genera-tive model, connects two stochastic processes— onefor the observed events, which samples the observedand the other for the missing events— based on thehistory of previously generated missing and observedevents. Note, that the sequence of training events that

1we can also predict the missing events but we evaluate thepredictive performance only on observed events, since the missingevents are not available in practice.

is given as input to IMTPP consists of only the ob-served events. We model the missing event sequencethrough latent random variables, which, along with thepreviously observed events, drive a unified generativemodel for the complete (observed and missing) eventsequence.

Given a stream of observed events SK = {e1 =(x1, t1), e2 = (x2, t2), . . . , eK = (xK , tK)}, if we usethe maximum likelihood principle to train IMTPP,then we should maximize the marginal log-likelihoodof the observed stream of events, i.e., log p(SK). How-ever, computation of log p(SK) demands marginaliza-tion with respect to the set of latent missing eventsMK−1, which is typically intractable. Therefore, weresort to maximizing a variational lower bound or evi-dence lower bound (ELBO) of the log-likelihood of theobserved stream of events SK . More specifically, wenote that:

p(SK) =

K−1∏k=0

∫Mk

p(ek+1 | Sk,Mk) p(Mk) dω(Mk)

= Eq(MK−1 | SK)

K−1∏k=0

p(ek+1 | Sk,Mk)

k∏r=k

p(εr | Sk,Mr−1)

k∏r=k

q(εr | ek+1,Sk,Mr−1)

where, ω(M) is the measure of the setM, q is an ap-proximate posterior distribution which aims to inter-polate missing events εr within the interval (tk, tk+1),based on the knowledge of the next observed event ek,along with all previous events Sk ∪Mr−1. Recall thatk (k) is the index r of the first (last) missing eventεr among those which have arrived between k-th andk + 1-th observed events, i.e., k = argminr{τr | tk <τr < tk+1} and k = argmaxr{τr | tk < τr < tk+1}.Next, by Jensen inequality, log p(SK) is at-least

Eq(MK−1 | SK)

K−1∑k=0

log p(ek+1 | Sk,Mk)

−K−1∑k=0

k∑r=k

KL[q(εr | ek+1,Sk,Mr−1)||p(εr | Sk,Mr−1)

],

(3)While the above inequality holds for any distributionq, the quality of this lower bound depends on the ex-pressivity of q, which we would model using a deeprecurrent neural network. Moreover, the above lowerbound suggests that our model consists of the follow-ing components.

MTPP for observed events. The distribu-tion p(ek+1 | Sk,Mk) models the MTPP for observedevents, which generates the (k + 1)-th event, ek+1,based on the history of all k observed events Sk andall missing eventsMk generated so far.

Learning Temporal Point Processes with Intermittent Observations

Prior MTPP for missing events. The distributionp(εr | Sk,Mr−1) is the prior model of the MTPP formissing events. It generates the r-th missing eventεr after the observed event ek, based on the priorinformation— the history with all k observed eventsSk and all missing eventsMr−1 generated so far.

Posterior MTPP for missing events. Given theset of observed events Sk+1 = {e1, e2, . . . , ek+1}, thedistribution q(εr | ek+1,Sk,Mr−1) generates the r-thmissing event εr, after the knowledge of the subsequentobserved event ek+1 is taken into account, along withinformation about all previously observed events Skand all missing eventsMr−1 generated so far.

4 Architecture of IMTPP

We first present a high level overview of deep neu-ral network parameterization of different componentsof IMTPP model, and then describe component-wisearchitecture in detail. Finally, we briefly present thesalient features of our proposal.

4.1 High level overview

We approximate the MTPP for observed events,p(ek+1 | Sk,Mk) using pθ and the posterior MTPP formissing events q(εr | ek+1,Sk,Mr−1) using qφ, bothimplemented as neural networks with parameters θand φ respectively. We set the prior MTPP for missingevents p(εr | Sk,Mr−1) as a known distribution ppriorusing the history of all the events it is conditioned on.In this context, we design two recurrent neural net-works (RNNs) which embed the history of observedevents S into the hidden vectors s and the missingeventsM into the hidden vectorm, similar to severalstate-of-the art MTPP models (Du et al., 2016; Meiand Eisner, 2017; Mei et al., 2019). In particular, theembeddings sk andmr encode the influence of the ar-rival time and the mark of the first k observed eventsfrom Sk and first r missing events from Mr respec-tively. Here, the RNN for the observed events updatessk to sk+1 by incorporating the effect of ek+1. Simi-larly, the RNN for the missing events updates mr tomr+1 by taking into account of the event εr+1.

Each event has two components, its mark and thearrival-time, which are discrete and continuous ran-dom variables respectively. Hence, we characterize theevent distribution as a density function which is theproduct of the density function (pθ,∆, qφ,∆, pprior,∆) ofthe inter-arrival time and the probability distribution(Pθ,x,Qφ,y,Pprior,y) of the mark, i.e.,

pθ(ek+1 = (xk+1, tk+1) | Sk,Mk)

= Pθ,x(xk+1 |∆t,k+1, sk,mk)

· pθ,∆(∆t,k+1 | sk,mk) (4)qφ(εr = (yr, τr) | ek+1,Sk,Mr−1)

= Qφ,y(yr |∆τ,r, ek+1, sk,mr−1) ·qφ,∆(∆τ,r | ek+1, sk,mr−1) (5)

pprior(εr = (yr, τr) | Sk,Mr−1)

= Pprior,y(yr |∆τ,r, sk,mr−1)·pprior,τ (∆τ,r | sk,mr−1) (6)

where, as mentioned, the inter-arrival times ∆t,k and∆τ,r are given as ∆t,k = tk− tk−1 and ∆r = τr− τr−1.Moreover, pθ,∆, qφ,∆ and pprior,∆ denote the densityof the inter-arrival times for the observed events, pos-terior density and the prior density of the inter-arrivaltimes of the missing events; and, Pθ,x, Qφ,y and Pprior,ydenote the corresponding probability mass functions ofthe mark distributions. Panels (b) and (c) in Figure 1illustrate the neural architecture of IMTPP.

4.2 Parameterization of pθ

We realize pθ in Eq. 4 using a three layer architecture.

Input layer. The first level is the input layer, whichtakes the last event as input and represents it througha suitable vector. In particular, upon arrival of ek, itcomputes the corresponding vector vk as:vk = wt,vtk +wx,vxk +wt,∆(tk − tk−1) + av, (7)

where w•,• and av are trainable parameters.

Hidden layer. The next level is the hidden layerthat embeds the sequence of observations into finitedimensional vectors s•, computed using RNN. Sucha layer takes vi as input and feed it into an RNN toupdate its hidden states in the following way.sk = tanh(Ws,ssk−1 +Ws,vvk + (tk − tk−1)ws,k + as)

where Ws,• and as are trainable parameters. Thishidden state sk can also be considered as a sufficientstatistic of Sk, the sequence of the first k observations.

Output layer. The next level is the output layerwhich computes both pθ,∆(·) and Pθ,x(·) based on skand mk. To this end, we have the density of inter-arrival times as

pθ,∆(∆t,k+1 | sk,mk)

= Lognormal(µe(sk,mk), σ2

e(sk,mk)), (8)

with [µe(sk,mk), σe(sk,mk)] = W>t,ssk +W>

t,mmk +at; and, the mark distribution as,

Pθ,x(xk+1 = x |∆t,k+1, sk,mk)

=exp(U>x,ssk +U>x,mmk)∑

x′∈C exp(U>x′,ssk +U>x′,mmk), (9)

The distributions are finally used to draw the inter-arrival time ∆t,k+1 and the mark xk+1 for the eventek+1. The sampled inter-arrival time ∆t,k+1 gives

Vinayak Gupta, Srikanta Bedathur, Sourangshu Bhattacharya, Abir De

tk+1tk+1tk+1

<latexit sha1_base64="oF4QrKRozF3KFuMv2CWYRWPHNjA=">AAAB9HicdVDLSsNAFJ3UV62vqks3g0UQhDApiba7ohuXFWwrtKFMppN26EwSZyaFEvIdblwo4taPceffOH0IKnrgwuGce7n3niDhTGmEPqzCyura+kZxs7S1vbO7V94/aKs4lYS2SMxjeRdgRTmLaEszzeldIikWAaedYHw18zsTKhWLo1s9Tagv8DBiISNYG8nvJSLIdD8bnzl53i9XkI2qNc+tQmRXPVR36oZ4yKmfu9Cx0RwVsESzX37vDWKSChppwrFSXQcl2s+w1Ixwmpd6qaIJJmM8pF1DIyyo8rP50Tk8McoAhrE0FWk4V79PZFgoNRWB6RRYj9Rvbyb+5XVTHdb8jEVJqmlEFovClEMdw1kCcMAkJZpPDcFEMnMrJCMsMdEmp5IJ4etT+D9pV23Htb0bt9K4XMZRBEfgGJwCB1yABrgGTdACBNyDB/AEnq2J9Wi9WK+L1oK1nDkEP2C9fQJCQ5Jx</latexit>

�t,k+1�t,k+1�t,k+1

<latexit sha1_base64="PVN2SAuEnxzd/8z1bOCWYV5AtWY=">AAAB/XicdVDJSgNBEO2JW4xbXG5eGoMgKENPSDS5BfXgMYJZIAmhp9NJmvQsdNcIcRj8FS8eFPHqf3jzb+wsgoo+KHi8V0VVPTeUQgMhH1ZqYXFpeSW9mllb39jcym7v1HUQKcZrLJCBarpUcyl8XgMBkjdDxannSt5wRxcTv3HLlRaBfwPjkHc8OvBFXzAKRupm99qh58btSy6BdmM4GR07SdLN5ohN8qViIY+JnS+SslM2pEic8mkBOzaZIofmqHaz7+1ewCKP+8Ak1brlkBA6MVUgmORJph1pHlI2ogPeMtSnHtedeHp9gg+N0sP9QJnyAU/V7xMx9bQee67p9CgM9W9vIv7ltSLolzqx8MMIuM9mi/qRxBDgSRS4JxRnIMeGUKaEuRWzIVWUgQksY0L4+hT/T+p52ynYxetCrnI+jyON9tEBOkIOOkMVdIWqqIYYukMP6Ak9W/fWo/Vivc5aU9Z8Zhf9gPX2CZ03lVo=</latexit>

tktktk

<latexit sha1_base64="3+GlJE2TauPpa7ojxvx+5EQirZA=">AAAB8HicdVDLSgMxFM34rPVVdekmWARXQ2boaLsrunFZwT6kHUomzbShSWZIMkIZ+hVuXCji1s9x59+YPgQVPXDhcM693HtPlHKmDUIfzsrq2vrGZmGruL2zu7dfOjhs6SRThDZJwhPVibCmnEnaNMxw2kkVxSLitB2Nr2Z++54qzRJ5ayYpDQUeShYzgo2V7nqpiHLTH0/7pTJykV8NKj5Erh+gmlezJEBe7bwCPRfNUQZLNPql994gIZmg0hCOte56KDVhjpVhhNNpsZdpmmIyxkPatVRiQXWYzw+ewlOrDGCcKFvSwLn6fSLHQuuJiGynwGakf3sz8S+vm5m4GuZMppmhkiwWxRmHJoGz7+GAKUoMn1iCiWL2VkhGWGFibEZFG8LXp/B/0vJdr+IGN5Vy/XIZRwEcgxNwBjxwAergGjRAExAgwAN4As+Och6dF+d10briLGeOwA84b5+aPpD1</latexit>

Sk = {(xi, ti)|i 2 [k]}Sk = {(xi, ti)|i 2 [k]}Sk = {(xi, ti)|i 2 [k]}

<latexit sha1_base64="yR9N/DhLBXhu/Wldu1OAkQI6O6s=">AAACFXicdVDLSgMxFM34rPVVdekmWAQFGTKl1XYhFN24rGhV6AxDJk3b0ExmSDJiGecn3Pgrblwo4lZw59+YPgQVPXDhcM693HtPEHOmNEIf1tT0zOzcfG4hv7i0vLJaWFu/UFEiCW2SiEfyKsCKciZoUzPN6VUsKQ4DTi+D/vHQv7ymUrFInOtBTL0QdwXrMIK1kfzCnhuHQeqGWPcI5ulZ5vfhoZvu3PhsT/tsF95C5jLR6ntulvmFIrJRqVoplyCySxVUc2qGVJBT2y9Dx0YjFMEEDb/w7rYjkoRUaMKxUi0HxdpLsdSMcJrl3UTRGJM+7tKWoQKHVHnp6KsMbhulDTuRNCU0HKnfJ1IcKjUIA9M5PF/99obiX14r0Z2qlzIRJ5oKMl7USTjUERxGBNtMUqL5wBBMJDO3QtLDEhNtgsybEL4+hf+Ti5LtlO3KablYP5rEkQObYAvsAAccgDo4AQ3QBATcgQfwBJ6te+vRerFex61T1mRmA/yA9fYJh++fEg==</latexit>

⌧k⌧k⌧k

<latexit sha1_base64="tdmg4QQGBAcYXkdxF2RSJ2DF9FE=">AAACA3icdVDLSsQwFE19juOr6k43wUFwVVqZl7tBNy5HcB4wLUOaSWfCpGlJUmEoBTf+ihsXirj1J9z5N6YzFVT0hMDhnHtvco8fMyqVbX8YS8srq2vrpY3y5tb2zq65t9+VUSIw6eCIRaLvI0kY5aSjqGKkHwuCQp+Rnj+9zP3eLRGSRvxGzWLihWjMaUAxUloamoduHPqpq1AyTN2Ej4jIJ6XTLMuGZsW26o2aU21C27LnyEn9XB/oFEoFFGgPzXd3FOEkJFxhhqQcOHasvBQJRTEjWdlNJIkRnqIxGWjKUUikl853yOCJVkYwiIS+XMG5+r0jRaGUs9DXlSFSE/nby8W/vEGigqaXUh4ninC8eChIGFQRzAOBIyoIVmymCcKC6r9CPEECYaVjK+sQvjaF/5PumeVUrdp1tdK6KOIogSNwDE6BAxqgBa5AG3QABnfgATyBZ+PeeDRejNdF6ZJR9ByAHzDePgH9g5kP</latexit>

⌧k⌧k⌧k

<latexit sha1_base64="CcGNx1Qy5BXVCA9FirMwGT6aNVU=">AAACAnicdVDLSsQwFE19juOr6krcBAfBVWllXu4G3bgcwXnAtJQ0k5kJk7QlSYWhFDf+ihsXirj1K9z5N6YzFVT0hMDhnHtvck8QMyqVbX8YS8srq2vrpY3y5tb2zq65t9+VUSIw6eCIRaIfIEkYDUlHUcVIPxYE8YCRXjC9zP3eLRGSRuGNmsXE42gc0hHFSGnJNw/dmAepq1Dip26kK/NB6TTLMt+s2Fa9UXOqTWhb9hw5qZ/rA51CqYACbd98d4cRTjgJFWZIyoFjx8pLkVAUM5KV3USSGOEpGpOBpiHiRHrpfIUMnmhlCEeR0DdUcK5+70gRl3LGA13JkZrI314u/uUNEjVqeikN40SREC8eGiUMqgjmecAhFQQrNtMEYUH1XyGeIIGw0qmVdQhfm8L/SffMcqpW7bpaaV0UcZTAETgGp8ABDdACV6ANOgCDO/AAnsCzcW88Gi/G66J0ySh6DsAPGG+fObeYow==</latexit>

⌧r⌧r⌧r

<latexit sha1_base64="ZO9Wlw6lK8FMnQnOj8XioYYzyF8=">AAAB9XicdVDLSgMxFM3UV62vqks3wSK4GmakL3dFNy4r2Ad0xpJJ0zY0yQxJRinD/IcbF4q49V/c+Tdm2hFU9FwuHM65l9ycIGJUacf5sAorq2vrG8XN0tb2zu5eef+gq8JYYtLBIQtlP0CKMCpIR1PNSD+SBPGAkV4wu8z83h2RiobiRs8j4nM0EXRMMdJGuvUiHiSeRvEwkWk6LFccu96oudUmdGxngYzUz01BN1cqIEd7WH73RiGOOREaM6TUwHUi7SdIaooZSUterEiE8AxNyMBQgThRfrK4OoUnRhnBcShNCw0X6veNBHGl5jwwkxzpqfrtZeJf3iDW46afUBHFmgi8fGgcM6hDmEUAR1QSrNncEIQlNbdCPEUSYW2CKpkQvn4K/yfdM9ut2rXraqV1kcdRBEfgGJwCFzRAC1yBNugADCR4AE/g2bq3Hq0X63U5WrDynUPwA9bbJ7Uzk1M=</latexit>

⌧r�1⌧r�1⌧r�1

<latexit sha1_base64="GcSrSuyUr3VvsndNAyUJl6WOvm8=">AAAB+XicdVDLSsNAFJ3UV62vqEs3g0VwY0ikL3dFNy4r2Ac0IUymk3boZBJmJoUS+iduXCji1j9x5984aSOo6LlcOJxzL3PnBAmjUtn2h1FaW9/Y3CpvV3Z29/YPzMOjnoxTgUkXxywWgwBJwignXUUVI4NEEBQFjPSD6U3u92dESBrzezVPiBehMachxUhpyTdNN4mCzFUo9TNx4SwWvlm1rUaz7tRa0LbsJXLSuNIFnUKpggId33x3RzFOI8IVZkjKoWMnysuQUBQzsqi4qSQJwlM0JkNNOYqI9LLl5Qt4ppURDGOhmyu4VL9vZCiSch4FejJCaiJ/e7n4lzdMVdjyMsqTVBGOVw+FKYMqhnkMcEQFwYrNNUFYUH0rxBMkEFY6rIoO4eun8H/Su7ScmlW/q1Xb10UcZXACTsE5cEATtMEt6IAuwGAGHsATeDYy49F4MV5XoyWj2DkGP2C8fQIQlJP2</latexit>

e1e1e1

<latexit sha1_base64="MapNUVU4kuyLfbif5TwPphu2cUQ=">AAAB8HicdVDLSgMxFM34rPVVdekmWARXQ2boaLsrunFZwT6kHUomzbShSWZIMkIZ+hVuXCji1s9x59+YPgQVPXDhcM693HtPlHKmDUIfzsrq2vrGZmGruL2zu7dfOjhs6SRThDZJwhPVibCmnEnaNMxw2kkVxSLitB2Nr2Z++54qzRJ5ayYpDQUeShYzgo2V7nqpiHLa96b9Uhm5yK8GFR8i1w9QzatZEiCvdl6BnovmKIMlGv3Se2+QkExQaQjHWnc9lJowx8owwum02Ms0TTEZ4yHtWiqxoDrM5wdP4alVBjBOlC1p4Fz9PpFjofVERLZTYDPSv72Z+JfXzUxcDXMm08xQSRaL4oxDk8DZ93DAFCWGTyzBRDF7KyQjrDAxNqOiDeHrU/g/afmuV3GDm0q5frmMowCOwQk4Ax64AHVwDRqgCQgQ4AE8gWdHOY/Oi/O6aF1xljNH4Aect08rM5Cs</latexit>

e2e2e2

<latexit sha1_base64="E5efCRQaaAEe/tkCihZ/ANkjcM0=">AAAB8HicdVDLSgMxFM34rPVVdekmWARXQ2boaLsrunFZwT6kHUomzbShSWZIMkIZ+hVuXCji1s9x59+YPgQVPXDhcM693HtPlHKmDUIfzsrq2vrGZmGruL2zu7dfOjhs6SRThDZJwhPVibCmnEnaNMxw2kkVxSLitB2Nr2Z++54qzRJ5ayYpDQUeShYzgo2V7nqpiHLa96f9Uhm5yK8GFR8i1w9QzatZEiCvdl6BnovmKIMlGv3Se2+QkExQaQjHWnc9lJowx8owwum02Ms0TTEZ4yHtWiqxoDrM5wdP4alVBjBOlC1p4Fz9PpFjofVERLZTYDPSv72Z+JfXzUxcDXMm08xQSRaL4oxDk8DZ93DAFCWGTyzBRDF7KyQjrDAxNqOiDeHrU/g/afmuV3GDm0q5frmMowCOwQk4Ax64AHVwDRqgCQgQ4AE8gWdHOY/Oi/O6aF1xljNH4Aect08suJCt</latexit>

e3e3e3

<latexit sha1_base64="c0QKLDEEdI1NgQ6wc9hXBiGU9TA=">AAAB8HicdVDLSsNAFJ3UV62vqks3g0VwFSYx0XZXdOOygn1IG8pkOmmHziRhZiKU0K9w40IRt36OO//G6UNQ0QMXDufcy733hClnSiP0YRVWVtfWN4qbpa3tnd298v5BSyWZJLRJEp7ITogV5SymTc00p51UUixCTtvh+Grmt++pVCyJb/UkpYHAw5hFjGBtpLteKsKc9s+m/XIF2cit+p4Lke36qObUDPGRUzv3oGOjOSpgiUa//N4bJCQTNNaEY6W6Dkp1kGOpGeF0WupliqaYjPGQdg2NsaAqyOcHT+GJUQYwSqSpWMO5+n0ix0KpiQhNp8B6pH57M/Evr5vpqBrkLE4zTWOyWBRlHOoEzr6HAyYp0XxiCCaSmVshGWGJiTYZlUwIX5/C/0nLtR3P9m+8Sv1yGUcRHIFjcAoccAHq4Bo0QBMQIMADeALPlrQerRfrddFasJYzh+AHrLdPLj2Qrg==</latexit>

✏1✏1✏1

<latexit sha1_base64="iuGG82SWUqUpNICfajRPzTJDXL0=">AAAB+XicdVDLSsNAFJ34rPUVdelmsAiuQlKapt0V3bisYB/QhDKZTtqhk2SYmRRK6J+4caGIW//EnX/jpK2gogcunDnnXubeE3JGpbLtD2Njc2t7Z7e0V94/ODw6Nk9OuzLNBCYdnLJU9EMkCaMJ6SiqGOlzQVAcMtILpzeF35sRIWma3Ks5J0GMxgmNKEZKS0PT9Hkc5j7hkjL9dhZDs2Jb9Xqt2mxC27LrruN6BfGchudCx7KXqIA12kPz3R+lOItJojBDUg4cm6sgR0JRzMii7GeScISnaEwGmiYoJjLIl5sv4KVWRjBKha5EwaX6fSJHsZTzONSdMVIT+dsrxL+8QaaiRpDThGeKJHj1UZQxqFJYxABHVBCs2FwThAXVu0I8QQJhpcMq6xC+LoX/k27VcmqWe1ertK7XcZTAObgAV8ABHmiBW9AGHYDBDDyAJ/Bs5Maj8WK8rlo3jPXMGfgB4+0TVdSUJA==</latexit>

✏2✏2✏2

<latexit sha1_base64="ZcA/0nh6IvlFKkItsefwkfV32jY=">AAAB+XicdVDLSsNAFJ34rPUVdelmsAiuQlKapt0V3bisYB/QhDKZTtqhk2SYmRRK6J+4caGIW//EnX/jpK2gogcunDnnXubeE3JGpbLtD2Njc2t7Z7e0V94/ODw6Nk9OuzLNBCYdnLJU9EMkCaMJ6SiqGOlzQVAcMtILpzeF35sRIWma3Ks5J0GMxgmNKEZKS0PT9Hkc5j7hkjL9ri6GZsW26vVatdmEtmXXXcf1CuI5Dc+FjmUvUQFrtIfmuz9KcRaTRGGGpBw4NldBjoSimJFF2c8k4QhP0ZgMNE1QTGSQLzdfwEutjGCUCl2Jgkv1+0SOYinncag7Y6Qm8rdXiH95g0xFjSCnCc8USfDqoyhjUKWwiAGOqCBYsbkmCAuqd4V4ggTCSodV1iF8XQr/J92q5dQs965WaV2v4yiBc3ABroADPNACt6ANOgCDGXgAT+DZyI1H48V4XbVuGOuZM/ADxtsnV1mUJQ==</latexit>

✏k✏k✏k

<latexit sha1_base64="g6SUfqceGd4BXw9YIVUwwjj1y4U=">AAACBnicdVDLSsNAFJ34rPUVdSnCYBFchaQ0TbsrunFZwT6gCWUynbRDJw9mJkIJWbnxV9y4UMSt3+DOv3HSVlDRAwNnzrn3ztzjJ4wKaZof2srq2vrGZmmrvL2zu7evHxx2RZxyTDo4ZjHv+0gQRiPSkVQy0k84QaHPSM+fXhZ+75ZwQePoRs4S4oVoHNGAYiSVNNRP3CT0M5ckgjJ1z9xYVRfDsmme50O9Yhr1eq3abELTMOu2ZTsFcayGY0PLMOeogCXaQ/3dHcU4DUkkMUNCDCwzkV6GuKSYkbzspoIkCE/RmAwUjVBIhJfN18jhmVJGMIi5OpGEc/V7R4ZCIWahrypDJCfit1eIf3mDVAYNL6NRkkoS4cVDQcqgjGGRCRxRTrBkM0UQ5lT9FeIJ4ghLlVxZhfC1KfyfdKuGVTPs61qldbGMowSOwSk4BxZwQAtcgTboAAzuwAN4As/avfaovWivi9IVbdlzBH5Ae/sEqJiakA==</latexit>

✏k✏k✏k

<latexit sha1_base64="HJWNuUSzUiOd0zq8jf3UOb4BQM8=">AAACB3icdVDLSsQwFE19juNr1KUgwUFwVVqZzmM36MalgqPCtAxpekfDpGlJUmEo3bnxV9y4UMStv+DOvzEdR1DRA4GTc+69yT1hypnSjvNuzczOzS8sVpaqyyura+u1jc1zlWSSQo8mPJGXIVHAmYCeZprDZSqBxCGHi3B0VPoXNyAVS8SZHqcQxORKsCGjRBtpUNvx0zjMfUgV4+ae+5mIQJbT8lFRFINa3bGbzcZBp4Md22l6rtcqSctttzzs2s4EdTTFyaD25kcJzWIQmnKiVN91Uh3kRGpGORRVP1OQEjoiV9A3VJAYVJBP9ijwnlEiPEykOULjifq9IyexUuM4NJUx0dfqt1eKf3n9TA/bQc5EmmkQ9POhYcaxTnAZCo6YBKr52BBCJTN/xfSaSEK1ia5qQvjaFP9Pzg9st2F7p41693AaRwVto120j1zUQl10jE5QD1F0i+7RI3qy7qwH69l6+SydsaY9W+gHrNcPbiua/A==</latexit>

✏r�1✏r�1✏r�1

<latexit sha1_base64="vC8Eh6YwaUzyKaepo1hlA9SgNiE=">AAAB/XicdVDLSsNAFJ3UV62v+Ni5GSyCG0NSmqbdFd24rGAf0IQymU7aoZMHMxOhhuKvuHGhiFv/w51/46StoKIHLpw5517m3uMnjAppmh9aYWV1bX2juFna2t7Z3dP3DzoiTjkmbRyzmPd8JAijEWlLKhnpJZyg0Gek608uc797S7igcXQjpwnxQjSKaEAxkkoa6EduEvqZSxJBmXpn/NyazQZ62TRqtWql0YCmYdZsy3Zy4lh1x4aWYc5RBku0Bvq7O4xxGpJIYoaE6FtmIr0McUkxI7OSmwqSIDxBI9JXNEIhEV42334GT5UyhEHMVUUSztXvExkKhZiGvuoMkRyL314u/uX1UxnUvYxGSSpJhBcfBSmDMoZ5FHBIOcGSTRVBmFO1K8RjxBGWKrCSCuHrUvg/6VQMq2rY19Vy82IZRxEcgxNwBizggCa4Ai3QBhjcgQfwBJ61e+1Re9FeF60FbTlzCH5Ae/sEb/2V4w==</latexit>

✏r✏r✏r

<latexit sha1_base64="IBUGf8uo6audy8YA18dZziUp6Lo=">AAAB+3icdVDLSsNAFJ3UV62vWJduBovgKiSladpd0Y3LCvYBTSiT6aQdOnkwMxFLyK+4caGIW3/EnX/jpK2gogcunDnnXube4yeMCmmaH1ppY3Nre6e8W9nbPzg80o+rfRGnHJMejlnMhz4ShNGI9CSVjAwTTlDoMzLw51eFP7gjXNA4upWLhHghmkY0oBhJJY31qpuEfuaSRFCm3hnP87FeM41ms1Fvt6FpmE3bsp2COFbLsaFlmEvUwBrdsf7uTmKchiSSmCEhRpaZSC9DXFLMSF5xU0EShOdoSkaKRigkwsuWu+fwXCkTGMRcVSThUv0+kaFQiEXoq84QyZn47RXiX94olUHLy2iUpJJEePVRkDIoY1gEASeUEyzZQhGEOVW7QjxDHGGp4qqoEL4uhf+Tft2wGoZ906h1LtdxlMEpOAMXwAIO6IBr0AU9gME9eABP4FnLtUftRXtdtZa09cwJ+AHt7ROI0ZVx</latexit>

ek+1ek+1ek+1

<latexit sha1_base64="3r9FWhZpslfbLqmHw1XddvMxRpM=">AAAB9HicdVDLSsNAFJ3UV62vqks3g0UQhDApiba7ohuXFWwrtKFMppN26EwSZyaFEvIdblwo4taPceffOH0IKnrgwuGce7n3niDhTGmEPqzCyura+kZxs7S1vbO7V94/aKs4lYS2SMxjeRdgRTmLaEszzeldIikWAaedYHw18zsTKhWLo1s9Tagv8DBiISNYG8nvJSLIaD8bnzl53i9XkI2qNc+tQmRXPVR36oZ4yKmfu9Cx0RwVsESzX37vDWKSChppwrFSXQcl2s+w1Ixwmpd6qaIJJmM8pF1DIyyo8rP50Tk8McoAhrE0FWk4V79PZFgoNRWB6RRYj9Rvbyb+5XVTHdb8jEVJqmlEFovClEMdw1kCcMAkJZpPDcFEMnMrJCMsMdEmp5IJ4etT+D9pV23Htb0bt9K4XMZRBEfgGJwCB1yABrgGTdACBNyDB/AEnq2J9Wi9WK+L1oK1nDkEP2C9fQIrHpJi</latexit>

q(✏r | ek+1, Sk, Mr�1)q(✏r | ek+1, Sk, Mr�1)q(✏r | ek+1, Sk, Mr�1)

<latexit sha1_base64="oI4uMuBCtd/MTZJw4ujTA9EMst8=">AAACMHicdZDLSgMxFIYz3q23qks3wSIo1mFGetFd0YVuhIq2Cp1SMumphmYuJhmhjPNIbnwU3Sgo4tanMNNWUdEfAj/fOYec87shZ1JZ1pMxMjo2PjE5NZ2ZmZ2bX8guLtVlEAkKNRrwQJy7RAJnPtQUUxzOQwHEczmcud39tH52DUKywD9VvRCaHrnwWYdRojRqZQ+c0HPjq3UHQsm4JrFIsJO/cfIYWnF3007yjkfUJSU8Pkk0SfL4CxxpILbsBG8krWzOMkulwvbuLrZMq1S0i+XUlO2dchHbptVXDg1VbWXvnXZAIw98RTmRsmFboWrGRChGOSQZJ5IQEtolF9DQ1iceyGbcPzjBa5q0cScQ+vkK9+n3iZh4UvY8V3emu8rftRT+VWtEqrPTjJkfRgp8OvioE3GsApymh9tMAFW8pw2hguldMb0kglClM87oED4vxf+b+rZpF8zicSFX2RvGMYVW0CpaRzYqowo6RFVUQxTdogf0jF6MO+PReDXeBq0jxnBmGf2Q8f4ByauqEQ==</latexit>

p(✏k+1 | Sk, Mk)p(✏k+1 | Sk, Mk)p(✏k+1 | Sk, Mk)

<latexit sha1_base64="xF3Y8CK90TkLjbEFnCAgCUgkr2w=">AAACM3icdZBNS8NAEIY3flu/qh69LBahYglJaZt6E72IICjaWmhC2Wy3unSTLLsbocT8Jy/+EQ+CeFDEq//BTVuhig4sDM87szPz+pxRqSzr2Zianpmdm19YzC0tr6yu5dc3mjKKBSYNHLFItHwkCaMhaSiqGGlxQVDgM3Ll948y/eqWCEmj8FINOPECdB3SHsVIadTJn7g88BNedAmXlGmS9PfsFLqlO7cE3QCpG4xYcpFqnk6AUw3cSH+czdVSCnfTTr5gmbVapby/Dy3TqlXtqpMljl13qtA2rWEUwDjOOvlHtxvhOCChwgxJ2bYtrrwECUUxI2nOjSXhCPfRNWnrNEQBkV4yvDmFO5p0YS8S+oUKDulkR4ICKQeBryuzpeVvLYN/ae1Y9epeQkMeKxLi0aBezKCKYGYg7FJBsGIDnSAsqN4V4hskEFba5pw24ftS+H/SLJt2xayeVwoHh2M7FsAW2AZFYAMHHIBjcAYaAIN78ARewZvxYLwY78bHqHTKGPdsgh9hfH4BiZSsJw==</latexit>

p(ek+1 | Sk, Mk)p(ek+1 | Sk, Mk)p(ek+1 | Sk, Mk)

<latexit sha1_base64="0tFyPz1xGlgf2xFkjmQCwiuTv1c=">AAACLHicdVDLSgMxFM34tr5GXboJFqFiGTKl1XYnduNGULQP6Awlk6Y2NPMgyQhlnA9y468I4sIibv0OM7VCFT0QOJxzb+69x4s4kwqhsTE3v7C4tLyymltb39jcMrd3mjKMBaENEvJQtD0sKWcBbSimOG1HgmLf47TlDeuZ37qjQrIwuFGjiLo+vg1YnxGstNQ1607ke0lUoN1keGSn0CneO0Xo+FgNCObJdar1dEa40IIT6h+zgdpK4WHaNfPIQqVqpVyCyCpVUM2uaVJBdu24DG0LTZAHU1x2zWenF5LYp4EiHEvZsVGk3AQLxQinac6JJY0wGeJb2tE0wD6VbjI5NoUHWunBfij0CxScqLMdCfalHPmersyWlr+9TPzL68SqX3UTFkSxogH5GtSPOVQhzJKDPSYoUXykCSaC6V0hGWCBidL55nQI35fC/0mzZNllq3JVzp+eTeNYAXtgHxSADU7AKTgHl6ABCHgAT+AVjI1H48V4M96/SueMac8u+AHj4xMp4qjg</latexit>

�⌧,r�⌧,r�⌧,r

<latexit sha1_base64="vs8xwDynI+x0/k3PK23y0jYRAbc=">AAAB/nicdVDJSgNBEO2JW4xbVDx5aQyCBxlmQtZbUA8eI5gFMsPQ0+kkTXoWumuEMAT8FS8eFPHqd3jzb+wsgoo+KHi8V0VVPT8WXIFlfRiZldW19Y3sZm5re2d3L79/0FZRIilr0UhEsusTxQQPWQs4CNaNJSOBL1jHH1/O/M4dk4pH4S1MYuYGZBjyAacEtOTlj5w48FPnigkgXuoASc7ldOrlC5ZZqZSK9Tq2TKtStsvVGanatWoZ26Y1RwEt0fTy704/oknAQqCCKNWzrRjclEjgVLBpzkkUiwkdkyHraRqSgCk3nZ8/xada6eNBJHWFgOfq94mUBEpNAl93BgRG6rc3E//yegkMam7KwzgBFtLFokEiMER4lgXuc8koiIkmhEqub8V0RCShoBPL6RC+PsX/k3bRtEtm+aZUaFws48iiY3SCzpCNqqiBrlETtRBFKXpAT+jZuDcejRfjddGaMZYzh+gHjLdPMm6WUQ==</latexit>

Mr = {(yj , ⌧j)|j 2 [r]}Mr = {(yj , ⌧j)|j 2 [r]}Mr = {(yj , ⌧j)|j 2 [r]}

<latexit sha1_base64="TKC1+8Xps6yA5TGqgCOyAxEMHEw=">AAACGnicdVDLSsNAFJ34rPVVdelmsAgKUhLtQxdC0Y0boYKtQhLCZDq1UyeTMDMRSsx3uPFX3LhQxJ248W+ctBFU9MDA4Zx7uXOOHzEqlWl+GBOTU9Mzs4W54vzC4tJyaWW1I8NYYNLGIQvFpY8kYZSTtqKKkctIEBT4jFz418eZf3FDhKQhP1fDiLgBuuK0RzFSWvJKlhMFfuIESPUxYslp6gl4CJ1ka+gNdhyFYm+wDW/hADqU28J10tQrlc1Ko75Xrx1As2KOkJF6tbpXg1aulEGOlld6c7ohjgPCFWZIStsyI+UmSCiKGUmLTixJhPA1uiK2phwFRLrJKFoKN7XShb1Q6McVHKnfNxIUSDkMfD2ZZZC/vUz8y7Nj1dt3E8qjWBGOx4d6MYMqhFlPsEsFwYoNNUFYUP1XiPtIIKx0m0VdwldS+D/p7FasaqV2Vi03j/I6CmAdbIAtYIEGaIIT0AJtgMEdeABP4Nm4Nx6NF+N1PDph5Dtr4AeM90/BA6C0</latexit>

✏j = (yj , ⌧j)✏j = (yj , ⌧j)✏j = (yj , ⌧j)

<latexit sha1_base64="DSZk9bw15GODbJcQYEyGaYs56/8=">AAACBnicdVDLSgMxFM3UV62vqksRgkWoIMNM3y6EohuXFewDOsOQSdM2beZBkhHK0JUbf8WNC0Xc+g3u/BszbQUVPXDh5Jx7yb3HDRkV0jA+tNTS8srqWno9s7G5tb2T3d1riSDimDRxwALecZEgjPqkKalkpBNygjyXkbY7vkz89i3hggb+jZyExPbQwKd9ipFUkpM9tELPjS0SCsrUe3SenzijU0uiyBmdTJ1sztCrZrFcKENDN8xKoXiWkEqtoBRTN2bIgQUaTvbd6gU48ogvMUNCdE0jlHaMuKSYkWnGigQJER6jAekq6iOPCDuenTGFx0rpwX7AVfkSztTvEzHyhJh4rur0kByK314i/uV1I9mv2TH1w0gSH88/6kcMygAmmcAe5QRLNlEEYU7VrhAPEUdYquQyKoSvS+H/pFXQzZJevi7l6heLONLgAByBPDBBFdTBFWiAJsDgDjyAJ/Cs3WuP2ov2Om9NaYuZffAD2tsnko2ZNQ==</latexit>

ei = (xi, ti)ei = (xi, ti)ei = (xi, ti)

<latexit sha1_base64="TZaxqv+SJkAmaJWmEu9Cx03DKlk=">AAAB/HicdVDLSgMxFM34rPU12qWbYBEqyJAZWm0XQtGNywr2Ae0wZNK0Dc08SDLiMNRfceNCEbd+iDv/xvQhqOiBC4dz7uXee/yYM6kQ+jCWlldW19ZzG/nNre2dXXNvvyWjRBDaJBGPRMfHknIW0qZiitNOLCgOfE7b/vhy6rdvqZAsCm9UGlM3wMOQDRjBSkueWejFgZ9Rj52X7jx2ojx2PPHMIrKQU62UHYgsp4Jqdk2TCrJrp2VoW2iGIlig4ZnvvX5EkoCGinAsZddGsXIzLBQjnE7yvUTSGJMxHtKupiEOqHSz2fETeKSVPhxEQleo4Ez9PpHhQMo08HVngNVI/vam4l9eN1GDqpuxME4UDcl80SDhUEVwmgTsM0GJ4qkmmAimb4VkhAUmSueV1yF8fQr/Jy3HsstW5bpcrF8s4siBA3AISsAGZ6AOrkADNAEBKXgAT+DZuDcejRfjdd66ZCxmCuAHjLdPepqUrw==</latexit>

{{{

<latexit sha1_base64="FAs3BJ+3msA/N4fbh1T/4CeEIh0=">AAAB73icbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoseiF48VTFtoQtlsN+3S3U3c3Qgl9E948aCIV/+ON/+N2zYHbX0w8Hhvhpl5UcqZNq777ZTW1jc2t8rblZ3dvf2D6uFRWyeZItQnCU9UN8Kaciapb5jhtJsqikXEaSca3878zhNVmiXywUxSGgo8lCxmBBsrdYNURHmQT/vVmlt350CrxCtIDQq0+tWvYJCQTFBpCMda9zw3NWGOlWGE02klyDRNMRnjIe1ZKrGgOszn907RmVUGKE6ULWnQXP09kWOh9UREtlNgM9LL3kz8z+tlJr4OcybTzFBJFovijCOToNnzaMAUJYZPLMFEMXsrIiOsMDE2oooNwVt+eZW0L+peo35536g1b4o4ynACp3AOHlxBE+6gBT4Q4PAMr/DmPDovzrvzsWgtOcXMMfyB8/kDbcqQPA==</latexit>

}}}

<latexit sha1_base64="+GSn2Z/3aqdTe8T3D1JXM0xTM1k=">AAAB73icbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoseiF48V7Ac0oWy2m3bp7ibuboQS8ie8eFDEq3/Hm//GbZuDtj4YeLw3w8y8MOFMG9f9dkpr6xubW+Xtys7u3v5B9fCoo+NUEdomMY9VL8SaciZp2zDDaS9RFIuQ0244uZ353SeqNIvlg5kmNBB4JFnECDZW6vmJCDM/zwfVmlt350CrxCtIDQq0BtUvfxiTVFBpCMda9z03MUGGlWGE07zip5ommEzwiPYtlVhQHWTze3N0ZpUhimJlSxo0V39PZFhoPRWh7RTYjPWyNxP/8/qpia6DjMkkNVSSxaIo5cjEaPY8GjJFieFTSzBRzN6KyBgrTIyNqGJD8JZfXiWdi7rXqF/eN2rNmyKOMpzAKZyDB1fQhDtoQRsIcHiGV3hzHp0X5935WLSWnGLmGP7A+fwBcNSQPg==</latexit>

,

<latexit sha1_base64="qV4ztchpZO4JJt3AzKkJY/q5rmI=">AAAB6HicbVDLSgNBEOyNrxhfUY9eBoPgQcKuRPQY9OIxAfOAZAmzk95kzOzsMjMrhJAv8OJBEa9+kjf/xkmyB00saCiquunuChLBtXHdbye3tr6xuZXfLuzs7u0fFA+PmjpOFcMGi0Ws2gHVKLjEhuFGYDtRSKNAYCsY3c381hMqzWP5YMYJ+hEdSB5yRo2V6he9Ysktu3OQVeJlpAQZar3iV7cfszRCaZigWnc8NzH+hCrDmcBpoZtqTCgb0QF2LJU0Qu1P5odOyZlV+iSMlS1pyFz9PTGhkdbjKLCdETVDvezNxP+8TmrCG3/CZZIalGyxKEwFMTGZfU36XCEzYmwJZYrbWwkbUkWZsdkUbAje8surpHlZ9irlq3qlVL3N4sjDCZzCOXhwDVW4hxo0gAHCM7zCm/PovDjvzseiNedkM8fwB87nD3WhjLg=</latexit>

222

<latexit sha1_base64="yYvOxg5Rc0rm5PQVxmQBqxKQClY=">AAAB8HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoseiF48V7Ic0oWy2m3bp7ibsboQS+iu8eFDEqz/Hm//GbZqDtj4YeLw3w8y8MOFMG9f9dkpr6xubW+Xtys7u3v5B9fCoo+NUEdomMY9VL8SaciZp2zDDaS9RFIuQ0244uZ373SeqNIvlg5kmNBB4JFnECDZWevQTEWY+k7NBtebW3RxolXgFqUGB1qD65Q9jkgoqDeFY677nJibIsDKMcDqr+KmmCSYTPKJ9SyUWVAdZfvAMnVlliKJY2ZIG5erviQwLracitJ0Cm7Fe9ubif14/NdF1kDGZpIZKslgUpRyZGM2/R0OmKDF8agkmitlbERljhYmxGVVsCN7yy6ukc1H3GvXL+0ateVPEUYYTOIVz8OAKmnAHLWgDAQHP8ApvjnJenHfnY9FacoqZY/gD5/MHIXKQog==</latexit>

xixixi

<latexit sha1_base64="TNzkDtrwpaEbn7W/6g+PIF0eO/U=">AAAB8HicdVDLSsNAFJ3UV62vqks3g0VwFSYh0XZXdOOygn1IG8pkOmmHziRhZiKW0K9w40IRt36OO//G6UNQ0QMXDufcy733hClnSiP0YRVWVtfWN4qbpa3tnd298v5BSyWZJLRJEp7ITogV5SymTc00p51UUixCTtvh+HLmt++oVCyJb/QkpYHAw5hFjGBtpNteKsL8vs+m/XIF2cit+p4Lke36qObUDPGRUzvzoGOjOSpgiUa//N4bJCQTNNaEY6W6Dkp1kGOpGeF0WupliqaYjPGQdg2NsaAqyOcHT+GJUQYwSqSpWMO5+n0ix0KpiQhNp8B6pH57M/Evr5vpqBrkLE4zTWOyWBRlHOoEzr6HAyYp0XxiCCaSmVshGWGJiTYZlUwIX5/C/0nLtR3P9q+9Sv1iGUcRHIFjcAoccA7q4Ao0QBMQIMADeALPlrQerRfrddFasJYzh+AHrLdPnVCQ9w==</latexit>

{{{

<latexit sha1_base64="FAs3BJ+3msA/N4fbh1T/4CeEIh0=">AAAB73icbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoseiF48VTFtoQtlsN+3S3U3c3Qgl9E948aCIV/+ON/+N2zYHbX0w8Hhvhpl5UcqZNq777ZTW1jc2t8rblZ3dvf2D6uFRWyeZItQnCU9UN8Kaciapb5jhtJsqikXEaSca3878zhNVmiXywUxSGgo8lCxmBBsrdYNURHmQT/vVmlt350CrxCtIDQq0+tWvYJCQTFBpCMda9zw3NWGOlWGE02klyDRNMRnjIe1ZKrGgOszn907RmVUGKE6ULWnQXP09kWOh9UREtlNgM9LL3kz8z+tlJr4OcybTzFBJFovijCOToNnzaMAUJYZPLMFEMXsrIiOsMDE2oooNwVt+eZW0L+peo35536g1b4o4ynACp3AOHlxBE+6gBT4Q4PAMr/DmPDovzrvzsWgtOcXMMfyB8/kDbcqQPA==</latexit>

}}}

<latexit sha1_base64="+GSn2Z/3aqdTe8T3D1JXM0xTM1k=">AAAB73icbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoseiF48V7Ac0oWy2m3bp7ibuboQS8ie8eFDEq3/Hm//GbZuDtj4YeLw3w8y8MOFMG9f9dkpr6xubW+Xtys7u3v5B9fCoo+NUEdomMY9VL8SaciZp2zDDaS9RFIuQ0244uZ353SeqNIvlg5kmNBB4JFnECDZW6vmJCDM/zwfVmlt350CrxCtIDQq0BtUvfxiTVFBpCMda9z03MUGGlWGE07zip5ommEzwiPYtlVhQHWTze3N0ZpUhimJlSxo0V39PZFhoPRWh7RTYjPWyNxP/8/qpia6DjMkkNVSSxaIo5cjEaPY8GjJFieFTSzBRzN6KyBgrTIyNqGJD8JZfXiWdi7rXqF/eN2rNmyKOMpzAKZyDB1fQhDtoQRsIcHiGV3hzHp0X5935WLSWnGLmGP7A+fwBcNSQPg==</latexit>

,

<latexit sha1_base64="qV4ztchpZO4JJt3AzKkJY/q5rmI=">AAAB6HicbVDLSgNBEOyNrxhfUY9eBoPgQcKuRPQY9OIxAfOAZAmzk95kzOzsMjMrhJAv8OJBEa9+kjf/xkmyB00saCiquunuChLBtXHdbye3tr6xuZXfLuzs7u0fFA+PmjpOFcMGi0Ws2gHVKLjEhuFGYDtRSKNAYCsY3c381hMqzWP5YMYJ+hEdSB5yRo2V6he9Ysktu3OQVeJlpAQZar3iV7cfszRCaZigWnc8NzH+hCrDmcBpoZtqTCgb0QF2LJU0Qu1P5odOyZlV+iSMlS1pyFz9PTGhkdbjKLCdETVDvezNxP+8TmrCG3/CZZIalGyxKEwFMTGZfU36XCEzYmwJZYrbWwkbUkWZsdkUbAje8surpHlZ9irlq3qlVL3N4sjDCZzCOXhwDVW4hxo0gAHCM7zCm/PovDjvzseiNedkM8fwB87nD3WhjLg=</latexit>

222

<latexit sha1_base64="yYvOxg5Rc0rm5PQVxmQBqxKQClY=">AAAB8HicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoseiF48V7Ic0oWy2m3bp7ibsboQS+iu8eFDEqz/Hm//GbZqDtj4YeLw3w8y8MOFMG9f9dkpr6xubW+Xtys7u3v5B9fCoo+NUEdomMY9VL8SaciZp2zDDaS9RFIuQ0244uZ373SeqNIvlg5kmNBB4JFnECDZWevQTEWY+k7NBtebW3RxolXgFqUGB1qD65Q9jkgoqDeFY677nJibIsDKMcDqr+KmmCSYTPKJ9SyUWVAdZfvAMnVlliKJY2ZIG5erviQwLracitJ0Cm7Fe9ubif14/NdF1kDGZpIZKslgUpRyZGM2/R0OmKDF8agkmitlbERljhYmxGVVsCN7yy6ukc1H3GvXL+0ateVPEUYYTOIVz8OAKmnAHLWgDAQHP8ApvjnJenHfnY9FacoqZY/gD5/MHIXKQog==</latexit>

✏j✏j✏j

<latexit sha1_base64="WJzmfElTPIG65bYsq0F5vBFW81o=">AAAB+XicdVDLSsNAFJ3UV62vqEs3g0VwFZIa2y6LblxWsA9oQplMJ+3YmSTMTAol9E/cuFDErX/izr9x0kZQ0QMXzpxzL3PvCRJGpbLtD6O0tr6xuVXeruzs7u0fmIdHXRmnApMOjlks+gGShNGIdBRVjPQTQRAPGOkF0+vc782IkDSO7tQ8IT5H44iGFCOlpaFpegkPMo8kkjL9vl8MzaptNWsXDbcBbcteIid1261fQKdQqqBAe2i+e6MYp5xECjMk5cCxE+VnSCiKGVlUvFSSBOEpGpOBphHiRPrZcvMFPNPKCIax0BUpuFS/T2SISznnge7kSE3kby8X//IGqQqbfkajJFUkwquPwpRBFcM8BjiigmDF5pogLKjeFeIJEggrHVZFh/B1KfyfdGuW41qXt261dVXEUQYn4BScAwc0QAvcgDboAAxm4AE8gWcjMx6NF+N11Voyiplj8APG2yd1s5Q3</latexit>

(a) Setup and components of IMTPP

wwwt,v, aaav

<latexit sha1_base64="IvLSn1C3yERO2JDv2jZEDDCcnrM=">AAACAXicbVDLSgMxFM3UV62vUTeCm2ARXJQyIxVdFt24rGAf0A5DJk3b0CQzJJlKGcaNv+LGhSJu/Qt3/o3pdBbaeuDCyTn3kntPEDGqtON8W4WV1bX1jeJmaWt7Z3fP3j9oqTCWmDRxyELZCZAijArS1FQz0okkQTxgpB2Mb2Z+e0KkoqG419OIeBwNBR1QjLSRfPuoF/EgeUj9RFcmaQVmT5T6E98uO1UnA1wmbk7KIEfDt796/RDHnAiNGVKq6zqR9hIkNcWMpKVerEiE8BgNSddQgThRXpJdkMJTo/ThIJSmhIaZ+nsiQVypKQ9MJ0d6pBa9mfif14314MpLqIhiTQSefzSIGdQhnMUB+1QSrNnUEIQlNbtCPEISYW1CK5kQ3MWTl0nrvOrWqhd3tXL9Oo+jCI7BCTgDLrgEdXALGqAJMHgEz+AVvFlP1ov1bn3MWwtWPnMI/sD6/AHmApcv</latexit>

wwwx,v

<latexit sha1_base64="7nWGp6aX6bd1MBEULKoQHavVwBA=">AAAB9HicbVBNSwMxEJ2tX7V+VT16CRbBg5Rdqeix6MVjBfsB7VKyadqGJtk1yVbLsr/DiwdFvPpjvPlvTNs9aOuDgcd7M8zMCyLOtHHdbye3srq2vpHfLGxt7+zuFfcPGjqMFaF1EvJQtQKsKWeS1g0znLYiRbEIOG0Go5up3xxTpVko780kor7AA8n6jGBjJb8TiSB5TLvJ09k47RZLbtmdAS0TLyMlyFDrFr86vZDEgkpDONa67bmR8ROsDCOcpoVOrGmEyQgPaNtSiQXVfjI7OkUnVumhfqhsSYNm6u+JBAutJyKwnQKboV70puJ/Xjs2/Ss/YTKKDZVkvqgfc2RCNE0A9ZiixPCJJZgoZm9FZIgVJsbmVLAheIsvL5PGedmrlC/uKqXqdRZHHo7gGE7Bg0uowi3UoA4EHuAZXuHNGTsvzrvzMW/NOdnMIfyB8/kDXf+Sfw==</latexit>

+

<latexit sha1_base64="3DnS3vImeIO9jz1IyYjxDR65oEU=">AAAB6HicbVDLSgNBEOyNrxhfUY9eBoMgCGFXInoMevGYgHlAsoTZSW8yZnZ2mZkVQsgXePGgiFc/yZt/4yTZgyYWNBRV3XR3BYng2rjut5NbW9/Y3MpvF3Z29/YPiodHTR2nimGDxSJW7YBqFFxiw3AjsJ0opFEgsBWM7mZ+6wmV5rF8MOME/YgOJA85o8ZK9YteseSW3TnIKvEyUoIMtV7xq9uPWRqhNExQrTuemxh/QpXhTOC00E01JpSN6AA7lkoaofYn80On5MwqfRLGypY0ZK7+npjQSOtxFNjOiJqhXvZm4n9eJzXhjT/hMkkNSrZYFKaCmJjMviZ9rpAZMbaEMsXtrYQNqaLM2GwKNgRv+eVV0rwse5XyVb1Sqt5mceThBE7hHDy4hircQw0awADhGV7hzXl0Xpx352PRmnOymWP4A+fzB3QdjLc=</latexit>

WWW s,s,WWW s,v,

<latexit sha1_base64="10zWS9OWZHJUzLGj6ZpET3V8+Co=">AAACBnicbVDLSgMxFM3UV62vUZciBIvgopQZqeiy6MZlBfuAdhgyaaYNTTJDkimUYVZu/BU3LhRx6ze4829M21nU1gMXTs65l9x7gphRpR3nxyqsrW9sbhW3Szu7e/sH9uFRS0WJxKSJIxbJToAUYVSQpqaakU4sCeIBI+1gdDf122MiFY3Eo57ExONoIGhIMdJG8u3TXsyDtJ35qaqorAIXnuOs4ttlp+rMAFeJm5MyyNHw7e9eP8IJJ0JjhpTquk6svRRJTTEjWamXKBIjPEID0jVUIE6Ul87OyOC5UfowjKQpoeFMXZxIEVdqwgPTyZEeqmVvKv7ndRMd3ngpFXGiicDzj8KEQR3BaSawTyXBmk0MQVhSsyvEQyQR1ia5kgnBXT55lbQuq26tevVQK9dv8ziK4AScgQvggmtQB/egAZoAgyfwAt7Au/VsvVof1ue8tWDlM8fgD6yvXzYzmPY=</latexit>

wwws,k, aaas

<latexit sha1_base64="I0BAguJoKmtzIpAGE8TBlv3UxhU=">AAACAnicbVDLSsNAFL3xWesr6krcDBbBRSmJVHRZdOOygn1AG8JkOm2HTiZhZqKUENz4K25cKOLWr3Dn3zhts9DWAxfOnHMvc+8JYs6Udpxva2l5ZXVtvbBR3Nza3tm19/abKkokoQ0S8Ui2A6woZ4I2NNOctmNJcRhw2gpG1xO/dU+lYpG40+OYeiEeCNZnBGsj+fYh6sZhkD5kfqrKo6w8e+LMV75dcirOFGiRuDkpQY66b391exFJQio04VipjuvE2kux1IxwmhW7iaIxJiM8oB1DBQ6p8tLpCRk6MUoP9SNpSmg0VX9PpDhUahwGpjPEeqjmvYn4n9dJdP/SS5mIE00FmX3UTzjSEZrkgXpMUqL52BBMJDO7IjLEEhNtUiuaENz5kxdJ86ziVivnt9VS7SqPowBHcAyn4MIF1OAG6tAAAo/wDK/wZj1ZL9a79TFrXbLymQP4A+vzByd0l0o=</latexit>

sssk�1 ! sssk

<latexit sha1_base64="iSFQSzdozAoOLDhNGsPV8dHO2Kg=">AAACBHicbVDLSgMxFM3UV62vUZfdBIvgxjIjFV0W3bisYB/QGYZMmmnDJJkhyQhl6MKNv+LGhSJu/Qh3/o1pO4K2HggczrmXm3PClFGlHefLKq2srq1vlDcrW9s7u3v2/kFHJZnEpI0TlsheiBRhVJC2ppqRXioJ4iEj3TC+nvrdeyIVTcSdHqfE52goaEQx0kYK7KqX8jBXkyCPT90J9HQCf5Q4sGtO3ZkBLhO3IDVQoBXYn94gwRknQmOGlOq7Tqr9HElNMSOTipcpkiIcoyHpGyoQJ8rPZyEm8NgoAxgl0jyh4Uz9vZEjrtSYh2aSIz1Si95U/M/rZzq69HMq0kwTgeeHooxBE3XaCBxQSbBmY0MQltT8FeIRkghr01vFlOAuRl4mnbO626if3zZqzauijjKogiNwAlxwAZrgBrRAG2DwAJ7AC3i1Hq1n6816n4+WrGLnEPyB9fEN0BqYNg==</latexit>

WWW t,s,WWW t,m, aaat

<latexit sha1_base64="1iHBemziWrvYqeHh4k4f2dVDMaI=">AAACEHicbVDLSgMxFM3UV62vqks3wSK6KGVGKrosunFZwT6gHYZMmmlDk5khuSOUYT7Bjb/ixoUibl26829MH0JtPRA495x7ubnHjwXXYNvfVm5ldW19I79Z2Nre2d0r7h80dZQoyho0EpFq+0QzwUPWAA6CtWPFiPQFa/nDm7HfemBK8yi8h1HMXEn6IQ84JWAkr3jajaWftjIvhbLOyniulL8lyTzwiiW7Yk+Al4kzIyU0Q90rfnV7EU0kC4EKonXHsWNwU6KAU8GyQjfRLCZ0SPqsY2hIJNNuOjkowydG6eEgUuaFgCfq/ERKpNYj6ZtOSWCgF72x+J/XSSC4clMexgmwkE4XBYnAEOFxOrjHFaMgRoYQqrj5K6YDoggFk2HBhOAsnrxMmucVp1q5uKuWatezOPLoCB2jM+SgS1RDt6iOGoiiR/SMXtGb9WS9WO/Wx7Q1Z81mDtEfWJ8/CcydOg==</latexit>

xxxk�1

<latexit sha1_base64="U7AW9mQbAr7EmhBz7XjX3Ce7fVE=">AAAB9HicdVDLSsNAFJ3UV62vqks3g0VwY5iERNtd0Y3LCvYBbSiT6aQdOnk4MymWkO9w40IRt36MO//G6UNQ0QMXDufcy733+AlnUiH0YRRWVtfWN4qbpa3tnd298v5BS8apILRJYh6Ljo8l5SyiTcUUp51EUBz6nLb98dXMb0+okCyObtU0oV6IhxELGMFKS14vCf3sPu9n4zMr75cryER21XVsiEzbRTWrpomLrNq5Ay0TzVEBSzT65ffeICZpSCNFOJaya6FEeRkWihFO81IvlTTBZIyHtKtphEMqvWx+dA5PtDKAQSx0RQrO1e8TGQ6lnIa+7gyxGsnf3kz8y+umKqh6GYuSVNGILBYFKYcqhrME4IAJShSfaoKJYPpWSEZYYKJ0TiUdwten8H/Ssk3LMd0bp1K/XMZRBEfgGJwCC1yAOrgGDdAEBNyBB/AEno2J8Wi8GK+L1oKxnDkEP2C8fQJMS5J3</latexit>

p✓,�

<latexit sha1_base64="56uHtmYdlptV3mBeu9A0ZPepG0Q=">AAAB/HicbVDLSgNBEJz1GeNrNUcvg0HwIGFXInoM6sFjBPOAbAizk04yZPbBTK8QlvVXvHhQxKsf4s2/cZLsQRMLGoqqbrq7/FgKjY7zba2srq1vbBa2its7u3v79sFhU0eJ4tDgkYxU22capAihgQIltGMFLPAltPzxzdRvPYLSIgofcBJDN2DDUAwEZ2iknl1K46yXejgCZGfeLUhkWc8uOxVnBrpM3JyUSY56z/7y+hFPAgiRS6Z1x3Vi7KZMoeASsqKXaIgZH7MhdAwNWQC6m86Oz+iJUfp0EClTIdKZ+nsiZYHWk8A3nQHDkV70puJ/XifBwVU3FWGcIIR8vmiQSIoRnSZB+0IBRzkxhHElzK2Uj5hiHE1eRROCu/jyMmmeV9xq5eK+Wq5d53EUyBE5JqfEJZekRu5InTQIJxPyTF7Jm/VkvVjv1se8dcXKZ0rkD6zPHx7tlRQ=</latexit>

U•,•U•,•U•,•

<latexit sha1_base64="VD6Be3xssqIavdN04myoL1ym8ZA=">AAACAnicbVDLSsNAFJ34rPUVdSVuBovgQkoiFV0W3bisYNpCE8JkOmmHzkzCzEQoIbjxV9y4UMStX+HOv3HaZqGtBy73cM69zNwTpYwq7Tjf1tLyyuraemWjurm1vbNr7+23VZJJTDycsER2I6QIo4J4mmpGuqkkiEeMdKLRzcTvPBCpaCLu9TglAUcDQWOKkTZSaB/6KY9yL8z9KGOM6LOyF0Vo15y6MwVcJG5JaqBEK7S//H6CM06Exgwp1XOdVAc5kppiRoqqnymSIjxCA9IzVCBOVJBPTyjgiVH6ME6kKaHhVP29kSOu1JhHZpIjPVTz3kT8z+tlOr4KcirSTBOBZw/FGYM6gZM8YJ9KgjUbG4KwpOavEA+RRFib1KomBHf+5EXSPq+7jfrFXaPWvC7jqIAjcAxOgQsuQRPcghbwAAaP4Bm8gjfryXqx3q2P2eiSVe4cgD+wPn8AEj+X4A==</latexit>

�t,k+1 ⇠

<latexit sha1_base64="pFjsGzuG926XMggqRe4IEatf5tM=">AAAB/HicdVDJSgNBEO1xjXGL5uilMQiCMvSERJNbUA8eI5gFMkPo6XSSJj0L3TVCGOKvePGgiFc/xJt/Y2cRVPRBweO9Kqrq+bEUGgj5sJaWV1bX1jMb2c2t7Z3d3N5+U0eJYrzBIhmptk81lyLkDRAgeTtWnAa+5C1/dDn1W3dcaRGFtzCOuRfQQSj6glEwUjeXd6+4BNpN4XR04kywq0XQzRWITYqVcqmIiV0sk6pTNaRMnOpZCTs2maGAFqh3c+9uL2JJwENgkmrdcUgMXkoVCCb5JOsmmseUjeiAdwwNacC1l86On+Ajo/RwP1KmQsAz9ftESgOtx4FvOgMKQ/3bm4p/eZ0E+hUvFWGcAA/ZfFE/kRgiPE0C94TiDOTYEMqUMLdiNqSKMjB5ZU0IX5/i/0mzaDslu3xTKtQuFnFk0AE6RMfIQeeohq5RHTUQQ2P0gJ7Qs3VvPVov1uu8dclazOTRD1hvnzN2lII=</latexit>

xk+1 ⇠

<latexit sha1_base64="yNCJ+sRnZa97gTxjQuiqQETHxlA=">AAAB83icdVDLSsNAFJ3UV62vqks3g0UQhDApiba7ohuXFewDmlAm00k7dCYJMxOxhP6GGxeKuPVn3Pk3Th+Cih64cDjnXu69J0w5UxqhD6uwsrq2vlHcLG1t7+zulfcP2irJJKEtkvBEdkOsKGcxbWmmOe2mkmIRctoJx1czv3NHpWJJfKsnKQ0EHsYsYgRrI/n3/Xx85kyhr5jolyvIRtWa51Yhsqseqjt1Qzzk1M9d6NhojgpYotkvv/uDhGSCxppwrFTPQakOciw1I5xOS36maIrJGA9pz9AYC6qCfH7zFJ4YZQCjRJqKNZyr3ydyLJSaiNB0CqxH6rc3E//yepmOakHO4jTTNCaLRVHGoU7gLAA4YJISzSeGYCKZuRWSEZaYaBNTyYTw9Sn8n7SrtuPa3o1baVwu4yiCI3AMToEDLkADXIMmaAECUvAAnsCzlVmP1ov1umgtWMuZQ/AD1tsn466RnQ==</latexit>

tk � tk�1

<latexit sha1_base64="r+hbt0vdq3NjOIxREhydzj8suzo=">AAAB83icdVDJSgNBEO2JW4xb1KOXxiB4ydAzTDS5Bb14jGAWSELo6XSSZnoWumuEMOQ3vHhQxKs/482/sbMIKvqg4PFeFVX1/EQKDYR8WLm19Y3Nrfx2YWd3b/+geHjU0nGqGG+yWMaq41PNpYh4EwRI3kkUp6EvedsPrud++54rLeLoDqYJ74d0HImRYBSM1INBgMswyIKyMxsUS8QmbrXiuZjYboXUnJohFeLULjzs2GSBElqhMSi+94YxS0MeAZNU665DEuhnVIFgks8KvVTzhLKAjnnX0IiGXPezxc0zfGaUIR7FylQEeKF+n8hoqPU09E1nSGGif3tz8S+vm8Ko2s9ElKTAI7ZcNEolhhjPA8BDoTgDOTWEMiXMrZhNqKIMTEwFE8LXp/h/0nJtx7Mrt16pfrWKI49O0Ck6Rw66RHV0gxqoiRhK0AN6Qs9Waj1aL9brsjVnrWaO0Q9Yb5+HiZFh</latexit>

P✓,x

<latexit sha1_base64="mz8jU2InqNC3zJVRfwt/hqDygJA=">AAAB/nicbVDLSsNAFJ34rPUVFVdugkVwISWRii6LblxWsA9oQphMJ+3QySTM3IglBPwVNy4Ucet3uPNvnLRZaOuBgcM593LPnCDhTIFtfxtLyyura+uVjerm1vbOrrm331FxKgltk5jHshdgRTkTtA0MOO0lkuIo4LQbjG8Kv/tApWKxuIdJQr0IDwULGcGgJd88dCMMoyDIWrmfuTCigM8ec9+s2XV7CmuROCWpoRIt3/xyBzFJIyqAcKxU37ET8DIsgRFO86qbKppgMsZD2tdU4IgqL5vGz60TrQysMJb6CbCm6u+NDEdKTaJATxZh1bxXiP95/RTCKy9jIkmBCjI7FKbcgtgqurAGTFICfKIJJpLprBYZYYkJ6MaqugRn/suLpHNedxr1i7tGrXld1lFBR+gYnSIHXaImukUt1EYEZegZvaI348l4Md6Nj9noklHuHKA/MD5/AMUelgQ=</latexit>

LognormalLognormalLognormal

<latexit sha1_base64="s9Vo1MH8DLHX39G5frcLWVtJ3aQ=">AAACAXicdVDLSgNBEJz1bXxFvQheBoPgKWziLiY30YsHDxFMIiQhzE4mccg8lpleMSzx4q948aCIV//Cm3/j5CGoaEFDUdVNd1cUC27B9z+8mdm5+YXFpeXMyura+kZ2c6tmdWIoq1IttLmKiGWCK1YFDoJdxYYRGQlWj/qnI79+w4zlWl3CIGYtSXqKdzkl4KR2dqcZyyhtArsFS9Nz3VPaSCKGw3Y25+f9w3JYKmFHwjDwwxEpBkG5jAt5f4wcmqLSzr43O5omkimggljbKPgxtFJigFPBhplmYllMaJ/0WMNRRSSzrXT8wRDvO6WDu9q4UoDH6veJlEhrBzJynZLAtf3tjcS/vEYC3VIr5SpOgCk6WdRNBAaNR3HgDjeMghg4Qqjh7lZMr4khFFxoGRfC16f4f1Ir5gtBPrwIcscn0ziW0C7aQweogI7QMTpDFVRFFN2hB/SEnr1779F78V4nrTPedGYb/YD39glSoJgg</latexit>

mmmk

<latexit sha1_base64="REot61VLIHrAokEBYNyt6oFsEVc=">AAAB/3icdVDNS8MwHE39nPOrKnjxEhyCp9LJZrfb0IvHCe4D1lLSLNvCkrYkqTBqD/4rXjwo4tV/w5v/jelWQUUfBB7vvV/yywtiRqWy7Q9jaXlldW29tFHe3Nre2TX39rsySgQmHRyxSPQDJAmjIekoqhjpx4IgHjDSC6aXud+7JULSKLxRs5h4HI1DOqIYKS355qEb8yDlmZ+6kc7l16TTLPPNim05jUbdbkLbsufIybntOE1YLZQKKND2zXd3GOGEk1BhhqQcVO1YeSkSimJGsrKbSBIjPEVjMtA0RJxIL53vn8ETrQzhKBL6hArO1e8TKeJSznigkxypifzt5eJf3iBRo4aX0jBOFAnx4qFRwqCKYF4GHFJBsGIzTRAWVO8K8QQJhJWurKxL+Pop/J90z6xqzapf1yqti6KOEjgCx+AUVIEDWuAKtEEHYHAHHsATeDbujUfjxXhdRJeMYuYA/IDx9gnTQ5dL</latexit>

(b) MTPP for observations pθ

+

<latexit sha1_base64="3DnS3vImeIO9jz1IyYjxDR65oEU=">AAAB6HicbVDLSgNBEOyNrxhfUY9eBoMgCGFXInoMevGYgHlAsoTZSW8yZnZ2mZkVQsgXePGgiFc/yZt/4yTZgyYWNBRV3XR3BYng2rjut5NbW9/Y3MpvF3Z29/YPiodHTR2nimGDxSJW7YBqFFxiw3AjsJ0opFEgsBWM7mZ+6wmV5rF8MOME/YgOJA85o8ZK9YteseSW3TnIKvEyUoIMtV7xq9uPWRqhNExQrTuemxh/QpXhTOC00E01JpSN6AA7lkoaofYn80On5MwqfRLGypY0ZK7+npjQSOtxFNjOiJqhXvZm4n9eJzXhjT/hMkkNSrZYFKaCmJjMviZ9rpAZMbaEMsXtrYQNqaLM2GwKNgRv+eVV0rwse5XyVb1Sqt5mceThBE7hHDy4hircQw0awADhGV7hzXl0Xpx352PRmnOymWP4A+fzB3QdjLc=</latexit>

⌧r�1 � ⌧r�2

<latexit sha1_base64="hsoMvyWDtqYhPDBy61dyHySSc0A=">AAAB/3icdVDLSsNAFJ34rPUVFdy4GSyCm4akb3dFNy4r2Ae0IUymk3bo5MHMRCgxC3/FjQtF3Pob7vwbJ32Aih64cDjnXu69x40YFdI0P7WV1bX1jc3cVn57Z3dvXz847Igw5pi0cchC3nORIIwGpC2pZKQXcYJ8l5GuO7nK/O4d4YKGwa2cRsT20SigHsVIKsnRjwcSxU7Ci1YKi0teSh29YBr1RqNcK0PTMGfISK1yUa9Ca6EUwAItR/8YDEMc+ySQmCEh+pYZSTtBXFLMSJofxIJECE/QiPQVDZBPhJ3M7k/hmVKG0Au5qkDCmfp9IkG+EFPfVZ0+kmPx28vEv7x+LL2GndAgiiUJ8HyRFzMoQ5iFAYeUEyzZVBGEOVW3QjxGHGGpIsurEJafwv9Jp2RYFaN6Uyk0Lxdx5MAJOAXnwAJ10ATXoAXaAIN78AiewYv2oD1pr9rbvHVFW8wcgR/Q3r8AZKCVuQ==</latexit>

ggg⌧,� , bbb�

<latexit sha1_base64="u0hrz+0nD91QkA9CsvS+KiZ/bFY=">AAACDnicbZDLSsNAFIYn9VbrLerSzWApuCglkYoui25cVrAXaEI4mU7boTNJmJkIJeQJ3Pgqblwo4ta1O9/G6WWhrT8M/HznHM6cP0w4U9pxvq3C2vrG5lZxu7Szu7d/YB8etVWcSkJbJOax7IagKGcRbWmmOe0mkoIIOe2E45tpvfNApWJxdK8nCfUFDCM2YAS0QYFd8RIRZsM8yDwNadUbghCQV/EMh3kwB4FddmrOTHjVuAtTRgs1A/vL68ckFTTShINSPddJtJ+B1Ixwmpe8VNEEyBiGtGdsBIIqP5udk+OKIX08iKV5kcYz+nsiA6HURISmU4AeqeXaFP5X66V6cOVnLEpSTSMyXzRIOdYxnmaD+0xSovnEGCCSmb9iMgIJRJsESyYEd/nkVdM+r7n12sVdvdy4XsRRRCfoFJ0hF12iBrpFTdRCBD2iZ/SK3qwn68V6tz7mrQVrMXOM/sj6/AHm9Zym</latexit>

ggg⌧,�

<latexit sha1_base64="pFDMHO4bOfKma37Dwzkg7hy+GzI=">AAAB/3icbVBNS8NAEN34WetXVPDiZbEIHqQkUtFj0YvHCvYDmlAm2226dDcJuxuhxBz8K148KOLVv+HNf+O2zUFbHww83pthZl6QcKa043xbS8srq2vrpY3y5tb2zq69t99ScSoJbZKYx7ITgKKcRbSpmea0k0gKIuC0HYxuJn77gUrF4uhejxPqCwgjNmAEtJF69iH2EhFkYd7LPA3pmReCEJD37IpTdabAi8QtSAUVaPTsL68fk1TQSBMOSnVdJ9F+BlIzwmle9lJFEyAjCGnX0AgEVX42vT/HJ0bp40EsTUUaT9XfExkIpcYiMJ0C9FDNexPxP6+b6sGVn7EoSTWNyGzRIOVYx3gSBu4zSYnmY0OASGZuxWQIEog2kZVNCO78y4ukdV51a9WLu1qlfl3EUUJH6BidIhddojq6RQ3URAQ9omf0it6sJ+vFerc+Zq1LVjFzgP7A+vwBIsKWMQ==</latexit>

GGGm,m,GGGm,�

<latexit sha1_base64="FngvNzVJMxVFqaVzfyjqdi063y8=">AAACCnicbVC7TsMwFHXKq5RXgJHFUCExVFWCimCsYICxSPQhNVHkuG5r1XYi20Gqosws/AoLAwix8gVs/A1um6G0HOlKx+fcK997wphRpR3nxyqsrK6tbxQ3S1vbO7t79v5BS0WJxKSJIxbJTogUYVSQpqaakU4sCeIhI+1wdDPx249EKhqJBz2Oic/RQNA+xUgbKbCPvZiH6W0WpLzCswqce3oDxDnKArvsVJ0p4DJxc1IGORqB/e31IpxwIjRmSKmu68TaT5HUFDOSlbxEkRjhERqQrqECcaL8dHpKBk+N0oP9SJoSGk7V+YkUcaXGPDSdHOmhWvQm4n9eN9H9Kz+lIk40EXj2UT9hUEdwkgvsUUmwZmNDEJbU7ArxEEmEtUmvZEJwF09eJq3zqlurXtzXyvXrPI4iOAIn4Ay44BLUwR1ogCbA4Am8gDfwbj1br9aH9TlrLVj5zCH4A+vrF00smqk=</latexit>

mmmr�2 ! mmmr�1

<latexit sha1_base64="cNPEoHsRSELjwZcVuMBY6VsI1Wg=">AAACCHicbVC7TsMwFHV4lvIKMDJgUSGxUCVVEYwVLIxFog+piSLHdVqrthPZDlIVZWThV1gYQIiVT2Djb3DbDKXlSJaOzrlX1+eECaNKO86PtbK6tr6xWdoqb+/s7u3bB4dtFacSkxaOWSy7IVKEUUFammpGuokkiIeMdMLR7cTvPBKpaCwe9DghPkcDQSOKkTZSYJ94CQ8zngeZvKjl0NMxnFPcPLArTtWZAi4TtyAVUKAZ2N9eP8YpJ0JjhpTquU6i/QxJTTEjedlLFUkQHqEB6RkqECfKz6ZBcnhmlD6MYmme0HCqzm9kiCs15qGZ5EgP1aI3Ef/zeqmOrv2MiiTVRODZoShl0MSdtAL7VBKs2dgQhCU1f4V4iCTC2nRXNiW4i5GXSbtWdevVy/t6pXFT1FECx+AUnAMXXIEGuANN0AIYPIEX8AberWfr1fqwPmejK1axcwT+wPr6BZvumbc=</latexit>

sssk

<latexit sha1_base64="DV7kuerZAkVNuchtuNE0l5TlO/U=">AAAB8nicdVDLSsNAFJ3UV62vqks3g0VwFSah0XZXdOOygn1AGspkOmmHTiZhZiKUkM9w40IRt36NO//G6UNQ0QMXDufcy733hClnSiP0YZXW1jc2t8rblZ3dvf2D6uFRVyWZJLRDEp7IfogV5UzQjmaa034qKY5DTnvh9Hru9+6pVCwRd3qW0iDGY8EiRrA2kj9I4zBXxTCfFsNqDdnIbXh1FyLb9VDTaRriIad5UYeOjRaogRXaw+r7YJSQLKZCE46V8h2U6iDHUjPCaVEZZIqmmEzxmPqGChxTFeSLkwt4ZpQRjBJpSmi4UL9P5DhWahaHpjPGeqJ+e3PxL8/PdNQIcibSTFNBlouijEOdwPn/cMQkJZrPDMFEMnMrJBMsMdEmpYoJ4etT+D/purZTt73beq11tYqjDE7AKTgHDrgELXAD2qADCEjAA3gCz5a2Hq0X63XZWrJWM8fgB6y3T2L6kgA=</latexit>

q�,�

<latexit sha1_base64="n+IOPmm4XmDc+E2IQ4krum+ZYLc=">AAAB+HicdVDLSsNAFJ3UV62PRl26GSyCCwlJ7GtZ1IXLCvYBTQiT6aQdOnk4MxFq6Je4caGIWz/FnX/jpK2gogcuHM65l3vv8RNGhTTND62wsrq2vlHcLG1t7+yW9b39rohTjkkHxyzmfR8JwmhEOpJKRvoJJyj0Gen5k4vc790RLmgc3chpQtwQjSIaUIykkjy9fOtlTjKmp84lYRLNPL1iGuaZbTbrUJGaWbetnNi1RrUOLcOcowKWaHv6uzOMcRqSSGKGhBhYZiLdDHFJMSOzkpMKkiA8QSMyUDRCIRFuNj98Bo+VMoRBzFVFEs7V7xMZCoWYhr7qDJEci99eLv7lDVIZNN2MRkkqSYQXi4KUQRnDPAU4pJxgyaaKIMypuhXiMeIIS5VVSYXw9Sn8n3Rtw6oatetqpXW+jKMIDsEROAEWaIAWuAJt0AEYpOABPIFn7V571F6010VrQVvOHIAf0N4+AQVmk1k=</latexit>

Q�,y

<latexit sha1_base64="O82aKMQ7V2t4lWSCGzzjdf0Ocpw=">AAAB/HicbVDLSsNAFL3xWesr2qWbwSK4kJJIRZdFNy5bsA9oQphMJ+3QyYOZiRBC/BU3LhRx64e482+ctllo64GBwzn3cs8cP+FMKsv6NtbWNza3tis71d29/YND8+i4J+NUENolMY/FwMeSchbRrmKK00EiKA59Tvv+9G7m9x+pkCyOHlSWUDfE44gFjGClJc+sOSFWE9/PO4WXO8mEXWSFZ9athjUHWiV2SepQou2ZX84oJmlII0U4lnJoW4lycywUI5wWVSeVNMFkisd0qGmEQyrdfB6+QGdaGaEgFvpFCs3V3xs5DqXMQl9PzqLKZW8m/ucNUxXcuDmLklTRiCwOBSlHKkazJtCICUoUzzTBRDCdFZEJFpgo3VdVl2Avf3mV9C4bdrNx1WnWW7dlHRU4gVM4BxuuoQX30IYuEMjgGV7hzXgyXox342MxumaUOzX4A+PzByrxlR0=</latexit>

LogNormalLogNormalLogNormal

<latexit sha1_base64="D7WLE+j/etF66sbp45HBdkJhYag=">AAACAXicbVBNS8NAEN34WetX1IvgJVgETyWRih6LXjyIVLAf0ISy2W7bpbvZsDsRS4gX/4oXD4p49V9489+4bXPQ1gcDj/dmmJkXxpxpcN1va2FxaXlltbBWXN/Y3Nq2d3YbWiaK0DqRXKpWiDXlLKJ1YMBpK1YUi5DTZji8HPvNe6o0k9EdjGIaCNyPWI8RDEbq2Pt+LMLUB/oAmqTXsn8jlcA8yzp2yS27EzjzxMtJCeWodewvvytJImgEhGOt254bQ5BiBYxwmhX9RNMYkyHu07ahERZUB+nkg8w5MkrX6UllKgJnov6eSLHQeiRC0ykwDPSsNxb/89oJ9M6DlEVxAjQi00W9hDsgnXEcTpcpSoCPDMFEMXOrQwZYYQImtKIJwZt9eZ40TspepXx6WylVL/I4CugAHaJj5KEzVEVXqIbqiKBH9Ixe0Zv1ZL1Y79bHtHXBymf20B9Ynz+te5ew</latexit>

TruncatedTruncatedTruncated

<latexit sha1_base64="5afmuY5LTvc9ChZ+ecxRuveTuEo=">AAAB/3icbVBNS8NAEN3Ur1q/ooIXL8EieCqJVPRY9OKxQr+gDWWznbZLN5uwOxFLzMG/4sWDIl79G978N24/Dtr6YODx3gwz84JYcI2u+23lVlbX1jfym4Wt7Z3dPXv/oKGjRDGos0hEqhVQDYJLqCNHAa1YAQ0DAc1gdDPxm/egNI9kDccx+CEdSN7njKKRuvZRJw6DtIPwgGlNJdLo0Muyrl10S+4UzjLx5qRI5qh27a9OL2JJCBKZoFq3PTdGP6UKOROQFTqJhpiyER1A21BJQ9B+Or0/c06N0nP6kTIl0ZmqvydSGmo9DgPTGVIc6kVvIv7ntRPsX/kpl3GCINlsUT8RDkbOJAynxxUwFGNDKFPc3OqwIVWUoYmsYELwFl9eJo3zklcuXdyVi5XreRx5ckxOyBnxyCWpkFtSJXXCyCN5Jq/kzXqyXqx362PWmrPmM4fkD6zPHzwBluU=</latexit>

�⌧,r ⇠

<latexit sha1_base64="wK04djhaYV0BE5IR6vS1KgTHGrc=">AAAB/HicdVDJSgNBEO1xjXEbzdFLYxA8SJjRLOYW1IPHCGaBTAg9nZ6kSc9Cd40QhvgrXjwo4tUP8ebf2JOMoKIPCh7vVVFVz40EV2BZH8bS8srq2npuI7+5tb2za+7tt1UYS8paNBSh7LpEMcED1gIOgnUjyYjvCtZxJ5ep37ljUvEwuIVpxPo+GQXc45SAlgZmwbliAsggcYDEJ3LmKO4PzKJVqtXPKlUbWyVrjpRULbtewXamFFGG5sB8d4YhjX0WABVEqZ5tRdBPiAROBZvlnVixiNAJGbGepgHxmeon8+Nn+EgrQ+yFUlcAeK5+n0iIr9TUd3WnT2Csfnup+JfXi8E77yc8iGJgAV0s8mKBIcRpEnjIJaMgppoQKrm+FdMxkYSCziuvQ/j6FP9P2qclu1yq3JSLjYssjhw6QIfoGNmohhroGjVRC1E0RQ/oCT0b98aj8WK8LlqXjGymgH7AePsEOhSVKw==</latexit>

VVV •,•

<latexit sha1_base64="4kC6Y4YPWSdTLzlwx1AZuMIigvA=">AAACAnicbVDLSsNAFL3xWesr6krcDBbBhZREKrosunFZwT6gCWEynbZDZ5IwMxFKCG78FTcuFHHrV7jzb5y2WWjrgcs9nHMvM/eECWdKO863tbS8srq2Xtoob25t7+zae/stFaeS0CaJeSw7IVaUs4g2NdOcdhJJsQg5bYejm4nffqBSsTi61+OE+gIPItZnBGsjBfahl4gwa+VB5oUp51SfFT0P7IpTdaZAi8QtSAUKNAL7y+vFJBU00oRjpbquk2g/w1Izwmle9lJFE0xGeEC7hkZYUOVn0xNydGKUHurH0lSk0VT9vZFhodRYhGZSYD1U895E/M/rprp/5WcsSlJNIzJ7qJ9ypGM0yQP1mKRE87EhmEhm/orIEEtMtEmtbEJw509eJK3zqlurXtzVKvXrIo4SHMExnIILl1CHW2hAEwg8wjO8wpv1ZL1Y79bHbHTJKnYO4A+szx8VVZfh</latexit>

gggm,⌧ , bbbm

<latexit sha1_base64="t4m4aMlz5oXUR+/f1IhWp1HKHuo=">AAACBHicbVDLSsNAFJ3UV62vqMtuBovgopREKrosunFZwT6gCWEynbZDZyZhZiKUkIUbf8WNC0Xc+hHu/BunaRbaeuDC4Zx7ufeeMGZUacf5tkpr6xubW+Xtys7u3v6BfXjUVVEiMengiEWyHyJFGBWko6lmpB9LgnjISC+c3sz93gORikbiXs9i4nM0FnREMdJGCuyqF/MwHWdByuueRklWh7kSZgEP7JrTcHLAVeIWpAYKtAP7yxtGOOFEaMyQUgPXibWfIqkpZiSreIkiMcJTNCYDQwXiRPlp/kQGT40yhKNImhIa5urviRRxpWY8NJ0c6Yla9ubif94g0aMrP6UiTjQReLFolDCoIzhPBA6pJFizmSEIS2puhXiCJMLa5FYxIbjLL6+S7nnDbTYu7pq11nURRxlUwQk4Ay64BC1wC9qgAzB4BM/gFbxZT9aL9W59LFpLVjFzDP7A+vwBD/GYXg==</latexit>

yyyr�1

<latexit sha1_base64="hPTM4VPV+xBFLD8chs9ef202tdU=">AAAB+HicdVDLSsNAFJ34rPXRqEs3g0VwY0js013RjcsK9gFtCJPppB06eTAzEWLIl7hxoYhbP8Wdf+OkjaCiBy4czrmXe+9xI0aFNM0PbWV1bX1js7RV3t7Z3avo+wd9EcYckx4OWciHLhKE0YD0JJWMDCNOkO8yMnDnV7k/uCNc0DC4lUlEbB9NA+pRjKSSHL0yjnw3TTInTfmZlWWOXjWNVrtda9agaZgL5KRZv2g1oFUoVVCg6+jv40mIY58EEjMkxMgyI2mniEuKGcnK41iQCOE5mpKRogHyibDTxeEZPFHKBHohVxVIuFC/T6TIFyLxXdXpIzkTv71c/MsbxdJr2ykNoliSAC8XeTGDMoR5CnBCOcGSJYogzKm6FeIZ4ghLlVVZhfD1Kfyf9M8Nq240burVzmURRwkcgWNwCizQAh1wDbqgBzCIwQN4As/avfaovWivy9YVrZg5BD+gvX0ClW2Ttg==</latexit>

yyyr ⇠

<latexit sha1_base64="pxLij24zbOEvFr/TSM/WefIUseY=">AAAB+3icdVDLSgMxFM3UV62vsS7dBIvgqkzt013RjcsK9gFtGTJp2oYmmSHJiMMwv+LGhSJu/RF3/o2ZdgQVPXDhcM693HuPFzCqtON8WLm19Y3Nrfx2YWd3b//APiz2lB9KTLrYZ74ceEgRRgXpaqoZGQSSIO4x0vcWV6nfvyNSUV/c6iggY45mgk4pRtpIrl0cBdyLo8SNY5kkcKQod+2SU262WtVGFTplZ4mUNGoXzTqsZEoJZOi49vto4uOQE6ExQ0oNK06gxzGSmmJGksIoVCRAeIFmZGioQJyocby8PYGnRpnAqS9NCQ2X6veJGHGlIu6ZTo70XP32UvEvbxjqaWscUxGEmgi8WjQNGdQ+TIOAEyoJ1iwyBGFJza0Qz5FEWJu4CiaEr0/h/6R3Xq7UyvWbWql9mcWRB8fgBJyBCmiCNrgGHdAFGNyDB/AEnq3EerRerNdVa87KZo7AD1hvnzpulTs=</latexit>

GGG⌧,m,GGG⌧,s, bbb⌧

<latexit sha1_base64="yUTDIdUUCYeX6z9GKX/2Q7e5bso=">AAACGHicdZDLSsNAFIYn9VbrrerSzWARXJSahLZ2WXShywr2Ak0Jk+m0HTqThJmJUEIew42v4saFIm67822cpBWs6A8DP985hzPn90JGpTLNTyO3tr6xuZXfLuzs7u0fFA+POjKIBCZtHLBA9DwkCaM+aSuqGOmFgiDuMdL1ptdpvftAhKSBf69mIRlwNPbpiGKkNHKLF07IvfgmcWNHoajMk/IqkEkZZsRL3BS4xZJZMat2w7SgNpZdszNjNurVOrS0SVUCS7Xc4twZBjjixFeYISn7lhmqQYyEopiRpOBEkoQIT9GY9LX1ESdyEGeHJfBMkyEcBUI/X8GM/pyIEZdyxj3dyZGayN+1FP5V60dq1BjE1A8jRXy8WDSKGFQBTFOCQyoIVmymDcKC6r9CPEECYaWzLOgQvi+F/5uOXbGqldpdtdS8WsaRByfgFJwDC1yCJrgFLdAGGDyCZ/AK3own48V4Nz4WrTljOXMMVmTMvwAWMKET</latexit>

(c)Posterior MTPP for missing events qφ

Figure 1: Overview and neural architecture of IMTPP. Panel (a) illustrates an the problem setup, notationsas well as an overview of the components IMTPP. Panel (b) shows the neural architecture of the MTPP ofobservations pθ. Panel (c) shows the neural architecture of the posterior MTPP of missing events qφ. Note that,the information of ek+1 here is used to truncate the log-normal distribution, whereas the log-normal distributionis non-truncated. Prior MTPP for missing events has a simpler architecture and therefore is omitted in thisillustration.

tk+1 = tk + ∆t,k. Here, the mark distribution isindependent of ∆t,k+1. Finally, we note that θ ={W•,•,w•,•,U•,•,a•} are trainable parameters.

We would like to highlight that, the proposedlognormal distribution of inter-arrival times∆t,k allows an easy re-parameterization trick—Lognormal(µe, σe) = exp(µe +σe ·Normal(0, 1))—which mitigates variance of estimated parameters andfacilitates fast training and accurate prediction.

4.3 Parameterization of qφ

At the very outset, qφ(• | ek, sk,mr−1) (Eq. 5) gener-ates missing events that are likely to be omitted duringthe interval (tk, tk+1) after the knowledge of the sub-sequent observed event ek+1 is taken into account. Toensure that missing events are generated within de-sired interval, (tk, tk+1), whenever an event is drawnwith τr > tk+1, then qφ(• | ek+1, sk,mr−1) is set tozero and k is set to r − 1. Otherwise, k is flagged ask. Similar to the generator for observed events pθ, ithas also a three level neural architecture.

Input layer. Given the subsequent observed eventtk+1 along with Sk and εr−1 = (yr−1, τr−1) arriveswith τr−1 < tk+1 or equivalently if r − 1 6= k, then wefirst convert τr−1 into a suitable representation.

γr−1 = gτ,γτr−1 + gy,γyr−1 + g∆,γ(τr−1 − τr−2) + bγ ,

where g•,• and bγ are trainable parameters.

Hidden layer. Similar to the hidden layer used in pθmodel, the hidden layer here too embeds the sequenceof missing events into finite dimensional vectors m•,computed using RNN in a recurrent manner. Such alayer takes γr−1 as input and feed it into an RNN to

update its hidden states in the following way.

mr−1 = tanh (Gm,mmr−2 +Gm,γγr−1

+ (τr−1 − τr−2) gm,τ + bm) , (10)

where G•,•, g•,• and bm are trainable parameters.

Output layer. The next level is the output layerwhich computes both qφ,∆(·) and Qφ,y(·) based onmr

and sk. To compute these quantities, it takes five sig-nals as input: (i) the current update of the hiddenstate mr for the RNN in the previous layer, (ii) thecurrent update of the hidden state sk that embedsthe history of observed events, and (iii) the timing ofthe last observed event, tk, (iv) the timing of the lastmissing event, τr−1 and (v) the timing of the next ob-servation, tk+1. To this end, we have the density ofinter-arrival times as

qφ,∆(∆τ,r | ek+1, sk,mr−1)

= Lognormal(µε(mr−1, sk), σ2

ε (mr−1, sk))

� Jτr−1 + ∆τ,r < tk+1K, (11)

with [µε(mr−1, sk), σε(mr−1, sk)] = G>τ,mmr−1 +

G>τ,ssk + bτ ; and, the mark distribution as,

Pθ,x(yr = y |∆τ,r, ek+1, sk,mr−1)

=Jτr−1 + ∆τ,r < tk+1K� exp(V >y,ssk + V >y,mmr−1)∑

y′∈C exp(V >y′,ssk + V >y′,mmr−1),

(12)

Hence, we have:

∆τ,r ∼ qφ,∆(• | ek+1, sk,mr−1)

Learning Temporal Point Processes with Intermittent Observations

If ∆τ,r < tk+1 − τr−1 :

τr = τj + ∆τ,

yr ∼ Pθ,x(yr = y |∆τ,r, ek+1, sk,mr−1)

k =∞ (Allow more missing events)

Otherwise:

k = r − 1.

Here, note that the mark distribution depends on ∆τ,r.φ = {G•,•, g•,•,V•,•, b•} are trainable parameters.

The distributions in Eqs. 11– 12 ensure that given thefirst k+1 observations, qφ generates the missing eventsonly for (tk, tk+1) and not for further subsequent in-tervals.

4.4 Prior MTPP model pprior

We model the prior density (Eq. 6) of the arrival timesof the missing events as,

pprior,∆(∆τ,r | sk,mr−1)

= Lognormal(µ(sk,mr−1), σ2(sk,mr−1)

)with [µ(sk,mr−1), σ2(sk,mr−1] = q>µ,mmr−1 +

q>µ,ssk + c; and, the mark distribution of the missingevents as,

Pprior,y(yr = y |∆τ,r, sk,mr−1)

=exp(Q>y,ssk +Q>y,mmr−1)∑

y′∈C exp(Q>y′,ssk +Q>y′,mmr−1), (13)

All parameters Q•,•, q•,• and c are set a-priori.

4.5 Training θ and φ

Note that the trainable parameters for observed andposterior MTPPs are θ = {w•,•,W•,•,a•,U•,•} andφ = {g•,•,G•,•, b•,V•,•} respectively. Given a historySK of observed events, we aim to learn θ and φ bymaximizing ELBO, as defined in Eq. 3, i.e.

maxθ,φ

ELBO(θ, φ) (14)

We compute optimal parameters θ∗ and φ∗ that maxi-mizes ELBO(θ, φ) by means of stochastic gradient de-scent (SGD) method (Rumelhart et al., 1986).

4.6 Salient features of our proposal

It is worth noting the similarity of our modeling and in-ference framework to variational autoencoders (Chunget al., 2015; Doersch, 2016; Bowman et al., 2015), withqφ and pθ playing the roles of encoder and decoder re-spectively, while pprior plays the role of the prior distri-bution of latent events. However, the random seeds inour model are not simply noise as they are interpretedin autoencoders. They can be concretely interpreted

in IMTPP as missing events, making our model phys-ically interpretable.

Secondly, note that the proposal of (Mei et al., 2019)aims to impute the missing events based on the en-tire observation sequence SK , rather than to predictobserved events in the face of missing events. Forthis purpose, it uses a bi-directional RNN and, when-ever a new observation arrives, it re-generates all miss-ing events by making a completely new pass over thebackward RNN. As a consequence, such an imputationmethod suffers from the quadratic complexity with re-spect to the number of observed events. In contrast,our proposal is designed to generate subsequent ob-served and missing events rather than imputing miss-ing events in between observed events2. To that aim,we only make forward computations and therefore,it does not require to re-generate all missing eventswhenever a new observation arrives, which makes itmuch more efficient than (Mei et al., 2019) in termsof both learning and prediction. Through our exper-iments we also show the exceptionally time-effectiveoperation of IMTPP over other missing-data models.

Finally, unlike most of the prior work (Du et al.,2016; Zhang et al., 2020; Mei et al., 2019; Mei andEisner, 2017; Shelton et al., 2018; Zuo et al., 2020)we model our distribution for inter-arrival times usinglog-normal. While Shchur et al. (2020) also use modelinter-arrival times using log-normal, they do not focusto predict observations in the face of missing events.However, it is important to reiterate (see Shchuret al. (2020) for details) that this modeling choice of-fers significant advantages over intensity based modelsin terms of providing ease of re-parameterization trickfor efficient training, allowing a closed form expressionfor expected arrival times, and enabling usability forsupervised training as well.

5 Experiments

In this section, we report a comprehensive empiricalevaluation of IMTPP. Specifically, we address the fol-lowing research questions. RQ1: Can IMTPP ac-curately predict the dynamics of the missing events?RQ2: What is the mark and time prediction perfor-mance of IMTPP in comparison to the state-of-the-art baselines? Where are the gains and losses? RQ3:How does IMTPP perform in the long term forecast-ing? RQ4: How does the efficiency of IMTPP com-pare with the proposal of Mei et al. (2019)?

2However, note that the posterior distribution qφ can easily beused to impute missing events between already occurred events.

Vinayak Gupta, Srikanta Bedathur, Sourangshu Bhattacharya, Abir De

Mean Absolute Error (MAE) Mean Prediction Accuracy (MPA)Movies Toys Taxi Retweet SO Foursquare Movies Toys Taxi Retweet SO Foursquare

HP (Hawkes, 1971) 0.060 0.062 0.220 0.049 0.010 0.098 0.482 0.685 0.894 0.531 0.418 0.523SMHP (Liniger, 2009) 0.062 0.061 0.213 0.051 0.008 0.091 0.501 0.683 0.893 0.554 0.423 0.520RMTPP (Du et al., 2016) 0.053 0.048 0.128 0.040 0.005 0.047 0.548 0.734 0.929 0.572 0.446 0.605SAHP (Zhang et al., 2020) 0.072 0.073 0.174 0.081 0.017 0.108 0.458 0.602 0.863 0.461 0.343 0.459THP (Zuo et al., 2020) 0.068 0.057 0.193 0.047 0.006 0.052 0.537 0.724 0.931 0.526 0.458 0.624PFPP (Mei et al., 2019) 0.058 0.055 0.181 0.042 0.007 0.076 0.559 0.738 0.925 0.569 0.437 0.582HPMD (Shelton et al., 2018) 0.060 0.061 0.208 0.048 0.008 0.087 0.513 0.688 0.907 0.558 0.439 0.531IMTPP (our proposal) 0.049 0.045 0.108 0.038 0.005 0.041 0.574 0.746 0.938 0.577 0.451 0.612

Table 1: Performance of all the methods in terms of time prediction error (MAE) and mark prediction accuracy(MPA) across all datasets on the 20% test set. Numbers with bold font (boxes) indicate best (second best)performer. It shows that IMTPP is the best performer for a majority of all the datasets.

5.1 Experimental setup

Datasets. We utilize a publicly available syntheticdataset 3 used in Zhang et al. (2020), to address RQ1.Here, we randomly sample 10% events and tag themto be missing. For all other experiments, we use sixdiverse datasets: Amazon movies (Movies) (Ni et al.,2019), Amazon toys (Toys) (Ni et al., 2019), NYC-Taxi (Taxi), Retweet (Zhao et al., 2015), Stackoverflow(SO) (Du et al., 2016) and Foursquare (Yang et al.,2019), as described below.

(1) Amazon Movies (Ni et al., 2019). For thisdataset we consider the reviews given to items underthe category "Movies" on Amazon. For each item weconsider the time of the written review as the timeof event in the sequence and the rating as the corre-sponding mark.

(2) Amazon Toys (Ni et al., 2019). Similar to Ama-zon Movies, but here we consider the items under thecategory "Toys".

(3) NYC Taxi4. Each sequence corresponds to a se-ries of time-stamped pick-up and drop-off events of ataxi in New York City with locations as marks.

(4) Retweet (Zhao et al., 2015). Similar to (Meiand Eisner, 2017), we group retweeting users intothree classes based on their connectivity: ordinaryuser (degree lower than the median), popular user (de-gree lower than 95-percentile) and influencers (degreehigher than 95-percentile). Each stream of retweets istreated as a sequence of events with retweet time asthe event time, and user class as the mark.

(5) Stack Overflow. Similar to (Du et al., 2016), wetreat the badge awarded to a user on the stack overflowforum as a mark. Thus we have each user correspond-ing a sequence of events with times corresponding tothe time of mark affiliation.

(6) Foursquare. As a novel evaluation dataset, weuse Foursquare (a location search and discovery app)crawls (Yang et al., 2019) to construct a collection of

3https://github.com/QiangAIResearcher/sahp_repo4https://chriswhong.com/open-data/foil_nyc_taxi/

5.8 6.0 6.2 6.4 6.6t, τ →

Tru

e

e0 e1 e2 ε0 ε1ε2 ε3 ε4 ε5 ε6 e3 e4e5

5.8 6.0 6.2 6.4 6.6

t, τ →P

red

icte

d

e0 e1 e2 ε0 ε1 ε2 ε3 ε4ε5 ε6 e3 e4 e5

Figure 2: Qualitative Analysis of mark and time pre-diction performance of IMTPP over the syntheticdataset. In the top figure, we show the observed (barsin blue) as well as events hidden (bars in brown) dur-ing learning between timestamps 6.0 and 6.4. In thelower figure, we show the events predicted by IMTPPfor the same sequence. Marks are represented usingdifferent colored (cyan and purple) circles.

check-ins from Japan. Each user has a sequence withthe mark corresponding to the type of the check-inlocation (e.g. "Jazz Club") and the time as the times-tamp of the check-in.

Baselines. We compare IMTPP with eight base-lines: (i) Hawkes process (HP) (Hawkes, 1971; Duet al., 2016), (ii) Self modulating Hawkes process(SMHP) (Liniger, 2009), (iii) Recurrent marked tem-poral point process (RMTPP) (Du et al., 2016), (iv)Self attentive Hawkes process (SAHP) (Zhang et al.,2020), (v) Transformer Hawkes process (THP) (Zuoet al., 2020) (vi) Particle filtering of point process withmissing events (PFPP) (Mei et al., 2019) and (vii)Hawkes process with missing data (HPMD) (Sheltonet al., 2018).

Evaluation protocol. Given a stream of N observedevents SN , we split them into training SK and testset SN\SK , where the training set (test set) consistsof first 80% (last 20%) events, i.e., K = d0.8Ne. Wetrain IMTPP and the baselines on SK and then evalu-ate the trained models on the test set SN\SK in termsof (i) mean absolute error (MAE) of predicted times,i.e., 1

|SN\SK |∑ei∈SN\SKE[|ti−ti|] and (ii) mark predic-

tion accuracy (MPA), i.e., 1|SN\SK |

∑ei∈SN\SK P(xi =

Learning Temporal Point Processes with Intermittent Observations

xi). Here ti and xi are the predicted time and markthe i-th event in test set. Note that such predictionsare made only on observed events in real datasets.

For our experiments in this paper, we make our codeand data public at https://github.com/data-iitd/imtpp.Our code uses Tensorflow5 v.1.13.1 and Tensorflow-Probability v0.6.06.

5.2 Implementation details

Parameter Settings. For our experiments, we setdim(v•) = 16, and dim(γ•) = 32, where v• and γ•are the output of the first layers in p∗θ and q∗φ respec-tively; the sizes of hidden states as dim(h•) = 64 anddim(z•) = 128; batch-size B = 64. In addition weset an l2 regularizer over the parameters where the l2regularizing coefficient has value 0.001.

System Configuration. All our experiments weredone on a server running Ubuntu 16.04. CPU: Intel(R)Xeon(R) Gold 5118 CPU @ 2.30GHz , RAM: 125GBand GPU: NVIDIA Tesla T4 16GB DDR6.

Baseline Implementation Details. For implemen-tations regarding Markov chain, we use the code7by (Du et al., 2016) and for RMTPP we use thePython-based implementation8. For Hawkes and self-modulating Hawkes, we use the codes9 made availableby Mei and Eisner (2017). Since HP and SMHP (Lin-iger, 2009) generate a sequence of events of a spec-ified length from the weights learned over the train-ing set, we generate |N | sequences as per the dataas S = {s1, s2, · · · sN} each with maximum sequencelength. For evaluation, we consider the first li setof events for each sequence i. Recent research (Liet al., 2018) states that Neural Hawkes (Mei and Eis-ner, 2017) and (Du et al., 2016) show similar perfor-mance. These deviations are subjected to extensive toparameter tuning, which however is beyond scope ofthis paper. For rest of the baselines, we used the im-plementations provided by the respective authors —PFPP10, HPMD11, THP12 and SAHP13.

5.3 Results

Prediction of missing events. To address the re-search question RQ1, we first qualitatively demon-

5https://www.tensorflow.org/6https://www.tensorflow.org/probability7https://github.com/dunan/NeuralPointProcess8https://github.com/musically-ut/tf_rmtpp9https://github.com/HMEIatJHU/neurawkes

10https://github.com/HMEIatJHU/neural-hawkes-particle-smoothing

11https://github.com/cshelton/hawkesinf12https://github.com/SimiaoZuo/Transformer-Hawkes-Process13https://github.com/QiangAIResearcher/sahp_repo

0 10 20 30 40k (index of ek)→

0.6

1.2

1.8

2.4

3.0

∆t,k→

True

Predicted

0 10 20 30 40k (index of ek)→

0.00

0.15

0.30

0.45

0.60

∆t,k→

True

Predicted

Figure 3: Real life examples of true and predictedinter-arrival times ∆t,k of different events ek, againstk for k ∈ {k + 1, . . . , N}.

strate the ability of IMTPP in predicting missingevents in a sequence. To do so, we train IMTPPover the observed events and use our trained modelpredict both observed and missing events. Figure 2provides an illustrative setting of our results. It showsthat IMTPP provides many accurate mark predic-tions. In addition, it qualitatively shows that the pre-dicted inter-arrival times closely matches with the trueinter-arrival times.

Comparative analysis on real data. Next, we ad-dress the research question RQ2. More specifically,we compare the performance of IMTPP with all thebaselines introduced above over all six datasets.

— Analysis of MAE and MPA: Table 1 summarizesthe results, which sketches the comparative analysisin terms of mean absolute error (MAE) on time andmark prediction accuracy (MPA), respectively. Theyreveal the following observations. (1) IMTPP exhibitssteady improvement over all the baselines in most ofthe datasets, in case of both time and mark prediction.However, for Stackoverflow and Foursquare datasets,THP outperforms all other models including IMTPPin terms of MPA. (2) RMTPP is the second best per-former in terms of MAE of time prediction almost inall datasets. In fact, in Stackoverflow (SO) dataset, itshares the lowest MAE together with IMTPP. How-ever, there is no consistent second best performer interms of MPA. Notably, PFPP and IMTPP, whichtake into account of missing events, are the second bestperformers for four datasets. (3) Both PFPP (Meiet al., 2019) and HPMD (Shelton et al., 2018) farepoorly with respect to RMTPP in terms of MAE. Thisis because PFPP focus on imputing missing eventsbased on the complete observations and, is not wellsuited to predict observed events in the face of miss-ing observations. In fact, PFPP does not offer a jointtraining mechanism for the MTPP for observed eventsand the imputation model. Rather it trains an impu-tation model based on the observation model learneda-priori. On the other hand, HPMD only assumes lin-ear Hawkes process with known influence structure.Therefore it shows poor performance with respect to

Vinayak Gupta, Srikanta Bedathur, Sourangshu Bhattacharya, Abir De

0 10 20 ×104

ek-ID (sorted by gain)→

−0.04

0.00

0.04

0.08

Gain

inA

E→

IMTPP vs PFPP

IMTPP vs RMTPP

(a) Movies

0 4 8 ×104

ek-ID (sorted by gain)→

−0.04

0.00

0.04

0.08

Gain

inA

E→

IMTPP vs PFPP

IMTPP vs RMTPP

(b) Toys

Figure 4: Performance gain in terms of AE(baseline)−AE(IMTPP)— the gain (above x-axis) or loss (belowx-axis) of the average error per event E[|tk − tk|] ofIMTPP— with respect to two competitive baselines:RMTPP and PFPP. Events in the test set are sortedby decreasing gain of IMTPP along x-axis.

1 2 3 4 5k (index of ek)→

0.048

0.056

0.064

MA

E→

RMTPP

IMTPP

(a) Movies, MAE

1 2 3 4 5k (index of ek)→

0.51

0.54

0.57

MP

A→

RMTPP

IMTPP

(b) Movies, MPAFigure 5: Variation of forecasting performance ofIMTPP and RMTPP in terms of MAE and MPA atpredicting next i-th event, against i for Movies dataset.Panels (a–b) show the variation of MAE while panels(c–d) show the variation of MPA.

RMTPP.

— Qualitative analysis: In addition, Figure 3 providessome real-life examples taken from Movies and Toysdatasets, which qualitatively show that the predictedinter-arrival times closely matches with the true inter-arrival times.

— Drill down analysis: Next, we provide a compar-ative analysis of the time prediction performance atevery event in the test set. To this end, for each ob-served event ei in the test, we compute the gain (orloss) IMTPP achieves in terms of the average error perevent E[|tk− tk|], i.e., AE(baseline)−AE(IMTPP) fortwo competitive baselines, e.g., RMTPP and PFPP forMovies and Toys datasets. Figure 4 summarizes theresults, which shows that IMTPP outperforms themost competitive baseline (RMTPP) for more than70% events across both Movies and Toys datasets.

Forecasting future events. Here, we aim to addressthe research question RQ3. To make a more challeng-ing evaluation of IMTPP against its competitors wedesign a difficult event prediction task, where we pre-dict the next n events given only the current eventas input. To do so, we keep sampling events usingthe trained model pθ and qφ till n-th prediction. Such

0 20 40 60 80 100Length→

0

5

10

15

20

Tim

e(H

ou

rs)→

PFPP

IMTPP

(a) Time vs |SK |

0 4 8 12 16 20 24Epoch→

0

5

10

15

20

Tim

e(H

ou

rs)→

PFPP

IMTPP

(b) Time vs # of epochs

Figure 6: Runtime performance of PFPP and IMTPPfor across Movies Dataset. Panel (a) shows time vs.length of training sequence and panel (b) shows timevs. number of epochs.

an evaluation protocol effectively requires accurate in-ference of the missing data distribution, since, unlikeduring the training phase, the future observations arenot fed into the missing event model. To this end,we compare the forecasting performance of IMTPPagainst RMTPP, the most competitive baseline.

Figure 5 summarizes the results, which shows that (i)the performances of all the algorithms deteriorate as nincreases and; (ii) IMTPP achieves 5.5% improve-ments in MPA and significantly better 10.12% im-provements in MAE than RMTPP on Movies dataset.

Scalability analysis. Here, we address the researchquestion RQ4 by comparing the runtime of IMTPPwith PFPP across the no. of training epochs as wellas the length of training sequence |SK |. Figure 6 sum-marizes the results, which shows that IMTPP enjoysa better latency than PFPP. In particular, we observethat the runtime of PFPP increases quadratically withrespect to |SK |, whereas, the runtime of IMTPP in-creases linearly. The quadratic complexity of PFPPis due to the presence of an backward RNN which re-quires a complete pass whenever a new event arrives.

6 Conclusion

In this paper, we provide a method for incorporatingmissing events for training a marked temporal pointprocesses, that simultaneously samples missing as wellas observed events across continuous time. Most ear-lier methods relied on complete data - an ideal and raresetting. Experiments on several real datasets from di-verse application domains show that our proposal out-performs several alternatives. Since including missingdata, improves over standard learning procedures, thisobservation opens avenues for further research that in-cludes missing data. We plan to further extend ourmodel to incorporate partial missing aspects i.e. eithertime or mark is missing from the event data, whichmay give even more flexibility to our model.

Learning Temporal Point Processes with Intermittent Observations

Acknowledgements

Abir De acknowledges DST Inspire research grant.

References

Bacry, E., Mastromatteo, I., and Muzy, J.-F. (2015).Hawkes processes in finance. Market Microstructureand Liquidity, 1(01):1550005.

Bowman, S. R., Vilnis, L., Vinyals, O., Dai, A. M.,Jozefowicz, R., and Bengio, S. (2015). Generatingsentences from a continuous space. arXiv preprintarXiv:1511.06349.

Che, Z., Purushotham, S., Cho, K., Sontag, D., andLiu, Y. (2018). Recurrent neural networks for multi-variate time series with missing values. Nature Sci-entific Reports.

Chung, J., Kastner, K., Dinh, L., Goel, K., Courville,A. C., and Bengio, Y. (2015). A recurrent latentvariable model for sequential data. In NIPS.

Daley, D. J. and Vere-Jones, D. (2007). An introduc-tion to the theory of point processes: volume II: gen-eral theory and structure. Springer Science & Busi-ness Media.

De, A., Valera, I., Ganguly, N., Bhattacharya, S., andGomez-Rodriguez, M. (2016). Learning and fore-casting opinion dynamics in social networks. InNIPS.

Doersch, C. (2016). Tutorial on variational autoen-coders. arXiv preprint arXiv:1606.05908.

Du, N., Dai, H., Trivedi, R., Upadhyay, U., Gomez-Rodriguez, M., and Song, L. (2016). Recurrentmarked temporal point processes: Embedding eventhistory to vector. In KDD.

Du, N., Farajtabar, M., Ahmed, A., Smola, A. J., andSong, L. (2015). Dirichlet-hawkes processes with ap-plications to clustering continuous-time documentstreams. In KDD.

Farajtabar, M., Yang, J., Ye, X., Xu, H., Trivedi, R.,Khalil, E., Li, S., Song, L., and Zha, H. (2017). Fakenews mitigation via point process based interven-tion. In ICML.

Guo, R., Li, J., and Liu, H. (2018). Initiator: Noise-contrastive estimation for marked temporal pointprocess. In IJCAI.

Hawkes, A. G. (1971). Spectra of some self-excitingand mutually exciting point processes. Biometrika,58(1).

Huang, H., Wang, H., and Mak, B. (2019). Recur-rent poisson process unit for speech recognition. InAAAI.

Jing, H. and Smola, A. J. (2017). Neural survival rec-ommender. In WSDM.

Kumar, S., Zhang, X., and Leskovec, J. (2019). Pre-dicting dynamic embedding trajectory in temporalinteraction networks. In KDD.

Li, S., Xiao, S., Zhu, S., Du, N., Xie, Y., and Song,L. (2018). Learning temporal point processes viareinforcement learning. In NIPS.

Likhyani, A., Gupta, V., Srijith, P., Deepak, P., andBedathur, S. (2020). Modeling implicit communitiesfrom geo-tagged event traces using spatio-temporalpoint processes. In WISE.

Liniger, T. J. (2009). Multivariate hawkes processes.PhD thesis, ETH Zurich.

Little, R. J. and Rubin, D. B. (2019). Statistical anal-ysis with missing data, volume 793. John Wiley &Sons.

Lorch, L., De, A., Bhatt, S., Trouleau, W., Upadhyay,U., and Gomez-Rodriguez, M. (2018). Stochasticoptimal control of epidemic processes in networks.arXiv preprint arXiv:1810.13043.

Mei, H. and Eisner, J. M. (2017). The neural hawkesprocess: A neurally self-modulating multivariatepoint process. In NIPS.

Mei, H., Qin, G., and Eisner, J. (2019). Imputingmissing events in continuous-time event streams. InICML.

Ni, J., Li, J., and McAuley, J. (2019). Justifying rec-ommendations using distantly-labeled reviews andfine-grained aspects. In EMNLP-IJCNLP, pages188–197.

Rizoiu, M.-A., Mishra, S., Kong, Q., Carman, M., andXie, L. (2018). Sir-hawkes: on the relationship be-tween epidemic models and hawkes point processes.In The Web Conference.

Rizoiu, M.-A., Xie, L., Sanner, S., Cebrian, M., Yu,H., and Van Hentenryck, P. (2017). Expecting tobe hip: Hawkes intensity processes for social mediapopularity. In WWW.

Rumelhart, D. E., Hinton, G. E., and Williams,R. J. (1986). Learning representations by back-propagating errors. Nature, 323(6088):533–536.

Shchur, O., Biloš, M., and Günnemann, S. (2020).Intensity-free learning of temporal point processes.In ICLR.

Shelton, C. R., Qin, Z., and Shetty, C. (2018). Hawkesprocess inference with missing data. In AAAI.

Śmieja, M., Struski, L., Tabor, J., Zieliński, B., andSpurek, P. (2018). Processing of missing data byneural networks. In NIPS.

Vinayak Gupta, Srikanta Bedathur, Sourangshu Bhattacharya, Abir De

Tabibian, B., Upadhyay, U., De, A., Zarezade, A.,Schölkopf, B., and Gomez-Rodriguez, M. (2019).Enhancing human learning via spaced repetition op-timization. PNAS.

Tian, Y., Zhang, K., Li, J., Lin, X., and Yang, B.(2018). Lstm-based traffic flow prediction with miss-ing data. Neurocomputing.

Valera, I., Gomez-Rodriguez, M., and Gummadi, K.(2014). Modeling diffusion of competing productsand conventions in social media. arXiv preprintarXiv:1406.0516.

Wang, P., Fu, Y., Liu, G., Hu, W., and Aggarwal, C.(2017). Human mobility synchronization and trippurpose detection with mixture of hawkes processes.In KDD.

Yang, D., Qu, B., Yang, J., and Cudre-Mauroux, P.(2019). Revisiting user mobility and social relation-ships in lbsns: A hypergraph embedding approach.In WWW.

Yoon, J., Zame, W. R., and van der Schaar, M. (2019).Estimating missing data in temporal data streamsusing multi-directional recurrent neural networks.IEEE Trans. on Biomedical Engineering.

Zhang, Q., Lipani, A., Kirnap, O., and Yilmaz, E.(2020). Self-attentive hawkes processes. ICML.

Zhao, Q., Erdogdu, M. A., He, H. Y., Rajaraman, A.,and Leskovec, J. (2015). Seismic: A self-excitingpoint process model for predicting tweet popularity.In KDD.

Zuo, S., Jiang, H., Li, Z., Zhao, T., and Zha, H. (2020).Transformer hawkes process. In ICML.