VMSTAModern Stochastics: Theory and Applications2351-60542351-60462351-6046VTeXMokslininkų g. 2A, 08412 Vilnius, LithuaniaVMSTA39CNF10.15559/15-VMSTA39CNFResearch ArticleFredholm representation of multiparameter Gaussian processes with applications to equivalence in law and series expansions^{✩}SottinenTommitommi.sottinen@iki.fia∗ViitasaariLaurilauri.viitasaari@aalto.fib1Department of Mathematics and Statistics, University of Vaasa, P.O. Box 700, FIN-65101 Vaasa, FinlandDepartment of Mathematics and System Analysis, Aalto University School of Science, P.O. Box 11100, FIN-00076 Aalto, Finland

The authors thank the referees for their useful comments.

Corresponding author.

Lauri Viitasaari was partially funded by Emil Aaltonen Foundation.

We show that every multiparameter Gaussian process with integrable variance function admits a Wiener integral representation of Fredholm type with respect to the Brownian sheet. The Fredholm kernel in the representation can be constructed as the unique symmetric square root of the covariance. We analyze the equivalence of multiparameter Gaussian processes by using the Fredholm representation and show how to construct series expansions for multiparameter Gaussian processes by using the Fredholm kernel.

Equivalence in lawGaussian sheetsmultiparameter Gaussian processesrepresentation of Gaussian processesseries expansions60G1560G60Introduction

In this article, we consider multiparameter processes, that is, our time is multidimensional. Throughout the paper, the dimension of time n≥1 is arbitrary but fixed.

We use the following notation throughout this article: t,s,u∈Rn are n-dimensional multiparameters of time: t=(t1,…,tn), s=(s1,…,sn), u=(u1,…,un); 0 is an n-dimensional vector of 0s, and 1 is an n-dimensional vector of 1s. We denote s≤t if sk≤tk for all k≤n. For s≤t, the set [s,t]⊂Rn is the n-dimensional rectangle {u∈Rn;s≤u≤t}.

Let X=(Xt)t∈[0,1] be a real-valued centered Gaussian multiparameter process or field defined on some complete probability space (Ω,F,P). We assume that the Gaussian field X is separable, that is, its linear space, or the 1st chaos,
H1=cl(span{Xt;t∈[0,1]})
is separable. Here cl means closure in L2(Ω,F,P).

Our main result, Theorem 1, shows when the Gaussian field X can be represented in terms of the Brownian sheet. Recall that the Brownian sheet W=(Wt)t∈[0,1] is the centered Gaussian field with the covariance
E[WtWs]=∏k=1nmin(tk,sk).
The Brownian sheet can also be considered as the Gaussian white noise on [0,1] with the Lebesgue control measure. This means that dW is a random measure on ([0,1],B([0,1]),Leb([0,1])) characterized by the following properties:

∫AdWt∼N(0,Leb(A)),

∫AdWt and ∫BdWs are independent if A∩B=∅.

Then, if f,g:[0,1]→R are simple functions, then we have the Wiener–Itô isometryE[∫[0,1]f(t)dWt∫[0,1]g(s)dWs]=∫[0,1]f(t)g(t)dt.
Consequently, the integral ∫[0,1]f(t)dWt can be extended for all f∈L2([0,1]) by using the isometry (1), and the isometry (1) will also hold for this extended integral.

In this article, we show the Fredholm representation for Gaussian fields satisfying the trace condition (3) in Section 2, Theorem 1. In Section 3, we apply the Fredholm representation to give a representation for Gaussian fields that are equivalent in law, and in Section 4, we show how to generate series expansions for Gaussian fields by using the Fredholm representation. The Fredholm representation of Theorem 1 can also be used to provide a transfer principle that builds stochastic analysis and Malliavin calculus for Gaussian fields from the corresponding well-known theory for the Brownian sheet. We do not do that in this article, although it would be quite straightforward given the results for the one-dimensional case provided in [9].

Fredholm representation

Recall that X is a separable centered Gaussian field with covariance function R and W is a Brownian sheet. Suppose that X is defined on a complete probability space (Ω,F,P) that is rich enough to carry Brownian sheets.

The following theorem states that the field X can be realized as a Wiener integral with respect to a Brownian sheet. Let us note that it is not always possible to construct the Brownian sheet W directly from the field X. Indeed, consider the trivial field X≡0 to see this. As a consequence, the Karhunen representation theorem (see, e.g., [2, Thm. 41]) cannot be applied here. Consequently, the Brownian sheet in representation (2) is not guaranteed to exist on the same probability space with X.

In any case, representation (2) holds in law. This means that for a given Brownian sheet W, the field given by (2) is a Gaussian field with the same law as X.

(Fredholm representation).

Let(Ω,F,P)be a probability space such thatσ{ξk;k∈N}⊂F, whereξk,k∈N, are i.i.d. standard normal random variables. Let X be a separable centered Gaussian field defined on(Ω,F,P). Let R be the covariance of X.

Then there exist a kernelK∈L2([0,1])and a Brownian sheet W, possibly, defined on a larger probability space, such that the representationXt=∫[0,1]K(t,s)dWsholds if and only if R satisfies the trace condition∫[0,1]R(t,t)dt<∞.

From condition (3) it follows that the covariance operator
Rf(t)=∫[0,1]f(s)R(t,s)ds
is Hilbert–Schmidt. Indeed, the Hilbert–Schmidt norm of the operator R satisfies, by the Cauchy–Schwarz inequality,
‖R‖HS=∫[0,1]∫[0,1]R(t,s)2dtds≤∫[0,1]∫[0,1]R(t,t)R(s,s)dtds=∫[0,1]R(t,t)dt.
Since Hilbert–Schmidt operators are compact operators, it follows from, for example, [7, p. 233] that the operator R admits the eigenfunction representation
Rf(t)=∑k=1∞λk∫[0,1]f(s)ϕk(s)dsϕk(t).
Here (ϕk)k=1∞, the eigenfunctions of R, form an orthonormal system on L2([0,1]). In particular, this means that
R(t,s)=∑k=1∞λkϕk(t)ϕk(s).
From this it follows that the square root of the covariance operator R admits a kernel K if and only if
∑k=1∞λk<∞.
Note that condition (6) is equivalent to condition (3). Consequently, we can define
K(t,s)=∑k=1∞λkϕk(t)ϕk(s)
since the series in the right-hand side of (7) converges in L2([0,1]), and the eigenvalues (λk)k=1∞ of a positive-definite operator R are nonnegative.

Now,
R(t,s)=∑k=1∞λkϕk(t)ϕk(s)=∑k=1∞∑ℓ=1∞λkλℓϕk(t)ϕℓ(s)∫[0,1]ϕk(u)ϕℓ(u)du=∫[0,1](∑k=1∞λkϕk(t)ϕk(u)∑ℓ=1∞λℓϕℓ(s)ϕℓ(u))du=∫[0,1]K(t,u)K(s,u)du,
where the interchange of summation and integration is justified by the fact that series (7) converges in L2([0,1]). From this calculation and from the Wiener–Itô isometry (1) of the integrals with respect to the Brownian sheet it follows that the centered Gaussian processes on the left-hand side and the right-hand side of Eq. (2) have the same covariance function. Consequently, since they are Gaussian fields, they have the same law. This means that representation (2) holds in law.

Finally, we need to construct a Brownian sheet W associated with the field X such that representation (2) holds in L2(Ω,F,P). Let (ϕ˜k)k=1∞ be any orthonormal basis on L2([0,1]). Set
ϕk(t)=∫[0,1]ϕ˜k(s)K(t,s)ds.
Then (ϕk)k=1∞ is an orthonormal basis (possibly finite or even empty!) on the reproducing kernel Hilbert space (RKHS) of the Gaussian field X (see further for a definition). Let Θ be an isometry from the RKHS to L2(Ω,σ(X),P). Set ξk=Θ(ϕk). Then ξk are i.i.d. standard normal random variables, and by the reproducing property we have that
Xt=∑k=1∞ϕk(t)ξk
in L2(Ω,F,P). Now, it may be that there are only finitely many ξk developed this way. If this is the case, then we augment the finite sequence (ξk)k=1n with independent standard normal random variables. Then set
Wt=∑k=1∞∫[0,t]ϕ˜k(s)dsξk.
For this Brownian sheet, representation (2) holds in L2(Ω,F,P). Indeed,
∫[0,1]K(t,s)dWs=∫[0,1]K(t,s)d∑k=1∞∫[0,t]ϕ˜k(s)dsξk=∑k=1∞∫[0,1]K(t,s)ϕ˜k(s)dsξk=∑k=1∞ϕk(t)ξk=Xt.
Here the change of summation, differentiation, and integration is justified by the fact that the everything is square integrable. □

The eigenfunction expansion (5) for the kernel (t,s)↦K(t,s) is symmetric in t and s. Consequently, it is always possible to have a symmetric kernel in representation (2), that is, in principle it is always possible to transfer from a given representation
Xt=∫[0,1]K(t,s)dWs
to
Xt=∫[0,1]K˜(t,s)dW˜s
where W˜ is some other Brownian sheet, and the kernel K˜ is symmetric. Unfortunately, for a given kernel K and Brownian sheet W, the authors do not know how to do this analytically.

In general, it is not possible to choose a Volterra kernel K in (2). By a Volterra kernel we mean a kernel that satisfies K(t,s)=0 if sk>tkt_{k}$]]> for some k. To see why a Volterra representation is not always possible, consider the following simple counterexample: Xt≡ξ, where ξ is a standard normal random variable. This field cannot have a Volterra representation since Volterra fields vanish in the origin. A Fredholm representation for this field is simply Xt=∫[0,1]dWs (with suitable Brownian sheet W depending on ξ).

For a more complicated counterexample (with X0=0) see [9, Example 3.2].

Consequently, in general, it is not possible to generate a Gaussian field X on the rectangle [0,t] from the noise W on the same rectangle [0,t]. Instead, the whole information on the cube [0,1] may be needed.

If the family {K(t,·);t∈[0,1]} is total in L2([0,1]), then a Brownian sheet in representation (2) exists on the same probability space (Ω,F,P). Moreover, in this case, it can be constructed from the Gaussian field X. Indeed, in this case, we can apply the Karhunen representation theorem [2, Thm. 41].

The reproducing kernel Hilbert space (RKHS) of the Gaussian field X is the Hilbert space H that is isometric to the linear space H1, and the defining isometry is R(t,·)↦Xt. In other words, the RKHS is the Hilbert space of functions over [0,1] extended and closed linearly by the relation
⟨R(t,·),R(s,·)⟩H=R(t,s).
The RKHS is of paramount importance in the analysis of Gaussian processes. In this respect, the Fredholm representation (2) is also very important. Indeed, if the kernel K of Theorem 1 is known, then the RKHS is also known as the following reformulation of Lifshits [6, Prop. 4.1] states.

Let X admit representation (2). ThenH={f;f(t)=∫[0,1]f˜(s)K(t,s)ds,f˜∈L2([0,1])}.Moreover, the inner product inHis given by⟨f,g⟩H=inff˜,g˜∫[0,1]f˜(t)g˜(t)dt,where the infimum is taken over all suchf˜andg˜thatf(t)=∫[0,1]f˜(s)K(t,s)ds,g(t)=∫[0,1]g˜(t)K(t,s)ds.

Application to equivalence in law

Two random objects ξ and ζ are equivalent in law if, their distributions satisfy P[ξ∈B]>00$]]> if and only if P[ζ∈B]>00$]]> for all measurable sets B. On the contrary, the random objects ξ and ζ are singular in law if there exists a measurable set B such that P[ξ∈B]=1 but P[ζ∈B]=0. For centered Gaussian random objects there is the well-known dichotomy that two centered Gaussian objects are either equivalent or singular in law; see [4, Thm. 6.1].

There is a complete characterization of the equivalence by any two Gaussian processes due to Kallianpur and Oodaira; see [5, Thms. 9.2.1 and 9.2.2]. It is possible to extend this to Gaussian fields and formulate it in terms of the operator K. The result would remain quite abstract, though. Therefore, we due not pursue in that direction. Instead, the following Proposition 2 gives a partial solution to the problem what do Gaussian fields equivalent to a given Gaussian field X look like. Proposition 2 uses only the Hitsuda representation theorem, which is, unlike the Kallianpur–Oodaira theorem, quite concrete.

Let X˜=(X˜t)t∈[0,1] be a centered Gaussian field with covariance function R˜, and let X=(Xt)t∈[0,1] be a centered Gaussian field with covariance function R. (Representation of equivalent Gaussian fields).

Suppose that X has representation (2) with kernel K and Brownian sheet W. IfX˜t=∫[0,1]K(t,s)dWs−∫[0,1]∫[s,1]K(t,s)L(s,u)dWudsfor someL∈L2([0,1]), thenX˜is equivalent in law to X.

By [8, Prop. 4.2] we have the following multiparameter version of the Hitsuda representation theorem: A centered Gaussian field W˜=(W˜t)t∈[0,1] is equivalent in law to a Brownian sheet if and only if it admits the representation
W˜t=Wt−∫[0,t]∫[0,s]L(s,u)dWuds
for some Volterra kernel L∈L2([0,1]).

Let then X have the Fredholm representation
Xt=∫[0,1]K(t,s)dWs.
Then X˜ is equivalent to X if it admits the representation
X˜t=∫[0,1]K(t,s)dW˜s,
where W˜ is related to W by (9). But Eq. (8) implies precisely this. □

On the kernel level, Eq. (8) states that
K˜(t,s)=K(t,s)−∫[s,1]K(t,u)L(u,s)du
for some Volterra kernel L∈L2([0,1]).

Application to series expansions

The Mercer square root (7) can be used to build the Karhunen–Loève expansion for the Gaussian process X. But the Mercer form (7) is seldom known. However, if we can somehow find any kernel K such that representation (2) holds, then we can construct a series expansion for X by using the Fredholm representation of Theorem 1 as follows.

(Series expansion).

Let X be a separable Gaussian process with representation (2), and let(ϕk)k=1∞be any orthonormal basis onL2([0,1]). Then X admits the series expansionXt=∑k=1∞∫[0,1]ϕk(s)K(t,s)ds·ξk,where the(ξk)k=1∞is a sequence of independent standard normal random variables. The series (12) converges inL2(Ω,F,P)and also almost surely uniformly if and only if X is continuous.

The proof below uses reproducing kernel Hilbert space technique. For more details on this, we refer to [3], where the series expansion is constructed for fractional Brownian motion by using the transfer principle.

The Fredholm representation (2) implies immediately that the reproducing kernel Hilbert space of X is the image KL2([0,1]) and K is actually an isometry from L2([0,1]) to the reproducing kernel Hilbert space of X. Indeed, this is what Proposition 1 states.

The L2-expansion (12) follows from this due to [1, Thm. 3.7] and the equivalence of almost sure convergence of (12), and the continuity of X follows from [1, Thm. 3.8]. □

ReferencesAdler, R.J.: Berlinet, A., Thomas-Agnan, C.: Gilsing, H., Sottinen, T.: Power series expansions for fractional Brownian motions. Hida, T., Hitsuda, M.: Kallianpur, G.: Lifshits, M.: Riesz, F., Sz.-Nagy, B.: Sottinen, T., Tudor, C.A.: On the equivalence of multiparameter Gaussian processes. Sottinen, T., Viitasaari, L.: Stochastic analysis of Gaussian processes via Fredholm representation. Preprint, arXiv:1410.2230 (2014)