# Monte Carlo Simulation with Julia (Growth Model)

Statistics
Structural equation modeling
Simulation
Julia

Learning note of using Julia for simulation

Author

Mark Lai

Published

February 23, 2024

In this note, I’m trying to use the Julia language to run some simulations.

using LinearAlgebra, Distributions, Random
using Optim, ForwardDiff
using Tables, MarkdownTables
using StatsPlots

First, here’s an example of simulating multivariate normal random variables

Random.seed!(1234)
μ = [0, 1, 2]
Σ = cov(randn(100, 3))
n = 200
y = rand(MvNormal(μ, Σ), n)
3×200 Matrix{Float64}:
0.0540757  -0.102689  0.658599  0.611513  …  0.879624  0.605667  0.595295
1.14527    -0.222159  2.24221   1.30428      0.254144  2.05194   2.90419
2.6294      0.572331  1.47629   2.98324      4.10777   1.05924   1.56324

## Growth Model Example

Specify implied moments from a growth model

growth_moments(Λ, α, Ψ, Θ) = (μ = Λ * α, Σ = Hermitian(Λ * Ψ * Λ') + Θ)
Λ = [1 0
1 1
1 2
1 3]
α = [0, 1]
Ψ = Symmetric([1 0.2
0.2 0.25])
Θ = Symmetric([    1  0.2 0.04 0.008
0.2    1  0.2  0.04
0.04  0.2    1   0.2
0.008 0.04  0.2     1])  # autoregressive structure
gm = growth_moments(Λ, α, Ψ, Θ)
(μ = [0, 1, 2, 3], Σ = [2.0 1.4 1.44 1.608; 1.4 2.65 2.3000000000000003 2.59; 1.44 2.3000000000000003 3.8 3.6999999999999997; 1.608 2.59 3.6999999999999997 5.449999999999999])

Simulate data from a growth model

sim_dat = rand(MvNormal(growth_moments(Λ, α, Ψ, Θ)...), 500)
4×500 Matrix{Float64}:
0.629762  -0.205364   1.64222   0.486685  …  -0.292158  0.13341  -0.624102
1.95646    3.54633   -0.797253  0.608302      1.88673   1.03129  -0.378568
1.91765    2.56468    0.553998  3.44475       2.42096   3.01259   0.746786
1.88494    2.93066    0.570765  1.94388       0.564698  3.71769   0.984581

Fit a growth model using maximum likelihood estimation using Optim.jl

"""
growth_ll(α, Ψ, Θ, Λ, y)

Compute log-likelihood of the input parameters given data y

α = latent means
Ψ = latent variance and covariances
Θ = error covariances
"""
function growth_ll(α, Ψ, Θ, Λ, y)
μhat, Σhat = growth_moments(Λ, α, Ψ, Θ)
if !isposdef(Σhat)
return Inf
end
d = MvNormal(μhat, Σhat)
loglikelihood(MvNormal(μhat, Σhat), y)
end
growth_ll(α, Ψ, Θ, Λ, sim_dat)
"""
growth_ll(θ, Λ, y)

Compute log-likelihood of the input parameter vector given data y
"""
function growth_ll(θ, Λ, y)
α = view(θ, 1:2)
Ψ = reshape(θ[[3, 4, 4, 5]], (2, 2))
Θ = [θ[6] * θ[7]^abs(i - j) for i in 1:4, j in 1:4]
growth_ll(α, Ψ, Θ, Λ, y)
end
# Maximum likelihood estimation
# Starting values:
θ₀ = vcat(α, Ψ[[1, 2, 4]], 1, .2)
opt = optimize(x -> -growth_ll(x, Λ, sim_dat), θ₀, Newton(); autodiff=:forward)
 * Status: success

* Candidate solution
Final objective value:     3.378396e+03

* Found with
Algorithm:     Newton's Method

* Convergence measures
|x - x'|               = 1.09e-07 ≰ 0.0e+00
|x - x'|/|x'|          = 1.11e-07 ≰ 0.0e+00
|f(x) - f(x')|         = 1.36e-12 ≰ 0.0e+00
|f(x) - f(x')|/|f(x')| = 4.04e-16 ≰ 0.0e+00
|g(x)|                 = 5.12e-12 ≤ 1.0e-08

* Work counters
Seconds run:   0  (vs limit Inf)
Iterations:    4
f(x) calls:    12
∇f(x) calls:   12
∇²f(x) calls:  4

Wrap into a function

function sim_θhat(n)
ysim = rand(MvNormal(growth_moments(Λ, α, Ψ, Θ)...), n)
opt = optimize(x -> -growth_ll(x, Λ, ysim), θ₀, Newton(); autodiff=:forward)
opt.minimizer
end
sim_θhat (generic function with 1 method)

Repeat $$R$$ replications

num_obs = 250
num_rep = 500
# Initialize output
results = Matrix(undef, length(θ₀), num_rep)
for i in 1:num_rep
results[:, i] = sim_θhat(num_obs)
end
# Mean estimates
mean(results, dims = 2)
7×1 Matrix{Float64}:
0.0023653319339430197
0.9993703468858799
0.946467070696018
0.20610432822890723
0.24159316736779207
1.0379116232819448
0.20774282838802816

### Plot

Plots.boxplot(repeat(parnames, outer = 500), vec(results .- θ₀), legend = false)