| rivDP {bayesm} | R Documentation |
rivDP is a Gibbs Sampler for a linear structural equation with an arbitrary number of instruments.
rivDP uses a mixture of normals for the structural and reduced form equation implemented with a
Dirichlet Process Prior.
rivDP(Data, Prior, Mcmc)
Data |
list(z,w,x,y) |
Prior |
list(md,Ad,mbg,Abg,lambda,Prioralpha) (optional) |
Mcmc |
list(R,keep,SCALE) (R required) |
Model:
x=z'delta + e1.
y=beta*x + w'gamma + e2.
e1,e2 ~ N(theta_{i}). theta_{i} represents mu_{i},Sigma_{i}
Note: Error terms have non-zero means. DO NOT include intercepts in the z or w matrices. This is different
from rivGibbs which requires intercepts to be included explicitly.
Priors:
delta ~ N(md,Ad^{-1}). vec(beta,gamma) ~ N(mbg,Abg^{-1})
theta_{i}sim{~}G
G ~ DP(alpha,G_{0})
G_{0} is the natural conjugate prior for (mu,Sigma):
Sigma ~ IW(nu,vI) and mu | Sigma ~ N(0,1/amu Sigma)
These parameters are collected together in the list lambda. It is highly
recommended that you use the default settings for these hyper-parameters.
alpha ~ (1-(alpha-alpha_{min})/(alpha_{max}-alpha{min}))^{power}
where alpha_{min} and alpha_{max} are set using the arguments in the reference
below. It is highly recommended that you use the default values for the hyperparameters
of the prior on alpha
List arguments contain:
zyxwmdAdmbgAbglambdaPrioralphaRkeepSCALEgridsize
output includes object nmix of class "bayesm.nmix" which contains draws of predictive distribution of
errors (a Bayesian analogue of a density estimate for the error terms).
nmix:
probdrawzdrawcompdrawnote: in compdraw list, there is only one component per draw
a list containing:
deltadraw |
R/keep x dim(delta) array of delta draws |
betadraw |
R/keep x 1 vector of beta draws |
gammadraw |
R/keep x dim(gamma) array of gamma draws |
Istardraw |
R/keep x 1 array of drawsi of the number of unique normal components |
alphadraw |
R/keep x 1 array of draws of Dirichlet Process tightness parameter |
nmix |
R/keep x list of draws for predictive distribution of errors |
Peter Rossi, Graduate School of Business, University of Chicago, Peter.Rossi@ChicagoGsb.edu.
For further discussion, see "A Semi-Parametric Bayesian Approach to the Instrumental
Variable Problem," by Conley, Hansen, McCulloch and Rossi, Journal of Econometrics (2008).
rivGibbs
##
if(nchar(Sys.getenv("LONG_TEST")) != 0) {R=2000} else {R=10}
##
## simulate scaled log-normal errors and run
##
set.seed(66)
k=10
delta=1.5
Sigma=matrix(c(1,.6,.6,1),ncol=2)
N=1000
tbeta=4
set.seed(66)
scalefactor=.6
root=chol(scalefactor*Sigma)
mu=c(1,1)
##
## compute interquartile ranges
##
ninterq=qnorm(.75)-qnorm(.25)
error=matrix(rnorm(100000*2),ncol=2)
error=t(t(error)+mu)
Err=t(t(exp(error))-exp(mu+.5*scalefactor*diag(Sigma)))
lnNinterq=quantile(Err[,1],prob=.75)-quantile(Err[,1],prob=.25)
##
## simulate data
##
error=matrix(rnorm(N*2),ncol=2)%*%root
error=t(t(error)+mu)
Err=t(t(exp(error))-exp(mu+.5*scalefactor*diag(Sigma)))
#
# scale appropriately
Err[,1]=Err[,1]*ninterq/lnNinterq
Err[,2]=Err[,2]*ninterq/lnNinterq
z=matrix(runif(k*N),ncol=k)
x=z%*%(delta*c(rep(1,k)))+Err[,1]
y=x*tbeta+Err[,2]
# set intial values for MCMC
Data = list(); Mcmc=list()
Data$z = z; Data$x=x; Data$y=y
# start MCMC and keep results
Mcmc$maxuniq=100
Mcmc$R=R
end=Mcmc$R
begin=100
out=rivDP(Data=Data,Mcmc=Mcmc)
cat("Summary of Beta draws",fill=TRUE)
summary(out$betadraw,tvalues=tbeta)
if(0){
## plotting examples
plot(out$betadraw,tvalues=tbeta)
plot(out$nmix) ## plot "fitted" density of the errors
##
}