Title: | FAB p-Values and Confidence Intervals |
---|---|
Description: | Frequentist assisted by Bayes (FAB) p-values and confidence interval construction. See Hoff (2019) <arXiv:1907.12589> "Smaller p-values via indirect information", Hoff and Yu (2019) <doi:10.1214/18-EJS1517> "Exact adaptive confidence intervals for linear regression coefficients", and Yu and Hoff (2018) <doi:10.1093/biomet/asy009> "Adaptive multigroup confidence intervals with constant coverage". |
Authors: | Peter Hoff |
Maintainer: | Peter Hoff <[email protected]> |
License: | GPL-3 |
Version: | 0.1 |
Built: | 2025-03-04 02:45:50 UTC |
Source: | https://github.com/pdhoff/fabinference |
Computation of a 1-alpha FAB t-interval using z-optimal spending function
fabtzCI(y, s, dof, alpha = 0.05, psi = list(mu = 0, tau2 = 1e+05, sigma2 = 1))
fabtzCI(y, s, dof, alpha = 0.05, psi = list(mu = 0, tau2 = 1e+05, sigma2 = 1))
y |
a numeric scalar, a normally distributed statistic |
s |
a numeric scalar, the standard error of y |
dof |
positive integer, degrees of freedom for s |
alpha |
the type I error rate, so 1-alpha is the coverage rate |
psi |
a list of parameters for the spending function, including
|
a two-dimensional vector of the left and right endpoints of the interval
Peter Hoff
n<-10 y<-rnorm(n) fabtzCI(mean(y),sqrt(var(y)/n),n-1) t.test(y)$conf.int
n<-10 y<-rnorm(n) fabtzCI(mean(y),sqrt(var(y)/n),n-1) t.test(y)$conf.int
Computation of a 1-alpha FAB z-interval
fabzCI(y, mu, t2, s2, alpha = 0.05)
fabzCI(y, mu, t2, s2, alpha = 0.05)
y |
a numeric scalar |
mu |
a numeric scalar |
t2 |
a positive numeric scalar |
s2 |
a positive numeric scalar |
alpha |
the type I error rate, so 1-alpha is the coverage rate |
A FAB interval is the "frequentist" interval procedure
that is Bayes optimal: It minimizes the prior expected
interval width among all interval procedures with
exact 1-alpha frequentist coverage. This function computes
the FAB z-interval for the mean of a normal population with an
known variance, given a user-specified prior distribution
determined by psi
. The prior is that the population mean
is normally distributed.
Referring to the elements of psi
as mu, t2, s2, the prior and population variance are
determined as follows:
mu is the prior expectation of the mean
t2 is the prior variance of the mean
s2 is the population variance
a two-dimensional vector of the left and right endpoints of the interval
Peter Hoff
y<-0 fabzCI(y,0,10,1) fabzCI(y,0,1/10,1) fabzCI(y,2,10,1) fabzCI(y,0,1/10,1)
y<-0 fabzCI(y,0,10,1) fabzCI(y,0,1/10,1) fabzCI(y,2,10,1) fabzCI(y,0,1/10,1)
asymptotic FAB p-values and confidence intervals for parameters in generalized linear regression models
glmFAB(cformula, FABvars, lformula = NULL, alpha = 0.05, silent = FALSE, ...)
glmFAB(cformula, FABvars, lformula = NULL, alpha = 0.05, silent = FALSE, ...)
cformula |
formula for the control variables |
FABvars |
matrix of regressors for which to make FAB p-values and CIs |
lformula |
formula for the linking model (just specify right-hand side) |
alpha |
error rate for CIs (1-alpha CIs will be constructed) |
silent |
show progress (TRUE) or not (FALSE) |
... |
additional arguments to be passed to |
an object of the class glmFAB
which inherits from glm
Peter Hoff
# n observations, p FAB variables, q=2 control variables n<-100 ; p<-25 # X is design matrix for params of interest # beta is vector of true parameter values # v a variable in the linking model - used to share info across betas v<-rnorm(p) ; beta<-(2 - 2*v + rnorm(p))/3 ; X<-matrix(rnorm(n*p),n,p)/8 # control coefficients and variables alpha1<-.5 ; alpha2<- -.5 w1<-rnorm(n)/8 w2<-rnorm(n)/8 # simulate data lp<-1 + alpha1*w1 + alpha2*w2 + X%*%beta y<-rpois(n,exp(lp)) # fit model fit<-glmFAB(y~w1+w2,X,~v,family=poisson) fit$FABpv fit$FABci summary(fit) # look at p-value column
# n observations, p FAB variables, q=2 control variables n<-100 ; p<-25 # X is design matrix for params of interest # beta is vector of true parameter values # v a variable in the linking model - used to share info across betas v<-rnorm(p) ; beta<-(2 - 2*v + rnorm(p))/3 ; X<-matrix(rnorm(n*p),n,p)/8 # control coefficients and variables alpha1<-.5 ; alpha2<- -.5 w1<-rnorm(n)/8 w2<-rnorm(n)/8 # simulate data lp<-1 + alpha1*w1 + alpha2*w2 + X%*%beta y<-rpois(n,exp(lp)) # fit model fit<-glmFAB(y~w1+w2,X,~v,family=poisson) fit$FABpv fit$FABci summary(fit) # look at p-value column
FAB p-values and confidence intervals for parameters in linear regression models
lmFAB( cformula, FABvars, lformula = NULL, alpha = 0.05, rssSplit = TRUE, silent = FALSE )
lmFAB( cformula, FABvars, lformula = NULL, alpha = 0.05, rssSplit = TRUE, silent = FALSE )
cformula |
formula for the control variables |
FABvars |
matrix of regressors for which to make FAB p-values and CIs |
lformula |
formula for the linking model (just specify right-hand side) |
alpha |
error rate for CIs (1-alpha CIs will be constructed) |
rssSplit |
use some residual degrees of freedom to help fit linking model (TRUE/FALSE) |
silent |
show progress (TRUE) or not (FALSE) |
an object of the class lmFAB
which inherits from lm
Peter Hoff
# n observations, p FAB variables, q=2 control variables n<-100 ; p<-25 # X is design matrix for params of interest # beta is vector of true parameter values # v a variable in the linking model - used to share info across betas v<-rnorm(p) ; beta<-(2 - 2*v + rnorm(p))/3 ; X<-matrix(rnorm(n*p),n,p)/8 # control coefficients and variables alpha1<-.5 ; alpha2<- -.5 w1<-rnorm(n)/8 w2<-rnorm(n)/8 # simulate data lp<-1 + alpha1*w1 + alpha2*w2 + X%*%beta y<-rnorm(n,lp) # fit model fit<-lmFAB(y~w1+w2,X,~v) fit$FABpv fit$FABci summary(fit) # look at p-value column
# n observations, p FAB variables, q=2 control variables n<-100 ; p<-25 # X is design matrix for params of interest # beta is vector of true parameter values # v a variable in the linking model - used to share info across betas v<-rnorm(p) ; beta<-(2 - 2*v + rnorm(p))/3 ; X<-matrix(rnorm(n*p),n,p)/8 # control coefficients and variables alpha1<-.5 ; alpha2<- -.5 w1<-rnorm(n)/8 w2<-rnorm(n)/8 # simulate data lp<-1 + alpha1*w1 + alpha2*w2 + X%*%beta y<-rnorm(n,lp) # fit model fit<-lmFAB(y~w1+w2,X,~v) fit$FABpv fit$FABci summary(fit) # look at p-value column
Marginal MLEs for the Fay-Herriot random effects model where the covariance matrix for the sampling model is known to scale.
mmleFH(y, X, V, ss0 = 0, df0 = 0)
mmleFH(y, X, V, ss0 = 0, df0 = 0)
y |
direct data following normal model |
X |
linking model predictors |
V |
covariance matrix to scale |
ss0 |
prior sum of squares for estimate of |
df0 |
prior degrees of freedom for estimate of |
a list of parameter estimates including
beta, the estimated regression coefficients
t2, the estimate of
s2, the estimate of
Peter Hoff
n<-30 ; p<-3 X<-matrix(rnorm(n*p),n,p) beta<-rnorm(p) theta<-X%*%beta + rnorm(n) V<-diag(n) y<-theta+rnorm(n) mmleFH(y,X,V)
n<-30 ; p<-3 X<-matrix(rnorm(n*p),n,p) beta<-rnorm(p) theta<-X%*%beta + rnorm(n) V<-diag(n) y<-theta+rnorm(n) mmleFH(y,X,V)
Marginal MLEs for the Fay-Herriot random effects model where the covariance matrix for the sampling model is known
mmleFHP(y, X, Sigma)
mmleFHP(y, X, Sigma)
y |
direct data following normal model |
X |
linking model predictors |
Sigma |
covariance matrix in sampling model |
a list of parameter estimates including
beta, the estimated regression coefficients
t2, the estimate of
Peter Hoff
n<-30 ; p<-3 X<-matrix(rnorm(n*p),n,p) beta<-rnorm(p) theta<-X%*%beta + rnorm(n) Sigma<-diag(n) y<-theta+rnorm(n) mmleFHP(y,X,Sigma)
n<-30 ; p<-3 X<-matrix(rnorm(n*p),n,p) beta<-rnorm(p) theta<-X%*%beta + rnorm(n) Sigma<-diag(n) y<-theta+rnorm(n) mmleFHP(y,X,Sigma)
QR decomposition for lmFAB objects
## S3 method for class 'lmFAB' qr(x, ...)
## S3 method for class 'lmFAB' qr(x, ...)
x |
|
... |
see |
qr decomposition for a design matrix
Split residual sum of squares from normal linear regression
rssSplit(fit, df0 = max(1, floor(fit$df/10)), seed = -71407)
rssSplit(fit, df0 = max(1, floor(fit$df/10)), seed = -71407)
fit |
lm object |
df0 |
degrees of freedom for the smaller of the two residual sums of squares |
seed |
random seed for constructing the basis vectors of the split |
a two-dimensional vector of independent sums of squares
Peter Hoff
n<-30 ; p<-6 ; sigma2<-1.5 X<-matrix(rnorm(n*p),n,p) y<-X%*%rnorm(6) + sqrt(sigma2)*rnorm(n) ss<-rssSplit(lm(y~ -1+X)) df<-as.numeric( substring(names(ss),first=3)) ss/df
n<-30 ; p<-6 ; sigma2<-1.5 X<-matrix(rnorm(n*p),n,p) y<-X%*%rnorm(6) + sqrt(sigma2)*rnorm(n) ss<-rssSplit(lm(y~ -1+X)) df<-as.numeric( substring(names(ss),first=3)) ss/df
Compute Bayes optimal spending function
sfabz(theta, psi, alpha = 0.05)
sfabz(theta, psi, alpha = 0.05)
theta |
value of theta being tested |
psi |
a list of parameters for the spending function, including
|
alpha |
level of test |
This function computes the value of s that minimizes the acceptance probability of a biased level-alpha test for a normal population with known variance, under a specified prior predictive distribution.
a scalar value giving the optimal tail-area probability
Peter Hoff
thetas<-seq(-1,1,length=100) s<-NULL for(theta in thetas){ s<-c(s,sfabz(theta,list(mu=0,tau2=1,sigma2=1)) ) } plot(thetas,s,type="l")
thetas<-seq(-1,1,length=100) s<-NULL for(theta in thetas){ s<-c(s,sfabz(theta,list(mu=0,tau2=1,sigma2=1)) ) } plot(thetas,s,type="l")
summary
method for class glmFAB
## S3 method for class 'glmFAB' summary( object, dispersion = NULL, correlation = FALSE, symbolic.cor = FALSE, ... )
## S3 method for class 'glmFAB' summary( object, dispersion = NULL, correlation = FALSE, symbolic.cor = FALSE, ... )
object |
an object of class |
dispersion |
see |
correlation |
see |
symbolic.cor |
see |
... |
see |
A mod of summary.glm
that shows FAB p-values in table
A list of summary statistics of the fitted generalized linear model
# n observations, p FAB variables, q=2 control variables n<-100 ; p<-25 # X is design matrix for params of interest # beta is vector of true parameter values # v a variable in the linking model - used to share info across betas v<-rnorm(p) ; beta<-(2 - 2*v + rnorm(p))/3 ; X<-matrix(rnorm(n*p),n,p)/8 # control coefficients and variables alpha1<-.5 ; alpha2<- -.5 w1<-rnorm(n)/8 w2<-rnorm(n)/8 # simulate data lp<-1 + alpha1*w1 + alpha2*w2 + X%*%beta y<-rpois(n,exp(lp)) # fit model fit<-glmFAB(y~w1+w2,X,~v,family=poisson) fit$FABpv fit$FABci summary(fit) # look at p-value column
# n observations, p FAB variables, q=2 control variables n<-100 ; p<-25 # X is design matrix for params of interest # beta is vector of true parameter values # v a variable in the linking model - used to share info across betas v<-rnorm(p) ; beta<-(2 - 2*v + rnorm(p))/3 ; X<-matrix(rnorm(n*p),n,p)/8 # control coefficients and variables alpha1<-.5 ; alpha2<- -.5 w1<-rnorm(n)/8 w2<-rnorm(n)/8 # simulate data lp<-1 + alpha1*w1 + alpha2*w2 + X%*%beta y<-rpois(n,exp(lp)) # fit model fit<-glmFAB(y~w1+w2,X,~v,family=poisson) fit$FABpv fit$FABci summary(fit) # look at p-value column
summary
method for class lmFAB
## S3 method for class 'lmFAB' summary(object, correlation = FALSE, symbolic.cor = FALSE, ...)
## S3 method for class 'lmFAB' summary(object, correlation = FALSE, symbolic.cor = FALSE, ...)
object |
an object of class |
correlation |
see |
symbolic.cor |
see |
... |
see |
A mod of summary.lm
that shows FAB p-values in table
A list of summary statistics of the fitted linear model
# n observations, p FAB variables, q=2 control variables n<-100 ; p<-25 # X is design matrix for params of interest # beta is vector of true parameter values # v a variable in the linking model - used to share info across betas v<-rnorm(p) ; beta<-(2 - 2*v + rnorm(p))/3 ; X<-matrix(rnorm(n*p),n,p)/8 # control coefficients and variables alpha1<-.5 ; alpha2<- -.5 w1<-rnorm(n)/8 w2<-rnorm(n)/8 # simulate data lp<-1 + alpha1*w1 + alpha2*w2 + X%*%beta y<-rnorm(n,lp) # fit model fit<-lmFAB(y~w1+w2,X,~v) fit$FABpv fit$FABci summary(fit) # look at p-value column
# n observations, p FAB variables, q=2 control variables n<-100 ; p<-25 # X is design matrix for params of interest # beta is vector of true parameter values # v a variable in the linking model - used to share info across betas v<-rnorm(p) ; beta<-(2 - 2*v + rnorm(p))/3 ; X<-matrix(rnorm(n*p),n,p)/8 # control coefficients and variables alpha1<-.5 ; alpha2<- -.5 w1<-rnorm(n)/8 w2<-rnorm(n)/8 # simulate data lp<-1 + alpha1*w1 + alpha2*w2 + X%*%beta y<-rnorm(n,lp) # fit model fit<-lmFAB(y~w1+w2,X,~v) fit$FABpv fit$FABci summary(fit) # look at p-value column