statsmodels.gam.generalized_additive_model.GLMGamResults¶
-
class
statsmodels.gam.generalized_additive_model.
GLMGamResults
(model, params, normalized_cov_params, scale, **kwds)[source]¶ Results class for generalized additive models, GAM.
This inherits from GLMResults.
Warning: some inherited methods might not correctly take account of the penalization
GLMGamResults inherits from GLMResults All methods related to the loglikelihood function return the penalized values.
Notes
status: experimental
- Attributes
- edf
list of effective degrees of freedom for each column of the design matrix.
- hat_matrix_diag
diagonal of hat matrix
- gcv
generalized cross-validation criterion computed as
gcv = scale / (1. - hat_matrix_trace / self.nobs)**2
- cv
cross-validation criterion computed as
cv = ((resid_pearson / (1 - hat_matrix_diag))**2).sum() / nobs
Methods
conf_int
([alpha, cols])Construct confidence interval for the fitted parameters.
cov_params
([r_matrix, column, scale, cov_p, …])Compute the variance/covariance matrix.
f_test
(r_matrix[, cov_p, scale, invcov])Compute the F-test for a joint linear hypothesis.
get_hat_matrix_diag
([observed, _axis])Compute the diagonal of the hat matrix
get_influence
([observed])Get an instance of GLMInfluence with influence and outlier measures
get_prediction
([exog, exog_smooth, transform])compute prediction results
initialize
(model, params, **kwargs)Initialize (possibly re-initialize) a Results instance.
load
(fname)Load a pickled results instance
See specific model class docstring
partial_values
(smooth_index[, include_constant])contribution of a smooth term to the linear prediction
plot_added_variable
(focus_exog[, …])Create an added variable plot for a fitted regression model.
plot_ceres_residuals
(focus_exog[, frac, …])Produces a CERES (Conditional Expectation Partial Residuals) plot for a fitted regression model.
plot_partial
(smooth_index[, plot_se, cpr, …])plot the contribution of a smooth term to the linear prediction
plot_partial_residuals
(focus_exog[, ax])Create a partial residual, or ‘component plus residual’ plot for a fitted regression model.
predict
([exog, exog_smooth, transform])”
Remove data arrays, all nobs arrays from result and model.
save
(fname[, remove_data])Save a pickle of this instance.
summary
([yname, xname, title, alpha])Summarize the Regression Results
summary2
([yname, xname, title, alpha, …])Experimental summary for regression Results
t_test
(r_matrix[, cov_p, scale, use_t])Compute a t-test for a each linear hypothesis of the form Rb = q.
t_test_pairwise
(term_name[, method, alpha, …])Perform pairwise t_test with multiple testing corrected p-values.
test_significance
(smooth_index)hypothesis test that a smooth component is zero.
wald_test
(r_matrix[, cov_p, scale, invcov, …])Compute a Wald-test for a joint linear hypothesis.
wald_test_terms
([skip_single, …])Compute a sequence of Wald tests for terms over multiple columns.
Properties
Akaike Information Criterion -2 * llf + 2*(df_model + 1)
Bayes Information Criterion deviance - df_resid * log(nobs)
The standard errors of the parameter estimates.
See statsmodels.families.family for the distribution-specific deviance functions.
Linear predicted values for the fitted model.
Value of the loglikelihood function evalued at params.
Log-likelihood of the model fit with a constant as the only regressor
See GLM docstring.
Fitted values of the null model
The value of the deviance function for the model fit with a constant as the only regressor.
Pearson’s Chi-Squared statistic is defined as the sum of the squares of the Pearson residuals.
The two-tailed p values for the t-stats of the params.
Anscombe residuals.
Scaled Anscombe residuals.
Unscaled Anscombe residuals.
Deviance residuals.
Pearson residuals.
Respnose residuals.
Working residuals.
Return the t-statistic for a given parameter estimate.
Flag indicating to use the Student’s distribution in inference.