Skip to contents

estimatePopsizeFit does for estimatePopsize what glm.fit does for glm. It is internally called in estimatePopsize. Since estimatePopsize does much more than just regression fitting estimatePopsizeFit is much faster.

Usage

estimatePopsizeFit(
  y,
  X,
  family,
  control,
  method,
  priorWeights,
  coefStart,
  etaStart,
  offset,
  ...
)

Arguments

y

vector of dependent variables.

X

model matrix, the vglm one.

family

same as model in estimatePopsize.

control

control parameters created in controlModel.

method

method of estimation same as in estimatePopsize.

priorWeights

vector of prior weights its the same argument as weights in estimatePopsize.

etaStart, coefStart

initial value of regression parameters or linear predictors.

offset

offset passed from by default passed from estimatePopsize().

...

arguments to pass to other methods.

Value

List with regression parameters, working weights (if IRLS fitting method) was chosen and number of iterations taken.

Details

If method argument was set to "optim" the stats::optim function will be used to fit regression with analytically computed gradient and (minus) log likelihood functions as gr and fn arguments. Unfortunately optim does not allow for hessian to be specified. More information about how to modify optim fitting is included in controlMethod().

If method argument was set to "IRLS" the iteratively reweighted least squares. The algorithm is well know in generalised linear models. Thomas W. Yee later extended this algorithm to vector generalised linear models and in more general terms it can roughly be described as (this is Yee's description after changing some conventions):

  1. Initialize with:

    • converged <- FALSE

    • iter <- 1

    • <- start

    • W <- prior

    • <- ()

  2. If converged or iter > Maxiter move to step 7.

  3. Store values from previous algorithm step:

    • W_- <- W

    • _- <-

    • _- <-

    and assign values at current step:

    • <- X_vlm

    • Z_i <- _i+_i_i E(^2_i _i^T_i)^-1

    • W_ij <- E(^2 _j^T_i)

    where _i is the ith component of log likelihood function, _i is the vector of linear predictors associated with ith row and E(^2_i _i^T_i) corresponds to weights associated with ith row and W is a block matrix, made of diagonal matrixes E(^2 _j^T_i)

  4. Regress Z on X_vlm to obtain as: = (X_vlm^TWX_vlm)^-1 X_vlm^TWZ

  5. Assign:

    • converged <- ()-_- < _- or ||-_-||_ <

    • iter <- iter + 1

    where is the relative tolerance level, by default 1e-8.

  6. Return to step 2.

  7. Return , W, iter.

In this package we use different conventions for X_vlm matrix hence slight differences are present in algorithm description but results are identical.

References

Yee, T. W. (2015). Vector Generalized Linear and Additive Models: With an Implementation in R. New York, USA: Springer. ISBN 978-1-4939-2817-0.

Author

Piotr Chlebicki, Maciej Beresewicz

Examples

# \donttest{
summary(farmsubmission)
#>    TOTAL_SUB        log_size       log_distance      C_TYPE    
#>  Min.   : 1.00   Min.   : 0.000   Min.   : 4.102   Beef :5336  
#>  1st Qu.: 1.00   1st Qu.: 4.673   1st Qu.:10.351   Dairy:6700  
#>  Median : 1.00   Median : 5.347   Median :10.778               
#>  Mean   : 2.34   Mean   : 5.259   Mean   :10.662               
#>  3rd Qu.: 3.00   3rd Qu.: 5.940   3rd Qu.:11.099               
#>  Max.   :47.00   Max.   :10.480   Max.   :12.097               

# construct vglm model matrix
X <- matrix(data = 0, nrow = 2 * NROW(farmsubmission), ncol = 7)
X[1:NROW(farmsubmission), 1:4] <- model.matrix(
~ 1 + log_size + log_distance + C_TYPE, 
farmsubmission
)


X[-(1:NROW(farmsubmission)), 5:7] <- X[1:NROW(farmsubmission), c(1, 3, 4)]

# this attribute tells the function which elements of the design matrix 
# correspond to which linear predictor 
attr(X, "hwm") <- c(4, 3)

# get starting points
start <- glm.fit(
y = farmsubmission$TOTAL_SUB, 
x = X[1:NROW(farmsubmission), 1:4], 
family = poisson()
)$coefficients

res <- estimatePopsizeFit(
y = farmsubmission$TOTAL_SUB, 
X = X, 
method = "IRLS", 
priorWeights = 1, 
family = ztoigeom(), 
control = controlMethod(verbose = 5), 
coefStart = c(start, 0, 0, 0),
etaStart = matrix(X %*% c(start, 0, 0, 0), ncol = 2),
offset = cbind(rep(0, NROW(farmsubmission)), rep(0, NROW(farmsubmission)))
)
#> Iteration number 1 log-likelihood: -17455.372
#> Parameter vector:  -2.255494347  0.521900283 -0.048255922  0.321168020 -1.297382847  0.049409082 -0.726587214
#> log-likelihood reduction:  Inf
#> Value of gradient at current step:
#>    639.53919  4035.62183  6732.72078   672.31223  -316.39394 -3358.16739  -229.50394
#> Algorithm will terminate if one of following conditions will be met:
#> The increase to minus log-likelihood will be bellow chosen value of epsilon 1e-08 
#> Maximum change to the vector of regression parameters will be bellow the chosen value of epsilon.
#> At current step the highest change was: 2.2554943
#> ----
#> Iteration number 2 log-likelihood: -17289.531
#> Parameter vector:  -2.859491920  0.627917105 -0.063516471  0.573159952 -2.327571214  0.074464787 -0.734988992
#> log-likelihood reduction:  165.84115
#> Value of gradient at current step:
#>    77.37365182  329.72667483  823.92206208    0.28711463  -53.27713944 -568.21808076  -29.69092406
#> Algorithm will terminate if one of following conditions will be met:
#> The increase to minus log-likelihood will be bellow chosen value of epsilon 1e-08 
#> Maximum change to the vector of regression parameters will be bellow the chosen value of epsilon.
#> At current step the highest change was: 1.0301884
#> ----
#> Iteration number 3 log-likelihood: -17279.272
#> Parameter vector:  -2.710874025  0.613067896 -0.069548715  0.537208696 -2.550651004  0.071488122 -0.939145580
#> log-likelihood reduction:  10.258788
#> Value of gradient at current step:
#>   35.9649654 218.1210745 385.4441808  32.8759028  -6.8429684 -71.8589804  -5.7178443
#> Algorithm will terminate if one of following conditions will be met:
#> The increase to minus log-likelihood will be bellow chosen value of epsilon 1e-08 
#> Maximum change to the vector of regression parameters will be bellow the chosen value of epsilon.
#> At current step the highest change was: 0.22307979
#> ----
#> Iteration number 4 log-likelihood: -17278.776
#> Parameter vector:  -2.78426085  0.61628333 -0.06440012  0.53843272 -3.10491635  0.12060422 -1.04138882
#> log-likelihood reduction:  0.49548535
#> Value of gradient at current step:
#>   1.67966432 11.13935043 16.42827781  1.15147049 -0.53787945 -5.57182317 -0.74695785
#> Algorithm will terminate if one of following conditions will be met:
#> The increase to minus log-likelihood will be bellow chosen value of epsilon 1e-08 
#> Maximum change to the vector of regression parameters will be bellow the chosen value of epsilon.
#> At current step the highest change was: 0.55426534
#> ----
#> Iteration number 5 log-likelihood: -17278.762
#> Parameter vector:  -2.77818073  0.61674992 -0.06504016  0.53517478 -3.10447410  0.12145802 -1.08163762
#> log-likelihood reduction:  0.014133264
#> Value of gradient at current step:
#>   0.232501334  2.204257359  2.694809270  0.131590616 -0.071608376 -0.605938875 -0.064356213
#> Algorithm will terminate if one of following conditions will be met:
#> The increase to minus log-likelihood will be bellow chosen value of epsilon 1e-08 
#> Maximum change to the vector of regression parameters will be bellow the chosen value of epsilon.
#> At current step the highest change was: 0.040248796
#> ----
#> Iteration number 6 log-likelihood: -17278.761
#> Parameter vector:  -2.78602286  0.61698772 -0.06441877  0.53495927 -3.19325142  0.12969021 -1.08314735
#> log-likelihood reduction:  0.00072206104
#> Value of gradient at current step:
#>  -0.0149254818 -0.0442854059 -0.2585639447 -0.0671445437 -0.0089103073 -0.1428452785 -0.0408361062
#> Algorithm will terminate if one of following conditions will be met:
#> The increase to minus log-likelihood will be bellow chosen value of epsilon 1e-08 
#> Maximum change to the vector of regression parameters will be bellow the chosen value of epsilon.
#> At current step the highest change was: 0.088777321
#> ----
#> Iteration number 7 log-likelihood: -17278.761
#> Parameter vector:  -2.783307402  0.617006834 -0.064659772  0.534565433 -3.160032510  0.126742033 -1.087078105
#> log-likelihood reduction:  0.000086867327
#> Value of gradient at current step:
#>   0.0219006963  0.2137543746  0.2878052108  0.0210725209 -0.0038258075 -0.0055588958  0.0088000042
#> Algorithm will terminate if one of following conditions will be met:
#> The increase to minus log-likelihood will be bellow chosen value of epsilon 1e-08 
#> Maximum change to the vector of regression parameters will be bellow the chosen value of epsilon.
#> At current step the highest change was: 0.033218913
#> ----
#> Iteration number 8 log-likelihood: -17278.761
#> Parameter vector:  -2.785132183  0.617029483 -0.064506904  0.534668321 -3.181900933  0.128724679 -1.085840518
#> log-likelihood reduction:  0.00002149725
#> Value of gradient at current step:
#>  -0.00917986045 -0.07823752458 -0.12764140882 -0.01528913765  0.00031447068 -0.01459257981 -0.00776040765
#> Algorithm will terminate if one of following conditions will be met:
#> The increase to minus log-likelihood will be bellow chosen value of epsilon 1e-08 
#> Maximum change to the vector of regression parameters will be bellow the chosen value of epsilon.
#> At current step the highest change was: 0.021868422
#> ----
#> Iteration number 9 log-likelihood: -17278.761
#> Parameter vector:  -2.784182050  0.617023943 -0.064588093  0.534586386 -3.170374939  0.127687086 -1.086727858
#> log-likelihood reduction:  0.0000064244196
#> Value of gradient at current step:
#>   0.00580810482  0.05244014319  0.07853720954  0.00778331025 -0.00057098473  0.00429221923  0.00373823937
#> Algorithm will terminate if one of following conditions will be met:
#> The increase to minus log-likelihood will be bellow chosen value of epsilon 1e-08 
#> Maximum change to the vector of regression parameters will be bellow the chosen value of epsilon.
#> At current step the highest change was: 0.011525994
#> ----
#> Iteration number 10 log-likelihood: -17278.761
#> Parameter vector:  -2.784725925  0.617028507 -0.064541975  0.534626993 -3.176948180  0.128280435 -1.086273279
#> log-likelihood reduction:  0.0000019945583
#> Value of gradient at current step:
#>  -0.00307033641 -0.02728421588 -0.04201045335 -0.00448214638  0.00022935424 -0.00326931976 -0.00221219809
#> Algorithm will terminate if one of following conditions will be met:
#> The increase to minus log-likelihood will be bellow chosen value of epsilon 1e-08 
#> Maximum change to the vector of regression parameters will be bellow the chosen value of epsilon.
#> At current step the highest change was: 0.006573241
#> ----
#> Iteration number 11 log-likelihood: -17278.761
#> Parameter vector:  -2.78442528  0.61702627 -0.06456754  0.53460327 -3.17330966  0.12795232 -1.08653540
#> log-likelihood reduction:  0.00000061988248
#> Value of gradient at current step:
#>   0.00175710833  0.01568170299  0.02390375498  0.00247668339 -0.00014882078  0.00161926855  0.00120723719
#> Algorithm will terminate if one of following conditions will be met:
#> The increase to minus log-likelihood will be bellow chosen value of epsilon 1e-08 
#> Maximum change to the vector of regression parameters will be bellow the chosen value of epsilon.
#> At current step the highest change was: 0.0036385179
#> ----
#> Iteration number 12 log-likelihood: -17278.761
#> Parameter vector:  -2.784593634  0.617027580 -0.064553239  0.534616288 -3.175346365  0.128136053 -1.086390890
#> log-likelihood reduction:  0.00000019314757
#> Value of gradient at current step:
#>  -0.000968762323 -0.008641334475 -0.013216142970 -0.001385713879  0.000077984443 -0.000953809240 -0.000679710377
#> Algorithm will terminate if one of following conditions will be met:
#> The increase to minus log-likelihood will be bellow chosen value of epsilon 1e-08 
#> Maximum change to the vector of regression parameters will be bellow the chosen value of epsilon.
#> At current step the highest change was: 0.0020367031
#> ----
#> Iteration number 13 log-likelihood: -17278.761
#> Parameter vector:  -2.784499807  0.617026860 -0.064561213  0.534608978 -3.174211159  0.128033659 -1.086471902
#> log-likelihood reduction:  0.000000060121238
#> Value of gradient at current step:
#>   0.000543879322  0.004849594735  0.007409081651  0.000772957763 -0.000044801248  0.000519451058  0.000377932800
#> Algorithm will terminate if one of following conditions will be met:
#> The increase to minus log-likelihood will be bellow chosen value of epsilon 1e-08 
#> Maximum change to the vector of regression parameters will be bellow the chosen value of epsilon.
#> At current step the highest change was: 0.0011352061
#> ----
#> Iteration number 14 log-likelihood: -17278.761
#> Parameter vector:  -2.784552191  0.617027265 -0.064556762  0.534613048 -3.174844941  0.128090828 -1.086426773
#> log-likelihood reduction:  0.000000018728315
#> Value of gradient at current step:
#>  -0.000302564127 -0.002699091075 -0.004124826709 -0.000431270327  0.000024661481 -0.000293282189 -0.000211223041
#> Algorithm will terminate if one of following conditions will be met:
#> The increase to minus log-likelihood will be bellow chosen value of epsilon 1e-08 
#> Maximum change to the vector of regression parameters will be bellow the chosen value of epsilon.
#> At current step the highest change was: 0.00063378216
#> ----
#> Iteration number 15 log-likelihood: -17278.761
#> Parameter vector:  -2.784522964  0.617027040 -0.064559245  0.534610775 -3.174491334  0.128058932 -1.086451974
#> log-likelihood reduction:  0.00000000583168
#> Value of gradient at current step:
#>   0.000169118292  0.001508137356  0.002304647172  0.000240720368 -0.000013855242  0.000162719781  0.000117791117
#> Algorithm will terminate if one of following conditions will be met:
#> The increase to minus log-likelihood will be bellow chosen value of epsilon 1e-08 
#> Maximum change to the vector of regression parameters will be bellow the chosen value of epsilon.
#> At current step the highest change was: 0.00035360721
#> ----
#> Value of analytically computed hessian at fitted regression coefficients:
#>             [,1]         [,2]         [,3]        [,4]       [,5]        [,6]
#> [1,]  -5533.0360  -31139.1628  -58786.3824  -3921.3952  407.46348  23473.8262
#> [2,] -31139.1628 -179723.7217 -330742.8394 -22956.5461 4362.01544   1060.0139
#> [3,] -58786.3824 -330742.8394 -627035.9660 -41504.3671  185.81715   4362.0154
#> [4,]  -3921.3952  -22956.5461  -41504.3671  -3921.3952 2193.72685  46845.5137
#> [5,]    407.4635     185.8172   46845.5137   1060.0139  -88.14760   -948.1115
#> [6,]   2193.7268    4362.0154    1971.9380   1971.9380 -948.11154 -10234.8910
#> [7,]   4362.0154   23473.8262     185.8172    185.8172  -28.97964   -312.6869
#>            [,7]
#> [1,] 1971.93804
#> [2,]  185.81715
#> [3,] 1971.93804
#> [4,]  185.81715
#> [5,]  -28.97964
#> [6,] -312.68692
#> [7,]  -28.97964
#> The matrix above has the following eigen values:
#>  -811159.7+0i -17207.51+0i -2161.919+9274.748i -2161.919-9274.748i 6057.218+0i 1581.239+0i -1513.504+0i 

# extract results

# regression coefficient vector
res$beta
#> [1] -2.78452296  0.61702704 -0.06455925  0.53461077 -3.17449133  0.12805893
#> [7] -1.08645197

# check likelihood
ll <- ztoigeom()$makeMinusLogLike(y = farmsubmission$TOTAL_SUB, X = X)

-ll(res$beta)
#> [1] -17278.76

# number of iterations
res$iter
#> [1] 15

# working weights
head(res$weights)
#>          lambda       mixed       mixed       omega
#> [1,] 0.22688312 -0.03207963 -0.03207963 0.006351956
#> [2,] 0.66642253 -0.03199578 -0.03199578 0.006238854
#> [3,] 0.08469406 -0.01202233 -0.01202233 0.001899477
#> [4,] 0.14084340 -0.01990317 -0.01990317 0.003397031
#> [5,] 0.34099080 -0.04762290 -0.04762290 0.012045206
#> [6,] 0.42290801 -0.05782236 -0.05782236 0.018478738

# Compare with optim call

res2 <- estimatePopsizeFit(
  y = farmsubmission$TOTAL_SUB, 
  X = X, 
  method = "optim", 
  priorWeights = 1, 
  family = ztoigeom(), 
  coefStart = c(start, 0, 0, 0),
  control = controlMethod(verbose = 1, silent = TRUE),
  offset = cbind(rep(0, NROW(farmsubmission)), rep(0, NROW(farmsubmission)))
)
#>   Nelder-Mead direct search function minimizer
#> function value for initial parameters = 18634.249443
#>   Scaled convergence tolerance is 0.000186342
#> Stepsize computed as 0.082584
#> BUILD              8 20695.993313 18634.249443
#> LO-REDUCTION      10 20521.906450 18634.249443
#> REFLECTION        12 19390.030449 18203.536957
#> LO-REDUCTION      14 18936.678302 18203.536957
#> LO-REDUCTION      16 18749.996003 18203.536957
#> LO-REDUCTION      18 18737.148786 18203.536957
#> LO-REDUCTION      20 18718.703486 18203.536957
#> EXTENSION         22 18670.005025 17895.311480
#> LO-REDUCTION      24 18634.249443 17895.311480
#> LO-REDUCTION      26 18560.534560 17895.311480
#> EXTENSION         28 18449.746468 17588.420920
#> LO-REDUCTION      30 18295.895431 17588.420920
#> LO-REDUCTION      32 18222.396059 17588.420920
#> LO-REDUCTION      34 18203.536957 17588.420920
#> LO-REDUCTION      36 18087.422161 17588.420920
#> LO-REDUCTION      38 18005.031702 17588.420920
#> REFLECTION        40 18001.913029 17584.422405
#> REFLECTION        42 17895.311480 17527.021914
#> LO-REDUCTION      44 17780.944181 17527.021914
#> LO-REDUCTION      46 17712.559384 17527.021914
#> HI-REDUCTION      48 17643.578220 17527.021914
#> EXTENSION         50 17621.199586 17473.997274
#> LO-REDUCTION      52 17602.520321 17473.997274
#> HI-REDUCTION      54 17594.799573 17473.997274
#> LO-REDUCTION      56 17588.420920 17473.997274
#> LO-REDUCTION      58 17584.422405 17473.997274
#> EXTENSION         60 17556.846275 17447.303959
#> EXTENSION         62 17555.104630 17408.736552
#> LO-REDUCTION      64 17528.536402 17408.736552
#> LO-REDUCTION      66 17527.021914 17408.736552
#> EXTENSION         68 17511.009990 17375.624732
#> LO-REDUCTION      70 17506.001244 17375.624732
#> LO-REDUCTION      72 17473.997274 17375.624732
#> LO-REDUCTION      74 17459.682283 17375.624732
#> REFLECTION        76 17447.303959 17374.438952
#> EXTENSION         78 17431.820752 17359.678457
#> REFLECTION        80 17410.856537 17352.512930
#> LO-REDUCTION      82 17408.736552 17352.512930
#> LO-REDUCTION      84 17392.430571 17352.512930
#> LO-REDUCTION      86 17383.843739 17352.512930
#> LO-REDUCTION      88 17375.624732 17352.512930
#> HI-REDUCTION      90 17374.438952 17352.512930
#> REFLECTION        92 17364.440518 17351.391717
#> LO-REDUCTION      94 17361.269788 17351.391717
#> HI-REDUCTION      96 17359.678457 17351.391717
#> LO-REDUCTION      98 17359.333428 17351.391717
#> EXTENSION        100 17359.195000 17340.073149
#> LO-REDUCTION     102 17355.280449 17340.073149
#> LO-REDUCTION     104 17354.737924 17340.073149
#> LO-REDUCTION     106 17352.512930 17340.073149
#> LO-REDUCTION     108 17352.283989 17340.073149
#> EXTENSION        110 17351.931319 17334.816229
#> LO-REDUCTION     112 17351.391717 17334.816229
#> LO-REDUCTION     114 17349.077477 17334.816229
#> LO-REDUCTION     116 17346.861518 17334.816229
#> EXTENSION        118 17344.774811 17327.221865
#> LO-REDUCTION     120 17344.768718 17327.221865
#> EXTENSION        122 17340.363390 17314.895018
#> LO-REDUCTION     124 17340.073149 17314.895018
#> LO-REDUCTION     126 17337.014069 17314.895018
#> LO-REDUCTION     128 17336.611433 17314.895018
#> LO-REDUCTION     130 17334.816229 17314.895018
#> EXTENSION        132 17330.280767 17308.580216
#> LO-REDUCTION     134 17327.221865 17308.580216
#> LO-REDUCTION     136 17324.616040 17308.580216
#> REFLECTION       138 17320.961818 17307.339443
#> LO-REDUCTION     140 17320.314061 17307.339443
#> REFLECTION       142 17316.412981 17307.294606
#> LO-REDUCTION     144 17314.895018 17307.294606
#> HI-REDUCTION     146 17311.221319 17307.294606
#> REFLECTION       148 17310.240780 17305.855235
#> LO-REDUCTION     150 17309.024781 17305.855235
#> HI-REDUCTION     152 17308.580216 17305.855235
#> REFLECTION       154 17307.909389 17304.751271
#> LO-REDUCTION     156 17307.607430 17304.751271
#> HI-REDUCTION     158 17307.339443 17304.751271
#> EXTENSION        160 17307.294606 17303.969236
#> LO-REDUCTION     162 17306.726648 17303.969236
#> LO-REDUCTION     164 17306.486915 17303.969236
#> LO-REDUCTION     166 17306.013269 17303.969236
#> LO-REDUCTION     168 17305.855235 17303.969236
#> REFLECTION       170 17305.566500 17303.365745
#> HI-REDUCTION     172 17304.769833 17303.365745
#> LO-REDUCTION     174 17304.751271 17303.365745
#> EXTENSION        176 17304.679762 17302.760257
#> LO-REDUCTION     178 17304.415232 17302.760257
#> LO-REDUCTION     180 17304.272835 17302.760257
#> EXTENSION        182 17304.232531 17301.382173
#> LO-REDUCTION     184 17303.969236 17301.382173
#> LO-REDUCTION     186 17303.740306 17301.382173
#> LO-REDUCTION     188 17303.561302 17301.382173
#> LO-REDUCTION     190 17303.365745 17301.382173
#> REFLECTION       192 17303.285629 17301.216054
#> EXTENSION        194 17302.760257 17300.566315
#> EXTENSION        196 17302.374881 17299.438766
#> EXTENSION        198 17302.365881 17298.338665
#> EXTENSION        200 17301.538607 17296.617465
#> LO-REDUCTION     202 17301.386247 17296.617465
#> LO-REDUCTION     204 17301.382173 17296.617465
#> LO-REDUCTION     206 17301.216054 17296.617465
#> EXTENSION        208 17300.566315 17294.656203
#> EXTENSION        210 17299.438766 17292.415140
#> LO-REDUCTION     212 17298.338665 17292.415140
#> LO-REDUCTION     214 17297.787001 17292.415140
#> REFLECTION       216 17297.283737 17292.315917
#> REFLECTION       218 17296.776034 17290.819569
#> EXTENSION        220 17296.617465 17289.284957
#> LO-REDUCTION     222 17294.656203 17289.284957
#> LO-REDUCTION     224 17294.447369 17289.284957
#> LO-REDUCTION     226 17292.619840 17289.284957
#> REFLECTION       228 17292.415140 17289.050671
#> LO-REDUCTION     230 17292.315917 17289.050671
#> HI-REDUCTION     232 17290.819569 17289.050671
#> LO-REDUCTION     234 17290.333239 17289.050671
#> HI-REDUCTION     236 17290.198503 17289.050671
#> LO-REDUCTION     238 17290.030554 17289.050671
#> HI-REDUCTION     240 17289.707101 17289.050671
#> REFLECTION       242 17289.446684 17289.049956
#> REFLECTION       244 17289.402572 17288.762931
#> LO-REDUCTION     246 17289.314216 17288.762931
#> LO-REDUCTION     248 17289.284957 17288.762931
#> LO-REDUCTION     250 17289.237020 17288.755874
#> REFLECTION       252 17289.231432 17288.591829
#> LO-REDUCTION     254 17289.050671 17288.591829
#> HI-REDUCTION     256 17289.049956 17288.591829
#> LO-REDUCTION     258 17288.965069 17288.591829
#> LO-REDUCTION     260 17288.779011 17288.591829
#> LO-REDUCTION     262 17288.766114 17288.591829
#> REFLECTION       264 17288.762931 17288.553874
#> LO-REDUCTION     266 17288.760612 17288.553874
#> LO-REDUCTION     268 17288.755874 17288.553874
#> REFLECTION       270 17288.712718 17288.530741
#> LO-REDUCTION     272 17288.644400 17288.530741
#> EXTENSION        274 17288.613052 17288.437051
#> HI-REDUCTION     276 17288.606964 17288.437051
#> EXTENSION        278 17288.591829 17288.369092
#> LO-REDUCTION     280 17288.588782 17288.369092
#> LO-REDUCTION     282 17288.553874 17288.369092
#> EXTENSION        284 17288.542682 17288.275532
#> LO-REDUCTION     286 17288.531198 17288.275532
#> LO-REDUCTION     288 17288.530741 17288.275532
#> LO-REDUCTION     290 17288.469459 17288.275532
#> EXTENSION        292 17288.461014 17288.127850
#> LO-REDUCTION     294 17288.437051 17288.127850
#> LO-REDUCTION     296 17288.369092 17288.127850
#> LO-REDUCTION     298 17288.347618 17288.127850
#> LO-REDUCTION     300 17288.313219 17288.127850
#> EXTENSION        302 17288.295935 17288.071792
#> EXTENSION        304 17288.275532 17287.915686
#> LO-REDUCTION     306 17288.214888 17287.915686
#> LO-REDUCTION     308 17288.211032 17287.915686
#> EXTENSION        310 17288.173100 17287.842214
#> LO-REDUCTION     312 17288.141906 17287.842214
#> LO-REDUCTION     314 17288.127850 17287.842214
#> EXTENSION        316 17288.071792 17287.733682
#> EXTENSION        318 17288.054470 17287.663282
#> EXTENSION        320 17287.989272 17287.507624
#> LO-REDUCTION     322 17287.919537 17287.507624
#> LO-REDUCTION     324 17287.915686 17287.507624
#> LO-REDUCTION     326 17287.856498 17287.507624
#> REFLECTION       328 17287.842214 17287.490634
#> LO-REDUCTION     330 17287.733682 17287.490634
#> REFLECTION       332 17287.663282 17287.438068
#> LO-REDUCTION     334 17287.555894 17287.438068
#> LO-REDUCTION     336 17287.551441 17287.438068
#> REFLECTION       338 17287.547247 17287.422322
#> REFLECTION       340 17287.542002 17287.418701
#> REFLECTION       342 17287.507624 17287.368868
#> LO-REDUCTION     344 17287.490634 17287.368868
#> LO-REDUCTION     346 17287.444486 17287.368868
#> LO-REDUCTION     348 17287.438762 17287.368868
#> HI-REDUCTION     350 17287.438068 17287.368868
#> EXTENSION        352 17287.422322 17287.329022
#> EXTENSION        354 17287.418701 17287.274393
#> LO-REDUCTION     356 17287.391509 17287.274393
#> LO-REDUCTION     358 17287.381359 17287.274393
#> LO-REDUCTION     360 17287.375030 17287.274393
#> LO-REDUCTION     362 17287.371807 17287.274393
#> LO-REDUCTION     364 17287.368868 17287.274393
#> REFLECTION       366 17287.329022 17287.274032
#> LO-REDUCTION     368 17287.327098 17287.274032
#> EXTENSION        370 17287.322765 17287.246248
#> LO-REDUCTION     372 17287.316574 17287.246248
#> EXTENSION        374 17287.308329 17287.210843
#> LO-REDUCTION     376 17287.304794 17287.210843
#> LO-REDUCTION     378 17287.279608 17287.210843
#> REFLECTION       380 17287.274393 17287.210366
#> REFLECTION       382 17287.274032 17287.209122
#> LO-REDUCTION     384 17287.264744 17287.209122
#> REFLECTION       386 17287.246248 17287.183411
#> LO-REDUCTION     388 17287.223877 17287.183411
#> LO-REDUCTION     390 17287.214413 17287.183411
#> LO-REDUCTION     392 17287.213088 17287.183411
#> EXTENSION        394 17287.210843 17287.178084
#> LO-REDUCTION     396 17287.210366 17287.178084
#> LO-REDUCTION     398 17287.209122 17287.178084
#> REFLECTION       400 17287.197803 17287.172953
#> EXTENSION        402 17287.196607 17287.164829
#> LO-REDUCTION     404 17287.192398 17287.164829
#> EXTENSION        406 17287.183411 17287.148789
#> LO-REDUCTION     408 17287.183094 17287.148789
#> LO-REDUCTION     410 17287.181404 17287.148789
#> LO-REDUCTION     412 17287.178084 17287.148789
#> LO-REDUCTION     414 17287.173517 17287.148789
#> LO-REDUCTION     416 17287.172953 17287.148789
#> LO-REDUCTION     418 17287.165027 17287.148789
#> EXTENSION        420 17287.164829 17287.142664
#> LO-REDUCTION     422 17287.164534 17287.142664
#> EXTENSION        424 17287.161998 17287.140674
#> LO-REDUCTION     426 17287.160519 17287.140674
#> EXTENSION        428 17287.154476 17287.127103
#> LO-REDUCTION     430 17287.153672 17287.127103
#> LO-REDUCTION     432 17287.148789 17287.127103
#> LO-REDUCTION     434 17287.147540 17287.127103
#> LO-REDUCTION     436 17287.146752 17287.127103
#> LO-REDUCTION     438 17287.142664 17287.127103
#> LO-REDUCTION     440 17287.140674 17287.127103
#> REFLECTION       442 17287.138106 17287.127061
#> EXTENSION        444 17287.131886 17287.110761
#> LO-REDUCTION     446 17287.130480 17287.110761
#> LO-REDUCTION     448 17287.129088 17287.110761
#> LO-REDUCTION     450 17287.128934 17287.110761
#> LO-REDUCTION     452 17287.128336 17287.110761
#> LO-REDUCTION     454 17287.127103 17287.110761
#> LO-REDUCTION     456 17287.127061 17287.110761
#> EXTENSION        458 17287.122784 17287.105926
#> LO-REDUCTION     460 17287.121627 17287.105926
#> EXTENSION        462 17287.120146 17287.101582
#> LO-REDUCTION     464 17287.118771 17287.101582
#> LO-REDUCTION     466 17287.117855 17287.101582
#> EXTENSION        468 17287.117130 17287.090721
#> LO-REDUCTION     470 17287.110761 17287.090721
#> LO-REDUCTION     472 17287.108359 17287.090721
#> LO-REDUCTION     474 17287.107134 17287.090721
#> EXTENSION        476 17287.105926 17287.081849
#> LO-REDUCTION     478 17287.104244 17287.081849
#> EXTENSION        480 17287.101582 17287.080130
#> LO-REDUCTION     482 17287.096149 17287.080130
#> EXTENSION        484 17287.095793 17287.074699
#> LO-REDUCTION     486 17287.093227 17287.074699
#> LO-REDUCTION     488 17287.090721 17287.074699
#> LO-REDUCTION     490 17287.086551 17287.074699
#> REFLECTION       492 17287.086412 17287.074418
#> LO-REDUCTION     494 17287.081849 17287.074418
#> LO-REDUCTION     496 17287.080130 17287.074418
#> REFLECTION       498 17287.077153 17287.073950
#> REFLECTION       500 17287.076483 17287.072432
#> LO-REDUCTION     502 17287.075898 17287.072432
#> LO-REDUCTION     504 17287.075565 17287.072432
#> LO-REDUCTION     506 17287.075074 17287.072432
#> LO-REDUCTION     508 17287.074699 17287.072432
#> LO-REDUCTION     510 17287.074418 17287.072432
#> HI-REDUCTION     512 17287.073950 17287.072432
#> LO-REDUCTION     514 17287.073404 17287.072432
#> EXTENSION        516 17287.072984 17287.071689
#> LO-REDUCTION     518 17287.072957 17287.071689
#> EXTENSION        520 17287.072911 17287.071281
#> LO-REDUCTION     522 17287.072872 17287.071281
#> EXTENSION        524 17287.072566 17287.070932
#> LO-REDUCTION     526 17287.072496 17287.070932
#> REFLECTION       528 17287.072432 17287.070705
#> LO-REDUCTION     530 17287.072040 17287.070705
#> EXTENSION        532 17287.071689 17287.069928
#> LO-REDUCTION     534 17287.071602 17287.069928
#> LO-REDUCTION     536 17287.071281 17287.069928
#> LO-REDUCTION     538 17287.071085 17287.069928
#> REFLECTION       540 17287.070932 17287.069704
#> LO-REDUCTION     542 17287.070818 17287.069704
#> REFLECTION       544 17287.070705 17287.069688
#> REFLECTION       546 17287.070279 17287.069497
#> REFLECTION       548 17287.070113 17287.069324
#> HI-REDUCTION     550 17287.070091 17287.069324
#> LO-REDUCTION     552 17287.070013 17287.069324
#> REFLECTION       554 17287.069928 17287.069224
#> LO-REDUCTION     556 17287.069704 17287.069224
#> EXTENSION        558 17287.069688 17287.069019
#> EXTENSION        560 17287.069605 17287.068864
#> LO-REDUCTION     562 17287.069502 17287.068864
#> LO-REDUCTION     564 17287.069497 17287.068864
#> EXTENSION        566 17287.069324 17287.068482
#> LO-REDUCTION     568 17287.069285 17287.068482
#> EXTENSION        570 17287.069224 17287.068173
#> EXTENSION        572 17287.069019 17287.067727
#> LO-REDUCTION     574 17287.068980 17287.067727
#> EXTENSION        576 17287.068959 17287.066896
#> EXTENSION        578 17287.068864 17287.065841
#> LO-REDUCTION     580 17287.068771 17287.065841
#> EXTENSION        582 17287.068482 17287.064747
#> LO-REDUCTION     584 17287.068173 17287.064747
#> EXTENSION        586 17287.067970 17287.063322
#> LO-REDUCTION     588 17287.067727 17287.063322
#> EXTENSION        590 17287.066896 17287.061580
#> LO-REDUCTION     592 17287.066709 17287.061580
#> LO-REDUCTION     594 17287.066510 17287.061580
#> EXTENSION        596 17287.065841 17287.057767
#> LO-REDUCTION     598 17287.064747 17287.057767
#> EXTENSION        600 17287.063788 17287.053859
#> LO-REDUCTION     602 17287.063322 17287.053859
#> LO-REDUCTION     604 17287.062792 17287.053859
#> LO-REDUCTION     606 17287.062126 17287.053859
#> EXTENSION        608 17287.061580 17287.050184
#> EXTENSION        610 17287.058435 17287.045782
#> LO-REDUCTION     612 17287.057767 17287.045782
#> LO-REDUCTION     614 17287.057359 17287.045782
#> EXTENSION        616 17287.057193 17287.037166
#> LO-REDUCTION     618 17287.053911 17287.037166
#> LO-REDUCTION     620 17287.053859 17287.037166
#> LO-REDUCTION     622 17287.050184 17287.037166
#> LO-REDUCTION     624 17287.048790 17287.037166
#> EXTENSION        626 17287.046966 17287.027006
#> LO-REDUCTION     628 17287.045782 17287.027006
#> EXTENSION        630 17287.041585 17287.015525
#> LO-REDUCTION     632 17287.037975 17287.015525
#> LO-REDUCTION     634 17287.037776 17287.015525
#> LO-REDUCTION     636 17287.037537 17287.015525
#> EXTENSION        638 17287.037166 17287.002045
#> LO-REDUCTION     640 17287.029130 17287.002045
#> LO-REDUCTION     642 17287.027006 17287.002045
#> EXTENSION        644 17287.021112 17286.982852
#> LO-REDUCTION     646 17287.020505 17286.982852
#> LO-REDUCTION     648 17287.020369 17286.982852
#> EXTENSION        650 17287.015525 17286.962921
#> LO-REDUCTION     652 17287.006234 17286.962921
#> LO-REDUCTION     654 17287.003316 17286.962921
#> LO-REDUCTION     656 17287.002045 17286.962921
#> LO-REDUCTION     658 17286.999773 17286.962921
#> EXTENSION        660 17286.994995 17286.933690
#> LO-REDUCTION     662 17286.982852 17286.933690
#> LO-REDUCTION     664 17286.977870 17286.933690
#> LO-REDUCTION     666 17286.975300 17286.933690
#> LO-REDUCTION     668 17286.974307 17286.933690
#> EXTENSION        670 17286.969127 17286.897589
#> LO-REDUCTION     672 17286.962921 17286.897589
#> LO-REDUCTION     674 17286.956414 17286.897589
#> EXTENSION        676 17286.943420 17286.853826
#> LO-REDUCTION     678 17286.937677 17286.853826
#> LO-REDUCTION     680 17286.935539 17286.853826
#> EXTENSION        682 17286.933690 17286.809961
#> LO-REDUCTION     684 17286.911841 17286.809961
#> LO-REDUCTION     686 17286.906427 17286.809961
#> EXTENSION        688 17286.897589 17286.764565
#> EXTENSION        690 17286.877846 17286.704552
#> LO-REDUCTION     692 17286.872431 17286.704552
#> EXTENSION        694 17286.853826 17286.604256
#> LO-REDUCTION     696 17286.822022 17286.604256
#> LO-REDUCTION     698 17286.811029 17286.604256
#> LO-REDUCTION     700 17286.809961 17286.604256
#> EXTENSION        702 17286.764565 17286.493497
#> LO-REDUCTION     704 17286.736274 17286.493497
#> EXTENSION        706 17286.704552 17286.355870
#> LO-REDUCTION     708 17286.657383 17286.355870
#> LO-REDUCTION     710 17286.628318 17286.355870
#> EXTENSION        712 17286.621506 17286.141094
#> LO-REDUCTION     714 17286.604256 17286.141094
#> LO-REDUCTION     716 17286.503365 17286.141094
#> LO-REDUCTION     718 17286.493497 17286.141094
#> EXTENSION        720 17286.414390 17285.877458
#> LO-REDUCTION     722 17286.366513 17285.877458
#> LO-REDUCTION     724 17286.355870 17285.877458
#> LO-REDUCTION     726 17286.251140 17285.877458
#> EXTENSION        728 17286.244105 17285.662233
#> EXTENSION        730 17286.223061 17285.393569
#> LO-REDUCTION     732 17286.141094 17285.393569
#> LO-REDUCTION     734 17285.992524 17285.393569
#> LO-REDUCTION     736 17285.964793 17285.393569
#> EXTENSION        738 17285.889261 17284.967393
#> LO-REDUCTION     740 17285.877458 17284.967393
#> EXTENSION        742 17285.662233 17284.823201
#> EXTENSION        744 17285.587027 17284.571260
#> EXTENSION        746 17285.430175 17284.208648
#> LO-REDUCTION     748 17285.411444 17284.208648
#> LO-REDUCTION     750 17285.393569 17284.208648
#> LO-REDUCTION     752 17285.061522 17284.208648
#> LO-REDUCTION     754 17284.967393 17284.208648
#> LO-REDUCTION     756 17284.823201 17284.208648
#> LO-REDUCTION     758 17284.663334 17284.208648
#> EXTENSION        760 17284.571260 17284.082158
#> EXTENSION        762 17284.471298 17283.933635
#> REFLECTION       764 17284.370237 17283.918499
#> LO-REDUCTION     766 17284.326906 17283.918499
#> LO-REDUCTION     768 17284.242090 17283.918499
#> LO-REDUCTION     770 17284.232223 17283.918499
#> EXTENSION        772 17284.208648 17283.684327
#> HI-REDUCTION     774 17284.082158 17283.684327
#> LO-REDUCTION     776 17284.006206 17283.684327
#> LO-REDUCTION     778 17283.954602 17283.684327
#> LO-REDUCTION     780 17283.943643 17283.684327
#> EXTENSION        782 17283.941150 17283.629046
#> LO-REDUCTION     784 17283.933635 17283.629046
#> EXTENSION        786 17283.918499 17283.432835
#> LO-REDUCTION     788 17283.855866 17283.432835
#> LO-REDUCTION     790 17283.737787 17283.432835
#> EXTENSION        792 17283.707141 17283.177089
#> LO-REDUCTION     794 17283.691961 17283.177089
#> LO-REDUCTION     796 17283.684327 17283.177089
#> LO-REDUCTION     798 17283.629046 17283.177089
#> EXTENSION        800 17283.480788 17283.019801
#> EXTENSION        802 17283.464836 17282.857567
#> LO-REDUCTION     804 17283.432835 17282.857567
#> LO-REDUCTION     806 17283.427379 17282.857567
#> REFLECTION       808 17283.273719 17282.810344
#> LO-REDUCTION     810 17283.192011 17282.810344
#> EXTENSION        812 17283.177089 17282.773026
#> LO-REDUCTION     814 17283.100216 17282.773026
#> EXTENSION        816 17283.019801 17282.431349
#> LO-REDUCTION     818 17282.924383 17282.431349
#> LO-REDUCTION     820 17282.882981 17282.431349
#> LO-REDUCTION     822 17282.857567 17282.431349
#> LO-REDUCTION     824 17282.826993 17282.431349
#> EXTENSION        826 17282.810344 17282.134774
#> LO-REDUCTION     828 17282.773026 17282.134774
#> LO-REDUCTION     830 17282.673630 17282.134774
#> EXTENSION        832 17282.551561 17281.917158
#> LO-REDUCTION     834 17282.507799 17281.917158
#> LO-REDUCTION     836 17282.441034 17281.917158
#> EXTENSION        838 17282.431349 17281.805083
#> EXTENSION        840 17282.328334 17281.485386
#> LO-REDUCTION     842 17282.207399 17281.485386
#> EXTENSION        844 17282.134774 17281.177471
#> LO-REDUCTION     846 17281.971962 17281.177471
#> LO-REDUCTION     848 17281.930907 17281.177471
#> LO-REDUCTION     850 17281.917158 17281.177471
#> LO-REDUCTION     852 17281.805083 17281.177471
#> LO-REDUCTION     854 17281.637799 17281.177471
#> LO-REDUCTION     856 17281.542150 17281.177471
#> LO-REDUCTION     858 17281.485386 17281.177471
#> LO-REDUCTION     860 17281.450381 17281.177471
#> EXTENSION        862 17281.448390 17280.945651
#> LO-REDUCTION     864 17281.369614 17280.945651
#> LO-REDUCTION     866 17281.303934 17280.945651
#> EXTENSION        868 17281.250802 17280.800450
#> LO-REDUCTION     870 17281.238533 17280.800450
#> LO-REDUCTION     872 17281.223261 17280.800450
#> EXTENSION        874 17281.177471 17280.488585
#> LO-REDUCTION     876 17281.024470 17280.488585
#> LO-REDUCTION     878 17280.950337 17280.488585
#> LO-REDUCTION     880 17280.945651 17280.488585
#> LO-REDUCTION     882 17280.939912 17280.488585
#> LO-REDUCTION     884 17280.816046 17280.488585
#> LO-REDUCTION     886 17280.800450 17280.488585
#> REFLECTION       888 17280.738456 17280.447101
#> REFLECTION       890 17280.664631 17280.390325
#> LO-REDUCTION     892 17280.577249 17280.390325
#> LO-REDUCTION     894 17280.566783 17280.390325
#> LO-REDUCTION     896 17280.538991 17280.390325
#> LO-REDUCTION     898 17280.511395 17280.390325
#> LO-REDUCTION     900 17280.488585 17280.390325
#> LO-REDUCTION     902 17280.459344 17280.390325
#> LO-REDUCTION     904 17280.447101 17280.390325
#> REFLECTION       906 17280.446459 17280.382631
#> LO-REDUCTION     908 17280.422497 17280.382631
#> LO-REDUCTION     910 17280.420917 17280.381469
#> LO-REDUCTION     912 17280.406656 17280.381469
#> LO-REDUCTION     914 17280.406429 17280.381469
#> LO-REDUCTION     916 17280.394455 17280.379401
#> LO-REDUCTION     918 17280.390325 17280.379401
#> HI-REDUCTION     920 17280.388261 17280.379401
#> REFLECTION       922 17280.387657 17280.375301
#> LO-REDUCTION     924 17280.387214 17280.375301
#> REFLECTION       926 17280.382631 17280.374067
#> HI-REDUCTION     928 17280.382496 17280.374067
#> REFLECTION       930 17280.381469 17280.373924
#> LO-REDUCTION     932 17280.379656 17280.373924
#> HI-REDUCTION     934 17280.379401 17280.373924
#> LO-REDUCTION     936 17280.377978 17280.373666
#> REFLECTION       938 17280.377712 17280.372196
#> EXTENSION        940 17280.375615 17280.367543
#> LO-REDUCTION     942 17280.375301 17280.367543
#> LO-REDUCTION     944 17280.374265 17280.367543
#> LO-REDUCTION     946 17280.374067 17280.367543
#> LO-REDUCTION     948 17280.373924 17280.367543
#> REFLECTION       950 17280.373666 17280.366638
#> REFLECTION       952 17280.372196 17280.366393
#> LO-REDUCTION     954 17280.370941 17280.366393
#> EXTENSION        956 17280.369039 17280.360078
#> LO-REDUCTION     958 17280.368943 17280.360078
#> LO-REDUCTION     960 17280.368258 17280.360078
#> LO-REDUCTION     962 17280.367543 17280.360078
#> LO-REDUCTION     964 17280.366638 17280.360078
#> LO-REDUCTION     966 17280.366395 17280.360078
#> EXTENSION        968 17280.366393 17280.358415
#> EXTENSION        970 17280.364882 17280.356231
#> EXTENSION        972 17280.364704 17280.352703
#> EXTENSION        974 17280.361998 17280.350130
#> EXTENSION        976 17280.360738 17280.346336
#> LO-REDUCTION     978 17280.360395 17280.346336
#> LO-REDUCTION     980 17280.360078 17280.346336
#> REFLECTION       982 17280.358415 17280.344675
#> LO-REDUCTION     984 17280.356231 17280.344675
#> LO-REDUCTION     986 17280.352703 17280.344675
#> LO-REDUCTION     988 17280.350130 17280.344675
#> LO-REDUCTION     990 17280.349130 17280.344675
#> EXTENSION        992 17280.348501 17280.341423
#> LO-REDUCTION     994 17280.347252 17280.341423
#> LO-REDUCTION     996 17280.346336 17280.341423
#> LO-REDUCTION     998 17280.346022 17280.341423
#> LO-REDUCTION    1000 17280.345873 17280.341423
#> Exiting from Nelder Mead minimizer
#>     1002 function evaluations used
# extract results

# regression coefficient vector
res2$beta
#> [1] -2.64077894  0.62582753 -0.08293688  0.53247068 -0.12437312 -0.16298836
#> [7] -1.10550227


# check likelihood
-ll(res2$beta)
#> [1] -17280.34

# number of calls to log lik function
# since optim does not return the number of
# iterations
res2$iter
#> function gradient 
#>     1002       NA 

# optim does not calculated working weights
head(res2$weights)
#> [1] 1
# }