Computes forecast combination weights according to the trimmed bias-corrected eigenvector approach by Hsiao and Wan (2014) and produces forecasts for the test set, if provided.
comb_EIG4(x, ntop_pred = NULL, criterion = "RMSE")
An object of class foreccomb
. Contains training set (actual values + matrix of model forecasts) and optionally a test set.
Specifies the number of retained predictors. If NULL
(default), the inbuilt optimization algorithm selects this number.
If ntop_pred
is not specified, a selection criterion is required for the optimization algorithm: one of "MAE", "MAPE",
or "RMSE". If ntop_pred
is selected by the user, criterion
should be set to NULL
(default).
Returns an object of class foreccomb_res
with the following components:
Returns the used forecast combination method.
Returns the individual input models that were used for the forecast combinations.
Returns the intercept (bias correction).
Returns the combination weights obtained by applying the combination method to the training set.
Number of retained predictors.
Ranking of the predictors that determines which models are removed in the trimming step.
Returns the fitted values of the combination method for the training set.
Returns range of summary measures of the forecast accuracy for the training set.
Returns forecasts produced by the combination method for the test set. Only returned if input included a forecast matrix for the test set.
Returns range of summary measures of the forecast accuracy for the test set. Only returned if input included a forecast matrix and a vector of actual values for the test set.
Returns the data forwarded to the method.
The underlying methodology of the trimmed bias-corrected eigenvector approach by Hsiao and Wan (2014) is the same as their
bias-corrected eigenvector approach
. The only difference is that the bias-corrected trimmed eigenvector approach
pre-selects the models that serve as input for the forecast combination, only a subset of the available forecast models is retained,
while the models with the worst performance are discarded.
The number of retained forecast models is controlled via ntop_pred
. The user can choose whether to select this number, or leave the selection
to the inbuilt optimization algorithm (in that case ntop_pred = NULL
). If the optimization algorithm should select the best number of
retained models, the user must select the optimization criterion
: MAE, MAPE, or RMSE. After this trimming step, the weights, the intercept and the
combined forecast are computed in the same way as in the bias-corrected eigenvector approach
.
The bias-corrected trimmed eigenvector approach combines the strengths of the bias-corrected eigenvector approach
and the trimmed eigenvector approach
.
Hsiao, C., and Wan, S. K. (2014). Is There An Optimal Forecast Combination? Journal of Econometrics, 178(2), 294--309.
obs <- rnorm(100)
preds <- matrix(rnorm(1000, 1), 100, 10)
train_o<-obs[1:80]
train_p<-preds[1:80,]
test_o<-obs[81:100]
test_p<-preds[81:100,]
## Number of retained models selected by the user:
data<-foreccomb(train_o, train_p, test_o, test_p)
comb_EIG4(data, ntop_pred = 2, criterion = NULL)
#> $Method
#> [1] "Trimmed Bias-Corrected Eigenvector Approach"
#>
#> $Models
#> [1] "Series 1" "Series 2" "Series 3" "Series 4" "Series 5" "Series 6"
#> [7] "Series 7" "Series 8" "Series 9" "Series 10"
#>
#> $Fitted
#> Time Series:
#> Start = 1
#> End = 80
#> Frequency = 1
#> [1] 1.335063601 -0.644402863 0.328956016 0.348816087 -0.622624239
#> [6] -0.158612192 -0.175035127 -0.310570441 0.115171899 0.306931366
#> [11] -1.079815471 0.017972635 -0.465935734 -0.871305224 -0.799094094
#> [16] -0.723282684 0.937945433 0.980787807 0.585427889 -0.505934682
#> [21] -1.037574059 -0.441399029 0.261242181 -0.206246816 -1.383117334
#> [26] 0.727166603 0.531836406 -0.563403972 -0.161774697 0.494777217
#> [31] 1.771562418 1.665781476 -0.612269599 0.493072774 1.198774526
#> [36] -0.023118604 -0.202766782 -0.559537861 -0.723417736 1.562707632
#> [41] 0.681430642 1.206483024 -1.109878045 0.507411927 0.587406573
#> [46] -0.261022043 0.024605345 0.566618079 0.056551758 1.201987126
#> [51] -0.201336091 -0.004775891 0.454740588 -0.477868146 0.815272943
#> [56] 1.446328166 0.239345776 -0.003389148 0.770581945 0.307590720
#> [61] 0.116559144 0.595001711 0.397686109 -0.256410329 0.354825958
#> [66] -0.508141108 0.045626799 -0.418520940 -0.509666637 -0.610703802
#> [71] -0.738245285 -0.582202875 1.827022151 -0.055176968 0.943166432
#> [76] 1.454457013 1.161106509 0.174711446 1.382156327 0.639326077
#>
#> $Accuracy_Train
#> ME RMSE MAE MPE MAPE ACF1
#> Test set -3.75134e-18 1.053826 0.8396734 76.56726 281.7393 -0.05380646
#> Theil's U
#> Test set 0.4137399
#>
#> $Input_Data
#> $Input_Data$Actual_Train
#> Time Series:
#> Start = 1
#> End = 80
#> Frequency = 1
#> [1] 0.313640277 0.099394436 0.571102495 -0.493303419 0.740020795
#> [6] -1.155857183 0.659237226 -0.244471592 -1.075666053 0.237334112
#> [11] -0.536922320 0.300644865 -0.682604279 -1.096695753 0.108056945
#> [16] 0.660196786 -0.159514144 0.924566330 0.847633466 0.240674695
#> [21] 0.661566069 -0.673336129 0.163818099 -0.728350759 0.763825227
#> [26] -1.369016542 0.669089877 0.240242204 0.222608890 -1.598929516
#> [31] 0.074311585 -0.283600634 0.524440134 0.587794507 -0.328265236
#> [36] 0.002751359 -0.593736447 -1.020164111 0.350226390 1.035387355
#> [41] 1.483234452 0.769098040 0.053119010 -1.015176799 1.890509601
#> [46] 1.177205664 0.696205297 -0.164212773 0.003891041 1.264764129
#> [51] 0.536097954 1.619151199 1.661912574 -0.141171950 -0.124441537
#> [56] 1.333175527 1.189194927 1.865422197 -1.411514849 0.170540146
#> [61] -0.345455850 0.191509854 -0.108988388 -2.150607787 0.531117718
#> [66] 1.700541897 -0.100690935 0.992127249 -0.018333892 0.448814282
#> [71] -0.951187908 -0.248644023 0.338648336 -0.486257148 -0.878755079
#> [76] 0.978506032 0.671988924 0.236001915 2.187088647 -1.189139992
#>
#> $Input_Data$Forecasts_Train
#> Time Series:
#> Start = 1
#> End = 80
#> Frequency = 1
#> Series 1 Series 2 Series 3 Series 4 Series 5
#> 1 1.96865312 2.2446423369 1.2317372938 1.073472202 2.736812526
#> 2 0.18020626 2.1014275251 2.6480743550 -0.379097382 -1.308020822
#> 3 0.60627049 1.7111920422 -0.2680904218 1.390321293 -0.044934878
#> 4 -1.16396541 -0.7602015024 1.8999077442 0.873260503 0.156264489
#> 5 1.97064910 1.0160730284 1.0961882925 -0.594987464 -0.179676073
#> 6 1.12730731 1.9409703402 0.5225027703 -0.486629375 0.788944918
#> 7 0.20303963 0.6928244798 1.6734746158 1.543237557 0.847169123
#> 8 0.41085597 0.7857560417 0.7613950961 0.853856739 0.593755513
#> 9 2.44459913 0.9145856463 0.7597119703 1.515236058 0.894626922
#> 10 1.79142336 0.8268979380 1.4158278656 1.476046632 0.850480623
#> 11 1.60694849 0.9745041985 1.1586077876 -0.229833435 0.202130597
#> 12 0.99886358 0.9512123796 1.8833879094 1.892031183 0.626046336
#> 13 2.37549513 0.6009851326 1.9897073164 2.247329194 0.245537312
#> 14 1.27539604 2.3365197787 1.2799171455 0.568533798 -0.258135951
#> 15 0.30864739 0.6483725880 1.9354254992 0.006062263 -0.034810489
#> 16 1.80521768 0.8801045812 -0.4285511338 0.937209829 0.436221414
#> 17 1.20214055 2.0480292237 2.7780004902 1.302754976 2.341576341
#> 18 1.12380019 2.3021164697 0.7591538794 -0.002338710 2.154222097
#> 19 1.99787188 1.5666581597 0.8573461962 1.308398985 0.628592573
#> 20 0.70203255 1.3487243227 2.6918088265 0.727337181 0.962864734
#> 21 -0.07425045 1.2172936453 2.4222605591 1.248447101 -0.617928028
#> 22 2.10163216 -0.0932411607 2.4687847038 1.176871161 -0.208053225
#> 23 0.60612170 0.6265697205 1.0162787649 -0.515818006 0.312705255
#> 24 1.55476638 1.1827660647 1.9173290051 0.100275464 -0.106083424
#> 25 1.11998851 0.9239717564 1.8291312939 1.875530351 -1.113776869
#> 26 1.51626857 0.4662993205 0.1977654583 1.420843492 1.781112348
#> 27 2.02704216 -0.1259466128 1.4499638612 2.287537376 1.882248022
#> 28 1.73166309 -0.8921976795 1.3051085721 1.061325881 0.098254422
#> 29 0.88754969 0.5395927407 2.2529292193 0.874989746 0.762082373
#> 30 2.42917302 -0.0004400262 -1.3415260845 1.140517911 2.398710985
#> 31 1.89530083 -0.9081040814 1.4082198511 0.407290765 2.598698114
#> 32 -0.50043104 1.5301821303 0.7579575260 2.701228886 2.884503036
#> 33 0.29265949 0.9536132918 1.5330827842 1.424448357 -0.778236014
#> 34 0.88444052 2.8730280890 2.1890457225 1.593755265 2.112988147
#> 35 1.08755882 1.9976233704 0.9618108293 0.851307434 2.105230384
#> 36 1.39420481 0.8999083061 1.8988936445 1.580500240 0.789172603
#> 37 0.39115956 1.3277712896 0.4772511053 0.303187650 0.419227073
#> 38 0.34886577 1.5405269503 0.2547668425 0.810555137 0.024491550
#> 39 2.48379535 1.6305111258 0.6624423417 2.189252033 -0.444738489
#> 40 1.26621809 1.4618450676 2.1506074866 1.378640532 2.483263658
#> 41 0.40012924 0.1024411143 1.6382207356 0.804581938 2.355249111
#> 42 2.46834368 1.5363210222 0.0532502269 -0.482625530 2.708018036
#> 43 0.38016321 0.8603891234 0.8612951529 0.127441212 -0.258808375
#> 44 1.56684614 0.6236072163 0.1791188111 3.221341483 0.047873051
#> 45 2.38543341 0.8405456105 2.7852701711 -0.157051448 1.545235153
#> 46 2.39847921 0.1898131114 0.5618313505 2.070486157 0.005157065
#> 47 -0.76108035 1.6326611309 0.3177582548 2.246061380 -0.234830507
#> 48 0.50721066 0.7485194848 0.3688252576 0.750203170 1.697818168
#> 49 1.85439446 1.8205041997 2.2535207590 1.363077849 0.579955971
#> 50 1.38592556 1.4281143006 0.1757042920 1.904834618 3.098480585
#> 51 3.61835203 0.0423233566 -1.2514922315 0.140416655 0.859823077
#> 52 0.88371788 -0.9194753961 1.1722346485 0.778447381 -0.259911900
#> 53 1.87622538 0.5676733473 2.0256499762 1.453850617 0.390422086
#> 54 2.09069477 -0.0589213876 2.0334431254 1.039029817 0.619152542
#> 55 2.89174186 1.1692968665 1.9513296525 1.375649585 1.118707642
#> 56 1.41127850 -0.7832504600 1.1503630497 0.697308384 2.593905788
#> 57 0.08029804 1.0357061652 1.0601284941 -1.773460943 1.461103185
#> 58 0.18281624 0.0205677124 -0.8465681148 0.737464444 1.107043141
#> 59 1.47891822 2.2542036442 1.0241911468 0.738292819 1.709232438
#> 60 0.53797561 -0.1315097093 1.9233125002 -0.275719119 0.790857565
#> 61 1.78143265 3.1078076233 1.1385579322 1.174430297 1.214125203
#> 62 0.84247034 0.1087309070 0.0001031517 2.470511612 1.820240635
#> 63 0.11578136 2.2806063620 1.1390665440 0.452119850 0.226161718
#> 64 2.30983189 0.5709780328 1.7744851363 0.020588019 -0.343105244
#> 65 0.25846317 -0.0270144236 1.6105194979 0.792629718 1.000195208
#> 66 -0.17288098 1.7844363375 1.1359310120 0.136547952 -0.604806873
#> 67 0.73415528 1.1267036790 2.1804394700 0.304999773 2.979400988
#> 68 2.11832689 1.0642447897 -1.2696083958 1.699632298 1.117316369
#> 69 -0.75666350 1.6424298326 1.9218732438 2.956360739 -0.347508097
#> 70 -0.45267787 1.0919274157 1.0327068023 1.072339490 0.935929620
#> 71 -0.17016771 2.0681802887 0.7823345855 -1.445122538 -0.554053290
#> 72 0.01747620 1.2765818487 2.3373263108 2.000235916 -0.284473481
#> 73 0.73615647 1.3129339008 -0.0361967214 1.505483338 2.714983645
#> 74 1.89144848 2.0994624751 -1.1164038186 1.120519237 -0.193269619
#> 75 1.18169908 2.0942126967 0.4357225764 1.350294546 1.760966084
#> 76 0.17057424 3.1868529745 0.4783049302 1.512846500 2.343785712
#> 77 1.40949797 -0.2479677883 0.3942210702 3.692051882 1.201280788
#> 78 0.81590048 1.0491689020 -0.8325568245 0.930075188 -0.054762689
#> 79 0.74052779 1.5045447613 -0.5274198716 -0.261517442 2.623496151
#> 80 -0.54151832 2.6777358200 1.8160125574 -0.235984454 1.959824434
#> Series 6 Series 7 Series 8 Series 9 Series 10
#> 1 1.40132604 0.37646842 -0.24411216 0.72323535 0.83093430
#> 2 1.89382073 2.48550434 -0.35000385 0.89983076 0.50879653
#> 3 2.52041601 1.99985895 0.63563039 0.86526860 0.62733796
#> 4 2.32323779 2.17264086 0.74924441 0.03061484 1.64291495
#> 5 0.59119133 2.42377756 1.79426445 1.87147108 1.49967663
#> 6 0.45125562 2.15075366 1.17485962 0.81116262 1.86533956
#> 7 0.34549028 1.54222019 1.24769208 2.84257824 1.33302189
#> 8 0.35103914 1.53515714 1.31831530 0.86489373 1.45650526
#> 9 0.92623487 1.92464301 0.56569279 1.11264989 2.10531543
#> 10 1.40034121 1.96977288 1.42623013 0.02850605 0.11583562
#> 11 -0.87016543 -0.26978483 3.13046361 2.38512199 1.19641173
#> 12 1.03415557 1.82679603 1.28619021 1.82974494 2.32082436
#> 13 0.42649235 0.18470645 1.04430375 2.53012021 1.96463209
#> 14 0.13877900 2.10815917 1.73244263 0.01250418 0.32283750
#> 15 0.03012699 -1.76382532 0.92127741 1.71381499 0.85274798
#> 16 -0.36708638 1.14275097 0.99242409 0.93363723 0.68281709
#> 17 1.00195540 0.26125687 1.30990868 1.10583897 0.14815402
#> 18 1.32031158 -0.10964448 1.70575961 1.78568171 0.66790601
#> 19 2.27772951 -0.24724444 0.20770516 0.92210780 1.86661050
#> 20 -0.51992249 1.05191535 2.25781734 -1.80435340 -0.51548398
#> 21 0.20413097 0.20057854 1.47378305 -0.28473533 1.49724359
#> 22 1.02328185 0.96559253 0.72949296 1.30166873 0.14825251
#> 23 1.94361176 1.25767581 1.38011371 0.88405716 0.23792503
#> 24 1.41783516 1.09144973 1.88185547 2.14286398 1.53770802
#> 25 0.03848286 3.79674885 1.54881250 1.69885978 1.16825697
#> 26 1.20970179 2.46182096 0.31200203 -0.16333250 0.40341920
#> 27 0.65954257 1.25865343 2.06783066 1.47945858 0.11993173
#> 28 0.38864542 2.37582304 1.45161615 1.87639430 0.67692068
#> 29 0.47645881 0.14290013 -0.14881513 1.83059466 0.67849630
#> 30 -0.04000679 -0.12952958 1.64756408 1.85208880 0.41335302
#> 31 2.52555816 1.02333506 0.52605308 0.47282302 4.10197629
#> 32 1.95110259 1.65314549 1.21423157 1.23292009 -0.54514600
#> 33 1.33033329 1.99429107 1.84778699 1.45245792 3.30585613
#> 34 0.29821987 0.32755356 1.19456816 0.46183082 1.92697061
#> 35 1.85783490 1.96473030 1.91344304 0.45173799 0.77989585
#> 36 0.74864395 4.45321707 1.61826960 -0.27377083 1.91080294
#> 37 0.79675563 -0.63342748 1.70621431 0.76663341 2.26803334
#> 38 0.48542279 0.86327978 0.82820022 0.18614931 1.47585548
#> 39 0.68700522 0.83154230 0.96514060 1.47302415 2.10212405
#> 40 2.20489204 1.30333270 0.74012145 0.79634941 1.14459244
#> 41 0.42206312 0.79945050 0.91997319 1.02554109 0.82517923
#> 42 1.15331519 2.43852476 0.40240700 -0.47943750 0.22569294
#> 43 -0.38452799 -0.10511903 0.11223024 1.84200869 0.43160438
#> 44 2.80138082 1.95627173 0.80953716 0.92592723 -0.47365249
#> 45 1.18498084 -0.86565201 0.12645138 1.24927217 2.29635646
#> 46 1.16436187 -1.10968041 1.27421090 0.86192134 0.67991440
#> 47 2.07907834 1.58948920 2.38556056 0.87653115 1.12599109
#> 48 0.95669041 1.77122633 0.08390879 0.58607518 0.27606396
#> 49 1.17407246 -0.28491169 1.88948625 0.12143076 1.45288752
#> 50 0.67610815 0.27065208 -0.09563775 0.68912011 -0.59154748
#> 51 0.27256562 0.95186291 -0.05525544 -0.31072884 0.96826032
#> 52 2.04455078 1.17949086 1.70347821 -0.10291258 -0.15996667
#> 53 2.27568485 0.48361356 1.00316205 1.23118055 2.95394186
#> 54 -0.04688776 3.67819315 0.12479238 0.37056467 -0.79987173
#> 55 2.19606702 -0.48970482 -0.13789391 2.08529626 0.89807211
#> 56 1.81679883 1.05969581 0.12193681 0.33935418 1.90432320
#> 57 0.52103340 0.98634290 -0.23184091 0.27824269 2.04921417
#> 58 0.41153939 0.02145690 0.65287291 1.56549907 1.44082486
#> 59 1.39110975 1.77920914 1.86562677 1.44530804 3.93591730
#> 60 1.47315035 2.00547855 -0.82146488 1.61267604 -0.66494640
#> 61 0.54688679 1.16981226 0.62379285 0.85495612 1.12991283
#> 62 0.87252244 0.31874027 3.15838990 1.72965561 0.51670322
#> 63 2.34694113 0.43107677 0.90353409 1.19591266 2.16167881
#> 64 1.59131543 0.01868282 1.08434945 0.31079101 1.68938236
#> 65 1.32637123 -0.52995077 2.14050509 0.10100857 1.38200514
#> 66 1.35151839 -0.89544284 3.01686481 -0.24661791 3.00677129
#> 67 -1.72173507 1.00690779 1.51823111 0.23330553 2.24764083
#> 68 -0.51274392 2.28109127 2.38252496 1.74700846 0.70127033
#> 69 1.04021564 -0.02291424 -1.22120021 2.69972293 1.42455172
#> 70 -0.71784825 1.01175988 0.68387929 0.67331275 0.07808633
#> 71 0.78526596 0.62858423 2.57948024 -0.46360508 -0.27906183
#> 72 0.80541967 0.63402075 0.33676302 1.53220604 0.61992347
#> 73 2.50821774 0.86671401 2.10322702 1.36935852 1.45239553
#> 74 1.85406484 0.63983599 -0.17100972 1.23896496 0.62001881
#> 75 1.70833614 1.26889896 0.79756314 1.34567917 0.91847009
#> 76 2.13401625 0.38806393 1.20918533 -1.38776611 0.04454325
#> 77 2.85698723 1.05960047 2.15901204 1.21242756 1.13071180
#> 78 2.19332437 1.07756764 0.23518878 1.35240629 0.89069058
#> 79 1.64040648 0.18655019 0.06948791 -0.31142217 1.25588509
#> 80 0.80283431 0.70677923 2.27564272 2.58733337 -0.10263718
#>
#> $Input_Data$Actual_Test
#> [1] 1.07920317 0.96931102 1.73325205 -0.47410753 0.94508983 0.84511079
#> [7] -0.39175260 1.30957807 0.01821672 1.30593031 0.72810676 -0.34879933
#> [13] -0.23721267 0.48875874 -0.30467485 -2.28620298 0.22212410 2.40701415
#> [19] 0.06481441 0.56561648
#>
#> $Input_Data$Forecasts_Test
#> Series 1 Series 2 Series 3 Series 4 Series 5 Series 6
#> [1,] 1.984272347 0.794773868 -0.54015484 0.7250048 1.4622399 1.89279736
#> [2,] 0.851731723 0.563581776 0.13426273 1.1676041 1.0179238 -0.31745893
#> [3,] 1.732314401 1.129545811 2.65635708 2.6594422 2.0339228 1.91299488
#> [4,] 0.454453529 1.219449923 -0.04809555 1.3317109 0.7721481 1.44789205
#> [5,] 0.649160585 2.266889027 3.20627387 0.4523562 0.9917039 1.08461088
#> [6,] 2.420573801 1.498699788 2.24958761 -0.3132371 1.1227509 2.27895661
#> [7,] -0.471834409 0.748986302 1.38226280 2.6683100 1.4408438 0.94082288
#> [8,] 0.514075927 0.434876791 0.59561271 1.5357740 1.2784780 2.05983277
#> [9,] -0.138952099 0.262255178 0.62670243 0.4617262 0.6916118 2.01786716
#> [10,] 0.977416164 -0.234632021 0.99381375 1.1811973 -0.4942440 -0.13207168
#> [11,] 0.297894946 2.739732557 0.27833893 -0.5643136 1.5833295 0.17056546
#> [12,] 0.779305155 0.540085154 1.24531348 1.1596313 2.2432492 0.25215397
#> [13,] -0.007125624 2.142645849 0.06367109 1.8406637 0.1617341 0.48938226
#> [14,] 1.453441611 1.866260418 1.19268557 -0.2106703 0.8030320 -0.70134046
#> [15,] 1.221838843 3.350539729 0.15891917 1.4221186 1.0278114 0.73627308
#> [16,] 1.193158404 -0.268983703 1.87936589 0.5067160 1.4897679 0.04707730
#> [17,] 0.338735658 0.018357106 -0.19420016 2.8498748 -0.1422574 -0.69178973
#> [18,] 1.165049917 0.273525057 0.17627604 -1.1859222 0.4932551 -0.02785455
#> [19,] 0.590124302 -0.008209334 1.98382686 2.0643086 0.5049323 0.78510630
#> [20,] 1.336731248 1.096160317 2.56604591 2.3477264 1.9847252 -0.61418763
#> Series 7 Series 8 Series 9 Series 10
#> [1,] 2.11020893 3.1337818 -0.9376121 0.292617966
#> [2,] 0.25687950 1.3513524 0.2744438 0.189390411
#> [3,] 0.75459301 1.4840987 1.5542175 0.901185544
#> [4,] 0.89884004 0.4272724 -0.5103968 2.280283150
#> [5,] 0.43233703 -0.1826533 1.4684864 2.097768540
#> [6,] 0.79596281 1.3317818 1.1851125 0.785521368
#> [7,] 0.71508369 0.8974147 0.1655054 1.519372269
#> [8,] 1.96279351 2.5461030 0.4282508 2.009634496
#> [9,] 1.31991054 0.6226223 1.2484283 2.673305166
#> [10,] 1.89192678 2.3868910 2.1213590 1.233097269
#> [11,] -0.30548479 0.6441404 1.9465880 -0.895011724
#> [12,] 0.99088793 0.1423767 1.4886162 2.035629695
#> [13,] 0.08189045 1.8914055 1.2038098 -1.527954719
#> [14,] 1.07327356 1.6954669 0.7343370 1.150991392
#> [15,] 1.86261842 3.4882202 2.2722534 1.476441181
#> [16,] 0.47839400 1.4602865 1.7296138 0.001524042
#> [17,] 0.68839009 1.4541595 1.9877548 2.088733438
#> [18,] 1.86585404 2.5515230 -0.7269225 0.929403183
#> [19,] 0.48945335 2.9685234 2.1755825 2.200515874
#> [20,] 0.23268610 1.3086785 2.1752767 -0.357645995
#>
#>
#> $Predict
#> function (x, newpreds)
#> {
#> pred <- as.vector(as.vector(x$Intercept) + newpreds %*% x$Weights)
#> return(pred)
#> }
#> <bytecode: 0x56048b16e578>
#> <environment: namespace:ForecastComb>
#>
#> $Intercept
#> [1] -0.7938428
#>
#> $Weights
#> [1] 0.0000000 0.0000000 0.0000000 0.0000000 0.5448055 0.4551945 0.0000000
#> [8] 0.0000000 0.0000000 0.0000000
#>
#> $Forecasts_Test
#> [1] 0.86438449 -0.38377790 1.18503430 0.28590024 0.24015185 0.85520661
#> [7] 0.41939423 0.84030365 0.50147321 -1.12322794 0.14640427 0.54307078
#> [13] -0.48296505 -0.67559288 0.10126195 0.03922026 -1.18624428 -0.53779396
#> [19] -0.16137684 0.00787153
#>
#> $Accuracy_Test
#> ME RMSE MAE MPE MAPE
#> Test set 0.3580339 1.212531 0.9268076 5.502702 281.264
#>
#> $Top_Predictors
#> [1] 2
#>
#> $Ranking
#> [1] 7 9 8 4 1 2 10 6 3 5
#>
#> attr(,"class")
#> [1] "foreccomb_res"
## Number of retained models selected by algorithm:
data<-foreccomb(train_o, train_p, test_o, test_p)
comb_EIG4(data, ntop_pred = NULL, criterion = "RMSE")
#> Optimization algorithm chooses number of retained models for trimmed bias-corrected eigenvector approach...
#> Algorithm finished. Optimized number of retained models: 8
#> $Method
#> [1] "Trimmed Bias-Corrected Eigenvector Approach"
#>
#> $Models
#> [1] "Series 1" "Series 2" "Series 3" "Series 4" "Series 5" "Series 6"
#> [7] "Series 7" "Series 8" "Series 9" "Series 10"
#>
#> $Fitted
#> Time Series:
#> Start = 1
#> End = 80
#> Frequency = 1
#> [1] 0.31765970 -0.32167418 -0.05915056 -0.07438384 0.19858769 -0.07436145
#> [7] 0.43806593 -0.01913400 0.45428486 0.20796839 0.28658155 0.65600624
#> [13] 0.81719115 -0.19202560 -0.10367758 -0.19465197 0.52425971 0.31648793
#> [19] 0.40341484 -0.31250269 -0.10165390 0.28467341 -0.12746679 0.49304338
#> [25] 0.24707556 -0.05158888 0.66007885 0.27213125 0.11732777 0.22040195
#> [31] 0.82020655 0.31093213 0.48585613 0.45460299 0.35810562 0.35023340
#> [37] 0.03846067 -0.29619644 0.47449016 0.62010199 0.16449325 -0.15923834
#> [43] -0.42068944 0.30523285 0.56118951 0.31477092 0.16264000 -0.23084553
#> [49] 0.48968754 0.01150619 -0.31928100 -0.09296042 0.80752970 -0.15234782
#> [55] 0.71109241 0.34010814 -0.47550604 -0.19645100 0.83011157 -0.28843327
#> [61] 0.20898507 0.58614812 0.20436563 0.21864399 0.20054775 0.09873941
#> [67] 0.17085942 0.19229888 0.15545033 -0.43672311 -0.70206874 0.10354032
#> [73] 0.64455855 -0.17170809 0.31726976 -0.12320633 0.90827654 -0.15144813
#> [79] -0.28158889 0.20810608
#>
#> $Accuracy_Train
#> ME RMSE MAE MPE MAPE ACF1
#> Test set 1.064741e-16 0.941017 0.7435324 -200.8646 465.7979 0.00560363
#> Theil's U
#> Test set 1.021127
#>
#> $Input_Data
#> $Input_Data$Actual_Train
#> Time Series:
#> Start = 1
#> End = 80
#> Frequency = 1
#> [1] 0.313640277 0.099394436 0.571102495 -0.493303419 0.740020795
#> [6] -1.155857183 0.659237226 -0.244471592 -1.075666053 0.237334112
#> [11] -0.536922320 0.300644865 -0.682604279 -1.096695753 0.108056945
#> [16] 0.660196786 -0.159514144 0.924566330 0.847633466 0.240674695
#> [21] 0.661566069 -0.673336129 0.163818099 -0.728350759 0.763825227
#> [26] -1.369016542 0.669089877 0.240242204 0.222608890 -1.598929516
#> [31] 0.074311585 -0.283600634 0.524440134 0.587794507 -0.328265236
#> [36] 0.002751359 -0.593736447 -1.020164111 0.350226390 1.035387355
#> [41] 1.483234452 0.769098040 0.053119010 -1.015176799 1.890509601
#> [46] 1.177205664 0.696205297 -0.164212773 0.003891041 1.264764129
#> [51] 0.536097954 1.619151199 1.661912574 -0.141171950 -0.124441537
#> [56] 1.333175527 1.189194927 1.865422197 -1.411514849 0.170540146
#> [61] -0.345455850 0.191509854 -0.108988388 -2.150607787 0.531117718
#> [66] 1.700541897 -0.100690935 0.992127249 -0.018333892 0.448814282
#> [71] -0.951187908 -0.248644023 0.338648336 -0.486257148 -0.878755079
#> [76] 0.978506032 0.671988924 0.236001915 2.187088647 -1.189139992
#>
#> $Input_Data$Forecasts_Train
#> Time Series:
#> Start = 1
#> End = 80
#> Frequency = 1
#> Series 1 Series 2 Series 3 Series 4 Series 5
#> 1 1.96865312 2.2446423369 1.2317372938 1.073472202 2.736812526
#> 2 0.18020626 2.1014275251 2.6480743550 -0.379097382 -1.308020822
#> 3 0.60627049 1.7111920422 -0.2680904218 1.390321293 -0.044934878
#> 4 -1.16396541 -0.7602015024 1.8999077442 0.873260503 0.156264489
#> 5 1.97064910 1.0160730284 1.0961882925 -0.594987464 -0.179676073
#> 6 1.12730731 1.9409703402 0.5225027703 -0.486629375 0.788944918
#> 7 0.20303963 0.6928244798 1.6734746158 1.543237557 0.847169123
#> 8 0.41085597 0.7857560417 0.7613950961 0.853856739 0.593755513
#> 9 2.44459913 0.9145856463 0.7597119703 1.515236058 0.894626922
#> 10 1.79142336 0.8268979380 1.4158278656 1.476046632 0.850480623
#> 11 1.60694849 0.9745041985 1.1586077876 -0.229833435 0.202130597
#> 12 0.99886358 0.9512123796 1.8833879094 1.892031183 0.626046336
#> 13 2.37549513 0.6009851326 1.9897073164 2.247329194 0.245537312
#> 14 1.27539604 2.3365197787 1.2799171455 0.568533798 -0.258135951
#> 15 0.30864739 0.6483725880 1.9354254992 0.006062263 -0.034810489
#> 16 1.80521768 0.8801045812 -0.4285511338 0.937209829 0.436221414
#> 17 1.20214055 2.0480292237 2.7780004902 1.302754976 2.341576341
#> 18 1.12380019 2.3021164697 0.7591538794 -0.002338710 2.154222097
#> 19 1.99787188 1.5666581597 0.8573461962 1.308398985 0.628592573
#> 20 0.70203255 1.3487243227 2.6918088265 0.727337181 0.962864734
#> 21 -0.07425045 1.2172936453 2.4222605591 1.248447101 -0.617928028
#> 22 2.10163216 -0.0932411607 2.4687847038 1.176871161 -0.208053225
#> 23 0.60612170 0.6265697205 1.0162787649 -0.515818006 0.312705255
#> 24 1.55476638 1.1827660647 1.9173290051 0.100275464 -0.106083424
#> 25 1.11998851 0.9239717564 1.8291312939 1.875530351 -1.113776869
#> 26 1.51626857 0.4662993205 0.1977654583 1.420843492 1.781112348
#> 27 2.02704216 -0.1259466128 1.4499638612 2.287537376 1.882248022
#> 28 1.73166309 -0.8921976795 1.3051085721 1.061325881 0.098254422
#> 29 0.88754969 0.5395927407 2.2529292193 0.874989746 0.762082373
#> 30 2.42917302 -0.0004400262 -1.3415260845 1.140517911 2.398710985
#> 31 1.89530083 -0.9081040814 1.4082198511 0.407290765 2.598698114
#> 32 -0.50043104 1.5301821303 0.7579575260 2.701228886 2.884503036
#> 33 0.29265949 0.9536132918 1.5330827842 1.424448357 -0.778236014
#> 34 0.88444052 2.8730280890 2.1890457225 1.593755265 2.112988147
#> 35 1.08755882 1.9976233704 0.9618108293 0.851307434 2.105230384
#> 36 1.39420481 0.8999083061 1.8988936445 1.580500240 0.789172603
#> 37 0.39115956 1.3277712896 0.4772511053 0.303187650 0.419227073
#> 38 0.34886577 1.5405269503 0.2547668425 0.810555137 0.024491550
#> 39 2.48379535 1.6305111258 0.6624423417 2.189252033 -0.444738489
#> 40 1.26621809 1.4618450676 2.1506074866 1.378640532 2.483263658
#> 41 0.40012924 0.1024411143 1.6382207356 0.804581938 2.355249111
#> 42 2.46834368 1.5363210222 0.0532502269 -0.482625530 2.708018036
#> 43 0.38016321 0.8603891234 0.8612951529 0.127441212 -0.258808375
#> 44 1.56684614 0.6236072163 0.1791188111 3.221341483 0.047873051
#> 45 2.38543341 0.8405456105 2.7852701711 -0.157051448 1.545235153
#> 46 2.39847921 0.1898131114 0.5618313505 2.070486157 0.005157065
#> 47 -0.76108035 1.6326611309 0.3177582548 2.246061380 -0.234830507
#> 48 0.50721066 0.7485194848 0.3688252576 0.750203170 1.697818168
#> 49 1.85439446 1.8205041997 2.2535207590 1.363077849 0.579955971
#> 50 1.38592556 1.4281143006 0.1757042920 1.904834618 3.098480585
#> 51 3.61835203 0.0423233566 -1.2514922315 0.140416655 0.859823077
#> 52 0.88371788 -0.9194753961 1.1722346485 0.778447381 -0.259911900
#> 53 1.87622538 0.5676733473 2.0256499762 1.453850617 0.390422086
#> 54 2.09069477 -0.0589213876 2.0334431254 1.039029817 0.619152542
#> 55 2.89174186 1.1692968665 1.9513296525 1.375649585 1.118707642
#> 56 1.41127850 -0.7832504600 1.1503630497 0.697308384 2.593905788
#> 57 0.08029804 1.0357061652 1.0601284941 -1.773460943 1.461103185
#> 58 0.18281624 0.0205677124 -0.8465681148 0.737464444 1.107043141
#> 59 1.47891822 2.2542036442 1.0241911468 0.738292819 1.709232438
#> 60 0.53797561 -0.1315097093 1.9233125002 -0.275719119 0.790857565
#> 61 1.78143265 3.1078076233 1.1385579322 1.174430297 1.214125203
#> 62 0.84247034 0.1087309070 0.0001031517 2.470511612 1.820240635
#> 63 0.11578136 2.2806063620 1.1390665440 0.452119850 0.226161718
#> 64 2.30983189 0.5709780328 1.7744851363 0.020588019 -0.343105244
#> 65 0.25846317 -0.0270144236 1.6105194979 0.792629718 1.000195208
#> 66 -0.17288098 1.7844363375 1.1359310120 0.136547952 -0.604806873
#> 67 0.73415528 1.1267036790 2.1804394700 0.304999773 2.979400988
#> 68 2.11832689 1.0642447897 -1.2696083958 1.699632298 1.117316369
#> 69 -0.75666350 1.6424298326 1.9218732438 2.956360739 -0.347508097
#> 70 -0.45267787 1.0919274157 1.0327068023 1.072339490 0.935929620
#> 71 -0.17016771 2.0681802887 0.7823345855 -1.445122538 -0.554053290
#> 72 0.01747620 1.2765818487 2.3373263108 2.000235916 -0.284473481
#> 73 0.73615647 1.3129339008 -0.0361967214 1.505483338 2.714983645
#> 74 1.89144848 2.0994624751 -1.1164038186 1.120519237 -0.193269619
#> 75 1.18169908 2.0942126967 0.4357225764 1.350294546 1.760966084
#> 76 0.17057424 3.1868529745 0.4783049302 1.512846500 2.343785712
#> 77 1.40949797 -0.2479677883 0.3942210702 3.692051882 1.201280788
#> 78 0.81590048 1.0491689020 -0.8325568245 0.930075188 -0.054762689
#> 79 0.74052779 1.5045447613 -0.5274198716 -0.261517442 2.623496151
#> 80 -0.54151832 2.6777358200 1.8160125574 -0.235984454 1.959824434
#> Series 6 Series 7 Series 8 Series 9 Series 10
#> 1 1.40132604 0.37646842 -0.24411216 0.72323535 0.83093430
#> 2 1.89382073 2.48550434 -0.35000385 0.89983076 0.50879653
#> 3 2.52041601 1.99985895 0.63563039 0.86526860 0.62733796
#> 4 2.32323779 2.17264086 0.74924441 0.03061484 1.64291495
#> 5 0.59119133 2.42377756 1.79426445 1.87147108 1.49967663
#> 6 0.45125562 2.15075366 1.17485962 0.81116262 1.86533956
#> 7 0.34549028 1.54222019 1.24769208 2.84257824 1.33302189
#> 8 0.35103914 1.53515714 1.31831530 0.86489373 1.45650526
#> 9 0.92623487 1.92464301 0.56569279 1.11264989 2.10531543
#> 10 1.40034121 1.96977288 1.42623013 0.02850605 0.11583562
#> 11 -0.87016543 -0.26978483 3.13046361 2.38512199 1.19641173
#> 12 1.03415557 1.82679603 1.28619021 1.82974494 2.32082436
#> 13 0.42649235 0.18470645 1.04430375 2.53012021 1.96463209
#> 14 0.13877900 2.10815917 1.73244263 0.01250418 0.32283750
#> 15 0.03012699 -1.76382532 0.92127741 1.71381499 0.85274798
#> 16 -0.36708638 1.14275097 0.99242409 0.93363723 0.68281709
#> 17 1.00195540 0.26125687 1.30990868 1.10583897 0.14815402
#> 18 1.32031158 -0.10964448 1.70575961 1.78568171 0.66790601
#> 19 2.27772951 -0.24724444 0.20770516 0.92210780 1.86661050
#> 20 -0.51992249 1.05191535 2.25781734 -1.80435340 -0.51548398
#> 21 0.20413097 0.20057854 1.47378305 -0.28473533 1.49724359
#> 22 1.02328185 0.96559253 0.72949296 1.30166873 0.14825251
#> 23 1.94361176 1.25767581 1.38011371 0.88405716 0.23792503
#> 24 1.41783516 1.09144973 1.88185547 2.14286398 1.53770802
#> 25 0.03848286 3.79674885 1.54881250 1.69885978 1.16825697
#> 26 1.20970179 2.46182096 0.31200203 -0.16333250 0.40341920
#> 27 0.65954257 1.25865343 2.06783066 1.47945858 0.11993173
#> 28 0.38864542 2.37582304 1.45161615 1.87639430 0.67692068
#> 29 0.47645881 0.14290013 -0.14881513 1.83059466 0.67849630
#> 30 -0.04000679 -0.12952958 1.64756408 1.85208880 0.41335302
#> 31 2.52555816 1.02333506 0.52605308 0.47282302 4.10197629
#> 32 1.95110259 1.65314549 1.21423157 1.23292009 -0.54514600
#> 33 1.33033329 1.99429107 1.84778699 1.45245792 3.30585613
#> 34 0.29821987 0.32755356 1.19456816 0.46183082 1.92697061
#> 35 1.85783490 1.96473030 1.91344304 0.45173799 0.77989585
#> 36 0.74864395 4.45321707 1.61826960 -0.27377083 1.91080294
#> 37 0.79675563 -0.63342748 1.70621431 0.76663341 2.26803334
#> 38 0.48542279 0.86327978 0.82820022 0.18614931 1.47585548
#> 39 0.68700522 0.83154230 0.96514060 1.47302415 2.10212405
#> 40 2.20489204 1.30333270 0.74012145 0.79634941 1.14459244
#> 41 0.42206312 0.79945050 0.91997319 1.02554109 0.82517923
#> 42 1.15331519 2.43852476 0.40240700 -0.47943750 0.22569294
#> 43 -0.38452799 -0.10511903 0.11223024 1.84200869 0.43160438
#> 44 2.80138082 1.95627173 0.80953716 0.92592723 -0.47365249
#> 45 1.18498084 -0.86565201 0.12645138 1.24927217 2.29635646
#> 46 1.16436187 -1.10968041 1.27421090 0.86192134 0.67991440
#> 47 2.07907834 1.58948920 2.38556056 0.87653115 1.12599109
#> 48 0.95669041 1.77122633 0.08390879 0.58607518 0.27606396
#> 49 1.17407246 -0.28491169 1.88948625 0.12143076 1.45288752
#> 50 0.67610815 0.27065208 -0.09563775 0.68912011 -0.59154748
#> 51 0.27256562 0.95186291 -0.05525544 -0.31072884 0.96826032
#> 52 2.04455078 1.17949086 1.70347821 -0.10291258 -0.15996667
#> 53 2.27568485 0.48361356 1.00316205 1.23118055 2.95394186
#> 54 -0.04688776 3.67819315 0.12479238 0.37056467 -0.79987173
#> 55 2.19606702 -0.48970482 -0.13789391 2.08529626 0.89807211
#> 56 1.81679883 1.05969581 0.12193681 0.33935418 1.90432320
#> 57 0.52103340 0.98634290 -0.23184091 0.27824269 2.04921417
#> 58 0.41153939 0.02145690 0.65287291 1.56549907 1.44082486
#> 59 1.39110975 1.77920914 1.86562677 1.44530804 3.93591730
#> 60 1.47315035 2.00547855 -0.82146488 1.61267604 -0.66494640
#> 61 0.54688679 1.16981226 0.62379285 0.85495612 1.12991283
#> 62 0.87252244 0.31874027 3.15838990 1.72965561 0.51670322
#> 63 2.34694113 0.43107677 0.90353409 1.19591266 2.16167881
#> 64 1.59131543 0.01868282 1.08434945 0.31079101 1.68938236
#> 65 1.32637123 -0.52995077 2.14050509 0.10100857 1.38200514
#> 66 1.35151839 -0.89544284 3.01686481 -0.24661791 3.00677129
#> 67 -1.72173507 1.00690779 1.51823111 0.23330553 2.24764083
#> 68 -0.51274392 2.28109127 2.38252496 1.74700846 0.70127033
#> 69 1.04021564 -0.02291424 -1.22120021 2.69972293 1.42455172
#> 70 -0.71784825 1.01175988 0.68387929 0.67331275 0.07808633
#> 71 0.78526596 0.62858423 2.57948024 -0.46360508 -0.27906183
#> 72 0.80541967 0.63402075 0.33676302 1.53220604 0.61992347
#> 73 2.50821774 0.86671401 2.10322702 1.36935852 1.45239553
#> 74 1.85406484 0.63983599 -0.17100972 1.23896496 0.62001881
#> 75 1.70833614 1.26889896 0.79756314 1.34567917 0.91847009
#> 76 2.13401625 0.38806393 1.20918533 -1.38776611 0.04454325
#> 77 2.85698723 1.05960047 2.15901204 1.21242756 1.13071180
#> 78 2.19332437 1.07756764 0.23518878 1.35240629 0.89069058
#> 79 1.64040648 0.18655019 0.06948791 -0.31142217 1.25588509
#> 80 0.80283431 0.70677923 2.27564272 2.58733337 -0.10263718
#>
#> $Input_Data$Actual_Test
#> [1] 1.07920317 0.96931102 1.73325205 -0.47410753 0.94508983 0.84511079
#> [7] -0.39175260 1.30957807 0.01821672 1.30593031 0.72810676 -0.34879933
#> [13] -0.23721267 0.48875874 -0.30467485 -2.28620298 0.22212410 2.40701415
#> [19] 0.06481441 0.56561648
#>
#> $Input_Data$Forecasts_Test
#> Series 1 Series 2 Series 3 Series 4 Series 5 Series 6
#> [1,] 1.984272347 0.794773868 -0.54015484 0.7250048 1.4622399 1.89279736
#> [2,] 0.851731723 0.563581776 0.13426273 1.1676041 1.0179238 -0.31745893
#> [3,] 1.732314401 1.129545811 2.65635708 2.6594422 2.0339228 1.91299488
#> [4,] 0.454453529 1.219449923 -0.04809555 1.3317109 0.7721481 1.44789205
#> [5,] 0.649160585 2.266889027 3.20627387 0.4523562 0.9917039 1.08461088
#> [6,] 2.420573801 1.498699788 2.24958761 -0.3132371 1.1227509 2.27895661
#> [7,] -0.471834409 0.748986302 1.38226280 2.6683100 1.4408438 0.94082288
#> [8,] 0.514075927 0.434876791 0.59561271 1.5357740 1.2784780 2.05983277
#> [9,] -0.138952099 0.262255178 0.62670243 0.4617262 0.6916118 2.01786716
#> [10,] 0.977416164 -0.234632021 0.99381375 1.1811973 -0.4942440 -0.13207168
#> [11,] 0.297894946 2.739732557 0.27833893 -0.5643136 1.5833295 0.17056546
#> [12,] 0.779305155 0.540085154 1.24531348 1.1596313 2.2432492 0.25215397
#> [13,] -0.007125624 2.142645849 0.06367109 1.8406637 0.1617341 0.48938226
#> [14,] 1.453441611 1.866260418 1.19268557 -0.2106703 0.8030320 -0.70134046
#> [15,] 1.221838843 3.350539729 0.15891917 1.4221186 1.0278114 0.73627308
#> [16,] 1.193158404 -0.268983703 1.87936589 0.5067160 1.4897679 0.04707730
#> [17,] 0.338735658 0.018357106 -0.19420016 2.8498748 -0.1422574 -0.69178973
#> [18,] 1.165049917 0.273525057 0.17627604 -1.1859222 0.4932551 -0.02785455
#> [19,] 0.590124302 -0.008209334 1.98382686 2.0643086 0.5049323 0.78510630
#> [20,] 1.336731248 1.096160317 2.56604591 2.3477264 1.9847252 -0.61418763
#> Series 7 Series 8 Series 9 Series 10
#> [1,] 2.11020893 3.1337818 -0.9376121 0.292617966
#> [2,] 0.25687950 1.3513524 0.2744438 0.189390411
#> [3,] 0.75459301 1.4840987 1.5542175 0.901185544
#> [4,] 0.89884004 0.4272724 -0.5103968 2.280283150
#> [5,] 0.43233703 -0.1826533 1.4684864 2.097768540
#> [6,] 0.79596281 1.3317818 1.1851125 0.785521368
#> [7,] 0.71508369 0.8974147 0.1655054 1.519372269
#> [8,] 1.96279351 2.5461030 0.4282508 2.009634496
#> [9,] 1.31991054 0.6226223 1.2484283 2.673305166
#> [10,] 1.89192678 2.3868910 2.1213590 1.233097269
#> [11,] -0.30548479 0.6441404 1.9465880 -0.895011724
#> [12,] 0.99088793 0.1423767 1.4886162 2.035629695
#> [13,] 0.08189045 1.8914055 1.2038098 -1.527954719
#> [14,] 1.07327356 1.6954669 0.7343370 1.150991392
#> [15,] 1.86261842 3.4882202 2.2722534 1.476441181
#> [16,] 0.47839400 1.4602865 1.7296138 0.001524042
#> [17,] 0.68839009 1.4541595 1.9877548 2.088733438
#> [18,] 1.86585404 2.5515230 -0.7269225 0.929403183
#> [19,] 0.48945335 2.9685234 2.1755825 2.200515874
#> [20,] 0.23268610 1.3086785 2.1752767 -0.357645995
#>
#>
#> $Predict
#> function (x, newpreds)
#> {
#> pred <- as.vector(as.vector(x$Intercept) + newpreds %*% x$Weights)
#> return(pred)
#> }
#> <bytecode: 0x56048b16e578>
#> <environment: namespace:ForecastComb>
#>
#> $Intercept
#> [1] -0.8534763
#>
#> $Weights
#> [1] 0.1343898 0.0000000 0.1252685 0.1318725 0.1020282 0.1137443 0.0000000
#> [8] 0.1293228 0.1410618 0.1223121
#>
#> $Forecasts_Test
#> [1] 0.11441705 -0.26383180 1.00929832 -0.11717587 0.35972125 0.52157228
#> [7] 0.18740498 0.49296493 0.15093229 0.25143501 -0.42362722 0.29511368
#> [13] -0.30403312 -0.07073638 0.65900604 0.19948934 0.17426754 -0.44295022
#> [19] 0.84732467 0.52219533
#>
#> $Accuracy_Test
#> ME RMSE MAE MPE MAPE
#> Test set 0.2238294 1.085977 0.8182896 -10.46356 185.938
#>
#> $Top_Predictors
#> [1] 8
#>
#> $Ranking
#> [1] 7 9 8 4 1 2 10 6 3 5
#>
#> attr(,"class")
#> [1] "foreccomb_res"