Computes a ‘combined forecast’ from a pool of individual model forecasts using winsorized mean at each point in time.

comb_WA(x, trim_factor = NULL, criterion = "RMSE")

Arguments

x

An object of class foreccomb. Contains training set (actual values + matrix of model forecasts) and optionally a test set.

trim_factor

numeric. Must be between 0 and 0.5.

criterion

If trim_factor is not specified, an optimization criterion for automated trimming needs to be defined. One of "MAE", "MAPE", or "RMSE" (default).

Value

Returns an object of class foreccomb_res with the following components:

Method

Returns the used forecast combination method.

Models

Returns the individual input models that were used for the forecast combinations.

Weights

Returns the combination weights obtained by applying the combination method to the training set.

Trim Factor

Returns the trim factor, \(\lambda\).

Fitted

Returns the fitted values of the combination method for the training set.

Accuracy_Train

Returns range of summary measures of the forecast accuracy for the training set.

Forecasts_Test

Returns forecasts produced by the combination method for the test set. Only returned if input included a forecast matrix for the test set.

Accuracy_Test

Returns range of summary measures of the forecast accuracy for the test set. Only returned if input included a forecast matrix and a vector of actual values for the test set.

Input_Data

Returns the data forwarded to the method.

Details

Suppose \(y_t\) is the variable of interest, there are \(N\) not perfectly collinear predictors, \(\mathbf{f}_t = (f_{1t}, \ldots, f_{Nt})'\). For each point in time, the order forecasts are computed:

$$\mathbf{f}_t^{ord} = (f_{(1)t}, \ldots, f_{(N)t})'$$

Using a trim factor \(\lambda\) (i.e., the top/bottom \(\lambda \%\) are winsorized), and setting \(K = N\lambda\) , the combined forecast is calculated as (Jose and Winkler, 2008):

$$\hat{y}_t = \frac{1}{N} \left[Kf_{(K+1)t} + \sum_{i=K+1}^{N-K} f_{(i)t} + Kf_{(N-K)t}\right]$$

Like the trimmed mean, the winsorized mean is a robust statistic that is less sensitive to outliers than the simple average. It is less extreme about handling outliers than the trimmed mean and preferred by Jose and Winkler (2008) for this reason.

This method allows the user to select \(\lambda\) (by specifying trim_factor), or to leave the selection to an optimization algorithm -- in which case the optimization criterion has to be selected (one of "MAE", "MAPE", or "RMSE").

References

Jose, V. R. R., and Winkler, R. L. (2008). Simple Robust Averages of Forecasts: Some Empirical Results. International Journal of Forecasting, 24(1), 163--169.

Author

Christoph E. Weiss and Gernot R. Roetzer

Examples

obs <- rnorm(100)
preds <- matrix(rnorm(1000, 1), 100, 10)
train_o<-obs[1:80]
train_p<-preds[1:80,]
test_o<-obs[81:100]
test_p<-preds[81:100,]

## User-selected trim factor:
data<-foreccomb(train_o, train_p, test_o, test_p)
comb_TA(data, trim_factor=0.1)
#> $Method
#> [1] "Trimmed Mean"
#> 
#> $Models
#>  [1] "Series 1"  "Series 2"  "Series 3"  "Series 4"  "Series 5"  "Series 6" 
#>  [7] "Series 7"  "Series 8"  "Series 9"  "Series 10"
#> 
#> $Fitted
#> Time Series:
#> Start = 1 
#> End = 80 
#> Frequency = 1 
#>  [1] 1.1669180 1.0019612 0.7692210 1.2034736 1.4383147 0.9943968 1.0053337
#>  [8] 1.3880183 1.0052722 0.3688193 0.6690988 0.9741079 1.1546763 1.4541525
#> [15] 1.3214964 1.0164880 0.4889617 1.6389753 0.8953264 1.2463367 1.2282467
#> [22] 0.7520737 0.3795631 0.4874229 0.9711204 0.9716126 0.9497192 1.0432728
#> [29] 1.2574309 0.7451724 1.4319760 0.9757726 0.9681537 1.2020370 1.6214222
#> [36] 1.5563897 0.9932657 1.6524457 1.1154510 1.5524646 0.6295621 0.8240539
#> [43] 1.0720680 1.2149484 1.0602680 1.0021505 1.6245247 0.5031843 1.2838845
#> [50] 1.0692994 1.1126856 1.0123477 1.2833629 1.0050834 0.4358694 0.6154181
#> [57] 1.1326102 0.9260484 1.0639368 1.1633297 1.3322243 1.1220658 0.5341706
#> [64] 0.6116851 1.6723249 1.0012042 0.9224963 0.9983168 0.8988779 1.0390622
#> [71] 0.9116182 1.0414435 1.0748138 1.2687088 1.0659005 0.6448551 0.7576733
#> [78] 1.4219890 1.0216800 1.1898080
#> 
#> $Accuracy_Train
#>                  ME     RMSE     MAE      MPE     MAPE       ACF1 Theil's U
#> Test set -0.9710838 1.391661 1.11296 65.00653 297.1929 0.03806376  1.450962
#> 
#> $Input_Data
#> $Input_Data$Actual_Train
#> Time Series:
#> Start = 1 
#> End = 80 
#> Frequency = 1 
#>  [1] -0.06482357  1.88129795  0.37728673 -1.10012675  0.50884206  0.69353830
#>  [7] -0.11306173 -1.17045083  1.08342120 -0.60269190 -0.91345522  2.48855648
#> [13]  0.41491463 -0.81415778 -1.21587151  0.15772992 -0.05254640 -0.26792886
#> [19] -0.31142546 -0.16529183  1.54545509  0.72025696  0.98834802  0.11434685
#> [25]  0.69896061 -1.70419899  0.05524195  0.59505638  1.48438082 -0.40696587
#> [31]  0.17464530  0.07126289 -0.85003539 -1.72823591 -0.92625250  1.52284667
#> [37]  1.28665243  0.49189116 -1.21118134  0.48840810 -0.29288472  0.87975549
#> [43] -0.60751393 -0.30026917 -1.31669005 -0.44742476  1.12319146  0.85315910
#> [49] -0.82439587  0.30886125  1.64922707 -2.70983082 -0.28844978 -0.80996466
#> [55]  0.06519417  0.38344340  0.65049717  0.30230426 -0.56987071  0.10788262
#> [61] -0.74475684  0.10954229  0.22454548 -0.58129117 -0.85372262  0.97043107
#> [67]  1.51545114 -0.43641418  1.08614249 -0.17353115  0.50109335  1.07580101
#> [73] -0.50366128 -0.85830836  0.71387359  0.43098403  0.23865652 -0.78183050
#> [79]  0.62387562  0.99747504
#> 
#> $Input_Data$Forecasts_Train
#> Time Series:
#> Start = 1 
#> End = 80 
#> Frequency = 1 
#>       Series 1    Series 2    Series 3    Series 4   Series 5     Series 6
#>  1  1.92487156  1.21722543  1.33187154  0.63522291  1.3201490 -2.687852853
#>  2  1.97312426  0.50256843  0.63952496  3.17289325  1.2831147 -0.017280248
#>  3  1.88423088  0.63099031  1.15156283 -0.49033842 -0.3514359  1.333999211
#>  4  2.00191255  0.30747604 -0.66936490  1.81382117  1.8240394  1.098243586
#>  5  0.72807458  3.22789533  0.95650940  1.38463701  1.1931443  2.280547609
#>  6  1.00727894  1.02754419  0.97709784 -0.44168893  0.1756110  1.477165969
#>  7  0.65512661  1.65309632  3.29177375  1.57725756  1.2594985 -0.302134050
#>  8  3.67897225  0.50027979  0.96699094  0.65489577  1.3480945  2.156895501
#>  9  2.17809853  0.83583537 -1.03002804  0.38848920  2.1619113  0.532795792
#> 10 -0.49243039  2.18492344  1.06511038  1.86891447  0.2947084 -1.555182813
#> 11 -0.05219573  1.85938780 -0.22558141  0.29312333  2.0212860  1.290776955
#> 12  0.95833322  1.20778988  0.46680882  0.77225172  0.4898309  0.877059695
#> 13  1.09164125  2.32725734  0.70643503  0.43876854  1.6546114  1.126627550
#> 14 -0.15684746  2.21169157  1.04853711 -0.62328203  0.3683613  2.907418026
#> 15  1.23148074  0.98506319  1.92973803  2.60600393  1.8719228  1.856455814
#> 16  2.08053687  1.78428544  1.34814411  0.10067457  0.6888679  0.223074873
#> 17  0.71191750 -0.09336842  1.54967242  0.79808031  0.8232614  0.275300299
#> 18  1.73944557  2.19249758  1.67570991  0.75451218  1.9806513  0.416212941
#> 19  1.51536325 -0.58285000  2.93111420  0.14670968  0.2857912  1.176490641
#> 20  1.71876101  0.62342335  1.96952763  0.31710447  0.7063407  3.006826716
#> 21  0.20263809  0.12430823  0.76013484  0.74970432  1.9810263  0.937215100
#> 22  1.57743524  0.96865117  1.59153705  2.47181425  0.6680911  1.063409757
#> 23  0.87018148  2.34997625  0.36418532  0.69551518  0.4916691 -0.004210052
#> 24  0.56448835 -0.65463671  1.32595463  0.56765781  0.5525789 -0.304844337
#> 25  1.19670376 -0.06785420  2.41422447  1.58743421  1.4857096  0.641409468
#> 26  0.84829507 -0.66123321  1.81150849  0.70565752  0.1411519  2.207064370
#> 27  1.50720131  0.87582762  0.77533940  1.20452863  0.3916465  1.061888163
#> 28  1.20726159  0.67938125  1.82187865  0.67793815  1.5294137  0.832580590
#> 29  2.18544733 -0.71787263 -0.61261424  1.38442904  0.9934852  2.170863239
#> 30 -0.13270562  1.13068327  0.11671233 -0.83479802  0.2755754 -0.871770585
#> 31  2.36958658  0.34793583  1.79441415  0.56308458  3.2195360  0.930603757
#> 32  1.86026392  1.77768309  0.24389250  2.43425058  0.6255733 -0.234934888
#> 33  0.61018203  2.24074262 -0.47366266  0.65815157  1.7631750  0.489692838
#> 34  1.67668157  0.97701497  1.69949875  0.88060766  3.5060910  1.254729622
#> 35  3.00563700  0.59825979  2.38067126 -1.02656763  1.4249912  1.858787130
#> 36  2.23157900  1.13513296  0.85261678  1.65989919  1.5273931  2.762509156
#> 37  0.62829150  0.64880577  0.86918776  1.27985976  1.2043053  0.675373178
#> 38  2.21723307  1.53765853 -0.27184944  2.79515131  2.3734330  0.992023043
#> 39  0.40683486  2.77326065  0.74670975  2.05000703  0.3132890 -0.857674250
#> 40 -0.31600323  1.42547147  1.23715355  2.72306611  2.7467876  0.187283135
#> 41  0.87674098 -0.10806360  1.46172942  1.92496497  1.7097927 -0.911791434
#> 42  1.02364811  0.74523052 -0.81526017  0.33165095 -0.4841330  1.401887449
#> 43 -0.97119247  0.23849489 -0.41200938  1.88270214  0.7903222  2.768369737
#> 44  1.67183724  0.87385299  0.28586846  1.99872795  0.9808677  1.118930670
#> 45  1.44122538  0.23750062  0.64071171  2.09354082  1.3606246  0.372486008
#> 46  1.12508595  1.16761768 -1.08322992  1.77100948 -0.1691081  2.334084065
#> 47  2.69874625  1.93387523  1.91986765  1.90406068  2.6743441  1.204741266
#> 48  2.54599476  0.58429137 -0.80955993  1.18524392  0.3852273  0.910127001
#> 49  2.00387215  2.31837122 -0.88612274  1.71093535  0.8391844 -0.623543139
#> 50  1.95192697 -1.90531092 -0.14133003  0.06719821  1.3526236  1.970875571
#> 51  0.79108087 -0.01975667  0.31458365  1.11010792  2.7615806  1.802153678
#> 52  3.19564912  1.28322540  0.73561887  0.37356096  0.6267023  1.333967190
#> 53  2.23654035 -0.09044053  3.17933746  1.62058102  1.8718930  0.177874462
#> 54  0.62371824  3.84344991  0.05707987  1.42825460  1.6254567  0.651932208
#> 55  0.62952076 -0.68985091  0.82354172  1.68010943  1.3316294 -0.591909648
#> 56  1.11766764  2.78385088 -0.36001455  0.28594531  0.3027414  0.842162981
#> 57  1.91701175  1.15312507  2.31415896  1.56613691  1.7037662  1.064526951
#> 58  0.01462512  0.35705646  0.28868473  1.34735265  0.6747173  2.018465058
#> 59  1.67232984  0.61179014  0.88851667  1.03962614  0.6593965  1.678822673
#> 60  1.04489558  0.64085371  1.27324132  0.53404577  0.4604977  1.386435255
#> 61  1.09940317  1.42853105  1.53730967  1.68507801  1.2867005 -0.610036254
#> 62  1.08877144  0.55561864  1.34813037  1.22270694  1.1917969  2.760760568
#> 63  0.32211039 -1.51376781  0.91348131  1.69531722  0.4739936  0.013599416
#> 64  1.37888304  1.56687075 -0.74604087  0.28769048  1.4174980  2.996949570
#> 65  1.75393458  2.08086471  2.52069774  3.25580400 -0.6286073  1.044918486
#> 66  0.98314800  0.80134238 -0.10249433  2.98315637  1.7209896  1.725657444
#> 67  1.61804613  1.48496488  0.93742507  0.69178642  0.4806510  0.307480014
#> 68  2.24169576  0.15253689  0.15772503  2.54945889  0.4357431  0.264159842
#> 69 -0.02527255 -0.01336296  0.77737957  1.39573635  1.8883610  0.805261085
#> 70 -0.36974240  0.73373880  1.83051439  0.86605498 -0.3523638  1.502681535
#> 71  0.84129561  0.77287469  0.65133073  0.85530305  2.4867596  0.640831731
#> 72  0.97243441  0.31156236  0.96879300  0.83483191  1.5701992  0.292327626
#> 73  1.30548094  0.79992255  1.20739950  1.12895406  1.5764023  1.384591666
#> 74  1.42727915  1.79264278  2.00184687  2.94298966  2.1365397  0.605000559
#> 75  1.70613019  0.78129255  0.63660559  1.90974979  1.4196825  0.610954297
#> 76  1.28099778  0.45868158  1.11109443  0.57478861 -0.7853909 -0.373408759
#> 77  0.90820166 -0.80136296 -0.60409230 -0.53886035  2.7795145  0.964391045
#> 78  4.49502536  1.27649045  1.86393022  2.47642586  1.7377523  1.804649403
#> 79  1.16804843  0.02879095  1.36417015  0.53336626  0.3461475  1.022308070
#> 80  1.10908559  0.79040184  1.82322210  0.04545392  1.6623901  1.543399289
#>       Series 7    Series 8      Series 9   Series 10
#>  1  1.56574024 -0.23283371  1.5730967417  2.43108368
#>  2  0.62056002  1.44394857  1.5701286561 -0.09359431
#>  3  1.42527499  1.02459430 -0.1600108593  1.09879273
#>  4  1.48909431  0.28117964  0.9175354114  1.89639919
#>  5 -1.28573834  3.54506698  1.3300440677  0.40566514
#>  6  1.66180749  1.44955832  0.1791107473  2.33677885
#>  7  1.84909968  1.49438525 -1.6239569686 -0.14366008
#>  8  1.24713404  2.29570311  1.9341529780  0.41803589
#>  9  2.72097026  1.66544943  0.0168125504  0.26278574
#> 10 -0.24839112  0.58647533 -0.4433610901  0.31952856
#> 11 -0.59512320  0.86144530  0.0218477669  1.30398603
#> 12  1.17832360  1.49898870  2.3813667871  0.81028519
#> 13 -0.44398564  0.77216391  1.7679752879  1.67918761
#> 14  2.97297540  0.12196703  2.7823659929  2.34972616
#> 15  0.69980214  0.35034551 -1.5396985884  1.64716299
#> 16  1.36940634  1.31592763  1.1506047612  0.25159302
#> 17  1.12265149  0.30646341 -0.9834908512 -0.03261244
#> 18  1.44922961  1.28643098  2.0333249555  2.23708751
#> 19  0.10427354  2.11406087 -0.5891425730  2.40277222
#> 20 -0.36976328  1.07399928  1.5949926166  1.96654452
#> 21  1.81528141  1.38130203  1.9986716362  2.41160933
#> 22 -0.01560092 -0.60169936  0.0002993922  0.16276678
#> 23  0.70182437 -0.83154898 -0.1510323189  0.06837188
#> 24  0.42374257 -0.60866280  1.3784683436  1.84501959
#> 25  0.39218297  0.66791756  1.2121675855  0.58543770
#> 26  1.28491543  0.26112679  0.8793171853  1.84092856
#> 27  0.28603014  1.58581162  0.7192750986  1.06204654
#> 28  2.17105705  0.59628244  1.0014463696  0.36023018
#> 29  2.08022117  2.61084333  0.7173306210  1.14028511
#> 30  1.57478312  2.10146339  1.7296654752  3.19701012
#> 31  0.75293781  1.18190355  2.3605194831  1.50275834
#> 32  2.05348619  0.48041491  0.0319796084  0.73288711
#> 33  2.86998247  0.51760018  1.7748494147 -0.30916400
#> 34  1.23328187  0.24171252  1.1200975013  0.77438370
#> 35  1.24838246  0.43861547  2.0160329821  3.42368742
#> 36  2.29087958  2.24889124  0.5013334834  0.50472551
#> 37  1.66145997  0.97884272 -1.1529833512  1.89619294
#> 38  1.39311403  1.18217675  2.1270221174  1.39690496
#> 39  1.46810022  2.36052288  1.0102719231  0.56787263
#> 40  2.83279198  1.30525765  2.0415761518  0.75312085
#> 41 -0.53498460  0.22261627  1.8586668846 -0.45000131
#> 42  1.48392386  0.93100458  1.1592187009  1.77609738
#> 43  1.16683363  2.35113691  2.4546521120  0.10441147
#> 44  1.36159643 -1.82003426  3.2424097315  1.42790585
#> 45  1.08489916 -0.13289286  1.3591315844  1.98556510
#> 46  1.77142696  1.00322718  0.8367040433  0.51124068
#> 47  0.73261005  1.47816269  0.6590591393  1.14853603
#> 48  0.60331743 -1.09549610  1.6271159317 -0.46028904
#> 49  2.66500321  1.42522195  1.1090413632  1.48799241
#> 50  0.64308710  1.27542026  1.9094956709  1.49597322
#> 51  0.51067991  1.82161750  0.1837397243  2.36752192
#> 52  0.11459795  0.70410262  2.8742342338  0.16737013
#> 53  0.02565506  0.93131895  1.3610211804  2.04201910
#> 54 -0.13371694  0.49369346  1.6615737254  1.49895810
#> 55 -0.73522233  0.68816029  0.4661169932  0.82974690
#> 56  1.13333199  1.33165579  0.0063654902 -0.09652559
#> 57  0.51532075 -0.43050542  0.6826113533  0.45838273
#> 58  0.09826700  2.42525039  1.2402156175  1.38362847
#> 59 -1.41683774  0.97316135  2.1022044609  0.98785076
#> 60  0.90661508  2.51103757  1.1122272299  2.40832391
#> 61  1.20855016  0.95593155  1.4562898934  3.39257827
#> 62  1.51871843  1.12245168  0.5438905774  0.92833182
#> 63 -0.92374518  1.35299330  1.1354938342  0.98543836
#> 64 -0.07712457 -1.24395451  0.1720393603  0.89366480
#> 65  2.04764306  1.61866511  1.4342347236  0.87764107
#> 66  1.07439537  0.05191018  0.3731059655  1.27908470
#> 67  0.54793426  0.45281955  2.2147516382  1.16634273
#> 68  1.26473956  1.85907967  1.6108549019 -0.08983873
#> 69  1.06020521  0.72174386  0.5891932301  1.85486690
#> 70  1.42093280  0.56765418  2.0608715865  1.74328476
#> 71 -0.89727115  2.59335811 -0.1424226360  1.18697248
#> 72  0.33937343  1.20067954  2.4493537623  2.13367384
#> 73  0.32276389  1.82930801  0.5143504358  0.68140877
#> 74 -0.84934821  0.17414539  0.3810963950  1.63111935
#> 75  2.15785141 -0.44180627 -0.1329450863  1.59573431
#> 76  1.29107326  1.37390829 -0.3841768476  1.19979110
#> 77  1.17115304  1.01289684  0.6933548753  2.45434167
#> 78  0.96717987 -0.34308904  1.2895579242 -0.04007394
#> 79  0.46224701  1.98616068  1.3010264468  1.97612625
#> 80  1.15451087 -0.14862841  2.2949718153  1.39000022
#> 
#> $Input_Data$Actual_Test
#>  [1]  0.86598193 -0.71399881 -0.41780859  0.74130038  0.81740052  0.03525554
#>  [7] -0.01920845 -0.24672645 -0.91463278 -1.63470381  0.48350690 -1.13212227
#> [13]  1.46798670 -1.08554092  1.93647348 -1.26261590  0.82063058 -0.45867606
#> [19]  0.64051764  0.80317101
#> 
#> $Input_Data$Forecasts_Test
#>          Series 1    Series 2    Series 3    Series 4    Series 5    Series 6
#>  [1,]  0.62219413  0.13792682  1.91655984  1.23490626  0.80730673  1.60124514
#>  [2,]  0.52241542  0.88701578  1.82129036  0.11642475  1.81688367  1.78717226
#>  [3,]  1.22544134  0.04894114  1.34396526  3.30584020 -0.03149307 -0.10124913
#>  [4,] -0.07164555  0.20296333  1.64931930  1.44758313  1.41201381  0.04102607
#>  [5,] -0.49798634  0.69532162  0.90584421  1.28646384  1.06145854 -0.36595341
#>  [6,]  1.24126098  2.08061988  0.58639209  1.41419228  0.97280670  2.29267921
#>  [7,] -0.66294905 -0.60990615  1.08660292  1.26836002  0.86989556  0.67114003
#>  [8,]  2.47779305 -0.25745317  0.09532662  0.92298914 -0.47422912 -2.06079384
#>  [9,] -0.18751553  1.13487999  1.75699226  0.41892632 -0.40581280  1.79639841
#> [10,]  1.48159494  0.95996269  0.42495125  1.20459107  2.99138812  0.51345398
#> [11,]  2.96329848  1.07304714  2.19835379  0.96546902  0.59130370 -1.39155788
#> [12,] -0.03343207  0.57856211  1.20681867  0.42142036  0.33754895  2.18574923
#> [13,]  2.23345761  1.72116725  1.41108154  0.16583368  0.79787655  0.42722079
#> [14,]  0.92876348  1.64567938  0.50552787  0.54885071  1.29744403  2.51502591
#> [15,]  0.66638629  0.99740198  0.82597848  2.55390193  0.95654240  2.10707545
#> [16,]  1.24368090  0.76688061 -0.27219534 -1.27006310 -0.26703076  0.39028437
#> [17,]  0.42469558  2.93744487  1.42906622 -0.30696310  1.28202545  0.85038793
#> [18,]  0.30653425  1.46226648  1.37250351  0.11535864 -0.12636388  2.58956683
#> [19,] -0.62872346  1.24614624  0.43012596  2.79931809  0.91017594  1.71324307
#> [20,]  1.61787650  1.80274561  1.44592369 -0.04470975 -0.32150609  1.12678994
#>         Series 7   Series 8   Series 9    Series 10
#>  [1,]  0.7462466 1.42120923 0.87063518  0.713425140
#>  [2,]  1.7143826 0.96099381 1.70010505 -0.092220946
#>  [3,]  2.1385895 2.23299039 0.83507677  0.674831477
#>  [4,]  1.9445706 0.36561131 0.97782927  0.007520743
#>  [5,]  1.1938149 1.19027259 0.98234799  1.901098654
#>  [6,]  1.4284023 0.72982114 0.83458845  1.705261628
#>  [7,]  2.5157183 1.72265607 1.30372242  1.518522505
#>  [8,]  1.5220899 2.14468840 0.52605143  0.946906915
#>  [9,]  0.8911845 1.43016172 0.01990461  0.189704251
#> [10,]  1.5652505 1.81406832 1.20605089  1.970295087
#> [11,]  2.5326961 2.14454109 1.20559247 -1.146333996
#> [12,]  0.6247584 1.42884539 0.59871870  1.007310643
#> [13,]  2.2160399 0.92496832 1.85380106 -0.195081081
#> [14,]  1.6667064 3.32649442 2.62292824  3.345575721
#> [15,]  0.8494955 0.07467723 0.30383977  0.962763130
#> [16,] -0.7277950 1.95039753 1.38403538  2.452553027
#> [17,]  0.2186041 1.61394044 0.79312338  2.233892152
#> [18,]  0.6509573 0.79701321 1.11013871  0.520896152
#> [19,] -0.1396447 2.48673988 0.92089220  2.527638656
#> [20,]  0.1249421 0.53819906 0.55431168  1.592665735
#> 
#> 
#> $Predict
#> function (object, newpreds) 
#> {
#>     pred <- apply(newpreds, 1, function(x) mean(x, trim = object$Trim_Factor))
#>     return(pred)
#> }
#> <bytecode: 0x56049daaf248>
#> <environment: namespace:ForecastComb>
#> 
#> $Weights
#> [1] "Weights of the individual forecasts differ over time with trimmed mean"
#> 
#> $Forecasts_Test
#>  [1] 1.0021460 1.1881742 1.0585428 0.7629834 0.8686963 1.3008692 0.9788742
#>  [8] 0.6782963 0.7067798 1.3394084 1.1955837 0.7754979 1.1897486 1.8189866
#> [15] 0.9586854 0.5585322 1.1057169 0.7919585 1.2619147 0.8694999
#> 
#> $Accuracy_Test
#>                  ME    RMSE      MAE      MPE     MAPE
#> Test set -0.9842352 1.40918 1.109838 178.5164 568.7203
#> 
#> $Trim_Factor
#> [1] 0.1
#> 
#> attr(,"class")
#> [1] "foreccomb_res"

## Algorithm-optimized trim factor:
data<-foreccomb(train_o, train_p, test_o, test_p)
comb_TA(data, criterion="RMSE")
#> Optimization algorithm chooses trim factor for trimmed mean approach...
#> Algorithm finished. Optimized trim factor: 0
#> $Method
#> [1] "Trimmed Mean"
#> 
#> $Models
#>  [1] "Series 1"  "Series 2"  "Series 3"  "Series 4"  "Series 5"  "Series 6" 
#>  [7] "Series 7"  "Series 8"  "Series 9"  "Series 10"
#> 
#> $Fitted
#> Time Series:
#> Start = 1 
#> End = 80 
#> Frequency = 1 
#>  [1] 0.9078574 1.1094988 0.7547660 1.0960336 1.3765846 0.9850264 0.9710487
#>  [8] 1.5201155 0.9733120 0.3580295 0.6778953 1.0641039 1.1120682 1.3982913
#> [15] 1.1638277 1.0313115 0.4477875 1.5765103 0.9504583 1.2607757 1.2361891
#> [22] 0.7886704 0.4554932 0.5089766 1.0115333 0.9318732 0.9469595 1.0877470
#> [29] 1.1952418 0.8286619 1.5023280 1.0005496 1.0141549 1.3364099 1.5368497
#> [36] 1.5714960 0.8689336 1.5742867 1.0839195 1.4936505 0.6049670 0.7553268
#> [43] 1.0373721 1.1141963 1.0442792 0.9268058 1.6354003 0.5475973 1.2049956
#> [50] 0.8619960 1.1643309 1.1409029 1.3355800 1.1750400 0.4431843 0.7347181
#> [57] 1.0944535 0.9848263 0.9196861 1.2278173 1.3440336 1.2281177 0.4454914
#> [64] 0.6646476 1.6005796 1.0890296 0.9902202 1.0446155 0.9054112 1.0003627
#> [71] 0.8989032 1.1073229 1.0750582 1.2243312 1.0243249 0.5747359 0.8039538
#> [78] 1.5527848 1.0188392 1.1664807
#> 
#> $Accuracy_Train
#>                  ME     RMSE      MAE      MPE     MAPE       ACF1 Theil's U
#> Test set -0.9685841 1.386412 1.105489 55.73535 292.2448 0.04332422  1.451069
#> 
#> $Input_Data
#> $Input_Data$Actual_Train
#> Time Series:
#> Start = 1 
#> End = 80 
#> Frequency = 1 
#>  [1] -0.06482357  1.88129795  0.37728673 -1.10012675  0.50884206  0.69353830
#>  [7] -0.11306173 -1.17045083  1.08342120 -0.60269190 -0.91345522  2.48855648
#> [13]  0.41491463 -0.81415778 -1.21587151  0.15772992 -0.05254640 -0.26792886
#> [19] -0.31142546 -0.16529183  1.54545509  0.72025696  0.98834802  0.11434685
#> [25]  0.69896061 -1.70419899  0.05524195  0.59505638  1.48438082 -0.40696587
#> [31]  0.17464530  0.07126289 -0.85003539 -1.72823591 -0.92625250  1.52284667
#> [37]  1.28665243  0.49189116 -1.21118134  0.48840810 -0.29288472  0.87975549
#> [43] -0.60751393 -0.30026917 -1.31669005 -0.44742476  1.12319146  0.85315910
#> [49] -0.82439587  0.30886125  1.64922707 -2.70983082 -0.28844978 -0.80996466
#> [55]  0.06519417  0.38344340  0.65049717  0.30230426 -0.56987071  0.10788262
#> [61] -0.74475684  0.10954229  0.22454548 -0.58129117 -0.85372262  0.97043107
#> [67]  1.51545114 -0.43641418  1.08614249 -0.17353115  0.50109335  1.07580101
#> [73] -0.50366128 -0.85830836  0.71387359  0.43098403  0.23865652 -0.78183050
#> [79]  0.62387562  0.99747504
#> 
#> $Input_Data$Forecasts_Train
#> Time Series:
#> Start = 1 
#> End = 80 
#> Frequency = 1 
#>       Series 1    Series 2    Series 3    Series 4   Series 5     Series 6
#>  1  1.92487156  1.21722543  1.33187154  0.63522291  1.3201490 -2.687852853
#>  2  1.97312426  0.50256843  0.63952496  3.17289325  1.2831147 -0.017280248
#>  3  1.88423088  0.63099031  1.15156283 -0.49033842 -0.3514359  1.333999211
#>  4  2.00191255  0.30747604 -0.66936490  1.81382117  1.8240394  1.098243586
#>  5  0.72807458  3.22789533  0.95650940  1.38463701  1.1931443  2.280547609
#>  6  1.00727894  1.02754419  0.97709784 -0.44168893  0.1756110  1.477165969
#>  7  0.65512661  1.65309632  3.29177375  1.57725756  1.2594985 -0.302134050
#>  8  3.67897225  0.50027979  0.96699094  0.65489577  1.3480945  2.156895501
#>  9  2.17809853  0.83583537 -1.03002804  0.38848920  2.1619113  0.532795792
#> 10 -0.49243039  2.18492344  1.06511038  1.86891447  0.2947084 -1.555182813
#> 11 -0.05219573  1.85938780 -0.22558141  0.29312333  2.0212860  1.290776955
#> 12  0.95833322  1.20778988  0.46680882  0.77225172  0.4898309  0.877059695
#> 13  1.09164125  2.32725734  0.70643503  0.43876854  1.6546114  1.126627550
#> 14 -0.15684746  2.21169157  1.04853711 -0.62328203  0.3683613  2.907418026
#> 15  1.23148074  0.98506319  1.92973803  2.60600393  1.8719228  1.856455814
#> 16  2.08053687  1.78428544  1.34814411  0.10067457  0.6888679  0.223074873
#> 17  0.71191750 -0.09336842  1.54967242  0.79808031  0.8232614  0.275300299
#> 18  1.73944557  2.19249758  1.67570991  0.75451218  1.9806513  0.416212941
#> 19  1.51536325 -0.58285000  2.93111420  0.14670968  0.2857912  1.176490641
#> 20  1.71876101  0.62342335  1.96952763  0.31710447  0.7063407  3.006826716
#> 21  0.20263809  0.12430823  0.76013484  0.74970432  1.9810263  0.937215100
#> 22  1.57743524  0.96865117  1.59153705  2.47181425  0.6680911  1.063409757
#> 23  0.87018148  2.34997625  0.36418532  0.69551518  0.4916691 -0.004210052
#> 24  0.56448835 -0.65463671  1.32595463  0.56765781  0.5525789 -0.304844337
#> 25  1.19670376 -0.06785420  2.41422447  1.58743421  1.4857096  0.641409468
#> 26  0.84829507 -0.66123321  1.81150849  0.70565752  0.1411519  2.207064370
#> 27  1.50720131  0.87582762  0.77533940  1.20452863  0.3916465  1.061888163
#> 28  1.20726159  0.67938125  1.82187865  0.67793815  1.5294137  0.832580590
#> 29  2.18544733 -0.71787263 -0.61261424  1.38442904  0.9934852  2.170863239
#> 30 -0.13270562  1.13068327  0.11671233 -0.83479802  0.2755754 -0.871770585
#> 31  2.36958658  0.34793583  1.79441415  0.56308458  3.2195360  0.930603757
#> 32  1.86026392  1.77768309  0.24389250  2.43425058  0.6255733 -0.234934888
#> 33  0.61018203  2.24074262 -0.47366266  0.65815157  1.7631750  0.489692838
#> 34  1.67668157  0.97701497  1.69949875  0.88060766  3.5060910  1.254729622
#> 35  3.00563700  0.59825979  2.38067126 -1.02656763  1.4249912  1.858787130
#> 36  2.23157900  1.13513296  0.85261678  1.65989919  1.5273931  2.762509156
#> 37  0.62829150  0.64880577  0.86918776  1.27985976  1.2043053  0.675373178
#> 38  2.21723307  1.53765853 -0.27184944  2.79515131  2.3734330  0.992023043
#> 39  0.40683486  2.77326065  0.74670975  2.05000703  0.3132890 -0.857674250
#> 40 -0.31600323  1.42547147  1.23715355  2.72306611  2.7467876  0.187283135
#> 41  0.87674098 -0.10806360  1.46172942  1.92496497  1.7097927 -0.911791434
#> 42  1.02364811  0.74523052 -0.81526017  0.33165095 -0.4841330  1.401887449
#> 43 -0.97119247  0.23849489 -0.41200938  1.88270214  0.7903222  2.768369737
#> 44  1.67183724  0.87385299  0.28586846  1.99872795  0.9808677  1.118930670
#> 45  1.44122538  0.23750062  0.64071171  2.09354082  1.3606246  0.372486008
#> 46  1.12508595  1.16761768 -1.08322992  1.77100948 -0.1691081  2.334084065
#> 47  2.69874625  1.93387523  1.91986765  1.90406068  2.6743441  1.204741266
#> 48  2.54599476  0.58429137 -0.80955993  1.18524392  0.3852273  0.910127001
#> 49  2.00387215  2.31837122 -0.88612274  1.71093535  0.8391844 -0.623543139
#> 50  1.95192697 -1.90531092 -0.14133003  0.06719821  1.3526236  1.970875571
#> 51  0.79108087 -0.01975667  0.31458365  1.11010792  2.7615806  1.802153678
#> 52  3.19564912  1.28322540  0.73561887  0.37356096  0.6267023  1.333967190
#> 53  2.23654035 -0.09044053  3.17933746  1.62058102  1.8718930  0.177874462
#> 54  0.62371824  3.84344991  0.05707987  1.42825460  1.6254567  0.651932208
#> 55  0.62952076 -0.68985091  0.82354172  1.68010943  1.3316294 -0.591909648
#> 56  1.11766764  2.78385088 -0.36001455  0.28594531  0.3027414  0.842162981
#> 57  1.91701175  1.15312507  2.31415896  1.56613691  1.7037662  1.064526951
#> 58  0.01462512  0.35705646  0.28868473  1.34735265  0.6747173  2.018465058
#> 59  1.67232984  0.61179014  0.88851667  1.03962614  0.6593965  1.678822673
#> 60  1.04489558  0.64085371  1.27324132  0.53404577  0.4604977  1.386435255
#> 61  1.09940317  1.42853105  1.53730967  1.68507801  1.2867005 -0.610036254
#> 62  1.08877144  0.55561864  1.34813037  1.22270694  1.1917969  2.760760568
#> 63  0.32211039 -1.51376781  0.91348131  1.69531722  0.4739936  0.013599416
#> 64  1.37888304  1.56687075 -0.74604087  0.28769048  1.4174980  2.996949570
#> 65  1.75393458  2.08086471  2.52069774  3.25580400 -0.6286073  1.044918486
#> 66  0.98314800  0.80134238 -0.10249433  2.98315637  1.7209896  1.725657444
#> 67  1.61804613  1.48496488  0.93742507  0.69178642  0.4806510  0.307480014
#> 68  2.24169576  0.15253689  0.15772503  2.54945889  0.4357431  0.264159842
#> 69 -0.02527255 -0.01336296  0.77737957  1.39573635  1.8883610  0.805261085
#> 70 -0.36974240  0.73373880  1.83051439  0.86605498 -0.3523638  1.502681535
#> 71  0.84129561  0.77287469  0.65133073  0.85530305  2.4867596  0.640831731
#> 72  0.97243441  0.31156236  0.96879300  0.83483191  1.5701992  0.292327626
#> 73  1.30548094  0.79992255  1.20739950  1.12895406  1.5764023  1.384591666
#> 74  1.42727915  1.79264278  2.00184687  2.94298966  2.1365397  0.605000559
#> 75  1.70613019  0.78129255  0.63660559  1.90974979  1.4196825  0.610954297
#> 76  1.28099778  0.45868158  1.11109443  0.57478861 -0.7853909 -0.373408759
#> 77  0.90820166 -0.80136296 -0.60409230 -0.53886035  2.7795145  0.964391045
#> 78  4.49502536  1.27649045  1.86393022  2.47642586  1.7377523  1.804649403
#> 79  1.16804843  0.02879095  1.36417015  0.53336626  0.3461475  1.022308070
#> 80  1.10908559  0.79040184  1.82322210  0.04545392  1.6623901  1.543399289
#>       Series 7    Series 8      Series 9   Series 10
#>  1  1.56574024 -0.23283371  1.5730967417  2.43108368
#>  2  0.62056002  1.44394857  1.5701286561 -0.09359431
#>  3  1.42527499  1.02459430 -0.1600108593  1.09879273
#>  4  1.48909431  0.28117964  0.9175354114  1.89639919
#>  5 -1.28573834  3.54506698  1.3300440677  0.40566514
#>  6  1.66180749  1.44955832  0.1791107473  2.33677885
#>  7  1.84909968  1.49438525 -1.6239569686 -0.14366008
#>  8  1.24713404  2.29570311  1.9341529780  0.41803589
#>  9  2.72097026  1.66544943  0.0168125504  0.26278574
#> 10 -0.24839112  0.58647533 -0.4433610901  0.31952856
#> 11 -0.59512320  0.86144530  0.0218477669  1.30398603
#> 12  1.17832360  1.49898870  2.3813667871  0.81028519
#> 13 -0.44398564  0.77216391  1.7679752879  1.67918761
#> 14  2.97297540  0.12196703  2.7823659929  2.34972616
#> 15  0.69980214  0.35034551 -1.5396985884  1.64716299
#> 16  1.36940634  1.31592763  1.1506047612  0.25159302
#> 17  1.12265149  0.30646341 -0.9834908512 -0.03261244
#> 18  1.44922961  1.28643098  2.0333249555  2.23708751
#> 19  0.10427354  2.11406087 -0.5891425730  2.40277222
#> 20 -0.36976328  1.07399928  1.5949926166  1.96654452
#> 21  1.81528141  1.38130203  1.9986716362  2.41160933
#> 22 -0.01560092 -0.60169936  0.0002993922  0.16276678
#> 23  0.70182437 -0.83154898 -0.1510323189  0.06837188
#> 24  0.42374257 -0.60866280  1.3784683436  1.84501959
#> 25  0.39218297  0.66791756  1.2121675855  0.58543770
#> 26  1.28491543  0.26112679  0.8793171853  1.84092856
#> 27  0.28603014  1.58581162  0.7192750986  1.06204654
#> 28  2.17105705  0.59628244  1.0014463696  0.36023018
#> 29  2.08022117  2.61084333  0.7173306210  1.14028511
#> 30  1.57478312  2.10146339  1.7296654752  3.19701012
#> 31  0.75293781  1.18190355  2.3605194831  1.50275834
#> 32  2.05348619  0.48041491  0.0319796084  0.73288711
#> 33  2.86998247  0.51760018  1.7748494147 -0.30916400
#> 34  1.23328187  0.24171252  1.1200975013  0.77438370
#> 35  1.24838246  0.43861547  2.0160329821  3.42368742
#> 36  2.29087958  2.24889124  0.5013334834  0.50472551
#> 37  1.66145997  0.97884272 -1.1529833512  1.89619294
#> 38  1.39311403  1.18217675  2.1270221174  1.39690496
#> 39  1.46810022  2.36052288  1.0102719231  0.56787263
#> 40  2.83279198  1.30525765  2.0415761518  0.75312085
#> 41 -0.53498460  0.22261627  1.8586668846 -0.45000131
#> 42  1.48392386  0.93100458  1.1592187009  1.77609738
#> 43  1.16683363  2.35113691  2.4546521120  0.10441147
#> 44  1.36159643 -1.82003426  3.2424097315  1.42790585
#> 45  1.08489916 -0.13289286  1.3591315844  1.98556510
#> 46  1.77142696  1.00322718  0.8367040433  0.51124068
#> 47  0.73261005  1.47816269  0.6590591393  1.14853603
#> 48  0.60331743 -1.09549610  1.6271159317 -0.46028904
#> 49  2.66500321  1.42522195  1.1090413632  1.48799241
#> 50  0.64308710  1.27542026  1.9094956709  1.49597322
#> 51  0.51067991  1.82161750  0.1837397243  2.36752192
#> 52  0.11459795  0.70410262  2.8742342338  0.16737013
#> 53  0.02565506  0.93131895  1.3610211804  2.04201910
#> 54 -0.13371694  0.49369346  1.6615737254  1.49895810
#> 55 -0.73522233  0.68816029  0.4661169932  0.82974690
#> 56  1.13333199  1.33165579  0.0063654902 -0.09652559
#> 57  0.51532075 -0.43050542  0.6826113533  0.45838273
#> 58  0.09826700  2.42525039  1.2402156175  1.38362847
#> 59 -1.41683774  0.97316135  2.1022044609  0.98785076
#> 60  0.90661508  2.51103757  1.1122272299  2.40832391
#> 61  1.20855016  0.95593155  1.4562898934  3.39257827
#> 62  1.51871843  1.12245168  0.5438905774  0.92833182
#> 63 -0.92374518  1.35299330  1.1354938342  0.98543836
#> 64 -0.07712457 -1.24395451  0.1720393603  0.89366480
#> 65  2.04764306  1.61866511  1.4342347236  0.87764107
#> 66  1.07439537  0.05191018  0.3731059655  1.27908470
#> 67  0.54793426  0.45281955  2.2147516382  1.16634273
#> 68  1.26473956  1.85907967  1.6108549019 -0.08983873
#> 69  1.06020521  0.72174386  0.5891932301  1.85486690
#> 70  1.42093280  0.56765418  2.0608715865  1.74328476
#> 71 -0.89727115  2.59335811 -0.1424226360  1.18697248
#> 72  0.33937343  1.20067954  2.4493537623  2.13367384
#> 73  0.32276389  1.82930801  0.5143504358  0.68140877
#> 74 -0.84934821  0.17414539  0.3810963950  1.63111935
#> 75  2.15785141 -0.44180627 -0.1329450863  1.59573431
#> 76  1.29107326  1.37390829 -0.3841768476  1.19979110
#> 77  1.17115304  1.01289684  0.6933548753  2.45434167
#> 78  0.96717987 -0.34308904  1.2895579242 -0.04007394
#> 79  0.46224701  1.98616068  1.3010264468  1.97612625
#> 80  1.15451087 -0.14862841  2.2949718153  1.39000022
#> 
#> $Input_Data$Actual_Test
#>  [1]  0.86598193 -0.71399881 -0.41780859  0.74130038  0.81740052  0.03525554
#>  [7] -0.01920845 -0.24672645 -0.91463278 -1.63470381  0.48350690 -1.13212227
#> [13]  1.46798670 -1.08554092  1.93647348 -1.26261590  0.82063058 -0.45867606
#> [19]  0.64051764  0.80317101
#> 
#> $Input_Data$Forecasts_Test
#>          Series 1    Series 2    Series 3    Series 4    Series 5    Series 6
#>  [1,]  0.62219413  0.13792682  1.91655984  1.23490626  0.80730673  1.60124514
#>  [2,]  0.52241542  0.88701578  1.82129036  0.11642475  1.81688367  1.78717226
#>  [3,]  1.22544134  0.04894114  1.34396526  3.30584020 -0.03149307 -0.10124913
#>  [4,] -0.07164555  0.20296333  1.64931930  1.44758313  1.41201381  0.04102607
#>  [5,] -0.49798634  0.69532162  0.90584421  1.28646384  1.06145854 -0.36595341
#>  [6,]  1.24126098  2.08061988  0.58639209  1.41419228  0.97280670  2.29267921
#>  [7,] -0.66294905 -0.60990615  1.08660292  1.26836002  0.86989556  0.67114003
#>  [8,]  2.47779305 -0.25745317  0.09532662  0.92298914 -0.47422912 -2.06079384
#>  [9,] -0.18751553  1.13487999  1.75699226  0.41892632 -0.40581280  1.79639841
#> [10,]  1.48159494  0.95996269  0.42495125  1.20459107  2.99138812  0.51345398
#> [11,]  2.96329848  1.07304714  2.19835379  0.96546902  0.59130370 -1.39155788
#> [12,] -0.03343207  0.57856211  1.20681867  0.42142036  0.33754895  2.18574923
#> [13,]  2.23345761  1.72116725  1.41108154  0.16583368  0.79787655  0.42722079
#> [14,]  0.92876348  1.64567938  0.50552787  0.54885071  1.29744403  2.51502591
#> [15,]  0.66638629  0.99740198  0.82597848  2.55390193  0.95654240  2.10707545
#> [16,]  1.24368090  0.76688061 -0.27219534 -1.27006310 -0.26703076  0.39028437
#> [17,]  0.42469558  2.93744487  1.42906622 -0.30696310  1.28202545  0.85038793
#> [18,]  0.30653425  1.46226648  1.37250351  0.11535864 -0.12636388  2.58956683
#> [19,] -0.62872346  1.24614624  0.43012596  2.79931809  0.91017594  1.71324307
#> [20,]  1.61787650  1.80274561  1.44592369 -0.04470975 -0.32150609  1.12678994
#>         Series 7   Series 8   Series 9    Series 10
#>  [1,]  0.7462466 1.42120923 0.87063518  0.713425140
#>  [2,]  1.7143826 0.96099381 1.70010505 -0.092220946
#>  [3,]  2.1385895 2.23299039 0.83507677  0.674831477
#>  [4,]  1.9445706 0.36561131 0.97782927  0.007520743
#>  [5,]  1.1938149 1.19027259 0.98234799  1.901098654
#>  [6,]  1.4284023 0.72982114 0.83458845  1.705261628
#>  [7,]  2.5157183 1.72265607 1.30372242  1.518522505
#>  [8,]  1.5220899 2.14468840 0.52605143  0.946906915
#>  [9,]  0.8911845 1.43016172 0.01990461  0.189704251
#> [10,]  1.5652505 1.81406832 1.20605089  1.970295087
#> [11,]  2.5326961 2.14454109 1.20559247 -1.146333996
#> [12,]  0.6247584 1.42884539 0.59871870  1.007310643
#> [13,]  2.2160399 0.92496832 1.85380106 -0.195081081
#> [14,]  1.6667064 3.32649442 2.62292824  3.345575721
#> [15,]  0.8494955 0.07467723 0.30383977  0.962763130
#> [16,] -0.7277950 1.95039753 1.38403538  2.452553027
#> [17,]  0.2186041 1.61394044 0.79312338  2.233892152
#> [18,]  0.6509573 0.79701321 1.11013871  0.520896152
#> [19,] -0.1396447 2.48673988 0.92089220  2.527638656
#> [20,]  0.1249421 0.53819906 0.55431168  1.592665735
#> 
#> 
#> $Predict
#> function (object, newpreds) 
#> {
#>     pred <- apply(newpreds, 1, function(x) mean(x, trim = object$Trim_Factor))
#>     return(pred)
#> }
#> <bytecode: 0x56049daaf248>
#> <environment: namespace:ForecastComb>
#> 
#> $Weights
#> [1] "Weights of the individual forecasts differ over time with trimmed mean"
#> 
#> $Forecasts_Test
#>  [1] 1.0071655 1.1234463 1.1672934 0.7976792 0.8352683 1.3286025 0.9683763
#>  [8] 0.5843369 0.7044824 1.4131607 1.1136410 0.8356300 1.1556366 1.8402996
#> [15] 1.0298062 0.5650748 1.1476217 0.8798871 1.2265912 0.8437238
#> 
#> $Accuracy_Test
#>                  ME     RMSE      MAE      MPE     MAPE
#> Test set -0.9920766 1.422945 1.113978 173.2581 569.3889
#> 
#> $Trim_Factor
#> [1] 0
#> 
#> attr(,"class")
#> [1] "foreccomb_res"