setwd("/Users/adrien/R/Data")
library(caschrono)
## Loading required package: zoo
##
## Attaching package: 'zoo'
## The following objects are masked from 'package:base':
##
## as.Date, as.Date.numeric
library(forecast)
X <- read.table("TP4dat.txt")
Box.test(X)
##
## Box-Pierce test
##
## data: X
## X-squared = 65.922, df = 1, p-value = 4.441e-16
acf(X)
pacf(X)
Le test de Ljung-Box conduit au rejet de l’hypothèse d’un bruit blanc.
L’inspection de l’ACF indique que la série temporelle ne ressemble pas à un modèle MA, mais la PACF évoque la possibilité d’un AR(2) car ses deux premières valeurs dépassent clairement de l’intervalle de confiance.
On tente une modélsation avec un AR(2) :
arima(X,order=c(2,0,0))
##
## Call:
## arima(x = X, order = c(2, 0, 0))
##
## Coefficients:
## ar1 ar2 intercept
## 0.9379 -0.6286 0.1475
## s.e. 0.0548 0.0545 0.1086
##
## sigma^2 estimated as 1.12: log likelihood = -295.85, aic = 599.7
Le modèle suggéré est donc \[ X_t=0.1475+0.9379X_{t-1}-0.6286X_{t-2}+\epsilon_t \] où \(\epsilon_t\) est un bruit blanc centré de variance \(\sigma^2=1.12.\) On étudie les résidus de la modélisation :
res <-arima(X,order=c(2,0,0))$residuals
plot(res)
Box.test.2(res,1:200)
## Retard p-value
## [1,] 1 0.3680457
## [2,] 2 0.4416490
## [3,] 3 0.5233534
## [4,] 4 0.6868456
## [5,] 5 0.8100865
## [6,] 6 0.8812358
## [7,] 7 0.7671951
## [8,] 8 0.8194149
## [9,] 9 0.8816212
## [10,] 10 0.9187498
## [11,] 11 0.9427677
## [12,] 12 0.9631014
## [13,] 13 0.9784386
## [14,] 14 0.9878437
## [15,] 15 0.9762351
## [16,] 16 0.9799566
## [17,] 17 0.9873395
## [18,] 18 0.9895882
## [19,] 19 0.9778066
## [20,] 20 0.9851518
## [21,] 21 0.9034333
## [22,] 22 0.9134288
## [23,] 23 0.9238513
## [24,] 24 0.9388256
## [25,] 25 0.9556562
## [26,] 26 0.9683741
## [27,] 27 0.9632337
## [28,] 28 0.9733463
## [29,] 29 0.9811751
## [30,] 30 0.9703122
## [31,] 31 0.9767994
## [32,] 32 0.9790534
## [33,] 33 0.9846402
## [34,] 34 0.9834340
## [35,] 35 0.9754006
## [36,] 36 0.9818717
## [37,] 37 0.9857561
## [38,] 38 0.9788393
## [39,] 39 0.9532704
## [40,] 40 0.9619364
## [41,] 41 0.9629671
## [42,] 42 0.9658849
## [43,] 43 0.9717870
## [44,] 44 0.9779345
## [45,] 45 0.9822293
## [46,] 46 0.9837634
## [47,] 47 0.9877470
## [48,] 48 0.9906797
## [49,] 49 0.9912289
## [50,] 50 0.9931826
## [51,] 51 0.9945026
## [52,] 52 0.9959732
## [53,] 53 0.9969856
## [54,] 54 0.9949851
## [55,] 55 0.9940157
## [56,] 56 0.9955481
## [57,] 57 0.9966900
## [58,] 58 0.9956066
## [59,] 59 0.9957532
## [60,] 60 0.9963690
## [61,] 61 0.9940840
## [62,] 62 0.9954932
## [63,] 63 0.9965954
## [64,] 64 0.9971159
## [65,] 65 0.9957397
## [66,] 66 0.9967375
## [67,] 67 0.9971803
## [68,] 68 0.9978273
## [69,] 69 0.9982779
## [70,] 70 0.9985249
## [71,] 71 0.9988486
## [72,] 72 0.9985406
## [73,] 73 0.9986151
## [74,] 74 0.9986905
## [75,] 75 0.9989630
## [76,] 76 0.9989909
## [77,] 77 0.9992065
## [78,] 78 0.9993908
## [79,] 79 0.9995253
## [80,] 80 0.9996312
## [81,] 81 0.9996908
## [82,] 82 0.9997741
## [83,] 83 0.9997938
## [84,] 84 0.9998475
## [85,] 85 0.9998017
## [86,] 86 0.9998563
## [87,] 87 0.9998832
## [88,] 88 0.9998648
## [89,] 89 0.9998668
## [90,] 90 0.9998682
## [91,] 91 0.9998995
## [92,] 92 0.9999261
## [93,] 93 0.9999322
## [94,] 94 0.9999398
## [95,] 95 0.9998868
## [96,] 96 0.9998846
## [97,] 97 0.9999075
## [98,] 98 0.9999307
## [99,] 99 0.9999463
## [100,] 100 0.9999611
## [101,] 101 0.9999664
## [102,] 102 0.9999666
## [103,] 103 0.9999566
## [104,] 104 0.9999041
## [105,] 105 0.9999291
## [106,] 106 0.9999356
## [107,] 107 0.9999525
## [108,] 108 0.9999637
## [109,] 109 0.9999718
## [110,] 110 0.9999791
## [111,] 111 0.9999848
## [112,] 112 0.9999720
## [113,] 113 0.9999796
## [114,] 114 0.9999824
## [115,] 115 0.9999849
## [116,] 116 0.9999876
## [117,] 117 0.9999910
## [118,] 118 0.9999925
## [119,] 119 0.9999942
## [120,] 120 0.9999931
## [121,] 121 0.9999950
## [122,] 122 0.9999961
## [123,] 123 0.9999954
## [124,] 124 0.9999938
## [125,] 125 0.9999937
## [126,] 126 0.9999953
## [127,] 127 0.9999966
## [128,] 128 0.9999973
## [129,] 129 0.9999980
## [130,] 130 0.9999986
## [131,] 131 0.9999989
## [132,] 132 0.9999992
## [133,] 133 0.9999994
## [134,] 134 0.9999996
## [135,] 135 0.9999997
## [136,] 136 0.9999997
## [137,] 137 0.9999998
## [138,] 138 0.9999999
## [139,] 139 0.9999999
## [140,] 140 0.9999999
## [141,] 141 0.9999999
## [142,] 142 1.0000000
## [143,] 143 1.0000000
## [144,] 144 1.0000000
## [145,] 145 1.0000000
## [146,] 146 1.0000000
## [147,] 147 1.0000000
## [148,] 148 1.0000000
## [149,] 149 1.0000000
## [150,] 150 1.0000000
## [151,] 151 1.0000000
## [152,] 152 1.0000000
## [153,] 153 1.0000000
## [154,] 154 1.0000000
## [155,] 155 1.0000000
## [156,] 156 1.0000000
## [157,] 157 1.0000000
## [158,] 158 1.0000000
## [159,] 159 1.0000000
## [160,] 160 1.0000000
## [161,] 161 1.0000000
## [162,] 162 1.0000000
## [163,] 163 1.0000000
## [164,] 164 1.0000000
## [165,] 165 1.0000000
## [166,] 166 1.0000000
## [167,] 167 1.0000000
## [168,] 168 1.0000000
## [169,] 169 1.0000000
## [170,] 170 1.0000000
## [171,] 171 1.0000000
## [172,] 172 1.0000000
## [173,] 173 1.0000000
## [174,] 174 1.0000000
## [175,] 175 1.0000000
## [176,] 176 1.0000000
## [177,] 177 1.0000000
## [178,] 178 1.0000000
## [179,] 179 1.0000000
## [180,] 180 1.0000000
## [181,] 181 1.0000000
## [182,] 182 1.0000000
## [183,] 183 1.0000000
## [184,] 184 1.0000000
## [185,] 185 1.0000000
## [186,] 186 1.0000000
## [187,] 187 1.0000000
## [188,] 188 1.0000000
## [189,] 189 1.0000000
## [190,] 190 1.0000000
## [191,] 191 1.0000000
## [192,] 192 1.0000000
## [193,] 193 1.0000000
## [194,] 194 1.0000000
## [195,] 195 1.0000000
## [196,] 196 1.0000000
## [197,] 197 1.0000000
## [198,] 198 1.0000000
## [199,] 199 1.0000000
## [200,] 200 NA
Comme l’hypothèse de bruit blanc n’est pas rejeté, la modélisation AR(2) est pertinante.
shapiro.test(res)
##
## Shapiro-Wilk normality test
##
## data: res
## W = 0.99398, p-value = 0.5982
Comme le test de Shapiro-Wilk n’est pas rejeté, il semble même raisonnable de supposer que le bruit \(\epsilon_t\) du modèle AR est gaussien.
data <- read.csv("ARM2.csv", header = TRUE)
head(data)
## Date Open High Low Close Volume Adj.Close
## 1 2016-03-10 985 998.50 968.0 968 4524300 962.370
## 2 2016-03-09 984 988.00 978.5 982 3098300 976.289
## 3 2016-03-08 985 991.50 979.5 986 4659900 980.265
## 4 2016-03-07 1009 1012.26 996.5 1006 3194600 1000.149
## 5 2016-03-04 1023 1025.00 1007.0 1017 2970500 1011.085
## 6 2016-03-03 1015 1029.17 1012.0 1019 2384400 1013.073
plot(data$Close, type='l')
acf(data$Close, lag=100)
La série initiale ne semble pas stationnaire : elle présente une tendance additive claire.
plot(diff(data$Close), type='l')
acf(diff(data$Close), lag=100)
La série des différence quant à elle à une ACF mieux régularisée mais on observe sur son chronogramme une tendance multiplicative.
plot(diff(log(data$Close)), type='l')
acf(diff(log(data$Close)), lag=100)
De ces trois séries, celle des log-différences semble la plus stationnaire, et l’on étudie maintenant.
Clo <- ts(diff(log(data$Close)), frequency = 365) # L'étude pouvais aussi se faire avec frequency = 252, ce qui colle mieux aux jours ouvrés des marchés
plot(Clo)
pacf(Clo)
ar(Clo)
##
## Call:
## ar(x = Clo)
##
## Coefficients:
## 1 2
## -0.0587 -0.0470
##
## Order selected 2 sigma^2 estimated as 0.000551
Box.test.2(ar(Clo)$resid,100:150)
## Retard p-value
## [1,] 100 0.05995026
## [2,] 101 0.05789882
## [3,] 102 0.05665369
## [4,] 103 0.06281479
## [5,] 104 0.06687777
## [6,] 105 0.07554365
## [7,] 106 0.07690252
## [8,] 107 0.06916886
## [9,] 108 0.06725926
## [10,] 109 0.06930430
## [11,] 110 0.07381283
## [12,] 111 0.07381966
## [13,] 112 0.08321743
## [14,] 113 0.06481233
## [15,] 114 0.07337126
## [16,] 115 0.07114192
## [17,] 116 0.07954462
## [18,] 117 0.08340825
## [19,] 118 0.09241964
## [20,] 119 0.08612950
## [21,] 120 0.08408561
## [22,] 121 0.09391194
## [23,] 122 0.04968134
## [24,] 123 0.05183029
## [25,] 124 0.05838065
## [26,] 125 0.06492013
## [27,] 126 0.07209301
## [28,] 127 0.06549218
## [29,] 128 0.06955888
## [30,] 129 0.06103513
## [31,] 130 0.04810365
## [32,] 131 0.04673885
## [33,] 132 0.05243088
## [34,] 133 0.04314509
## [35,] 134 0.04699471
## [36,] 135 0.05271467
## [37,] 136 0.05169637
## [38,] 137 0.04921490
## [39,] 138 0.05485274
## [40,] 139 0.06161050
## [41,] 140 0.06762331
## [42,] 141 0.07545565
## [43,] 142 0.07792642
## [44,] 143 0.08453565
## [45,] 144 0.07202630
## [46,] 145 0.08019849
## [47,] 146 0.08251920
## [48,] 147 0.08502475
## [49,] 148 0.09132801
## [50,] 149 0.09665676
## [51,] 150 0.09047751
La modélisation automatique AR suggère un AR(2), même cela n’est pas clair au vu de la PACF. On voit que le test de Ljung-Box rejete l’hypothèse de bruit blanc des résidus autour de lag=122. La modélisation n’est donc pas vraiment convaincante.
auto.arima(Clo)
## Series: Clo
## ARIMA(1,0,1) with non-zero mean
##
## Coefficients:
## ar1 ma1 mean
## 0.5775 -0.6398 -8e-04
## s.e. 0.1665 0.1568 4e-04
##
## sigma^2 estimated as 0.0005508: log likelihood=6608.96
## AIC=-13209.91 AICc=-13209.9 BIC=-13186.12
Box.test.2(arima(Clo,order=c(1,0,1))$resid,1:200)
## Retard p-value
## [1,] 1 0.89847456
## [2,] 2 0.88162198
## [3,] 3 0.96322135
## [4,] 4 0.96489084
## [5,] 5 0.98787305
## [6,] 6 0.96355228
## [7,] 7 0.90949922
## [8,] 8 0.95057639
## [9,] 9 0.95884654
## [10,] 10 0.91544085
## [11,] 11 0.94834693
## [12,] 12 0.88072549
## [13,] 13 0.90848593
## [14,] 14 0.88824887
## [15,] 15 0.80447877
## [16,] 16 0.84372619
## [17,] 17 0.82888500
## [18,] 18 0.87119801
## [19,] 19 0.86910267
## [20,] 20 0.89764706
## [21,] 21 0.90548421
## [22,] 22 0.84369016
## [23,] 23 0.78180932
## [24,] 24 0.82004616
## [25,] 25 0.84155630
## [26,] 26 0.32513874
## [27,] 27 0.35896666
## [28,] 28 0.32779248
## [29,] 29 0.35365325
## [30,] 30 0.29575332
## [31,] 31 0.17341489
## [32,] 32 0.19698712
## [33,] 33 0.19366249
## [34,] 34 0.22593008
## [35,] 35 0.23366772
## [36,] 36 0.25456340
## [37,] 37 0.29298274
## [38,] 38 0.29417906
## [39,] 39 0.31728112
## [40,] 40 0.30898524
## [41,] 41 0.23548674
## [42,] 42 0.26574254
## [43,] 43 0.23817549
## [44,] 44 0.25158695
## [45,] 45 0.17027508
## [46,] 46 0.19704380
## [47,] 47 0.20689078
## [48,] 48 0.23530481
## [49,] 49 0.20176911
## [50,] 50 0.21959922
## [51,] 51 0.22793253
## [52,] 52 0.23537035
## [53,] 53 0.20682380
## [54,] 54 0.15077260
## [55,] 55 0.17126241
## [56,] 56 0.13314951
## [57,] 57 0.10953057
## [58,] 58 0.12741078
## [59,] 59 0.13969889
## [60,] 60 0.15499940
## [61,] 61 0.17545474
## [62,] 62 0.19506713
## [63,] 63 0.20645900
## [64,] 64 0.16501964
## [65,] 65 0.18539554
## [66,] 66 0.19733169
## [67,] 67 0.20299087
## [68,] 68 0.21246681
## [69,] 69 0.23719080
## [70,] 70 0.20796637
## [71,] 71 0.12994831
## [72,] 72 0.14552205
## [73,] 73 0.06201117
## [74,] 74 0.07156731
## [75,] 75 0.08228748
## [76,] 76 0.09477917
## [77,] 77 0.07664146
## [78,] 78 0.07963026
## [79,] 79 0.09168019
## [80,] 80 0.08619762
## [81,] 81 0.07739086
## [82,] 82 0.08700837
## [83,] 83 0.09189534
## [84,] 84 0.10357002
## [85,] 85 0.11675217
## [86,] 86 0.11681255
## [87,] 87 0.12933659
## [88,] 88 0.07036592
## [89,] 89 0.08008288
## [90,] 90 0.06023542
## [91,] 91 0.06334047
## [92,] 92 0.07267244
## [93,] 93 0.08304277
## [94,] 94 0.09102218
## [95,] 95 0.08224747
## [96,] 96 0.06424333
## [97,] 97 0.07258045
## [98,] 98 0.08258127
## [99,] 99 0.09152162
## [100,] 100 0.06689920
## [101,] 101 0.06495346
## [102,] 102 0.06388498
## [103,] 103 0.07108284
## [104,] 104 0.07433290
## [105,] 105 0.08387581
## [106,] 106 0.08444073
## [107,] 107 0.07560526
## [108,] 108 0.07376866
## [109,] 109 0.07549727
## [110,] 110 0.08065036
## [111,] 111 0.08214027
## [112,] 112 0.09240509
## [113,] 113 0.07312414
## [114,] 114 0.08251264
## [115,] 115 0.08004777
## [116,] 116 0.08968206
## [117,] 117 0.09440612
## [118,] 118 0.10438876
## [119,] 119 0.09537661
## [120,] 120 0.09271300
## [121,] 121 0.10319465
## [122,] 122 0.05356364
## [123,] 123 0.05588073
## [124,] 124 0.06282507
## [125,] 125 0.06972515
## [126,] 126 0.07743539
## [127,] 127 0.06938876
## [128,] 128 0.07342302
## [129,] 129 0.06427496
## [130,] 130 0.05098889
## [131,] 131 0.04981287
## [132,] 132 0.05593720
## [133,] 133 0.04669632
## [134,] 134 0.05015648
## [135,] 135 0.05616029
## [136,] 136 0.05554774
## [137,] 137 0.05204609
## [138,] 138 0.05772709
## [139,] 139 0.06476965
## [140,] 140 0.07101850
## [141,] 141 0.07915819
## [142,] 142 0.08174160
## [143,] 143 0.08868740
## [144,] 144 0.07635795
## [145,] 145 0.08489672
## [146,] 146 0.08755745
## [147,] 147 0.09146610
## [148,] 148 0.09861849
## [149,] 149 0.10519739
## [150,] 150 0.09712179
## [151,] 151 0.09619012
## [152,] 152 0.10327133
## [153,] 153 0.10527606
## [154,] 154 0.10921956
## [155,] 155 0.11691539
## [156,] 156 0.12442117
## [157,] 157 0.12206850
## [158,] 158 0.13358186
## [159,] 159 0.14529938
## [160,] 160 0.15235350
## [161,] 161 0.16055705
## [162,] 162 0.15693413
## [163,] 163 0.16960082
## [164,] 164 0.17501929
## [165,] 165 0.18739930
## [166,] 166 0.20023649
## [167,] 167 0.21547995
## [168,] 168 0.16244503
## [169,] 169 0.17489979
## [170,] 170 0.18687027
## [171,] 171 0.15871577
## [172,] 172 0.16378129
## [173,] 173 0.17399670
## [174,] 174 0.18472186
## [175,] 175 0.12729410
## [176,] 176 0.13682501
## [177,] 177 0.14668493
## [178,] 178 0.12010578
## [179,] 179 0.11690885
## [180,] 180 0.11986513
## [181,] 181 0.12407812
## [182,] 182 0.13438333
## [183,] 183 0.14145368
## [184,] 184 0.13697365
## [185,] 185 0.12000170
## [186,] 186 0.11896316
## [187,] 187 0.12878573
## [188,] 188 0.13311383
## [189,] 189 0.13500624
## [190,] 190 0.14108325
## [191,] 191 0.14577592
## [192,] 192 0.15088280
## [193,] 193 0.16292882
## [194,] 194 0.17077308
## [195,] 195 0.17887808
## [196,] 196 0.19212361
## [197,] 197 0.19622584
## [198,] 198 0.17696798
## [199,] 199 0.18696104
## [200,] 200 0.20002495
La commande auto.arima suggère un ARMA(1,1), mais l’étude des résidus montrent encore qu’ils ne passent pas le test de Ljung-Box, ce qui ne permet pas d’obtenir une bonne modélisation ARMA.
Dec<- decompose(Clo, type='additive')
plot(Dec)
Res=na.omit(Dec$random) #na.omit pour supprimer les valeurs manquantes
acf(Res)
pacf(Res)
Box.test.2(ar(Res)$resid,1:200) # pas bon
## Retard p-value
## [1,] 1 0.96147930
## [2,] 2 0.99574734
## [3,] 3 0.99974887
## [4,] 4 0.98091773
## [5,] 5 0.95042118
## [6,] 6 0.89067401
## [7,] 7 0.67207795
## [8,] 8 0.55210596
## [9,] 9 0.61330288
## [10,] 10 0.59074098
## [11,] 11 0.66531436
## [12,] 12 0.61953262
## [13,] 13 0.66739158
## [14,] 14 0.71545673
## [15,] 15 0.70330337
## [16,] 16 0.75796001
## [17,] 17 0.74729262
## [18,] 18 0.78856358
## [19,] 19 0.81904970
## [20,] 20 0.84720819
## [21,] 21 0.88097749
## [22,] 22 0.84493187
## [23,] 23 0.82070870
## [24,] 24 0.85466433
## [25,] 25 0.87866542
## [26,] 26 0.31161711
## [27,] 27 0.35163202
## [28,] 28 0.32172631
## [29,] 29 0.24329671
## [30,] 30 0.20966632
## [31,] 31 0.15691137
## [32,] 32 0.17451428
## [33,] 33 0.15553526
## [34,] 34 0.17744093
## [35,] 35 0.17697853
## [36,] 36 0.18838159
## [37,] 37 0.20656947
## [38,] 38 0.23218218
## [39,] 39 0.25425447
## [40,] 40 0.25113510
## [41,] 41 0.20985242
## [42,] 42 0.23653828
## [43,] 43 0.12237191
## [44,] 44 0.10199165
## [45,] 45 0.07453414
## [46,] 46 0.07680399
## [47,] 47 0.08467948
## [48,] 48 0.08384760
## [49,] 49 0.09291377
## [50,] 50 0.09425767
## [51,] 51 0.10194859
## [52,] 52 0.11858634
## [53,] 53 0.09495854
## [54,] 54 0.06333098
## [55,] 55 0.06849828
## [56,] 56 0.04542537
## [57,] 57 0.04762340
## [58,] 58 0.05583176
## [59,] 59 0.05170240
## [60,] 60 0.06165501
## [61,] 61 0.06859229
## [62,] 62 0.08056393
## [63,] 63 0.07364631
## [64,] 64 0.05500304
## [65,] 65 0.06058222
## [66,] 66 0.06859365
## [67,] 67 0.07798204
## [68,] 68 0.08820371
## [69,] 69 0.09791671
## [70,] 70 0.10852294
## [71,] 71 0.03956005
## [72,] 72 0.04641693
## [73,] 73 0.02420514
## [74,] 74 0.02292904
## [75,] 75 0.02747562
## [76,] 76 0.03276698
## [77,] 77 0.02649334
## [78,] 78 0.02945212
## [79,] 79 0.03255649
## [80,] 80 0.02777251
## [81,] 81 0.02553813
## [82,] 82 0.02909025
## [83,] 83 0.03263561
## [84,] 84 0.02973430
## [85,] 85 0.03171710
## [86,] 86 0.03132043
## [87,] 87 0.03503802
## [88,] 88 0.02105447
## [89,] 89 0.02437425
## [90,] 90 0.02482175
## [91,] 91 0.02549361
## [92,] 92 0.02986278
## [93,] 93 0.03429056
## [94,] 94 0.03842445
## [95,] 95 0.02369502
## [96,] 96 0.01891957
## [97,] 97 0.02194613
## [98,] 98 0.02553292
## [99,] 99 0.02800026
## [100,] 100 0.02462307
## [101,] 101 0.01717121
## [102,] 102 0.01680556
## [103,] 103 0.01956436
## [104,] 104 0.01961432
## [105,] 105 0.01972846
## [106,] 106 0.02300993
## [107,] 107 0.01939491
## [108,] 108 0.01900149
## [109,] 109 0.01786954
## [110,] 110 0.01898932
## [111,] 111 0.01834488
## [112,] 112 0.02143065
## [113,] 113 0.01818546
## [114,] 114 0.02083369
## [115,] 115 0.01964327
## [116,] 116 0.02279050
## [117,] 117 0.02321490
## [118,] 118 0.02545089
## [119,] 119 0.02694339
## [120,] 120 0.01808688
## [121,] 121 0.02096986
## [122,] 122 0.00992568
## [123,] 123 0.01157827
## [124,] 124 0.01327102
## [125,] 125 0.01512837
## [126,] 126 0.01707892
## [127,] 127 0.01640668
## [128,] 128 0.01548433
## [129,] 129 0.00878357
## [130,] 130 0.00480702
## [131,] 131 0.00410454
## [132,] 132 0.00471351
## [133,] 133 0.00304067
## [134,] 134 0.00355771
## [135,] 135 0.00414191
## [136,] 136 0.00439954
## [137,] 137 0.00397438
## [138,] 138 0.00462219
## [139,] 139 0.00506412
## [140,] 140 0.00473569
## [141,] 141 0.00548653
## [142,] 142 0.00636111
## [143,] 143 0.00537169
## [144,] 144 0.00490679
## [145,] 145 0.00563438
## [146,] 146 0.00563820
## [147,] 147 0.00569254
## [148,] 148 0.00624518
## [149,] 149 0.00691504
## [150,] 150 0.00488190
## [151,] 151 0.00435737
## [152,] 152 0.00500726
## [153,] 153 0.00361159
## [154,] 154 0.00351320
## [155,] 155 0.00412640
## [156,] 156 0.00483323
## [157,] 157 0.00521381
## [158,] 158 0.00602516
## [159,] 159 0.00697823
## [160,] 160 0.00544638
## [161,] 161 0.00492895
## [162,] 162 0.00333684
## [163,] 163 0.00374455
## [164,] 164 0.00370494
## [165,] 165 0.00400565
## [166,] 166 0.00466962
## [167,] 167 0.00537525
## [168,] 168 0.00310929
## [169,] 169 0.00360137
## [170,] 170 0.00405655
## [171,] 171 0.00199722
## [172,] 172 0.00229825
## [173,] 173 0.00265184
## [174,] 174 0.00302291
## [175,] 175 0.00140215
## [176,] 176 0.00162847
## [177,] 177 0.00189139
## [178,] 178 0.00168732
## [179,] 179 0.00130907
## [180,] 180 0.00153070
## [181,] 181 0.00152632
## [182,] 182 0.00166951
## [183,] 183 0.00195852
## [184,] 184 0.00192815
## [185,] 185 0.00191434
## [186,] 186 0.00176264
## [187,] 187 0.00171492
## [188,] 188 0.00190513
## [189,] 189 0.00219087
## [190,] 190 0.00251058
## [191,] 191 0.00264085
## [192,] 192 0.00306788
## [193,] 193 0.00347431
## [194,] 194 0.00335357
## [195,] 195 0.00301918
## [196,] 196 0.00348939
## [197,] 197 0.00350253
## [198,] 198 0.00264027
## [199,] 199 0.00303716
## [200,] 200 0.00313740
Box.test.2(arima(Res,order=c(1,0,1))$resid,50:100)
## Retard p-value
## [1,] 50 0.09195897
## [2,] 51 0.10187243
## [3,] 52 0.11927692
## [4,] 53 0.09784426
## [5,] 54 0.06355495
## [6,] 55 0.06946390
## [7,] 56 0.04954589
## [8,] 57 0.05400456
## [9,] 58 0.06303309
## [10,] 59 0.05755680
## [11,] 60 0.06817554
## [12,] 61 0.07577271
## [13,] 62 0.08889657
## [14,] 63 0.08547464
## [15,] 64 0.06437615
## [16,] 65 0.06937277
## [17,] 66 0.07861724
## [18,] 67 0.08792158
## [19,] 68 0.09695675
## [20,] 69 0.10804832
## [21,] 70 0.12016434
## [22,] 71 0.04516518
## [23,] 72 0.05287867
## [24,] 73 0.02775560
## [25,] 74 0.02649802
## [26,] 75 0.03167021
## [27,] 76 0.03759975
## [28,] 77 0.03080117
## [29,] 78 0.03459086
## [30,] 79 0.03796039
## [31,] 80 0.03164272
## [32,] 81 0.02938890
## [33,] 82 0.03305021
## [34,] 83 0.03712003
## [35,] 84 0.03237220
## [36,] 85 0.03434463
## [37,] 86 0.03402969
## [38,] 87 0.03827337
## [39,] 88 0.02231196
## [40,] 89 0.02559204
## [41,] 90 0.02636362
## [42,] 91 0.02726794
## [43,] 92 0.03173811
## [44,] 93 0.03548638
## [45,] 94 0.04035232
## [46,] 95 0.02540018
## [47,] 96 0.02040251
## [48,] 97 0.02330549
## [49,] 98 0.02696766
## [50,] 99 0.02944756
## [51,] 100 0.02557426
Box.test.2(auto.arima(Res)$resid,50:100)
## Retard p-value
## [1,] 50 0.09346401
## [2,] 51 0.10336602
## [3,] 52 0.12091108
## [4,] 53 0.09927051
## [5,] 54 0.06457142
## [6,] 55 0.07043285
## [7,] 56 0.05009859
## [8,] 57 0.05450484
## [9,] 58 0.06355937
## [10,] 59 0.05803287
## [11,] 60 0.06872407
## [12,] 61 0.07632071
## [13,] 62 0.08950299
## [14,] 63 0.08600802
## [15,] 64 0.06499824
## [16,] 65 0.07002851
## [17,] 66 0.07933502
## [18,] 67 0.08871186
## [19,] 68 0.09781844
## [20,] 69 0.10897573
## [21,] 70 0.12125671
## [22,] 71 0.04566720
## [23,] 72 0.05344535
## [24,] 73 0.02811488
## [25,] 74 0.02690795
## [26,] 75 0.03214834
## [27,] 76 0.03815775
## [28,] 77 0.03117161
## [29,] 78 0.03495497
## [30,] 79 0.03839613
## [31,] 80 0.03209982
## [32,] 81 0.02981095
## [33,] 82 0.03353323
## [34,] 83 0.03764910
## [35,] 84 0.03283023
## [36,] 85 0.03480320
## [37,] 86 0.03447519
## [38,] 87 0.03878742
## [39,] 88 0.02266026
## [40,] 89 0.02598352
## [41,] 90 0.02676777
## [42,] 91 0.02771040
## [43,] 92 0.03224663
## [44,] 93 0.03607985
## [45,] 94 0.04096646
## [46,] 95 0.02565625
## [47,] 96 0.02052135
## [48,] 97 0.02346452
## [49,] 98 0.02715650
## [50,] 99 0.02967256
## [51,] 100 0.02578657
L’utilisation de la commande decompose, qui permet d’isoler tendance et périodicité déterministe de la série, n’améliore pas la modélisation ; le test de Ljung-Box pour l’hypothèse de bruit blanc des résidus est également rejeté pour les modélisations AR et même ARMA. Le modèle ARMA n’est donc pas approprié pour modéliser cette série, même après prétaitement et décomposition. Il faudrait utiliser d’autres modèles de séries temporelles.
On considère le CAC40.
CAC <- EuStockMarkets[,3]
plot(CAC)
On effectue un lissage exponentiel simple :
y <- window(CAC,end=c(1997,260)) # données jusque fin 1997
z <- window(CAC,start=c(1998,1)) # données depuis 1998
l <- length(z)
ysmooth <- HoltWinters(y,beta=FALSE,gamma=FALSE)
ypred <- predict(ysmooth,n.ahead=l)
plot(ysmooth,ypred,main = "prédiction de l'année 1998")
On effectue maintenant un lissage de Holt-Winters :
ysmooth <- HoltWinters(y,gamma=FALSE)
ypred <- predict(ysmooth,n.ahead=l)
plot(ysmooth,ypred)
Pour ce dernier on calcule et représente l’erreur de prédiction :
Error <- z[1:10]-ypred[1:10]
plot(1:10,Error)
On voit que l’erreur dépasse 100 pour une prédiction d’horizon 6 ou plus.