Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_regression_trees1.wasp
Title produced by softwareRecursive Partitioning (Regression Trees)
Date of computationTue, 14 Dec 2010 08:57:51 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2010/Dec/14/t1292316962omo77gjztkqrcrd.htm/, Retrieved Thu, 02 May 2024 23:21:49 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=109271, Retrieved Thu, 02 May 2024 23:21:49 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact173
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Recursive Partitioning (Regression Trees)] [] [2010-12-05 18:59:57] [b98453cac15ba1066b407e146608df68]
-   PD    [Recursive Partitioning (Regression Trees)] [WS 10 - Recursive...] [2010-12-14 08:57:51] [89d441ae0711e9b79b5d358f420c1317] [Current]
Feedback Forum

Post a new message
Dataseries X:
100.44	1576.23	29.29	710.45
100.88	1546.37	28.99	720
101.42	1545.05	28.91	720
99.97	1552.34	29.29	720
100.56	1594.3	30.96	754.78
99.51	1605.78	30.57	802.73
98.96	1673.21	30.59	845.24
100.85	1612.94	31.39	893.91
100.66	1566.34	31.28	931.43
100.22	1530.17	31.1	940
100.30	1582.54	31.7	947.73
100.73	1702.16	32.57	960
101.46	1701.93	32.49	996.96
101.35	1811.15	32.46	1000
101.14	1924.2	32.3	1000
101.68	2034.25	32.97	1000
101.47	2011.13	32.9	1013.04
100.59	2013.04	32.93	1095.24
101.18	2151.67	33.72	1159.09
100.87	1902.09	33.33	1200
99.79	1944.01	33.44	1200
100.74	1916.67	33.89	1282.61
99.34	1967.31	34.34	1513.64
100.07	2119.88	33.56	1669.05
103.68	2216.38	32.67	1700
103.52	2522.83	32.57	1700
104.68	2647.64	33.23	1700
103.75	2631.23	32.85	1665.91
103.70	2693.41	32.61	1650
102.98	3021.76	32.57	1650
106.30	2953.67	32.98	1619.57
107.21	2796.8	31.33	1599.05
106.83	2672.05	29.8	1572.73
105.60	2251.23	28.06	1470
104.30	2046.08	25.47	1268
104.43	2420.04	24.65	1217.39
104.36	2608.89	23.94	1154.09
106.21	2660.47	23.89	984
107.34	2493.98	23.54	900
106.92	2541.7	24.28	900
104.80	2554.6	25.51	916.67
103.85	2699.61	27.03	957.73
103.39	2805.48	27.09	966.09
103.38	2956.66	27.3	980
103.93	3149.51	27.11	990.91
104.41	3372.5	26.39	1000.91
104.47	3379.33	27.54	1042.38
103.84	3517.54	26.85	1142.61
103.65	3527.34	26.82	1214.29
103.17	3281.06	25.9	1218
103.40	3089.65	24.96	1202.61
112.72	3222.76	25.4	1200
114.77	3165.76	24.38	1228.57
116.18	3232.43	24.73	1195.91
116.93	3229.54	25.43	1180
115.19	3071.74	26.04	1210.91
114.55	2850.17	25.59	1272.27




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'Sir Ronald Aylmer Fisher' @ 193.190.124.24

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 4 seconds \tabularnewline
R Server & 'Sir Ronald Aylmer Fisher' @ 193.190.124.24 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=109271&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]4 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Sir Ronald Aylmer Fisher' @ 193.190.124.24[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=109271&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=109271&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time4 seconds
R Server'Sir Ronald Aylmer Fisher' @ 193.190.124.24







Goodness of Fit
Correlation0.6612
R-squared0.4372
RMSE3.2822

\begin{tabular}{lllllllll}
\hline
Goodness of Fit \tabularnewline
Correlation & 0.6612 \tabularnewline
R-squared & 0.4372 \tabularnewline
RMSE & 3.2822 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=109271&T=1

[TABLE]
[ROW][C]Goodness of Fit[/C][/ROW]
[ROW][C]Correlation[/C][C]0.6612[/C][/ROW]
[ROW][C]R-squared[/C][C]0.4372[/C][/ROW]
[ROW][C]RMSE[/C][C]3.2822[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=109271&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=109271&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goodness of Fit
Correlation0.6612
R-squared0.4372
RMSE3.2822







Actuals, Predictions, and Residuals
#ActualsForecastsResiduals
1100.44100.852307692308-0.412307692307692
2100.88100.8523076923080.0276923076923055
3101.42100.8523076923080.567692307692312
499.97100.852307692308-0.882307692307691
5100.56100.852307692308-0.292307692307688
699.51100.852307692308-1.34230769230768
798.96100.852307692308-1.89230769230770
8100.85100.852307692308-0.00230769230769567
9100.66100.852307692308-0.192307692307693
10100.22100.852307692308-0.632307692307691
11100.3100.852307692308-0.552307692307693
12100.73100.852307692308-0.122307692307686
13101.46100.8523076923080.607692307692304
14101.35100.8523076923080.497692307692304
15101.14100.8523076923080.287692307692311
16101.68100.8523076923080.827692307692317
17101.47100.8523076923080.617692307692309
18100.59100.852307692308-0.262307692307687
19101.18100.8523076923080.327692307692317
20100.87100.8523076923080.0176923076923146
2199.79100.852307692308-1.06230769230768
22100.74100.852307692308-0.112307692307695
2399.34100.852307692308-1.51230769230769
24100.07100.852307692308-0.782307692307697
25103.68100.8523076923082.82769230769232
26103.52106.66-3.14
27104.68106.66-1.97999999999999
28103.75106.66-2.91000000000000
29103.7106.66-2.95999999999999
30102.98106.66-3.67999999999999
31106.3106.66-0.359999999999999
32107.21106.660.549999999999997
33106.83106.660.170000000000002
34105.6106.66-1.06000000000000
35104.3100.8523076923083.44769230769231
36104.43106.66-2.22999999999999
37104.36106.66-2.30000000000000
38106.21106.66-0.450000000000003
39107.34106.660.680000000000007
40106.92106.660.260000000000005
41104.8106.66-1.86
42103.85106.66-2.81
43103.39106.66-3.27000000000000
44103.38106.66-3.28
45103.93106.66-2.72999999999999
46104.41106.66-2.25
47104.47106.66-2.19000000000000
48103.84106.66-2.81999999999999
49103.65106.66-3.00999999999999
50103.17106.66-3.48999999999999
51103.4106.66-3.25999999999999
52112.72106.666.06
53114.77106.668.11
54116.18106.669.52000000000001
55116.93106.6610.27
56115.19106.668.53
57114.55106.667.89

\begin{tabular}{lllllllll}
\hline
Actuals, Predictions, and Residuals \tabularnewline
# & Actuals & Forecasts & Residuals \tabularnewline
1 & 100.44 & 100.852307692308 & -0.412307692307692 \tabularnewline
2 & 100.88 & 100.852307692308 & 0.0276923076923055 \tabularnewline
3 & 101.42 & 100.852307692308 & 0.567692307692312 \tabularnewline
4 & 99.97 & 100.852307692308 & -0.882307692307691 \tabularnewline
5 & 100.56 & 100.852307692308 & -0.292307692307688 \tabularnewline
6 & 99.51 & 100.852307692308 & -1.34230769230768 \tabularnewline
7 & 98.96 & 100.852307692308 & -1.89230769230770 \tabularnewline
8 & 100.85 & 100.852307692308 & -0.00230769230769567 \tabularnewline
9 & 100.66 & 100.852307692308 & -0.192307692307693 \tabularnewline
10 & 100.22 & 100.852307692308 & -0.632307692307691 \tabularnewline
11 & 100.3 & 100.852307692308 & -0.552307692307693 \tabularnewline
12 & 100.73 & 100.852307692308 & -0.122307692307686 \tabularnewline
13 & 101.46 & 100.852307692308 & 0.607692307692304 \tabularnewline
14 & 101.35 & 100.852307692308 & 0.497692307692304 \tabularnewline
15 & 101.14 & 100.852307692308 & 0.287692307692311 \tabularnewline
16 & 101.68 & 100.852307692308 & 0.827692307692317 \tabularnewline
17 & 101.47 & 100.852307692308 & 0.617692307692309 \tabularnewline
18 & 100.59 & 100.852307692308 & -0.262307692307687 \tabularnewline
19 & 101.18 & 100.852307692308 & 0.327692307692317 \tabularnewline
20 & 100.87 & 100.852307692308 & 0.0176923076923146 \tabularnewline
21 & 99.79 & 100.852307692308 & -1.06230769230768 \tabularnewline
22 & 100.74 & 100.852307692308 & -0.112307692307695 \tabularnewline
23 & 99.34 & 100.852307692308 & -1.51230769230769 \tabularnewline
24 & 100.07 & 100.852307692308 & -0.782307692307697 \tabularnewline
25 & 103.68 & 100.852307692308 & 2.82769230769232 \tabularnewline
26 & 103.52 & 106.66 & -3.14 \tabularnewline
27 & 104.68 & 106.66 & -1.97999999999999 \tabularnewline
28 & 103.75 & 106.66 & -2.91000000000000 \tabularnewline
29 & 103.7 & 106.66 & -2.95999999999999 \tabularnewline
30 & 102.98 & 106.66 & -3.67999999999999 \tabularnewline
31 & 106.3 & 106.66 & -0.359999999999999 \tabularnewline
32 & 107.21 & 106.66 & 0.549999999999997 \tabularnewline
33 & 106.83 & 106.66 & 0.170000000000002 \tabularnewline
34 & 105.6 & 106.66 & -1.06000000000000 \tabularnewline
35 & 104.3 & 100.852307692308 & 3.44769230769231 \tabularnewline
36 & 104.43 & 106.66 & -2.22999999999999 \tabularnewline
37 & 104.36 & 106.66 & -2.30000000000000 \tabularnewline
38 & 106.21 & 106.66 & -0.450000000000003 \tabularnewline
39 & 107.34 & 106.66 & 0.680000000000007 \tabularnewline
40 & 106.92 & 106.66 & 0.260000000000005 \tabularnewline
41 & 104.8 & 106.66 & -1.86 \tabularnewline
42 & 103.85 & 106.66 & -2.81 \tabularnewline
43 & 103.39 & 106.66 & -3.27000000000000 \tabularnewline
44 & 103.38 & 106.66 & -3.28 \tabularnewline
45 & 103.93 & 106.66 & -2.72999999999999 \tabularnewline
46 & 104.41 & 106.66 & -2.25 \tabularnewline
47 & 104.47 & 106.66 & -2.19000000000000 \tabularnewline
48 & 103.84 & 106.66 & -2.81999999999999 \tabularnewline
49 & 103.65 & 106.66 & -3.00999999999999 \tabularnewline
50 & 103.17 & 106.66 & -3.48999999999999 \tabularnewline
51 & 103.4 & 106.66 & -3.25999999999999 \tabularnewline
52 & 112.72 & 106.66 & 6.06 \tabularnewline
53 & 114.77 & 106.66 & 8.11 \tabularnewline
54 & 116.18 & 106.66 & 9.52000000000001 \tabularnewline
55 & 116.93 & 106.66 & 10.27 \tabularnewline
56 & 115.19 & 106.66 & 8.53 \tabularnewline
57 & 114.55 & 106.66 & 7.89 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=109271&T=2

[TABLE]
[ROW][C]Actuals, Predictions, and Residuals[/C][/ROW]
[ROW][C]#[/C][C]Actuals[/C][C]Forecasts[/C][C]Residuals[/C][/ROW]
[ROW][C]1[/C][C]100.44[/C][C]100.852307692308[/C][C]-0.412307692307692[/C][/ROW]
[ROW][C]2[/C][C]100.88[/C][C]100.852307692308[/C][C]0.0276923076923055[/C][/ROW]
[ROW][C]3[/C][C]101.42[/C][C]100.852307692308[/C][C]0.567692307692312[/C][/ROW]
[ROW][C]4[/C][C]99.97[/C][C]100.852307692308[/C][C]-0.882307692307691[/C][/ROW]
[ROW][C]5[/C][C]100.56[/C][C]100.852307692308[/C][C]-0.292307692307688[/C][/ROW]
[ROW][C]6[/C][C]99.51[/C][C]100.852307692308[/C][C]-1.34230769230768[/C][/ROW]
[ROW][C]7[/C][C]98.96[/C][C]100.852307692308[/C][C]-1.89230769230770[/C][/ROW]
[ROW][C]8[/C][C]100.85[/C][C]100.852307692308[/C][C]-0.00230769230769567[/C][/ROW]
[ROW][C]9[/C][C]100.66[/C][C]100.852307692308[/C][C]-0.192307692307693[/C][/ROW]
[ROW][C]10[/C][C]100.22[/C][C]100.852307692308[/C][C]-0.632307692307691[/C][/ROW]
[ROW][C]11[/C][C]100.3[/C][C]100.852307692308[/C][C]-0.552307692307693[/C][/ROW]
[ROW][C]12[/C][C]100.73[/C][C]100.852307692308[/C][C]-0.122307692307686[/C][/ROW]
[ROW][C]13[/C][C]101.46[/C][C]100.852307692308[/C][C]0.607692307692304[/C][/ROW]
[ROW][C]14[/C][C]101.35[/C][C]100.852307692308[/C][C]0.497692307692304[/C][/ROW]
[ROW][C]15[/C][C]101.14[/C][C]100.852307692308[/C][C]0.287692307692311[/C][/ROW]
[ROW][C]16[/C][C]101.68[/C][C]100.852307692308[/C][C]0.827692307692317[/C][/ROW]
[ROW][C]17[/C][C]101.47[/C][C]100.852307692308[/C][C]0.617692307692309[/C][/ROW]
[ROW][C]18[/C][C]100.59[/C][C]100.852307692308[/C][C]-0.262307692307687[/C][/ROW]
[ROW][C]19[/C][C]101.18[/C][C]100.852307692308[/C][C]0.327692307692317[/C][/ROW]
[ROW][C]20[/C][C]100.87[/C][C]100.852307692308[/C][C]0.0176923076923146[/C][/ROW]
[ROW][C]21[/C][C]99.79[/C][C]100.852307692308[/C][C]-1.06230769230768[/C][/ROW]
[ROW][C]22[/C][C]100.74[/C][C]100.852307692308[/C][C]-0.112307692307695[/C][/ROW]
[ROW][C]23[/C][C]99.34[/C][C]100.852307692308[/C][C]-1.51230769230769[/C][/ROW]
[ROW][C]24[/C][C]100.07[/C][C]100.852307692308[/C][C]-0.782307692307697[/C][/ROW]
[ROW][C]25[/C][C]103.68[/C][C]100.852307692308[/C][C]2.82769230769232[/C][/ROW]
[ROW][C]26[/C][C]103.52[/C][C]106.66[/C][C]-3.14[/C][/ROW]
[ROW][C]27[/C][C]104.68[/C][C]106.66[/C][C]-1.97999999999999[/C][/ROW]
[ROW][C]28[/C][C]103.75[/C][C]106.66[/C][C]-2.91000000000000[/C][/ROW]
[ROW][C]29[/C][C]103.7[/C][C]106.66[/C][C]-2.95999999999999[/C][/ROW]
[ROW][C]30[/C][C]102.98[/C][C]106.66[/C][C]-3.67999999999999[/C][/ROW]
[ROW][C]31[/C][C]106.3[/C][C]106.66[/C][C]-0.359999999999999[/C][/ROW]
[ROW][C]32[/C][C]107.21[/C][C]106.66[/C][C]0.549999999999997[/C][/ROW]
[ROW][C]33[/C][C]106.83[/C][C]106.66[/C][C]0.170000000000002[/C][/ROW]
[ROW][C]34[/C][C]105.6[/C][C]106.66[/C][C]-1.06000000000000[/C][/ROW]
[ROW][C]35[/C][C]104.3[/C][C]100.852307692308[/C][C]3.44769230769231[/C][/ROW]
[ROW][C]36[/C][C]104.43[/C][C]106.66[/C][C]-2.22999999999999[/C][/ROW]
[ROW][C]37[/C][C]104.36[/C][C]106.66[/C][C]-2.30000000000000[/C][/ROW]
[ROW][C]38[/C][C]106.21[/C][C]106.66[/C][C]-0.450000000000003[/C][/ROW]
[ROW][C]39[/C][C]107.34[/C][C]106.66[/C][C]0.680000000000007[/C][/ROW]
[ROW][C]40[/C][C]106.92[/C][C]106.66[/C][C]0.260000000000005[/C][/ROW]
[ROW][C]41[/C][C]104.8[/C][C]106.66[/C][C]-1.86[/C][/ROW]
[ROW][C]42[/C][C]103.85[/C][C]106.66[/C][C]-2.81[/C][/ROW]
[ROW][C]43[/C][C]103.39[/C][C]106.66[/C][C]-3.27000000000000[/C][/ROW]
[ROW][C]44[/C][C]103.38[/C][C]106.66[/C][C]-3.28[/C][/ROW]
[ROW][C]45[/C][C]103.93[/C][C]106.66[/C][C]-2.72999999999999[/C][/ROW]
[ROW][C]46[/C][C]104.41[/C][C]106.66[/C][C]-2.25[/C][/ROW]
[ROW][C]47[/C][C]104.47[/C][C]106.66[/C][C]-2.19000000000000[/C][/ROW]
[ROW][C]48[/C][C]103.84[/C][C]106.66[/C][C]-2.81999999999999[/C][/ROW]
[ROW][C]49[/C][C]103.65[/C][C]106.66[/C][C]-3.00999999999999[/C][/ROW]
[ROW][C]50[/C][C]103.17[/C][C]106.66[/C][C]-3.48999999999999[/C][/ROW]
[ROW][C]51[/C][C]103.4[/C][C]106.66[/C][C]-3.25999999999999[/C][/ROW]
[ROW][C]52[/C][C]112.72[/C][C]106.66[/C][C]6.06[/C][/ROW]
[ROW][C]53[/C][C]114.77[/C][C]106.66[/C][C]8.11[/C][/ROW]
[ROW][C]54[/C][C]116.18[/C][C]106.66[/C][C]9.52000000000001[/C][/ROW]
[ROW][C]55[/C][C]116.93[/C][C]106.66[/C][C]10.27[/C][/ROW]
[ROW][C]56[/C][C]115.19[/C][C]106.66[/C][C]8.53[/C][/ROW]
[ROW][C]57[/C][C]114.55[/C][C]106.66[/C][C]7.89[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=109271&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=109271&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Actuals, Predictions, and Residuals
#ActualsForecastsResiduals
1100.44100.852307692308-0.412307692307692
2100.88100.8523076923080.0276923076923055
3101.42100.8523076923080.567692307692312
499.97100.852307692308-0.882307692307691
5100.56100.852307692308-0.292307692307688
699.51100.852307692308-1.34230769230768
798.96100.852307692308-1.89230769230770
8100.85100.852307692308-0.00230769230769567
9100.66100.852307692308-0.192307692307693
10100.22100.852307692308-0.632307692307691
11100.3100.852307692308-0.552307692307693
12100.73100.852307692308-0.122307692307686
13101.46100.8523076923080.607692307692304
14101.35100.8523076923080.497692307692304
15101.14100.8523076923080.287692307692311
16101.68100.8523076923080.827692307692317
17101.47100.8523076923080.617692307692309
18100.59100.852307692308-0.262307692307687
19101.18100.8523076923080.327692307692317
20100.87100.8523076923080.0176923076923146
2199.79100.852307692308-1.06230769230768
22100.74100.852307692308-0.112307692307695
2399.34100.852307692308-1.51230769230769
24100.07100.852307692308-0.782307692307697
25103.68100.8523076923082.82769230769232
26103.52106.66-3.14
27104.68106.66-1.97999999999999
28103.75106.66-2.91000000000000
29103.7106.66-2.95999999999999
30102.98106.66-3.67999999999999
31106.3106.66-0.359999999999999
32107.21106.660.549999999999997
33106.83106.660.170000000000002
34105.6106.66-1.06000000000000
35104.3100.8523076923083.44769230769231
36104.43106.66-2.22999999999999
37104.36106.66-2.30000000000000
38106.21106.66-0.450000000000003
39107.34106.660.680000000000007
40106.92106.660.260000000000005
41104.8106.66-1.86
42103.85106.66-2.81
43103.39106.66-3.27000000000000
44103.38106.66-3.28
45103.93106.66-2.72999999999999
46104.41106.66-2.25
47104.47106.66-2.19000000000000
48103.84106.66-2.81999999999999
49103.65106.66-3.00999999999999
50103.17106.66-3.48999999999999
51103.4106.66-3.25999999999999
52112.72106.666.06
53114.77106.668.11
54116.18106.669.52000000000001
55116.93106.6610.27
56115.19106.668.53
57114.55106.667.89



Parameters (Session):
par1 = 1 ; par2 = none ; par3 = 0 ; par4 = no ;
Parameters (R input):
par1 = 1 ; par2 = none ; par3 = 0 ; par4 = no ;
R code (references can be found in the software module):
library(party)
library(Hmisc)
par1 <- as.numeric(par1)
par3 <- as.numeric(par3)
x <- data.frame(t(y))
is.data.frame(x)
x <- x[!is.na(x[,par1]),]
k <- length(x[1,])
n <- length(x[,1])
colnames(x)[par1]
x[,par1]
if (par2 == 'kmeans') {
cl <- kmeans(x[,par1], par3)
print(cl)
clm <- matrix(cbind(cl$centers,1:par3),ncol=2)
clm <- clm[sort.list(clm[,1]),]
for (i in 1:par3) {
cl$cluster[cl$cluster==clm[i,2]] <- paste('C',i,sep='')
}
cl$cluster <- as.factor(cl$cluster)
print(cl$cluster)
x[,par1] <- cl$cluster
}
if (par2 == 'quantiles') {
x[,par1] <- cut2(x[,par1],g=par3)
}
if (par2 == 'hclust') {
hc <- hclust(dist(x[,par1])^2, 'cen')
print(hc)
memb <- cutree(hc, k = par3)
dum <- c(mean(x[memb==1,par1]))
for (i in 2:par3) {
dum <- c(dum, mean(x[memb==i,par1]))
}
hcm <- matrix(cbind(dum,1:par3),ncol=2)
hcm <- hcm[sort.list(hcm[,1]),]
for (i in 1:par3) {
memb[memb==hcm[i,2]] <- paste('C',i,sep='')
}
memb <- as.factor(memb)
print(memb)
x[,par1] <- memb
}
if (par2=='equal') {
ed <- cut(as.numeric(x[,par1]),par3,labels=paste('C',1:par3,sep=''))
x[,par1] <- as.factor(ed)
}
table(x[,par1])
colnames(x)
colnames(x)[par1]
x[,par1]
if (par2 == 'none') {
m <- ctree(as.formula(paste(colnames(x)[par1],' ~ .',sep='')),data = x)
}
load(file='createtable')
if (par2 != 'none') {
m <- ctree(as.formula(paste('as.factor(',colnames(x)[par1],') ~ .',sep='')),data = x)
if (par4=='yes') {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'10-Fold Cross Validation',3+2*par3,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'',1,TRUE)
a<-table.element(a,'Prediction (training)',par3+1,TRUE)
a<-table.element(a,'Prediction (testing)',par3+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Actual',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,paste('C',jjj,sep=''),1,TRUE)
a<-table.element(a,'CV',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,paste('C',jjj,sep=''),1,TRUE)
a<-table.element(a,'CV',1,TRUE)
a<-table.row.end(a)
for (i in 1:10) {
ind <- sample(2, nrow(x), replace=T, prob=c(0.9,0.1))
m.ct <- ctree(as.formula(paste('as.factor(',colnames(x)[par1],') ~ .',sep='')),data =x[ind==1,])
if (i==1) {
m.ct.i.pred <- predict(m.ct, newdata=x[ind==1,])
m.ct.i.actu <- x[ind==1,par1]
m.ct.x.pred <- predict(m.ct, newdata=x[ind==2,])
m.ct.x.actu <- x[ind==2,par1]
} else {
m.ct.i.pred <- c(m.ct.i.pred,predict(m.ct, newdata=x[ind==1,]))
m.ct.i.actu <- c(m.ct.i.actu,x[ind==1,par1])
m.ct.x.pred <- c(m.ct.x.pred,predict(m.ct, newdata=x[ind==2,]))
m.ct.x.actu <- c(m.ct.x.actu,x[ind==2,par1])
}
}
print(m.ct.i.tab <- table(m.ct.i.actu,m.ct.i.pred))
numer <- 0
for (i in 1:par3) {
print(m.ct.i.tab[i,i] / sum(m.ct.i.tab[i,]))
numer <- numer + m.ct.i.tab[i,i]
}
print(m.ct.i.cp <- numer / sum(m.ct.i.tab))
print(m.ct.x.tab <- table(m.ct.x.actu,m.ct.x.pred))
numer <- 0
for (i in 1:par3) {
print(m.ct.x.tab[i,i] / sum(m.ct.x.tab[i,]))
numer <- numer + m.ct.x.tab[i,i]
}
print(m.ct.x.cp <- numer / sum(m.ct.x.tab))
for (i in 1:par3) {
a<-table.row.start(a)
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
for (jjj in 1:par3) a<-table.element(a,m.ct.i.tab[i,jjj])
a<-table.element(a,round(m.ct.i.tab[i,i]/sum(m.ct.i.tab[i,]),4))
for (jjj in 1:par3) a<-table.element(a,m.ct.x.tab[i,jjj])
a<-table.element(a,round(m.ct.x.tab[i,i]/sum(m.ct.x.tab[i,]),4))
a<-table.row.end(a)
}
a<-table.row.start(a)
a<-table.element(a,'Overall',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,'-')
a<-table.element(a,round(m.ct.i.cp,4))
for (jjj in 1:par3) a<-table.element(a,'-')
a<-table.element(a,round(m.ct.x.cp,4))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
}
}
m
bitmap(file='test1.png')
plot(m)
dev.off()
bitmap(file='test1a.png')
plot(x[,par1] ~ as.factor(where(m)),main='Response by Terminal Node',xlab='Terminal Node',ylab='Response')
dev.off()
if (par2 == 'none') {
forec <- predict(m)
result <- as.data.frame(cbind(x[,par1],forec,x[,par1]-forec))
colnames(result) <- c('Actuals','Forecasts','Residuals')
print(result)
}
if (par2 != 'none') {
print(cbind(as.factor(x[,par1]),predict(m)))
myt <- table(as.factor(x[,par1]),predict(m))
print(myt)
}
bitmap(file='test2.png')
if(par2=='none') {
op <- par(mfrow=c(2,2))
plot(density(result$Actuals),main='Kernel Density Plot of Actuals')
plot(density(result$Residuals),main='Kernel Density Plot of Residuals')
plot(result$Forecasts,result$Actuals,main='Actuals versus Predictions',xlab='Predictions',ylab='Actuals')
plot(density(result$Forecasts),main='Kernel Density Plot of Predictions')
par(op)
}
if(par2!='none') {
plot(myt,main='Confusion Matrix',xlab='Actual',ylab='Predicted')
}
dev.off()
if (par2 == 'none') {
detcoef <- cor(result$Forecasts,result$Actuals)
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goodness of Fit',2,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Correlation',1,TRUE)
a<-table.element(a,round(detcoef,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'R-squared',1,TRUE)
a<-table.element(a,round(detcoef*detcoef,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'RMSE',1,TRUE)
a<-table.element(a,round(sqrt(mean((result$Residuals)^2)),4))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Actuals, Predictions, and Residuals',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'#',header=TRUE)
a<-table.element(a,'Actuals',header=TRUE)
a<-table.element(a,'Forecasts',header=TRUE)
a<-table.element(a,'Residuals',header=TRUE)
a<-table.row.end(a)
for (i in 1:length(result$Actuals)) {
a<-table.row.start(a)
a<-table.element(a,i,header=TRUE)
a<-table.element(a,result$Actuals[i])
a<-table.element(a,result$Forecasts[i])
a<-table.element(a,result$Residuals[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
}
if (par2 != 'none') {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Confusion Matrix (predicted in columns / actuals in rows)',par3+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'',1,TRUE)
for (i in 1:par3) {
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
}
a<-table.row.end(a)
for (i in 1:par3) {
a<-table.row.start(a)
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
for (j in 1:par3) {
a<-table.element(a,myt[i,j])
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
}