Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_regression_trees1.wasp
Title produced by softwareRecursive Partitioning (Regression Trees)
Date of computationFri, 24 Dec 2010 15:51:31 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2010/Dec/24/t1293205769d7it04y3rygxzur.htm/, Retrieved Tue, 30 Apr 2024 06:42:29 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=115153, Retrieved Tue, 30 Apr 2024 06:42:29 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact96
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Recursive Partitioning (Regression Trees)] [Recursive partiti...] [2010-12-24 15:51:31] [278a0539dc236556c5f30b5bc56ff9eb] [Current]
Feedback Forum

Post a new message
Dataseries X:
3.48	4143
3.6	4429
3.66	5219
3.45	4929
3.3	5761
3.14	5592
3.21	4163
3.12	4962
3.14	5208
3.4	4755
3.42	4491
3.29	5732
3.49	5731
3.52	5040
3.81	6102
4.03	4904
3.98	5369
4.1	5578
3.96	4619
3.83	4731
3.72	5011
3.82	5299
3.76	4146
3.98	4625
4.14	4736
4	4219
4.13	5116
4.28	4205
4.46	4121
4.63	5103
4.49	4300
4.41	4578
4.5	3809
4.39	5657
4.33	4248
4.45	3830
4.17	4736
4.13	4839
4.33	4411
4.47	4570
4.63	4104
4.9	4801
4.77	3953
4.51	3828
4.63	4440
4.36	4026
3.95	4109
3.74	4785
4.15	3224
4.14	3552
3.97	3940
3.81	3913
4.07	3681
3.84	4309
3.63	3830
3.55	4143
3.6	4087
3.63	3818
3.55	3380
3.69	3430
3.53	3458
3.43	3970
3.4	5260
3.41	5024
3.09	5634
3.35	6549
3.22	4676




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'RServer@AstonUniversity' @ vre.aston.ac.uk

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 3 seconds \tabularnewline
R Server & 'RServer@AstonUniversity' @ vre.aston.ac.uk \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=115153&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]3 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'RServer@AstonUniversity' @ vre.aston.ac.uk[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=115153&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=115153&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time3 seconds
R Server'RServer@AstonUniversity' @ vre.aston.ac.uk







Goodness of Fit
Correlation0.3889
R-squared0.1512
RMSE0.4249

\begin{tabular}{lllllllll}
\hline
Goodness of Fit \tabularnewline
Correlation & 0.3889 \tabularnewline
R-squared & 0.1512 \tabularnewline
RMSE & 0.4249 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=115153&T=1

[TABLE]
[ROW][C]Goodness of Fit[/C][/ROW]
[ROW][C]Correlation[/C][C]0.3889[/C][/ROW]
[ROW][C]R-squared[/C][C]0.1512[/C][/ROW]
[ROW][C]RMSE[/C][C]0.4249[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=115153&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=115153&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goodness of Fit
Correlation0.3889
R-squared0.1512
RMSE0.4249







Actuals, Predictions, and Residuals
#ActualsForecastsResiduals
13.484.00282608695652-0.522826086956522
23.64.00282608695652-0.402826086956522
33.663.616190476190480.0438095238095237
43.453.61619047619048-0.166190476190476
53.33.61619047619048-0.316190476190477
63.143.61619047619048-0.476190476190476
73.214.00282608695652-0.792826086956522
83.123.61619047619048-0.496190476190476
93.143.61619047619048-0.476190476190476
103.44.00282608695652-0.602826086956522
113.424.00282608695652-0.582826086956522
123.293.61619047619048-0.326190476190476
133.493.61619047619048-0.126190476190476
143.523.61619047619048-0.0961904761904764
153.813.616190476190480.193809523809524
164.034.002826086956520.0271739130434785
173.983.616190476190480.363809523809524
184.13.616190476190480.483809523809523
193.964.00282608695652-0.0428260869565218
203.834.00282608695652-0.172826086956522
213.723.616190476190480.103809523809524
223.823.616190476190480.203809523809523
233.764.00282608695652-0.242826086956522
243.984.00282608695652-0.0228260869565218
254.144.002826086956520.137173913043478
2644.00282608695652-0.00282608695652176
274.133.616190476190480.513809523809523
284.284.002826086956520.277173913043478
294.464.002826086956520.457173913043478
304.633.616190476190481.01380952380952
314.494.002826086956520.487173913043478
324.414.002826086956520.407173913043478
334.54.002826086956520.497173913043478
344.393.616190476190480.773809523809523
354.334.002826086956520.327173913043478
364.454.002826086956520.447173913043478
374.174.002826086956520.167173913043478
384.134.002826086956520.127173913043478
394.334.002826086956520.327173913043478
404.474.002826086956520.467173913043478
414.634.002826086956520.627173913043478
424.94.002826086956520.897173913043479
434.774.002826086956520.767173913043478
444.514.002826086956520.507173913043478
454.634.002826086956520.627173913043478
464.364.002826086956520.357173913043479
473.954.00282608695652-0.0528260869565216
483.744.00282608695652-0.262826086956522
494.154.002826086956520.147173913043479
504.144.002826086956520.137173913043478
513.974.00282608695652-0.0328260869565216
523.814.00282608695652-0.192826086956522
534.074.002826086956520.0671739130434785
543.844.00282608695652-0.162826086956522
553.634.00282608695652-0.372826086956522
563.554.00282608695652-0.452826086956522
573.64.00282608695652-0.402826086956522
583.634.00282608695652-0.372826086956522
593.554.00282608695652-0.452826086956522
603.694.00282608695652-0.312826086956522
613.534.00282608695652-0.472826086956522
623.434.00282608695652-0.572826086956522
633.43.61619047619048-0.216190476190476
643.413.61619047619048-0.206190476190476
653.093.61619047619048-0.526190476190477
663.353.61619047619048-0.266190476190476
673.224.00282608695652-0.782826086956522

\begin{tabular}{lllllllll}
\hline
Actuals, Predictions, and Residuals \tabularnewline
# & Actuals & Forecasts & Residuals \tabularnewline
1 & 3.48 & 4.00282608695652 & -0.522826086956522 \tabularnewline
2 & 3.6 & 4.00282608695652 & -0.402826086956522 \tabularnewline
3 & 3.66 & 3.61619047619048 & 0.0438095238095237 \tabularnewline
4 & 3.45 & 3.61619047619048 & -0.166190476190476 \tabularnewline
5 & 3.3 & 3.61619047619048 & -0.316190476190477 \tabularnewline
6 & 3.14 & 3.61619047619048 & -0.476190476190476 \tabularnewline
7 & 3.21 & 4.00282608695652 & -0.792826086956522 \tabularnewline
8 & 3.12 & 3.61619047619048 & -0.496190476190476 \tabularnewline
9 & 3.14 & 3.61619047619048 & -0.476190476190476 \tabularnewline
10 & 3.4 & 4.00282608695652 & -0.602826086956522 \tabularnewline
11 & 3.42 & 4.00282608695652 & -0.582826086956522 \tabularnewline
12 & 3.29 & 3.61619047619048 & -0.326190476190476 \tabularnewline
13 & 3.49 & 3.61619047619048 & -0.126190476190476 \tabularnewline
14 & 3.52 & 3.61619047619048 & -0.0961904761904764 \tabularnewline
15 & 3.81 & 3.61619047619048 & 0.193809523809524 \tabularnewline
16 & 4.03 & 4.00282608695652 & 0.0271739130434785 \tabularnewline
17 & 3.98 & 3.61619047619048 & 0.363809523809524 \tabularnewline
18 & 4.1 & 3.61619047619048 & 0.483809523809523 \tabularnewline
19 & 3.96 & 4.00282608695652 & -0.0428260869565218 \tabularnewline
20 & 3.83 & 4.00282608695652 & -0.172826086956522 \tabularnewline
21 & 3.72 & 3.61619047619048 & 0.103809523809524 \tabularnewline
22 & 3.82 & 3.61619047619048 & 0.203809523809523 \tabularnewline
23 & 3.76 & 4.00282608695652 & -0.242826086956522 \tabularnewline
24 & 3.98 & 4.00282608695652 & -0.0228260869565218 \tabularnewline
25 & 4.14 & 4.00282608695652 & 0.137173913043478 \tabularnewline
26 & 4 & 4.00282608695652 & -0.00282608695652176 \tabularnewline
27 & 4.13 & 3.61619047619048 & 0.513809523809523 \tabularnewline
28 & 4.28 & 4.00282608695652 & 0.277173913043478 \tabularnewline
29 & 4.46 & 4.00282608695652 & 0.457173913043478 \tabularnewline
30 & 4.63 & 3.61619047619048 & 1.01380952380952 \tabularnewline
31 & 4.49 & 4.00282608695652 & 0.487173913043478 \tabularnewline
32 & 4.41 & 4.00282608695652 & 0.407173913043478 \tabularnewline
33 & 4.5 & 4.00282608695652 & 0.497173913043478 \tabularnewline
34 & 4.39 & 3.61619047619048 & 0.773809523809523 \tabularnewline
35 & 4.33 & 4.00282608695652 & 0.327173913043478 \tabularnewline
36 & 4.45 & 4.00282608695652 & 0.447173913043478 \tabularnewline
37 & 4.17 & 4.00282608695652 & 0.167173913043478 \tabularnewline
38 & 4.13 & 4.00282608695652 & 0.127173913043478 \tabularnewline
39 & 4.33 & 4.00282608695652 & 0.327173913043478 \tabularnewline
40 & 4.47 & 4.00282608695652 & 0.467173913043478 \tabularnewline
41 & 4.63 & 4.00282608695652 & 0.627173913043478 \tabularnewline
42 & 4.9 & 4.00282608695652 & 0.897173913043479 \tabularnewline
43 & 4.77 & 4.00282608695652 & 0.767173913043478 \tabularnewline
44 & 4.51 & 4.00282608695652 & 0.507173913043478 \tabularnewline
45 & 4.63 & 4.00282608695652 & 0.627173913043478 \tabularnewline
46 & 4.36 & 4.00282608695652 & 0.357173913043479 \tabularnewline
47 & 3.95 & 4.00282608695652 & -0.0528260869565216 \tabularnewline
48 & 3.74 & 4.00282608695652 & -0.262826086956522 \tabularnewline
49 & 4.15 & 4.00282608695652 & 0.147173913043479 \tabularnewline
50 & 4.14 & 4.00282608695652 & 0.137173913043478 \tabularnewline
51 & 3.97 & 4.00282608695652 & -0.0328260869565216 \tabularnewline
52 & 3.81 & 4.00282608695652 & -0.192826086956522 \tabularnewline
53 & 4.07 & 4.00282608695652 & 0.0671739130434785 \tabularnewline
54 & 3.84 & 4.00282608695652 & -0.162826086956522 \tabularnewline
55 & 3.63 & 4.00282608695652 & -0.372826086956522 \tabularnewline
56 & 3.55 & 4.00282608695652 & -0.452826086956522 \tabularnewline
57 & 3.6 & 4.00282608695652 & -0.402826086956522 \tabularnewline
58 & 3.63 & 4.00282608695652 & -0.372826086956522 \tabularnewline
59 & 3.55 & 4.00282608695652 & -0.452826086956522 \tabularnewline
60 & 3.69 & 4.00282608695652 & -0.312826086956522 \tabularnewline
61 & 3.53 & 4.00282608695652 & -0.472826086956522 \tabularnewline
62 & 3.43 & 4.00282608695652 & -0.572826086956522 \tabularnewline
63 & 3.4 & 3.61619047619048 & -0.216190476190476 \tabularnewline
64 & 3.41 & 3.61619047619048 & -0.206190476190476 \tabularnewline
65 & 3.09 & 3.61619047619048 & -0.526190476190477 \tabularnewline
66 & 3.35 & 3.61619047619048 & -0.266190476190476 \tabularnewline
67 & 3.22 & 4.00282608695652 & -0.782826086956522 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=115153&T=2

[TABLE]
[ROW][C]Actuals, Predictions, and Residuals[/C][/ROW]
[ROW][C]#[/C][C]Actuals[/C][C]Forecasts[/C][C]Residuals[/C][/ROW]
[ROW][C]1[/C][C]3.48[/C][C]4.00282608695652[/C][C]-0.522826086956522[/C][/ROW]
[ROW][C]2[/C][C]3.6[/C][C]4.00282608695652[/C][C]-0.402826086956522[/C][/ROW]
[ROW][C]3[/C][C]3.66[/C][C]3.61619047619048[/C][C]0.0438095238095237[/C][/ROW]
[ROW][C]4[/C][C]3.45[/C][C]3.61619047619048[/C][C]-0.166190476190476[/C][/ROW]
[ROW][C]5[/C][C]3.3[/C][C]3.61619047619048[/C][C]-0.316190476190477[/C][/ROW]
[ROW][C]6[/C][C]3.14[/C][C]3.61619047619048[/C][C]-0.476190476190476[/C][/ROW]
[ROW][C]7[/C][C]3.21[/C][C]4.00282608695652[/C][C]-0.792826086956522[/C][/ROW]
[ROW][C]8[/C][C]3.12[/C][C]3.61619047619048[/C][C]-0.496190476190476[/C][/ROW]
[ROW][C]9[/C][C]3.14[/C][C]3.61619047619048[/C][C]-0.476190476190476[/C][/ROW]
[ROW][C]10[/C][C]3.4[/C][C]4.00282608695652[/C][C]-0.602826086956522[/C][/ROW]
[ROW][C]11[/C][C]3.42[/C][C]4.00282608695652[/C][C]-0.582826086956522[/C][/ROW]
[ROW][C]12[/C][C]3.29[/C][C]3.61619047619048[/C][C]-0.326190476190476[/C][/ROW]
[ROW][C]13[/C][C]3.49[/C][C]3.61619047619048[/C][C]-0.126190476190476[/C][/ROW]
[ROW][C]14[/C][C]3.52[/C][C]3.61619047619048[/C][C]-0.0961904761904764[/C][/ROW]
[ROW][C]15[/C][C]3.81[/C][C]3.61619047619048[/C][C]0.193809523809524[/C][/ROW]
[ROW][C]16[/C][C]4.03[/C][C]4.00282608695652[/C][C]0.0271739130434785[/C][/ROW]
[ROW][C]17[/C][C]3.98[/C][C]3.61619047619048[/C][C]0.363809523809524[/C][/ROW]
[ROW][C]18[/C][C]4.1[/C][C]3.61619047619048[/C][C]0.483809523809523[/C][/ROW]
[ROW][C]19[/C][C]3.96[/C][C]4.00282608695652[/C][C]-0.0428260869565218[/C][/ROW]
[ROW][C]20[/C][C]3.83[/C][C]4.00282608695652[/C][C]-0.172826086956522[/C][/ROW]
[ROW][C]21[/C][C]3.72[/C][C]3.61619047619048[/C][C]0.103809523809524[/C][/ROW]
[ROW][C]22[/C][C]3.82[/C][C]3.61619047619048[/C][C]0.203809523809523[/C][/ROW]
[ROW][C]23[/C][C]3.76[/C][C]4.00282608695652[/C][C]-0.242826086956522[/C][/ROW]
[ROW][C]24[/C][C]3.98[/C][C]4.00282608695652[/C][C]-0.0228260869565218[/C][/ROW]
[ROW][C]25[/C][C]4.14[/C][C]4.00282608695652[/C][C]0.137173913043478[/C][/ROW]
[ROW][C]26[/C][C]4[/C][C]4.00282608695652[/C][C]-0.00282608695652176[/C][/ROW]
[ROW][C]27[/C][C]4.13[/C][C]3.61619047619048[/C][C]0.513809523809523[/C][/ROW]
[ROW][C]28[/C][C]4.28[/C][C]4.00282608695652[/C][C]0.277173913043478[/C][/ROW]
[ROW][C]29[/C][C]4.46[/C][C]4.00282608695652[/C][C]0.457173913043478[/C][/ROW]
[ROW][C]30[/C][C]4.63[/C][C]3.61619047619048[/C][C]1.01380952380952[/C][/ROW]
[ROW][C]31[/C][C]4.49[/C][C]4.00282608695652[/C][C]0.487173913043478[/C][/ROW]
[ROW][C]32[/C][C]4.41[/C][C]4.00282608695652[/C][C]0.407173913043478[/C][/ROW]
[ROW][C]33[/C][C]4.5[/C][C]4.00282608695652[/C][C]0.497173913043478[/C][/ROW]
[ROW][C]34[/C][C]4.39[/C][C]3.61619047619048[/C][C]0.773809523809523[/C][/ROW]
[ROW][C]35[/C][C]4.33[/C][C]4.00282608695652[/C][C]0.327173913043478[/C][/ROW]
[ROW][C]36[/C][C]4.45[/C][C]4.00282608695652[/C][C]0.447173913043478[/C][/ROW]
[ROW][C]37[/C][C]4.17[/C][C]4.00282608695652[/C][C]0.167173913043478[/C][/ROW]
[ROW][C]38[/C][C]4.13[/C][C]4.00282608695652[/C][C]0.127173913043478[/C][/ROW]
[ROW][C]39[/C][C]4.33[/C][C]4.00282608695652[/C][C]0.327173913043478[/C][/ROW]
[ROW][C]40[/C][C]4.47[/C][C]4.00282608695652[/C][C]0.467173913043478[/C][/ROW]
[ROW][C]41[/C][C]4.63[/C][C]4.00282608695652[/C][C]0.627173913043478[/C][/ROW]
[ROW][C]42[/C][C]4.9[/C][C]4.00282608695652[/C][C]0.897173913043479[/C][/ROW]
[ROW][C]43[/C][C]4.77[/C][C]4.00282608695652[/C][C]0.767173913043478[/C][/ROW]
[ROW][C]44[/C][C]4.51[/C][C]4.00282608695652[/C][C]0.507173913043478[/C][/ROW]
[ROW][C]45[/C][C]4.63[/C][C]4.00282608695652[/C][C]0.627173913043478[/C][/ROW]
[ROW][C]46[/C][C]4.36[/C][C]4.00282608695652[/C][C]0.357173913043479[/C][/ROW]
[ROW][C]47[/C][C]3.95[/C][C]4.00282608695652[/C][C]-0.0528260869565216[/C][/ROW]
[ROW][C]48[/C][C]3.74[/C][C]4.00282608695652[/C][C]-0.262826086956522[/C][/ROW]
[ROW][C]49[/C][C]4.15[/C][C]4.00282608695652[/C][C]0.147173913043479[/C][/ROW]
[ROW][C]50[/C][C]4.14[/C][C]4.00282608695652[/C][C]0.137173913043478[/C][/ROW]
[ROW][C]51[/C][C]3.97[/C][C]4.00282608695652[/C][C]-0.0328260869565216[/C][/ROW]
[ROW][C]52[/C][C]3.81[/C][C]4.00282608695652[/C][C]-0.192826086956522[/C][/ROW]
[ROW][C]53[/C][C]4.07[/C][C]4.00282608695652[/C][C]0.0671739130434785[/C][/ROW]
[ROW][C]54[/C][C]3.84[/C][C]4.00282608695652[/C][C]-0.162826086956522[/C][/ROW]
[ROW][C]55[/C][C]3.63[/C][C]4.00282608695652[/C][C]-0.372826086956522[/C][/ROW]
[ROW][C]56[/C][C]3.55[/C][C]4.00282608695652[/C][C]-0.452826086956522[/C][/ROW]
[ROW][C]57[/C][C]3.6[/C][C]4.00282608695652[/C][C]-0.402826086956522[/C][/ROW]
[ROW][C]58[/C][C]3.63[/C][C]4.00282608695652[/C][C]-0.372826086956522[/C][/ROW]
[ROW][C]59[/C][C]3.55[/C][C]4.00282608695652[/C][C]-0.452826086956522[/C][/ROW]
[ROW][C]60[/C][C]3.69[/C][C]4.00282608695652[/C][C]-0.312826086956522[/C][/ROW]
[ROW][C]61[/C][C]3.53[/C][C]4.00282608695652[/C][C]-0.472826086956522[/C][/ROW]
[ROW][C]62[/C][C]3.43[/C][C]4.00282608695652[/C][C]-0.572826086956522[/C][/ROW]
[ROW][C]63[/C][C]3.4[/C][C]3.61619047619048[/C][C]-0.216190476190476[/C][/ROW]
[ROW][C]64[/C][C]3.41[/C][C]3.61619047619048[/C][C]-0.206190476190476[/C][/ROW]
[ROW][C]65[/C][C]3.09[/C][C]3.61619047619048[/C][C]-0.526190476190477[/C][/ROW]
[ROW][C]66[/C][C]3.35[/C][C]3.61619047619048[/C][C]-0.266190476190476[/C][/ROW]
[ROW][C]67[/C][C]3.22[/C][C]4.00282608695652[/C][C]-0.782826086956522[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=115153&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=115153&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Actuals, Predictions, and Residuals
#ActualsForecastsResiduals
13.484.00282608695652-0.522826086956522
23.64.00282608695652-0.402826086956522
33.663.616190476190480.0438095238095237
43.453.61619047619048-0.166190476190476
53.33.61619047619048-0.316190476190477
63.143.61619047619048-0.476190476190476
73.214.00282608695652-0.792826086956522
83.123.61619047619048-0.496190476190476
93.143.61619047619048-0.476190476190476
103.44.00282608695652-0.602826086956522
113.424.00282608695652-0.582826086956522
123.293.61619047619048-0.326190476190476
133.493.61619047619048-0.126190476190476
143.523.61619047619048-0.0961904761904764
153.813.616190476190480.193809523809524
164.034.002826086956520.0271739130434785
173.983.616190476190480.363809523809524
184.13.616190476190480.483809523809523
193.964.00282608695652-0.0428260869565218
203.834.00282608695652-0.172826086956522
213.723.616190476190480.103809523809524
223.823.616190476190480.203809523809523
233.764.00282608695652-0.242826086956522
243.984.00282608695652-0.0228260869565218
254.144.002826086956520.137173913043478
2644.00282608695652-0.00282608695652176
274.133.616190476190480.513809523809523
284.284.002826086956520.277173913043478
294.464.002826086956520.457173913043478
304.633.616190476190481.01380952380952
314.494.002826086956520.487173913043478
324.414.002826086956520.407173913043478
334.54.002826086956520.497173913043478
344.393.616190476190480.773809523809523
354.334.002826086956520.327173913043478
364.454.002826086956520.447173913043478
374.174.002826086956520.167173913043478
384.134.002826086956520.127173913043478
394.334.002826086956520.327173913043478
404.474.002826086956520.467173913043478
414.634.002826086956520.627173913043478
424.94.002826086956520.897173913043479
434.774.002826086956520.767173913043478
444.514.002826086956520.507173913043478
454.634.002826086956520.627173913043478
464.364.002826086956520.357173913043479
473.954.00282608695652-0.0528260869565216
483.744.00282608695652-0.262826086956522
494.154.002826086956520.147173913043479
504.144.002826086956520.137173913043478
513.974.00282608695652-0.0328260869565216
523.814.00282608695652-0.192826086956522
534.074.002826086956520.0671739130434785
543.844.00282608695652-0.162826086956522
553.634.00282608695652-0.372826086956522
563.554.00282608695652-0.452826086956522
573.64.00282608695652-0.402826086956522
583.634.00282608695652-0.372826086956522
593.554.00282608695652-0.452826086956522
603.694.00282608695652-0.312826086956522
613.534.00282608695652-0.472826086956522
623.434.00282608695652-0.572826086956522
633.43.61619047619048-0.216190476190476
643.413.61619047619048-0.206190476190476
653.093.61619047619048-0.526190476190477
663.353.61619047619048-0.266190476190476
673.224.00282608695652-0.782826086956522



Parameters (Session):
par1 = 1 ; par2 = none ; par4 = yes ;
Parameters (R input):
par1 = 1 ; par2 = none ; par3 = ; par4 = yes ;
R code (references can be found in the software module):
library(party)
library(Hmisc)
par1 <- as.numeric(par1)
par3 <- as.numeric(par3)
x <- data.frame(t(y))
is.data.frame(x)
x <- x[!is.na(x[,par1]),]
k <- length(x[1,])
n <- length(x[,1])
colnames(x)[par1]
x[,par1]
if (par2 == 'kmeans') {
cl <- kmeans(x[,par1], par3)
print(cl)
clm <- matrix(cbind(cl$centers,1:par3),ncol=2)
clm <- clm[sort.list(clm[,1]),]
for (i in 1:par3) {
cl$cluster[cl$cluster==clm[i,2]] <- paste('C',i,sep='')
}
cl$cluster <- as.factor(cl$cluster)
print(cl$cluster)
x[,par1] <- cl$cluster
}
if (par2 == 'quantiles') {
x[,par1] <- cut2(x[,par1],g=par3)
}
if (par2 == 'hclust') {
hc <- hclust(dist(x[,par1])^2, 'cen')
print(hc)
memb <- cutree(hc, k = par3)
dum <- c(mean(x[memb==1,par1]))
for (i in 2:par3) {
dum <- c(dum, mean(x[memb==i,par1]))
}
hcm <- matrix(cbind(dum,1:par3),ncol=2)
hcm <- hcm[sort.list(hcm[,1]),]
for (i in 1:par3) {
memb[memb==hcm[i,2]] <- paste('C',i,sep='')
}
memb <- as.factor(memb)
print(memb)
x[,par1] <- memb
}
if (par2=='equal') {
ed <- cut(as.numeric(x[,par1]),par3,labels=paste('C',1:par3,sep=''))
x[,par1] <- as.factor(ed)
}
table(x[,par1])
colnames(x)
colnames(x)[par1]
x[,par1]
if (par2 == 'none') {
m <- ctree(as.formula(paste(colnames(x)[par1],' ~ .',sep='')),data = x)
}
load(file='createtable')
if (par2 != 'none') {
m <- ctree(as.formula(paste('as.factor(',colnames(x)[par1],') ~ .',sep='')),data = x)
if (par4=='yes') {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'10-Fold Cross Validation',3+2*par3,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'',1,TRUE)
a<-table.element(a,'Prediction (training)',par3+1,TRUE)
a<-table.element(a,'Prediction (testing)',par3+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Actual',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,paste('C',jjj,sep=''),1,TRUE)
a<-table.element(a,'CV',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,paste('C',jjj,sep=''),1,TRUE)
a<-table.element(a,'CV',1,TRUE)
a<-table.row.end(a)
for (i in 1:10) {
ind <- sample(2, nrow(x), replace=T, prob=c(0.9,0.1))
m.ct <- ctree(as.formula(paste('as.factor(',colnames(x)[par1],') ~ .',sep='')),data =x[ind==1,])
if (i==1) {
m.ct.i.pred <- predict(m.ct, newdata=x[ind==1,])
m.ct.i.actu <- x[ind==1,par1]
m.ct.x.pred <- predict(m.ct, newdata=x[ind==2,])
m.ct.x.actu <- x[ind==2,par1]
} else {
m.ct.i.pred <- c(m.ct.i.pred,predict(m.ct, newdata=x[ind==1,]))
m.ct.i.actu <- c(m.ct.i.actu,x[ind==1,par1])
m.ct.x.pred <- c(m.ct.x.pred,predict(m.ct, newdata=x[ind==2,]))
m.ct.x.actu <- c(m.ct.x.actu,x[ind==2,par1])
}
}
print(m.ct.i.tab <- table(m.ct.i.actu,m.ct.i.pred))
numer <- 0
for (i in 1:par3) {
print(m.ct.i.tab[i,i] / sum(m.ct.i.tab[i,]))
numer <- numer + m.ct.i.tab[i,i]
}
print(m.ct.i.cp <- numer / sum(m.ct.i.tab))
print(m.ct.x.tab <- table(m.ct.x.actu,m.ct.x.pred))
numer <- 0
for (i in 1:par3) {
print(m.ct.x.tab[i,i] / sum(m.ct.x.tab[i,]))
numer <- numer + m.ct.x.tab[i,i]
}
print(m.ct.x.cp <- numer / sum(m.ct.x.tab))
for (i in 1:par3) {
a<-table.row.start(a)
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
for (jjj in 1:par3) a<-table.element(a,m.ct.i.tab[i,jjj])
a<-table.element(a,round(m.ct.i.tab[i,i]/sum(m.ct.i.tab[i,]),4))
for (jjj in 1:par3) a<-table.element(a,m.ct.x.tab[i,jjj])
a<-table.element(a,round(m.ct.x.tab[i,i]/sum(m.ct.x.tab[i,]),4))
a<-table.row.end(a)
}
a<-table.row.start(a)
a<-table.element(a,'Overall',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,'-')
a<-table.element(a,round(m.ct.i.cp,4))
for (jjj in 1:par3) a<-table.element(a,'-')
a<-table.element(a,round(m.ct.x.cp,4))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
}
}
m
bitmap(file='test1.png')
plot(m)
dev.off()
bitmap(file='test1a.png')
plot(x[,par1] ~ as.factor(where(m)),main='Response by Terminal Node',xlab='Terminal Node',ylab='Response')
dev.off()
if (par2 == 'none') {
forec <- predict(m)
result <- as.data.frame(cbind(x[,par1],forec,x[,par1]-forec))
colnames(result) <- c('Actuals','Forecasts','Residuals')
print(result)
}
if (par2 != 'none') {
print(cbind(as.factor(x[,par1]),predict(m)))
myt <- table(as.factor(x[,par1]),predict(m))
print(myt)
}
bitmap(file='test2.png')
if(par2=='none') {
op <- par(mfrow=c(2,2))
plot(density(result$Actuals),main='Kernel Density Plot of Actuals')
plot(density(result$Residuals),main='Kernel Density Plot of Residuals')
plot(result$Forecasts,result$Actuals,main='Actuals versus Predictions',xlab='Predictions',ylab='Actuals')
plot(density(result$Forecasts),main='Kernel Density Plot of Predictions')
par(op)
}
if(par2!='none') {
plot(myt,main='Confusion Matrix',xlab='Actual',ylab='Predicted')
}
dev.off()
if (par2 == 'none') {
detcoef <- cor(result$Forecasts,result$Actuals)
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goodness of Fit',2,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Correlation',1,TRUE)
a<-table.element(a,round(detcoef,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'R-squared',1,TRUE)
a<-table.element(a,round(detcoef*detcoef,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'RMSE',1,TRUE)
a<-table.element(a,round(sqrt(mean((result$Residuals)^2)),4))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Actuals, Predictions, and Residuals',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'#',header=TRUE)
a<-table.element(a,'Actuals',header=TRUE)
a<-table.element(a,'Forecasts',header=TRUE)
a<-table.element(a,'Residuals',header=TRUE)
a<-table.row.end(a)
for (i in 1:length(result$Actuals)) {
a<-table.row.start(a)
a<-table.element(a,i,header=TRUE)
a<-table.element(a,result$Actuals[i])
a<-table.element(a,result$Forecasts[i])
a<-table.element(a,result$Residuals[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
}
if (par2 != 'none') {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Confusion Matrix (predicted in columns / actuals in rows)',par3+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'',1,TRUE)
for (i in 1:par3) {
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
}
a<-table.row.end(a)
for (i in 1:par3) {
a<-table.row.start(a)
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
for (j in 1:par3) {
a<-table.element(a,myt[i,j])
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
}