Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_regression_trees1.wasp
Title produced by softwareRecursive Partitioning (Regression Trees)
Date of computationTue, 14 Dec 2010 09:14:24 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2010/Dec/14/t129231792850g86buolxelp0p.htm/, Retrieved Thu, 02 May 2024 16:43:53 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=109290, Retrieved Thu, 02 May 2024 16:43:53 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact201
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Recursive Partitioning (Regression Trees)] [] [2010-12-05 19:35:21] [b98453cac15ba1066b407e146608df68]
-   PD  [Recursive Partitioning (Regression Trees)] [WS10] [2010-12-10 14:09:52] [c7506ced21a6c0dca45d37c8a93c80e0]
-   P     [Recursive Partitioning (Regression Trees)] [WS10 - Crossvalid...] [2010-12-10 15:01:44] [4a7069087cf9e0eda253aeed7d8c30d6]
F   P         [Recursive Partitioning (Regression Trees)] [w10] [2010-12-14 09:14:24] [9d72585f2b7b60ae977d4816136e1c95] [Current]
Feedback Forum
2010-12-19 12:43:39 [48eb36e2c01435ad7e4ea7854a9d98fe] [reply
De student stelt hier op correcte wijze een 'cross validation' tabel op.
Hierbij trekt de software een aselecte steekproef van 90% van alle data die beschikbaar zijn, en maakt hiervan een boomstructuur. Van de overige 10% maakt de software een voorspelling.

De student baseert zich voor zijn interpretatie vooral op het stuk van de 'testing'. Het eerste deel van de tabel kan bijvoorbeeld als volgt worden opgevat: in de 10 experimenten (gesimuleerd door de software) waren er (1237 + 1128 = ) 2365 waarden in de laagste categorie. Hiervan werden er 1237 correct voorspeld. De waarde in de kolom CV duidt aan hoe accuraat de waarden dus voorspeld kunnen worden; we zien hier dat ongeveer 52% van de lage waarden correct voorspeld kunnen worden. Er waren (349 + 1779 =) 2128 waarden in de hoogste categorie. Hiervan werden er 1779 correct voorspeld. Dit betekent dat ongeveer 84% van de waarden uit de hoogste categorie correct voorspeld kunnen worden.

Post a new message
Dataseries X:
3.66356	7.74414	-4.4	4.2	0	18	19	116
3.04452	8.03398	-5.7	4.8	-0.3	69.1	9	506
3.71357	4.70048	-13.5	4.3	0.2	80	3	95
2.94444	7.5251	1.4	3	0.1	177	22	161
4.06044	7.7626	4.1	5.6	1.1	287	7	80
3.68888	7.88683	5.8	2.3	-0.1	200	9	33
3.3322	7.81521	2.7	1.9	0.4	228	7	129
3.3673	7.77779	7.1	8.9	0.2	220	15	155
2.07944	6.89163	4.1	2	0.1	183	9	132
1.94591	7.6774	1.1	5.2	0.1	43.1	10	480
3.3322	5.71373	-9	3.4	0	80	6	98
3.21888	8.33471	1.6	4.4	-0.3	78.1	16	558
1.09861	4.65396	-0.4	5.7	0.8	267	3	121
5.3845	7.38895	-4.8	6	0	81	20	122
3.93183	7.81763	-0.9	1.4	-0.3	236	15	51
3.3673	4.96981	-0.5	0.6	1.7	341.1	4	552
2.83321	7.80384	-1	1.8	0	16	7	150
3.04452	7.62168	-2.3	5.2	-0.1	66.1	12	428
2.99573	5.21494	5.8	2.6	-0.2	68.8	3	582
4.21951	7.64204	5.1	2.2	1.1	199	18	538
3.04452	7.77149	10.8	2.4	-1.5	329.8	13	579
3.7612	8.31385	12.2	4	-2.8	230.4	17	572
3.58352	7.63964	-2.8	2.1	-1.6	210.2	11	512
2.48491	7.05531	11.2	3.3	2.3	302.8	23	604
2.48491	7.38275	13.6	0.6	-0.8	159.8	11	602
3.3322	7.56786	-12.3	3.7	-0.2	70	12	460
3.13549	7.92226	4.2	1.4	0.2	161	7	127
2.07944	7.38771	-2.8	4.5	0.1	53.3	21	417
3.68888	7.4793	3.4	7.2	0.5	209.1	12	477
2.99573	3.97029	-11.7	2.7	1.7	81	4	87
3.98898	8.05102	-2.5	3.2	0.1	83	9	51
3.04452	6.67834	3.9	4.8	-0.6	66.8	8	573
3.55535	7.34601	3	1.4	1	200	21	177
1.79176	7.54062	0	6.1	-0.2	207.5	19	489
3.04452	4.82028	-2.6	2.1	0.7	72.8	4	530
2.56495	7.80057	1	2.2	0	99	7	65
4.36945	7.41878	-0.2	1.1	3.2	138	22	49
3.46574	7.96032	7.7	4.8	-0.2	66	17	196
4.60517	8.23669	10.7	5.3	-0.9	205	17	192
3.09104	8.30721	9.3	7.3	-0.5	206	17	212
4.15888	8.26333	9.5	5	-0.9	208	17	208
1.79176	7.50329	1.6	4.5	0.1	117	19	62
2.48491	6.49527	-3.4	0.5	1.1	217	6	135
3.21888	7.74414	-1.3	4.2	-0.2	59	15	149
3.73767	6.75577	-3.2	2.6	0.6	70.5	21	398
4.2485	6.51915	-1.8	2.3	1.1	81	23	174
3.8712	4.77068	-0.5	0.5	0.7	358	4	402
4.00733	7.88721	7.6	4	-0.6	180	13	208
2.94444	5.42053	3.4	2.9	0.4	74.7	3	567
3.09104	3.80666	7.4	3.2	-0.1	225	3	205
1.94591	4.8752	-3.5	3.8	-0.1	60.5	3	418
3.52636	8.25088	19.8	4.8	-1.9	260	17	206
3.8712	7.85516	16.3	3.4	-0.1	225.3	20	605
2.99573	8.25427	-7.3	5	0	60	8	506
4.07754	7.36201	-0.6	0.8	1.1	88.3	21	422
3.21888	7.59388	-0.4	4.5	-0.2	64.6	13	430
3.49651	8.35208	17.8	5	-2.5	219.6	16	599
3.2581	4.62497	-1.9	2.2	2.2	268	4	172
2.99573	7.39879	-10	3.8	-0.1	78.4	20	449
2.70805	6.48616	-1.3	2.9	0.7	63.3	24	397
3.09104	6.20456	4.7	1.5	1.1	104.4	1	588
4.06044	7.64348	0.3	1.5	-0.3	172	13	130
3.61092	7.59337	8.8	2.2	0	186	21	577
2.56495	6.43455	-9.3	3.6	0.1	71.2	9	450
1.60944	7.00033	-2.4	5.8	0	61	22	418
5.0876	7.49499	2.9	1.2	2.1	271.1	21	543
2.30259	7.80221	2.5	5	0.1	135	7	128
3.2581	7.55276	13.4	3.9	0.3	216.8	19	606
3.66356	6.82979	-15.5	3.1	0.2	70.3	22	460
3.04452	7.49053	-1.6	2.2	-0.9	211.7	12	501
3.4012	7.59035	0.1	2.6	-0.1	73	11	66
3.04452	5.51745	-2.8	0.5	2.2	5	5	155
3.4012	8.18619	-0.9	1.3	0.6	57	8	67
3.55535	5.95842	-10.3	1.2	2.5	86	24	86
3.97029	7.55276	-4.1	0.7	-0.7	264.8	13	401
4.72739	7.77022	-3.8	4.9	-0.2	68	7	166
4.30407	7.99834	8.2	5.6	-0.6	210	15	212
2.56495	7.15305	-4.8	4.7	-0.1	74.6	23	441
3.78419	7.95997	-4.3	6	0.1	18	13	82
4.95583	6.57508	6.2	1.3	1.1	226	24	200
3.66356	7.88231	3.3	5.8	-0.1	46.9	19	557
5.26786	7.43603	-1	1.8	1.4	89	20	44
3.46574	4.95583	-4	1.7	1	65	2	165
2.07944	5.1299	7.8	1.6	1.7	184.3	4	604
3.3673	7.20638	-5.3	1.8	-0.2	80	13	451
3.29584	7.46107	2.2	3.1	0.1	62.3	13	470
2.19722	7.53209	14.8	4.2	-4.5	227.7	11	588
3.43399	8.13359	-4.8	1.5	0	196	17	100
3.2581	6.91771	12.8	1	0.3	103.5	24	602
2.83321	7.12206	-3.2	4.6	-0.1	62.2	12	427
3.63759	8.2845	0.7	0.8	-0.8	86	8	541
2.70805	7.56528	3.5	6.9	0.2	204.4	15	482
2.99573	7.65112	-6.8	2.2	-0.4	78.8	16	517
4.00733	7.67183	8.3	0.7	-1.3	206.2	14	532
3.43399	6.31173	-19	2.3	0.8	60.9	8	461
0.69315	7.0775	7	4	-0.2	54.3	10	578
3.52636	7.34148	-14.6	1.6	-0.4	228.9	14	457
2.63906	7.96346	3.6	3.2	1.2	268	18	42
1.94591	7.35116	3.5	1.9	0.2	215	20	107
3.71357	7.54115	5.2	0.9	-1	280	11	155
2.83321	6.64249	-8.3	2.6	0.2	60	23	85
3.43399	7.56579	0.6	3.8	-0.4	64.6	12	507
3.29584	8.1806	-0.5	0.7	-0.2	248	15	124
3.17805	7.60738	1	2.4	-0.1	192	13	38
4.81218	7.54327	-8	0.6	0.5	88	13	94
4.63473	8.20985	-3	1.5	2.7	89.6	8	470
3.04452	5.76205	5	3.2	0	58	6	195
3.8712	6.91274	-5.1	0.8	1.4	241	23	137
3.68888	8.19781	0.2	2.4	0.5	80	8	72
4.33073	7.6024	12.9	5.6	-0.8	46	13	188
4.61512	7.54327	2.6	2.6	-1.9	243	11	45
4.43082	7.70661	7.8	8	-1.2	33.7	13	556
3.4012	5.54518	-0.7	2	-0.3	254	1	542
2.48491	7.68432	7.7	1.5	-0.1	89	16	209
3.29584	7.90064	1.5	1.3	0.9	334.7	18	422
2.94444	7.76684	15.8	3.2	-0.2	125.5	17	595
3.3673	7.97797	4.3	5.7	-1	232	9	156
2.89037	7.89767	8.1	6.4	-0.5	199.4	7	577
4.90527	8.24065	11.6	6	-1.3	213	16	193
2.63906	7.07581	2.8	3.2	0.3	223	10	69
3.29584	7.7424	-4.4	4.9	-0.1	23	10	82
3.71357	6.97915	-5.1	3.3	1.4	81	23	98
4.39445	8.27741	2	2.5	-0.3	207.2	16	536
3.82864	8.01268	10	4.8	-0.5	83.2	9	600
2.56495	5.78996	1.1	4	-0.1	79	3	70
3.3322	7.45934	6.1	2.5	0	92	22	210
3.13549	6.22851	0.1	1.9	1.1	208.8	6	538
0.69315	6.45362	-2.6	4.9	0.2	38.3	23	405
4.00733	7.7178	7.9	0.4	0.5	276	19	184
3.2581	7.92696	4.8	9.4	0.6	204	14	474
3.2581	6.56244	2.6	2.8	-0.7	224.1	8	545
4.84419	8.17273	-9.6	1.1	1.3	191	16	94
2.63906	5.1299	-1.5	2.5	-0.1	245.1	4	510
1.38629	4.54329	3.8	5.8	0.7	201.9	3	473
3.2581	5.71043	1.8	2.7	0.9	181.9	1	527
2.30259	8.19257	-0.7	4.1	0.6	252	8	121
3.09104	5.00395	3.6	2.7	1.2	344.7	4	533
2.94444	7.39817	3.9	6.2	0.7	206	20	155
3.4012	6.63988	4.7	3.6	0	59	24	196
2.99573	6.29342	-2.5	4.9	-0.2	84	7	167
3.82864	7.4378	-9.9	4.5	-0.1	55	11	89
2.30259	7.7977	1.7	2	0.3	38	15	63
4.29046	7.33629	-12.8	3.5	0	79	19	97
4.41884	7.51589	4.3	2.1	0.6	32.3	20	555
3.3673	7.70841	2.1	3.1	0	79	10	64
2.30259	4.64439	-2.2	4.9	-0.1	77	2	114
3.98898	7.77402	3.3	1.8	-0.3	220	10	155
4.12713	5.05625	5.6	3.5	-0.1	78	4	184
3.68888	5.86363	-0.6	0.6	1.7	142	1	79
2.30259	5.52943	3.4	4.7	0.8	220.1	5	474
3.29584	7.91754	0.2	2.8	0.4	335	15	53
3.13549	6.74052	-6.4	2.8	-0.1	77	22	84
2.07944	6.53669	8	1.9	-0.2	54.6	8	578
5.25227	7.82844	3.6	0.8	1.5	87.7	19	529
3.78419	7.74846	-13.4	4	0.2	74.1	13	448
3.97029	5.71043	-2.1	1.3	1.8	87.8	1	485
2.99573	5.64897	-8.7	4.3	-0.2	74.5	1	450
2.56495	6.42972	1.6	1.3	1	58	8	76
3.21888	6.51323	-3.9	2.8	0.4	71.3	8	398
1.38629	7.93057	1.4	2.5	0.2	210.5	19	525
3.13549	7.59337	3.3	4.2	1.2	235	20	138
4.06044	7.19818	3	1.8	1.1	191	22	192
4.27667	8.23377	7.2	3.8	-0.2	31	15	173
3.49651	8.13212	-1.7	1.5	-0.1	68	8	515
3.21888	7.87284	0.8	5	0.1	76.2	19	514
4.07754	7.53102	0.1	3.5	1.2	66.3	20	397
4.20469	7.49276	4.5	6.6	-0.1	210	12	156
4.18965	5.65599	0.3	1.5	1.7	303.4	1	529
2.48491	7.07327	-0.9	1.6	-0.5	256.5	9	510
1.09861	6.33683	2.2	2.8	0.1	142	1	63
2.48491	7.21229	-0.6	2.1	0.1	60	21	134
2.48491	7.86978	3.2	2.6	1.3	286	20	42
0.69315	7.84031	4.1	2.7	0.2	175	19	56
4.34381	7.98344	-4.8	0.8	1.1	230.1	17	445
2.63906	7.03086	7.1	1.6	0.5	66.4	22	593
3.29584	7.90175	-4.4	3.8	0.1	15	9	82
4.17439	7.72223	7.1	3.9	-0.9	90	10	197
3.71357	8.19451	0	0.8	-0.3	254.2	16	500
4.02535	7.84698	11.6	4.2	-2.7	247	14	200
4.85203	7.58731	3.1	2.2	-3.4	227	11	148
4.64439	6.47235	1.5	1.2	1.3	101	24	191
3.63759	7.74932	15	7.3	-2.2	229.9	14	584
3.4012	8.12711	3.6	4.9	1.2	238	18	138
1.79176	5.68358	5.7	1.9	0.3	49	6	189
4.11087	7.58274	3.4	6.9	-0.1	194.9	11	548
3.3673	8.01434	10.1	2.4	-0.3	44	17	189
4.44265	7.65539	-2.4	2.6	0.7	78.3	18	398
2.77259	7.83716	0.2	4.5	0	80	7	157
3.71357	6.9921	-15	3.3	0.6	66	23	462
5.17048	8.08209	5.2	7.2	-1.3	202.7	14	537
3.09104	7.85554	4.1	3.4	0.1	79	7	170
2.70805	7.68064	6.3	9.4	0.2	248	16	160
4.11087	7.5438	8.7	4	-1.2	216.3	14	538
4.79579	8.30598	2.4	4.2	-0.7	65.1	8	556
2.3979	5.95324	-5.3	1	2.2	60	9	77
2.99573	6.2634	2.5	4.3	-0.1	79	24	69
3.13549	8.25036	9.3	5.4	-0.8	213	16	212
3.43399	7.61776	-5.6	3.1	-0.5	77.4	11	519
3.4012	7.44425	5.6	7	1	248.1	12	471
3.09104	7.87169	-1.7	0.5	0	218	18	51
4.45435	7.64492	-5.6	4.9	-0.1	56.7	7	409
4.18965	7.46851	-3.5	0.7	0.2	165.1	20	520
3.73767	8.18256	-11.2	3.7	0.2	78	16	96
4.20469	5.10595	-11.9	0.6	0.8	286.5	4	489
2.56495	5.79301	2.1	2.6	3.1	241	1	32
2.56495	7.14125	1	4.5	1.3	232	21	145
4.85981	7.69303	8.8	3.6	-2.5	85.4	12	547
5.25227	7.23056	-4.6	6.5	0.1	80	21	122
2.63906	6.0845	-5.8	2	-0.3	107.5	1	518
3.13549	5.72031	1	1.8	0.6	92	3	76
1.94591	6.81892	0.1	1.7	-0.1	214	11	112
3.4012	5.67332	-1.9	0.9	3.7	109.2	5	397
4.72739	8.26178	1	3.4	1.3	248	8	43
3.3673	5.92959	2.1	3.8	0.5	64.2	1	531
3.2581	7.53262	6.1	2.5	-0.1	182.5	21	572
3.73767	8.11761	0.2	0.8	0.9	11	16	113
1.79176	7.98446	1.9	5.8	0.2	9	9	80
4.06044	7.7463	11.8	6	-0.7	212.7	14	582
3.91202	8.19229	-4.4	1.5	0.5	87.9	8	423
0.69315	6.43294	1.9	2.4	0.1	133	24	62
1.60944	5.2832	8.6	1.8	1.5	242	3	34
4.46591	8.22336	9.1	9.4	-0.5	39.4	16	556
3.17805	7.68983	13	0.3	0.4	114	22	602
2.89037	6.70564	1.8	0.8	0.1	269	6	141
4.17439	7.58426	4	3.1	-1.7	103.7	12	554
3.73767	8.04943	-7.8	0.6	-0.1	177	17	95
4.00733	7.72886	8.7	5.5	-0.4	132.3	14	573
3.04452	7.51806	-4.9	4.1	-0.2	73.8	13	436
2.63906	7.58172	-8.8	8.1	-0.1	53	18	90
4.11087	8.28425	2.9	3.3	0.8	22.1	17	397
4.47734	8.01731	0.4	3.9	0.7	67.4	18	507
3.17805	7.67786	-0.7	4.3	-0.4	59	13	151
3.3322	7.8598	4.2	2.7	1.1	230.7	18	526
0.69315	4.79579	2.7	2.8	0.2	187	4	131
4.58497	7.66528	7.4	0.9	-0.2	55	13	184
0.69315	4.36945	0.6	4.6	0.1	350	3	171
3.4012	7.5251	6	3.5	-0.2	78.1	21	571
2.07944	7.73849	1.6	1.6	0.3	211.3	21	525
4.55388	7.62071	3.4	5.9	-0.6	45	13	164
3.21888	5.39363	3.3	1.7	-0.1	187	4	202
2.77259	5.87774	-5.4	5.1	-0.2	72.2	2	441
3.29584	5.8522	4.2	1.7	1	175	5	587
3.3322	8.02027	2	6	-0.1	152	18	61
4.02535	7.23706	-14.2	1.4	-0.1	238.1	18	457
4.38203	7.83281	9.4	2.9	-0.3	248.4	15	540
2.99573	4.60517	1.5	1.1	0.2	233.7	3	480
3.55535	8.11073	0.6	6.8	0.3	10	8	54
3.2581	7.57917	2.5	3	-0.2	83	13	64
4.47734	8.31361	-0.3	3	-0.8	71.8	8	554
4.06044	4.86753	1.4	0.7	1.5	79.4	4	563
3.21888	7.50879	-0.7	4	-0.5	65.1	11	507
4.46591	6.51323	-8.1	0.7	1.2	111	24	143
3.13549	6.6107	-13.1	3.5	0	83	9	97
2.30259	4.70048	6.3	6	0.9	219.2	2	586
3.61092	6.94986	-3.8	0.7	-0.2	258	23	74
3.13549	5.72685	-3.6	0.8	4	302.4	5	470
3.89182	8.16337	-1.1	1.5	-0.2	157.4	16	521
3.21888	7.7012	15.8	3.9	-0.4	119.5	15	595
3.17805	4.45435	-5.1	3.4	0.4	58.4	3	507
3.73767	8.28248	8.5	3.1	-0.7	238	8	205
2.3979	8.05833	-2.1	4.1	-0.2	82	8	114
3.68888	7.90027	1.5	0.5	-0.6	99.2	13	509
4.26268	7.34148	3	0.8	0.6	86	21	32
4.94164	7.90027	-12.6	2.2	0.2	77.8	19	488
3.29584	7.65917	3	3.1	-0.1	221	14	169
2.07944	5.60947	-5.2	2.9	0.1	60	2	90
2.19722	7.15227	3.5	0.9	0.2	265	22	169
2.07944	5.42495	2.8	1.3	0	75.8	2	404
4.54329	7.8071	6.5	4.3	-1.1	44	11	173
2.07944	7.94839	2.5	3.6	-0.1	54.7	16	574
4.36945	8.34069	6.3	4.2	-0.1	76	16	166
3.97029	7.54062	-1.4	1.7	-0.2	86	12	51
3.98898	5.89164	1.7	1.1	2.4	173.1	1	533
2.99573	7.52402	2.4	7.9	-0.1	43.5	21	557
3.52636	6.43294	-0.1	1.6	-0.2	75	22	55
2.99573	7.50219	3.1	1.7	0.3	269	12	69
5.39363	8.04045	12.9	8.3	-1.1	357.5	14	551
3.63759	7.31986	-13.4	3.8	0.1	75	11	448
2.30259	5.43372	0.7	3.9	0.1	182.9	2	491
4.92725	7.94058	2.4	1	0.4	110.7	7	533
2.94444	8.17808	-1.3	5.6	-0.2	46	17	149
2.70805	7.65728	-6.3	4	-0.1	51.6	19	408
2.07944	7.56164	3.5	1.8	0	71.2	21	576
3.4012	7.98752	0	4	-0.3	78.5	9	430
5.05625	7.63143	11.8	2.4	-1.5	232.5	14	535
4.93447	7.63192	5.8	1.9	-3.1	260.8	12	541
2.56495	7.6324	3.1	4.7	-0.1	48	12	576
3.04452	7.9248	1.4	2.1	0	221	17	525
1.38629	4.30407	0.8	3.1	1.6	77	3	50
2.77259	7.67555	6.8	9.9	-0.2	249	15	160
3.91202	7.1025	1.3	8	0	61.6	23	557
4.35671	7.95437	6.8	1.5	-0.6	124.9	7	585
3.4012	7.87474	-6.8	1	0	207	15	84
4.02535	7.85166	10.5	4.4	-0.9	69	15	198
3.73767	8.26049	3.8	5.2	0	222	8	163
2.94444	7.7411	-12.8	3.6	0.1	68.4	14	460
4.09434	5.24702	1.3	0.8	1.1	72.3	5	563
3.04452	5.24175	-4.2	1.4	3.6	230	2	155
3.17805	7.55224	-0.6	2.5	-0.1	75	13	111
2.56495	7.96242	-6.1	5.2	0.4	27	18	82
2.63906	6.85751	5.1	5.5	0.1	230	10	126
3.09104	8.1191	0.8	3	-0.1	78	17	65
3.4012	8.00068	-4.4	3.7	-0.1	78	14	68
2.19722	4.11087	-6.5	2.1	-0.1	68.2	5	451
3.7612	4.81218	-3.7	0.9	-0.1	281.3	4	513
2.77259	6.42811	0.7	2.9	1.4	229.6	24	473
3.4012	4.95583	-5.7	3.3	0.6	72	2	401
3.61092	6.16121	8.4	3.3	0.7	80.5	24	564
3.3673	6.55251	-0.2	2.8	0.7	71	23	72
0.69315	5.1299	-1.9	4.5	0.4	17	2	40
2.94444	5.18739	-7.3	1.9	0.5	85	3	118
4.17439	7.91608	0.1	2.1	2	77	18	49
2.83321	7.33954	-5.9	2.9	0.2	64.6	22	406
3.04452	7.0076	4.7	5.6	0	69	23	211
3.2581	7.87664	4.2	4.4	0.6	189.8	15	472
3.21888	7.23778	2.1	5.4	0.1	159	21	61
4.23411	7.64348	-0.5	2.1	-2	247.8	14	513
4.81218	7.08339	-3.7	1.1	0.1	192	22	101
4.39445	7.95718	6.5	6.2	-0.1	210.8	18	547
2.63906	6.42162	2.1	4.1	0.7	260	24	43
3.2581	7.29029	-3.1	1	-0.2	211	21	102
1.94591	5.42495	-12.8	0.8	1.2	55	6	91
2.94444	7.29029	3.8	4.3	0.5	210	22	105
2.56495	7.52348	3.4	6	0.2	191	14	482
3.21888	5.63479	-4.2	2.1	1.1	91	2	68
2.77259	7.97694	-1.7	5.2	0	76	8	124
4.61512	7.78197	5.4	3.6	-1.1	44	10	173
1.60944	5.743	3.4	4.9	0.7	243	2	140
2.07944	5.743	-0.2	5.6	0.2	16.6	2	553
3.8712	5.62762	2.1	1.5	2	227.8	5	533
3.17805	7.85477	-12.5	3.2	0.2	82.1	15	456
3.58352	7.56941	-4.1	4.3	-0.1	44.7	14	409
4.68213	7.08087	-9.8	0.5	0.2	131	21	83
2.56495	7.43603	-4.9	4.7	-0.1	79.6	21	441
2.30259	6.11368	-5.2	3.5	0	59	1	90
3.55535	7.97488	-1.6	3.5	0.5	34.5	18	480
3.46574	6.69084	6.4	1.2	0	188	6	207
2.48491	7.79523	2	2.8	1.6	242	18	127
4.21951	5.21494	-8.5	1.5	2.8	279	10	93
2.30259	7.65112	3.6	7.6	0.5	199.3	17	482
2.70805	6.12905	0.4	0.8	0.1	297	7	524
2.99573	8.08979	1.2	3.6	0.1	63.9	15	411
2.48491	5.76519	-3.5	3.2	0	240.2	2	504
3.13549	8.06401	3.6	2.3	-0.3	9.4	9	549
4.15888	7.39265	-14.9	1.9	0.8	65	20	91
1.79176	6.43294	1	2.5	-0.2	75	10	70
3.29584	7.62462	-1.7	1.6	-0.1	172	19	51
2.83321	5.22036	3.4	1.1	1.4	89	6	565
3.52636	6.35611	6.9	7	0.8	220	24	34
3.2581	4.47734	-4.8	2.5	0.4	84	3	101
3.98898	6.50279	4.2	3.9	0.6	78	24	197
1.60944	6.88959	1.5	4	0.1	240.4	22	523
3.49651	7.32909	-2.9	1.9	1.7	337.8	19	468
3.17805	7.3601	-8.2	4.7	-0.2	76.5	11	408
4.67283	7.77107	7	3.1	-0.3	73	10	185
3.43399	6.16121	8.1	1.9	-0.1	211.9	1	578
2.07944	7.97281	3.7	4.2	0.2	172	9	128
3.29584	7.90323	0.7	2.1	1.3	207.1	18	473
2.30259	8.16223	0.4	8.6	0.1	34	17	39
2.07944	4.83628	-10.4	1.9	1.1	75	7	86
0.69315	4.91998	6.5	4.5	0.4	206	2	57
3.13549	7.47591	9	4.6	-0.9	208.7	10	587
3.82864	7.52941	10.1	1.1	-3.1	255	11	188
3.8712	7.88796	-0.3	1.7	0.4	26.9	15	416
2.99573	4.47734	-4.4	2.6	0.8	85	4	38
3.2581	7.73281	1.7	1.5	1.1	345	19	158
3.3322	6.93245	-3.2	0.8	-0.1	260	22	68
3.49651	7.65634	-5.9	3.1	-0.1	77	16	85
1.94591	7.0193	3.7	3.6	0.1	195	22	107
4.57471	8.0762	9.9	2.2	0	92	16	177
3.91202	7.90175	7.4	4.3	-0.3	178	18	190
3.80666	6.89669	5.1	2.5	0.7	80	23	197
3.93183	4.30407	3.2	1.5	0.4	75	3	194
2.48491	5.15906	0	4.3	0.1	102.7	3	477
3.13549	7.82764	-5.6	6.1	-0.2	79.5	17	441
2.94444	7.02554	0.7	2.4	-0.2	232.5	23	542
3.4012	7.90175	15.1	2.9	0.2	113.9	19	567
1.94591	5.73334	-2.2	4.3	-0.1	72	5	124
3.17805	5.34711	-3.8	2.7	-0.2	80.5	3	511
3.21888	7.58477	4.2	6.5	0.4	214.3	11	474
2.63906	6.84055	-13.1	4.8	-0.2	52.8	11	462
3.3322	7.55381	-0.2	3.9	-0.2	79.6	12	430
2.70805	4.78749	1.5	3.2	0.1	153	4	61
3.78419	6.86485	0.8	1.5	-0.1	231.5	23	540
3.78419	7.81197	-6.6	2.2	1.3	87	7	122
1.60944	7.89469	2.1	4	-0.1	47.8	14	574
3.3673	7.76132	1.7	2.9	-0.1	81	14	60
3.04452	6.07764	-4.1	5.4	-0.1	66.5	1	427
4.45435	7.80954	18.2	3.7	-2.7	250.7	15	568
1.94591	5.4848	2.2	2	1.9	253.8	5	473
2.63906	5.27811	-7.8	4.6	-0.1	71.5	19	450
3.3322	7.76684	-5.4	3	-0.2	77	14	85
3.43399	5.5835	-4.5	4.1	0.3	59.4	1	507
3.89182	7.86634	8	6.1	0.1	197	19	212
2.19722	8.19644	2.7	3.8	0.2	139	8	128
4.59512	7.59186	-5.6	3.9	-0.2	41	13	116
3.49651	5.54126	4.6	0.9	0.5	108	5	190
2.83321	7.55799	12	4.4	0.2	274.2	20	584
2.19722	6.51471	4.4	2	0.7	205	6	33
2.89037	6.21461	4.5	3.5	0	55	9	196
3.4012	6.63988	0.2	0.5	1.4	84.8	24	483
2.99573	6.02345	-18.6	2.3	0.1	79.4	7	457
3.73767	7.62413	0.1	2.6	1.5	230	19	175
2.94444	5.84064	7.4	4.3	0.3	189.6	1	590
3.58352	8.32579	7.6	4.2	0	224.1	8	583
3.29584	7.32053	8.6	2.6	-0.1	189.2	22	577
4.8752	7.91571	-2.5	3.6	0	64	15	143
4.33073	7.7424	10.6	1.8	-1.6	245	14	177
3.29584	6.91771	-1.6	1.1	0	282	23	51
3.78419	7.78239	5	3.1	-0.4	230	16	545
3.61092	5.18739	5.2	4.4	0.8	206	4	126
2.83321	6.14204	5.3	2.3	1.2	153.1	1	587
3.7612	5.48064	-13.1	1.1	1.3	62.9	5	464
3.58352	5.2933	3.3	2.2	0.5	77.1	2	562
3.17805	5.4848	-0.8	3.5	0	176.8	5	522
3.52636	5.03044	-4	1.5	-0.1	108.2	2	521
4.21951	7.6406	1.7	2.3	-0.3	85	15	167
3.93183	8.29255	6.1	4.2	0.3	23	17	173
2.30259	5.70378	5.8	2.8	0	43	7	189
3.2581	5.4848	5.7	3.3	0.5	228	5	204
2.70805	7.02997	14.9	2.8	-5	255.5	10	594
3.55535	5.31321	-5.7	2.2	0.6	82	5	165
1.94591	6.14419	0.2	3.1	0.1	73	9	56
4.36945	5.14749	-9	1.3	0.4	77	2	88
4.70048	7.66388	16.8	6	-1.1	272	14	205
2.48491	5.8944	2.8	4.7	0	66	2	574
1.60944	7.61085	-0.1	6.6	-0.3	205.2	14	489
3.78419	7.52294	-12.5	0.8	-0.4	232	19	463
1.94591	4.82831	-7.8	3.7	0.1	58.4	4	407
3.09104	7.54908	0.9	1	-0.6	213.1	12	484
3.21888	7.75833	-4.2	4.6	-0.1	78	10	120
2.48491	7.50659	21.9	3.6	-2.5	264.1	19	608
2.19722	6.07764	-10.1	2.8	0	73	1	89
4.93447	7.01571	-9.4	1.2	0	73.8	23	437
4.04305	7.59488	11.2	1.7	-2	236	12	188
4.51086	8.20576	1.8	3.2	0.2	216.6	8	551
1.94591	4.59512	0.5	5.5	0.1	354	4	171
3.09104	8.23297	-2.2	1.6	-0.3	211.7	16	501
4.60517	8.31532	-0.1	5.2	-0.8	79	8	555
3.63759	7.57096	8.4	5.4	-1.8	240	11	163
3.46574	7.61036	0.9	3	-0.7	340	12	142
3.8712	8.21528	-2.9	1.4	-0.2	113.4	8	521
5.15329	8.19146	2.3	2.5	0.6	16	17	44
3.04452	7.67322	-1.4	4.4	-0.1	38.1	14	410
3.49651	7.78655	0.4	1.5	-0.1	110	13	124
2.56495	4.69135	-11.3	2.5	1.2	82	5	87
3.85015	7.629	0.1	3.9	-0.2	70.7	14	506
2.48491	5.49717	-5	4	-0.1	75.4	5	440
2.07944	6.57368	1.6	2	1.2	212.7	6	472
2.99573	7.90618	-2.8	2.3	-0.1	205	20	511
1.38629	6.55251	-3.5	3.9	-0.1	46.3	24	409
3.17805	8.22013	2	3.5	-0.2	82	8	64
2.19722	5.42935	2.4	3.6	0.1	125	5	128
3.55535	7.89096	7.1	3.4	-0.3	82.5	19	571
3.8712	8.02453	-2.7	1.4	1.6	66	9	73
3.4012	7.14283	-1.3	1.2	0.6	66	21	113
3.04452	7.97039	5.5	3.9	-0.5	231	14	131
4.34381	6.65801	5.5	1.1	0.9	138	24	199
2.3979	7.10003	-4	5.8	-0.4	78.4	11	399
2.63906	7.65681	-0.2	2.4	0	41	19	157
2.56495	4.70953	-2.8	1.8	1.4	198.9	3	537
3.3322	5.60212	3.8	3.9	-0.2	85	5	187
2.56495	7.85127	0.2	2.8	0.1	78	17	112
3.68888	7.89655	12.8	6.8	-0.6	227.7	15	583
2.3979	7.20638	-0.8	1.6	1.7	192.6	22	469
2.3979	4.89035	-10.8	1.5	0.2	57	3	460
4.68213	7.56735	-10.2	0.6	2.4	110.8	10	465
3.04452	7.61135	0.8	3.1	-0.1	65.6	19	429
3.04452	6.66823	4.1	1.4	0.2	139.8	24	572
2.19722	5.0689	4.3	4.9	0.1	82	2	212
3.17805	7.46107	-2.8	1.4	-0.1	243	20	102
4.48864	7.68018	-0.4	0.8	2.6	147	21	49
2.30259	7.80344	2.4	3.9	0.2	14.3	20	574
3.29584	8.26256	3.4	0.9	0.3	228	16	109
3.17805	7.6695	12.1	2.8	-3.2	256	13	203
4.06044	5.33272	-2.5	2	1.3	80.7	3	532
4.15888	6.30992	-4.5	2	0.3	90	24	100
3.09104	6.67456	2.2	7.1	0.1	37.9	6	557
4.41884	8.09316	-4.7	1.4	1.2	276.8	18	445
2.89037	6.56667	0.1	2.1	-0.2	247.4	6	543
1.09861	5.83481	-5.8	7.7	0.1	47	7	90
3.43399	7.794	3.3	4.3	0.8	217	17	105
3.52636	7.76089	2	1.7	1.1	234	15	121
3.13549	7.67276	1.3	4.2	0.2	169	10	61
2.19722	7.979	1.4	5	-0.1	40.7	9	480
3.13549	7.78406	6.1	1.9	0.4	171	21	210
2.07944	5.95842	-3.1	4.2	-0.1	52.5	2	426
3.63759	4.54329	-11.5	1.7	3.7	90.4	4	465
3.73767	5.62762	-6.6	1	-0.1	200	2	445
2.48491	5.66643	-3.8	2.1	0.6	73	3	77
1.94591	7.54539	6.5	9.4	-0.9	250	11	160
1.94591	6.23832	-0.4	3.3	0.3	215.1	24	470
2.77259	4.56435	-6.3	2	2.3	223	3	148
2.70805	6.58203	2.2	1.8	0.1	64.2	6	549
1.79176	5.31321	-4.9	4.2	0.3	353	2	117
2.30259	5.61677	-1.3	2.8	-0.1	65.2	1	486
4.11087	7.7111	-5.1	0.7	0.3	60	10	99
3.4012	6.2519	0.1	1	0.2	87	24	111
3.68888	7.85516	6.5	5.2	-0.2	69	19	196
4.17439	8.24512	8.6	1.6	-1	258.8	15	530




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time16 seconds
R Server'George Udny Yule' @ 72.249.76.132

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 16 seconds \tabularnewline
R Server & 'George Udny Yule' @ 72.249.76.132 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=109290&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]16 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'George Udny Yule' @ 72.249.76.132[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=109290&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=109290&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time16 seconds
R Server'George Udny Yule' @ 72.249.76.132







10-Fold Cross Validation
Prediction (training)Prediction (testing)
ActualC1C2CVC1C2CV
C1127110590.54551341360.4963
C242117380.805521890.7842
Overall--0.6703--0.6321

\begin{tabular}{lllllllll}
\hline
10-Fold Cross Validation \tabularnewline
 & Prediction (training) & Prediction (testing) \tabularnewline
Actual & C1 & C2 & CV & C1 & C2 & CV \tabularnewline
C1 & 1271 & 1059 & 0.5455 & 134 & 136 & 0.4963 \tabularnewline
C2 & 421 & 1738 & 0.805 & 52 & 189 & 0.7842 \tabularnewline
Overall & - & - & 0.6703 & - & - & 0.6321 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=109290&T=1

[TABLE]
[ROW][C]10-Fold Cross Validation[/C][/ROW]
[ROW][C][/C][C]Prediction (training)[/C][C]Prediction (testing)[/C][/ROW]
[ROW][C]Actual[/C][C]C1[/C][C]C2[/C][C]CV[/C][C]C1[/C][C]C2[/C][C]CV[/C][/ROW]
[ROW][C]C1[/C][C]1271[/C][C]1059[/C][C]0.5455[/C][C]134[/C][C]136[/C][C]0.4963[/C][/ROW]
[ROW][C]C2[/C][C]421[/C][C]1738[/C][C]0.805[/C][C]52[/C][C]189[/C][C]0.7842[/C][/ROW]
[ROW][C]Overall[/C][C]-[/C][C]-[/C][C]0.6703[/C][C]-[/C][C]-[/C][C]0.6321[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=109290&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=109290&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

10-Fold Cross Validation
Prediction (training)Prediction (testing)
ActualC1C2CVC1C2CV
C1127110590.54551341360.4963
C242117380.805521890.7842
Overall--0.6703--0.6321







Confusion Matrix (predicted in columns / actuals in rows)
C1C2
C1123137
C233207

\begin{tabular}{lllllllll}
\hline
Confusion Matrix (predicted in columns / actuals in rows) \tabularnewline
 & C1 & C2 \tabularnewline
C1 & 123 & 137 \tabularnewline
C2 & 33 & 207 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=109290&T=2

[TABLE]
[ROW][C]Confusion Matrix (predicted in columns / actuals in rows)[/C][/ROW]
[ROW][C][/C][C]C1[/C][C]C2[/C][/ROW]
[ROW][C]C1[/C][C]123[/C][C]137[/C][/ROW]
[ROW][C]C2[/C][C]33[/C][C]207[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=109290&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=109290&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Confusion Matrix (predicted in columns / actuals in rows)
C1C2
C1123137
C233207



Parameters (Session):
par1 = 1 ; par2 = quantiles ; par3 = 2 ; par4 = yes ;
Parameters (R input):
par1 = 1 ; par2 = quantiles ; par3 = 2 ; par4 = yes ;
R code (references can be found in the software module):
library(party)
library(Hmisc)
par1 <- as.numeric(par1)
par3 <- as.numeric(par3)
x <- data.frame(t(y))
is.data.frame(x)
x <- x[!is.na(x[,par1]),]
k <- length(x[1,])
n <- length(x[,1])
colnames(x)[par1]
x[,par1]
if (par2 == 'kmeans') {
cl <- kmeans(x[,par1], par3)
print(cl)
clm <- matrix(cbind(cl$centers,1:par3),ncol=2)
clm <- clm[sort.list(clm[,1]),]
for (i in 1:par3) {
cl$cluster[cl$cluster==clm[i,2]] <- paste('C',i,sep='')
}
cl$cluster <- as.factor(cl$cluster)
print(cl$cluster)
x[,par1] <- cl$cluster
}
if (par2 == 'quantiles') {
x[,par1] <- cut2(x[,par1],g=par3)
}
if (par2 == 'hclust') {
hc <- hclust(dist(x[,par1])^2, 'cen')
print(hc)
memb <- cutree(hc, k = par3)
dum <- c(mean(x[memb==1,par1]))
for (i in 2:par3) {
dum <- c(dum, mean(x[memb==i,par1]))
}
hcm <- matrix(cbind(dum,1:par3),ncol=2)
hcm <- hcm[sort.list(hcm[,1]),]
for (i in 1:par3) {
memb[memb==hcm[i,2]] <- paste('C',i,sep='')
}
memb <- as.factor(memb)
print(memb)
x[,par1] <- memb
}
if (par2=='equal') {
ed <- cut(as.numeric(x[,par1]),par3,labels=paste('C',1:par3,sep=''))
x[,par1] <- as.factor(ed)
}
table(x[,par1])
colnames(x)
colnames(x)[par1]
x[,par1]
if (par2 == 'none') {
m <- ctree(as.formula(paste(colnames(x)[par1],' ~ .',sep='')),data = x)
}
load(file='createtable')
if (par2 != 'none') {
m <- ctree(as.formula(paste('as.factor(',colnames(x)[par1],') ~ .',sep='')),data = x)
if (par4=='yes') {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'10-Fold Cross Validation',3+2*par3,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'',1,TRUE)
a<-table.element(a,'Prediction (training)',par3+1,TRUE)
a<-table.element(a,'Prediction (testing)',par3+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Actual',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,paste('C',jjj,sep=''),1,TRUE)
a<-table.element(a,'CV',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,paste('C',jjj,sep=''),1,TRUE)
a<-table.element(a,'CV',1,TRUE)
a<-table.row.end(a)
for (i in 1:10) {
ind <- sample(2, nrow(x), replace=T, prob=c(0.9,0.1))
m.ct <- ctree(as.formula(paste('as.factor(',colnames(x)[par1],') ~ .',sep='')),data =x[ind==1,])
if (i==1) {
m.ct.i.pred <- predict(m.ct, newdata=x[ind==1,])
m.ct.i.actu <- x[ind==1,par1]
m.ct.x.pred <- predict(m.ct, newdata=x[ind==2,])
m.ct.x.actu <- x[ind==2,par1]
} else {
m.ct.i.pred <- c(m.ct.i.pred,predict(m.ct, newdata=x[ind==1,]))
m.ct.i.actu <- c(m.ct.i.actu,x[ind==1,par1])
m.ct.x.pred <- c(m.ct.x.pred,predict(m.ct, newdata=x[ind==2,]))
m.ct.x.actu <- c(m.ct.x.actu,x[ind==2,par1])
}
}
print(m.ct.i.tab <- table(m.ct.i.actu,m.ct.i.pred))
numer <- 0
for (i in 1:par3) {
print(m.ct.i.tab[i,i] / sum(m.ct.i.tab[i,]))
numer <- numer + m.ct.i.tab[i,i]
}
print(m.ct.i.cp <- numer / sum(m.ct.i.tab))
print(m.ct.x.tab <- table(m.ct.x.actu,m.ct.x.pred))
numer <- 0
for (i in 1:par3) {
print(m.ct.x.tab[i,i] / sum(m.ct.x.tab[i,]))
numer <- numer + m.ct.x.tab[i,i]
}
print(m.ct.x.cp <- numer / sum(m.ct.x.tab))
for (i in 1:par3) {
a<-table.row.start(a)
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
for (jjj in 1:par3) a<-table.element(a,m.ct.i.tab[i,jjj])
a<-table.element(a,round(m.ct.i.tab[i,i]/sum(m.ct.i.tab[i,]),4))
for (jjj in 1:par3) a<-table.element(a,m.ct.x.tab[i,jjj])
a<-table.element(a,round(m.ct.x.tab[i,i]/sum(m.ct.x.tab[i,]),4))
a<-table.row.end(a)
}
a<-table.row.start(a)
a<-table.element(a,'Overall',1,TRUE)
for (jjj in 1:par3) a<-table.element(a,'-')
a<-table.element(a,round(m.ct.i.cp,4))
for (jjj in 1:par3) a<-table.element(a,'-')
a<-table.element(a,round(m.ct.x.cp,4))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
}
}
m
bitmap(file='test1.png')
plot(m)
dev.off()
bitmap(file='test1a.png')
plot(x[,par1] ~ as.factor(where(m)),main='Response by Terminal Node',xlab='Terminal Node',ylab='Response')
dev.off()
if (par2 == 'none') {
forec <- predict(m)
result <- as.data.frame(cbind(x[,par1],forec,x[,par1]-forec))
colnames(result) <- c('Actuals','Forecasts','Residuals')
print(result)
}
if (par2 != 'none') {
print(cbind(as.factor(x[,par1]),predict(m)))
myt <- table(as.factor(x[,par1]),predict(m))
print(myt)
}
bitmap(file='test2.png')
if(par2=='none') {
op <- par(mfrow=c(2,2))
plot(density(result$Actuals),main='Kernel Density Plot of Actuals')
plot(density(result$Residuals),main='Kernel Density Plot of Residuals')
plot(result$Forecasts,result$Actuals,main='Actuals versus Predictions',xlab='Predictions',ylab='Actuals')
plot(density(result$Forecasts),main='Kernel Density Plot of Predictions')
par(op)
}
if(par2!='none') {
plot(myt,main='Confusion Matrix',xlab='Actual',ylab='Predicted')
}
dev.off()
if (par2 == 'none') {
detcoef <- cor(result$Forecasts,result$Actuals)
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goodness of Fit',2,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Correlation',1,TRUE)
a<-table.element(a,round(detcoef,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'R-squared',1,TRUE)
a<-table.element(a,round(detcoef*detcoef,4))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'RMSE',1,TRUE)
a<-table.element(a,round(sqrt(mean((result$Residuals)^2)),4))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Actuals, Predictions, and Residuals',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'#',header=TRUE)
a<-table.element(a,'Actuals',header=TRUE)
a<-table.element(a,'Forecasts',header=TRUE)
a<-table.element(a,'Residuals',header=TRUE)
a<-table.row.end(a)
for (i in 1:length(result$Actuals)) {
a<-table.row.start(a)
a<-table.element(a,i,header=TRUE)
a<-table.element(a,result$Actuals[i])
a<-table.element(a,result$Forecasts[i])
a<-table.element(a,result$Residuals[i])
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable.tab')
}
if (par2 != 'none') {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Confusion Matrix (predicted in columns / actuals in rows)',par3+1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'',1,TRUE)
for (i in 1:par3) {
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
}
a<-table.row.end(a)
for (i in 1:par3) {
a<-table.row.start(a)
a<-table.element(a,paste('C',i,sep=''),1,TRUE)
for (j in 1:par3) {
a<-table.element(a,myt[i,j])
}
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
}