41 统计学习介绍
统计学习介绍的主要参考书: (James et al. 2013): Gareth James, Daniela Witten, Trevor Hastie, Robert Tibshirani(2013) An Introduction to Statistical Learning: with Applications in R, Springer.
调入需要的扩展包:
library(leaps) # 全子集回归
library(ISLR) # 参考书对应的包
library(glmnet) # 岭回归和lasso
library(tree) # 树回归
library(randomForest) # 随机森林和装袋法
library(MASS)
library(gbm) # boosting
library(e1071) # svm
41.1 统计学习的基本概念和一般步骤
41.1.1 统计学习的基本概念和方法
统计学习(statistical learning), 也有数据挖掘(data mining),机器学习(machine learning)等称呼。 主要目的是用一些计算机算法从大量数据中发现知识。 方兴未艾的数据科学就以统计学习为重要支柱。 方法分为有监督(supervised)学习与无监督(unsupervised)学习。
无监督学习方法如聚类问题、购物篮问题、主成分分析等。
有监督学习即统计中回归分析和判别分析解决的问题, 现在又有树回归、树判别、随机森林、lasso、支持向量机、 神经网络、贝叶斯网络、排序算法等许多方法。
无监督学习在给了数据之后, 直接从数据中发现规律, 比如聚类分析是发现数据中的聚集和分组现象, 购物篮分析是从数据中找到更多的共同出现的条目 (比如购买啤酒的用户也有较大可能购买火腿肠)。
有监督学习方法众多。 通常,需要把数据分为训练样本和检验样本, 训练样本的因变量(数值型或分类型)是已知的, 根据训练样本中自变量和因变量的关系训练出一个回归函数, 此函数以自变量为输入, 可以输出因变量的预测值。
训练出的函数有可能是有简单表达式的(例如,logistic回归)、 有参数众多的表达式的(如神经网络), 也有可能是依赖于所有训练样本而无法写出表达式的(例如k近邻分类)。
41.1.2 偏差与方差折衷
对回归问题,经常使用均方误差\(E|Ey - \hat y|^2\)来衡量精度。 对分类问题,经常使用分类准确率等来衡量精度。 易见\(E|Ey - \hat y|^2 = \text{Var}(\hat y) + (E\hat y - E y)^2\),所以均方误差可以分解为 \[ \text{均方误差} = \text{方差} + \text{偏差}^2, \]
训练的回归函数如果仅考虑对训练样本解释尽可能好, 就会使得估计结果方差很大,在对检验样本进行计算时因方差大而导致很大的误差, 所以选取的回归函数应该尽可能简单。
如果选取的回归函数过于简单而实际上自变量与因变量关系比较复杂, 就会使得估计的回归函数偏差比较大, 这样在对检验样本进行计算时也会有比较大的误差。
所以,在有监督学习时, 回归函数的复杂程度是一个很关键的量, 太复杂和太简单都可能导致差的结果, 需要找到一个折衷的值。
复杂程度在线性回归中就是自变量个数, 在一元曲线拟合中就是曲线的不光滑程度。 在其它指标类似的情况下,简单的模型更稳定、可解释更好, 所以统计学特别重视模型的简化。
41.1.3 交叉验证
即使是在从训练样本中修炼(估计)回归函数时, 也需要适当地选择模型的复杂度。 仅考虑对训练数据的拟合程度是不够的, 这会造成过度拟合问题。
为了相对客观地度量模型的预报误差, 假设训练样本有\(n\)个观测, 可以留出第一个观测不用, 用剩余的\(n-1\)个观测建模,然后预测第一个观测的因变量值, 得到一个误差;对每个观测都这样做, 就可以得到\(n\)个误差。 这样的方法叫做留一法。
更常用的是五折或十折交叉验证。 假设训练集有\(n\)个观测, 将其均分成\(10\)份, 保留第一份不用, 将其余九份合并在一起用来建模,然后预报第一份; 对每一份都这样做, 也可以得到\(n\)个误差, 这叫做十折交叉验证(ten-fold cross validation)方法。
因为要预报的数据没有用来建模, 交叉验证得到的误差估计更准确。
rsample的vfold_cv
可以生成这样的划分,
并对每一份,
可以用analysis()
和assessment()
分别提取建模用部分和验证用部分。
机器学习算法函数一般都包含了用交叉验证方法调参的功能,
不需要用户自己去划分数据。
41.2 Hitters数据分析
用例子来演示统计学习的各种方法。
41.2.1 介绍
考虑ISLR包的Hitters数据集。 此数据集有322个运动员的20个变量的数据, 其中的变量Salary(工资)是我们关心的。 变量包括:
## [1] "AtBat" "Hits" "HmRun" "Runs" "RBI" "Walks" "Years" "CAtBat" "CHits" "CHmRun" "CRuns" "CRBI" "CWalks" "League" "Division" "PutOuts" "Assists" "Errors" "Salary" "NewLeague"
数据集的详细变量信息如下:
## 'data.frame': 322 obs. of 20 variables:
## $ AtBat : int 293 315 479 496 321 594 185 298 323 401 ...
## $ Hits : int 66 81 130 141 87 169 37 73 81 92 ...
## $ HmRun : int 1 7 18 20 10 4 1 0 6 17 ...
## $ Runs : int 30 24 66 65 39 74 23 24 26 49 ...
## $ RBI : int 29 38 72 78 42 51 8 24 32 66 ...
## $ Walks : int 14 39 76 37 30 35 21 7 8 65 ...
## $ Years : int 1 14 3 11 2 11 2 3 2 13 ...
## $ CAtBat : int 293 3449 1624 5628 396 4408 214 509 341 5206 ...
## $ CHits : int 66 835 457 1575 101 1133 42 108 86 1332 ...
## $ CHmRun : int 1 69 63 225 12 19 1 0 6 253 ...
## $ CRuns : int 30 321 224 828 48 501 30 41 32 784 ...
## $ CRBI : int 29 414 266 838 46 336 9 37 34 890 ...
## $ CWalks : int 14 375 263 354 33 194 24 12 8 866 ...
## $ League : Factor w/ 2 levels "A","N": 1 2 1 2 2 1 2 1 2 1 ...
## $ Division : Factor w/ 2 levels "E","W": 1 2 2 1 1 2 1 2 2 1 ...
## $ PutOuts : int 446 632 880 200 805 282 76 121 143 0 ...
## $ Assists : int 33 43 82 11 40 421 127 283 290 0 ...
## $ Errors : int 20 10 14 3 4 25 7 9 19 0 ...
## $ Salary : num NA 475 480 500 91.5 750 70 100 75 1100 ...
## $ NewLeague: Factor w/ 2 levels "A","N": 1 2 1 2 2 1 1 1 2 1 ...
希望以Salary为因变量,查看其缺失值个数:
## [1] 59
为简单起见,去掉有缺失值的观测:
## [1] 263 20
41.2.3 回归自变量选择
41.2.3.1 最优子集选择
用leaps包的regsubsets()
函数计算最优子集回归,
办法是对某个试验性的子集自变量个数\(\hat p\)值,
都找到\(\hat p\)固定情况下残差平方和最小的变量子集,
这样只要在这些不同\(\hat p\)的最优子集中挑选就可以了。
挑选可以用AIC、BIC等方法。
可以先进行一个包含所有自变量的全集回归:
regfit.full <- regsubsets(
Salary ~ ., data=hit_train, nvmax=19)
reg.summary <- summary(regfit.full)
reg.summary
## Subset selection object
## Call: regsubsets.formula(Salary ~ ., data = hit_train, nvmax = 19)
## 19 Variables (and intercept)
## Forced in Forced out
## AtBat FALSE FALSE
## Hits FALSE FALSE
## HmRun FALSE FALSE
## Runs FALSE FALSE
## RBI FALSE FALSE
## Walks FALSE FALSE
## Years FALSE FALSE
## CAtBat FALSE FALSE
## CHits FALSE FALSE
## CHmRun FALSE FALSE
## CRuns FALSE FALSE
## CRBI FALSE FALSE
## CWalks FALSE FALSE
## LeagueN FALSE FALSE
## DivisionW FALSE FALSE
## PutOuts FALSE FALSE
## Assists FALSE FALSE
## Errors FALSE FALSE
## NewLeagueN FALSE FALSE
## 1 subsets of each size up to 19
## Selection Algorithm: exhaustive
## AtBat Hits HmRun Runs RBI Walks Years CAtBat CHits CHmRun CRuns CRBI CWalks LeagueN DivisionW PutOuts Assists Errors NewLeagueN
## 1 ( 1 ) " " " " " " " " " " " " " " " " " " " " " " "*" " " " " " " " " " " " " " "
## 2 ( 1 ) " " "*" " " " " " " " " " " " " " " " " " " "*" " " " " " " " " " " " " " "
## 3 ( 1 ) " " "*" " " " " " " " " " " " " " " " " " " "*" " " " " "*" " " " " " " " "
## 4 ( 1 ) " " "*" " " " " " " " " " " " " " " " " " " "*" " " " " "*" "*" " " " " " "
## 5 ( 1 ) "*" "*" " " " " " " " " " " " " " " " " " " "*" " " " " "*" "*" " " " " " "
## 6 ( 1 ) "*" "*" " " " " " " "*" " " " " " " " " " " "*" " " " " "*" "*" " " " " " "
## 7 ( 1 ) "*" "*" " " " " " " "*" " " " " " " " " " " "*" "*" " " "*" "*" " " " " " "
## 8 ( 1 ) "*" "*" " " " " " " "*" " " " " " " " " "*" "*" "*" " " "*" "*" " " " " " "
## 9 ( 1 ) "*" "*" " " " " " " "*" " " " " " " "*" "*" " " "*" " " "*" "*" "*" " " " "
## 10 ( 1 ) "*" "*" " " " " " " "*" " " "*" " " " " "*" "*" "*" " " "*" "*" "*" " " " "
## 11 ( 1 ) "*" "*" " " " " " " "*" " " "*" " " " " "*" "*" "*" "*" "*" "*" "*" " " " "
## 12 ( 1 ) "*" "*" " " "*" " " "*" " " "*" " " " " "*" "*" "*" "*" "*" "*" "*" " " " "
## 13 ( 1 ) "*" "*" " " "*" " " "*" "*" "*" " " " " "*" "*" "*" "*" "*" "*" "*" " " " "
## 14 ( 1 ) "*" "*" " " "*" "*" "*" "*" "*" " " " " "*" "*" "*" "*" "*" "*" "*" " " " "
## 15 ( 1 ) "*" "*" " " "*" "*" "*" "*" "*" " " " " "*" "*" "*" "*" "*" "*" "*" "*" " "
## 16 ( 1 ) "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " " " "
## 17 ( 1 ) "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" " "
## 18 ( 1 ) "*" "*" " " "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
## 19 ( 1 ) "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*" "*"
这里用nvmax=
指定了允许所有的自变量都参加,
缺省行为是限制最多个数的。
上述结果表格中每一行给出了固定\(\hat p\)条件下的最优子集。
试比较这些最优模型的BIC值:
## [1] -63.90242 -86.59469 -90.68877 -93.51559 -96.29865 -96.35699 -95.24328 -94.33547 -91.79438 -89.31463 -85.07463 -80.40798 -75.33025 -70.12122 -64.82873 -59.53306 -54.25553 -48.92352 -43.58870
其中\(\hat p=5\)的值相近,都很低,
取\(\hat p=6\)。
用coef()
加id=6
指定第六种子集:
## (Intercept) AtBat Hits Walks CRBI DivisionW PutOuts
## 149.0951521 -2.1064928 8.2070703 3.2517011 0.6351933 -136.2935330 0.2646021
这种方法实现了选取BIC最小的自变量子集。
41.2.3.2 逐步回归方法
在用lm()
做了全集回归后,
把全集回归结果输入到stats::step()
函数中可以执行逐步回归。
如:
##
## Call:
## lm(formula = Salary ~ ., data = hit_train)
##
## Residuals:
## Min 1Q Median 3Q Max
## -918.96 -183.16 -35.62 138.30 1799.45
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 241.67291 109.57064 2.206 0.02862 *
## AtBat -2.48494 0.76899 -3.231 0.00145 **
## Hits 8.15485 2.84403 2.867 0.00461 **
## HmRun -0.37929 7.64779 -0.050 0.96050
## Runs -2.12109 3.59273 -0.590 0.55564
## RBI 0.76668 3.11770 0.246 0.80602
## Walks 6.27568 2.18144 2.877 0.00448 **
## Years -7.18987 15.10209 -0.476 0.63457
## CAtBat -0.14891 0.16372 -0.909 0.36425
## CHits 0.23486 0.78151 0.301 0.76411
## CHmRun 0.50158 1.97716 0.254 0.80002
## CRuns 1.11476 0.92330 1.207 0.22881
## CRBI 0.70183 0.84282 0.833 0.40606
## CWalks -0.83644 0.37968 -2.203 0.02881 *
## LeagueN 47.02170 94.26262 0.499 0.61848
## DivisionW -120.60207 48.51038 -2.486 0.01379 *
## PutOuts 0.26292 0.09121 2.883 0.00440 **
## Assists 0.38272 0.26915 1.422 0.15670
## Errors -1.28251 5.36074 -0.239 0.81118
## NewLeagueN -7.16809 94.61668 -0.076 0.93969
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 336.8 on 188 degrees of freedom
## Multiple R-squared: 0.5146, Adjusted R-squared: 0.4655
## F-statistic: 10.49 on 19 and 188 DF, p-value: < 2.2e-16
## Start: AIC=2439.89
## Salary ~ AtBat + Hits + HmRun + Runs + RBI + Walks + Years +
## CAtBat + CHits + CHmRun + CRuns + CRBI + CWalks + League +
## Division + PutOuts + Assists + Errors + NewLeague
##
## Df Sum of Sq RSS AIC
## - HmRun 1 279 21327132 2437.9
## - NewLeague 1 651 21327504 2437.9
## - Errors 1 6493 21333346 2437.9
## - RBI 1 6860 21333713 2438.0
## - CHmRun 1 7301 21334153 2438.0
## - CHits 1 10245 21337098 2438.0
## - Years 1 25712 21352565 2438.1
## - League 1 28228 21355081 2438.2
## - Runs 1 39540 21366393 2438.3
## - CRBI 1 78662 21405515 2438.7
## - CAtBat 1 93836 21420689 2438.8
## - CRuns 1 165367 21492220 2439.5
## <none> 21326853 2439.9
## - Assists 1 229372 21556225 2440.1
## - CWalks 1 550572 21877425 2443.2
## - Division 1 701147 22028000 2444.6
## - Hits 1 932679 22259532 2446.8
## - Walks 1 938864 22265716 2446.8
## - PutOuts 1 942588 22269441 2446.9
## - AtBat 1 1184571 22511424 2449.1
##
## Step: AIC=2437.89
## Salary ~ AtBat + Hits + Runs + RBI + Walks + Years + CAtBat +
## CHits + CHmRun + CRuns + CRBI + CWalks + League + Division +
## PutOuts + Assists + Errors + NewLeague
##
## Df Sum of Sq RSS AIC
## - NewLeague 1 566 21327698 2435.9
## - Errors 1 6443 21333575 2436.0
## - CHmRun 1 7539 21334671 2436.0
## - CHits 1 9986 21337118 2436.0
## - RBI 1 12495 21339627 2436.0
## - Years 1 25478 21352610 2436.1
## - League 1 27950 21355082 2436.2
## - Runs 1 53429 21380561 2436.4
## - CAtBat 1 94340 21421471 2436.8
## - CRBI 1 96689 21423821 2436.8
## - CRuns 1 185367 21512499 2437.7
## <none> 21327132 2437.9
## - Assists 1 235593 21562725 2438.2
## - CWalks 1 575407 21902539 2441.4
## - Division 1 720408 22047540 2442.8
## - PutOuts 1 947076 22274208 2444.9
## - Walks 1 1002501 22329633 2445.4
## - Hits 1 1073306 22400438 2446.1
## - AtBat 1 1185325 22512457 2447.1
##
## Step: AIC=2435.9
## Salary ~ AtBat + Hits + Runs + RBI + Walks + Years + CAtBat +
## CHits + CHmRun + CRuns + CRBI + CWalks + League + Division +
## PutOuts + Assists + Errors
##
## Df Sum of Sq RSS AIC
## - Errors 1 6155 21333853 2434.0
## - CHmRun 1 7339 21335037 2434.0
## - CHits 1 9541 21337239 2434.0
## - RBI 1 12817 21340515 2434.0
## - Years 1 25398 21353097 2434.2
## - Runs 1 53335 21381033 2434.4
## - League 1 75071 21402769 2434.6
## - CAtBat 1 93812 21421510 2434.8
## - CRBI 1 98282 21425981 2434.9
## - CRuns 1 190610 21518308 2435.8
## <none> 21327698 2435.9
## - Assists 1 236010 21563708 2436.2
## - CWalks 1 577288 21904986 2439.4
## - Division 1 720061 22047759 2440.8
## - PutOuts 1 948064 22275762 2442.9
## - Walks 1 1003786 22331484 2443.5
## - Hits 1 1091940 22419639 2444.3
## - AtBat 1 1223590 22551289 2445.5
##
## Step: AIC=2433.96
## Salary ~ AtBat + Hits + Runs + RBI + Walks + Years + CAtBat +
## CHits + CHmRun + CRuns + CRBI + CWalks + League + Division +
## PutOuts + Assists
##
## Df Sum of Sq RSS AIC
## - CHmRun 1 6724 21340577 2432.0
## - CHits 1 7824 21341677 2432.0
## - RBI 1 11220 21345072 2432.1
## - Years 1 24104 21357956 2432.2
## - Runs 1 57526 21391379 2432.5
## - League 1 70922 21404775 2432.7
## - CAtBat 1 90644 21424497 2432.8
## - CRBI 1 100984 21434837 2432.9
## - CRuns 1 201382 21535235 2433.9
## <none> 21333853 2434.0
## - Assists 1 313674 21647527 2435.0
## - CWalks 1 593539 21927392 2437.7
## - Division 1 722945 22056798 2438.9
## - PutOuts 1 942739 22276592 2440.9
## - Walks 1 1040700 22374553 2441.9
## - Hits 1 1161864 22495717 2443.0
## - AtBat 1 1281359 22615212 2444.1
##
## Step: AIC=2432.03
## Salary ~ AtBat + Hits + Runs + RBI + Walks + Years + CAtBat +
## CHits + CRuns + CRBI + CWalks + League + Division + PutOuts +
## Assists
##
## Df Sum of Sq RSS AIC
## - CHits 1 2192 21342770 2430.1
## - RBI 1 12586 21353163 2430.2
## - Years 1 24971 21365548 2430.3
## - Runs 1 63054 21403631 2430.6
## - League 1 71042 21411619 2430.7
## - CAtBat 1 86281 21426858 2430.9
## <none> 21340577 2432.0
## - Assists 1 306971 21647548 2433.0
## - CRuns 1 433335 21773912 2434.2
## - CWalks 1 631568 21972145 2436.1
## - Division 1 716579 22057157 2436.9
## - PutOuts 1 954537 22295114 2439.1
## - CRBI 1 1001899 22342476 2439.6
## - Walks 1 1036407 22376984 2439.9
## - Hits 1 1187105 22527683 2441.3
## - AtBat 1 1283747 22624325 2442.2
##
## Step: AIC=2430.05
## Salary ~ AtBat + Hits + Runs + RBI + Walks + Years + CAtBat +
## CRuns + CRBI + CWalks + League + Division + PutOuts + Assists
##
## Df Sum of Sq RSS AIC
## - RBI 1 13190 21355960 2428.2
## - Years 1 29638 21372407 2428.3
## - League 1 72742 21415512 2428.8
## - Runs 1 81521 21424290 2428.8
## <none> 21342770 2430.1
## - CAtBat 1 230265 21573034 2430.3
## - Assists 1 307170 21649939 2431.0
## - CRuns 1 713710 22056479 2434.9
## - Division 1 715586 22058356 2434.9
## - CWalks 1 929774 22272544 2436.9
## - PutOuts 1 978714 22321484 2437.4
## - CRBI 1 1002770 22345540 2437.6
## - Walks 1 1086910 22429680 2438.4
## - AtBat 1 1599684 22942453 2443.1
## - Hits 1 1779918 23122687 2444.7
##
## Step: AIC=2428.18
## Salary ~ AtBat + Hits + Runs + Walks + Years + CAtBat + CRuns +
## CRBI + CWalks + League + Division + PutOuts + Assists
##
## Df Sum of Sq RSS AIC
## - Years 1 26692 21382651 2426.4
## - League 1 70307 21426266 2426.9
## - Runs 1 73753 21429713 2426.9
## <none> 21355960 2428.2
## - CAtBat 1 249406 21605365 2428.6
## - Assists 1 295538 21651497 2429.0
## - CRuns 1 702284 22058244 2432.9
## - Division 1 734085 22090044 2433.2
## - CWalks 1 937348 22293308 2435.1
## - PutOuts 1 1002301 22358261 2435.7
## - Walks 1 1086003 22441962 2436.5
## - CRBI 1 1439193 22795152 2439.7
## - AtBat 1 1640165 22996124 2441.6
## - Hits 1 1787801 23143761 2442.9
##
## Step: AIC=2426.43
## Salary ~ AtBat + Hits + Runs + Walks + CAtBat + CRuns + CRBI +
## CWalks + League + Division + PutOuts + Assists
##
## Df Sum of Sq RSS AIC
## - Runs 1 69079 21451730 2425.1
## - League 1 87548 21470199 2425.3
## <none> 21382651 2426.4
## - Assists 1 314039 21696690 2427.5
## - CAtBat 1 492567 21875218 2429.2
## - Division 1 725175 22107827 2431.4
## - CRuns 1 880113 22262764 2432.8
## - CWalks 1 988001 22370652 2433.8
## - PutOuts 1 1049648 22432299 2434.4
## - Walks 1 1079896 22462547 2434.7
## - CRBI 1 1420036 22802687 2437.8
## - AtBat 1 1614330 22996981 2439.6
## - Hits 1 1772982 23155633 2441.0
##
## Step: AIC=2425.11
## Salary ~ AtBat + Hits + Walks + CAtBat + CRuns + CRBI + CWalks +
## League + Division + PutOuts + Assists
##
## Df Sum of Sq RSS AIC
## - League 1 113492 21565223 2424.2
## <none> 21451730 2425.1
## - Assists 1 399827 21851557 2426.9
## - CAtBat 1 428452 21880182 2427.2
## - Division 1 727359 22179089 2430.0
## - CRuns 1 811308 22263038 2430.8
## - CWalks 1 947776 22399506 2432.1
## - Walks 1 1029714 22481444 2432.9
## - PutOuts 1 1153252 22604982 2434.0
## - CRBI 1 1434607 22886337 2436.6
## - AtBat 1 1793723 23245454 2439.8
## - Hits 1 1825947 23277677 2440.1
##
## Step: AIC=2424.2
## Salary ~ AtBat + Hits + Walks + CAtBat + CRuns + CRBI + CWalks +
## Division + PutOuts + Assists
##
## Df Sum of Sq RSS AIC
## <none> 21565223 2424.2
## - CAtBat 1 366456 21931678 2425.7
## - Assists 1 423017 21988240 2426.2
## - CRuns 1 756041 22321264 2429.4
## - Division 1 762166 22327389 2429.4
## - CWalks 1 998625 22563847 2431.6
## - Walks 1 1124976 22690198 2432.8
## - PutOuts 1 1245275 22810497 2433.9
## - CRBI 1 1393594 22958817 2435.2
## - Hits 1 1785448 23350671 2438.8
## - AtBat 1 1830070 23395292 2439.2
##
## Call:
## lm(formula = Salary ~ AtBat + Hits + Walks + CAtBat + CRuns +
## CRBI + CWalks + Division + PutOuts + Assists, data = hit_train)
##
## Coefficients:
## (Intercept) AtBat Hits Walks CAtBat CRuns CRBI CWalks DivisionW PutOuts Assists
## 235.9278 -2.5863 7.7364 5.9827 -0.1210 1.2468 0.9302 -0.9100 -123.4092 0.2893 0.3770
最后保留了10个自变量。
41.2.3.4 用10折交叉验证方法选择最优子集
下列程序对数据中每一行分配一个折号:
下面,对10折中每一折都分别当作测试集一次, 得到不同子集大小的均方误差:
cv.errors <- matrix( as.numeric(NA), 10, 19, dimnames=list(NULL, paste(1:19)) )
for(j in 1:10){ # 折
d_ana <- analysis(hit_fold$splits[[1]])
d_ass <- assessment((hit_fold$splits[[1]]))
best.fit <- regsubsets(
Salary ~ .,
data = d_ana, nvmax=19)
for(i in 1:19){
pred <- predict(
best.fit, d_ass, id=i)
cv.errors[j, i] <- mean( (d_ass[["Salary"]] - pred)^2 )
}
}
head(cv.errors)
## 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
## [1,] 277846.6 201416.7 292984.6 236472.5 250270.8 201979.7 197639.6 165602.5 178146.8 174549.5 178469.3 183560.7 185073.1 181778.8 182629.5 181893.4 181501.3 181587.7 181638.5
## [2,] 277846.6 201416.7 292984.6 236472.5 250270.8 201979.7 197639.6 165602.5 178146.8 174549.5 178469.3 183560.7 185073.1 181778.8 182629.5 181893.4 181501.3 181587.7 181638.5
## [3,] 277846.6 201416.7 292984.6 236472.5 250270.8 201979.7 197639.6 165602.5 178146.8 174549.5 178469.3 183560.7 185073.1 181778.8 182629.5 181893.4 181501.3 181587.7 181638.5
## [4,] 277846.6 201416.7 292984.6 236472.5 250270.8 201979.7 197639.6 165602.5 178146.8 174549.5 178469.3 183560.7 185073.1 181778.8 182629.5 181893.4 181501.3 181587.7 181638.5
## [5,] 277846.6 201416.7 292984.6 236472.5 250270.8 201979.7 197639.6 165602.5 178146.8 174549.5 178469.3 183560.7 185073.1 181778.8 182629.5 181893.4 181501.3 181587.7 181638.5
## [6,] 277846.6 201416.7 292984.6 236472.5 250270.8 201979.7 197639.6 165602.5 178146.8 174549.5 178469.3 183560.7 185073.1 181778.8 182629.5 181893.4 181501.3 181587.7 181638.5
cv.errors
是一个\(10\times 19\)矩阵,
每行对应一折作为测试集的情形,
每列是一个子集大小,
元素值是测试均方误差。
对每列的10个元素求平均, 可以得到每个子集大小的平均均方误差:
## 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
## 277846.6 201416.7 292984.6 236472.5 250270.8 201979.7 197639.6 165602.5 178146.8 174549.5 178469.3 183560.7 185073.1 181778.8 182629.5 181893.4 181501.3 181587.7 181638.5
这样找到的最优子集大小是8。 注意, 一般不需要用户自己进行这种交叉验证调参, 机器学习的函数一般都集成了这个功能。
用这种方法找到最优子集大小后, 可以对全数据集重新建模但是选择最优子集大小为8:
## (Intercept) AtBat Hits Walks CHmRun CRuns CWalks DivisionW PutOuts
## 130.9691577 -2.1731903 7.3582935 6.0037597 1.2339718 0.9651349 -0.8323788 -117.9657795 0.2908431
事实上, 划分训练集和验证集与交叉验证方法经常联合运用。 取一个固定的较小规模的测试集, 此测试集不用来作子集选择, 对训练集用交叉验证方法选择最优子集, 然后在测试集上评估最终模型的性能。
41.2.4 岭回归
当自变量个数太多时,模型复杂度高, 可能有过度拟合, 模型不稳定。
一种方法是对较大的模型系数施加二次惩罚, 把最小二乘问题变成带有二次惩罚项的惩罚最小二乘问题: \[\begin{aligned} \min\; \sum_{i=1}^n \left( y_i - \beta_0 - \beta_1 x_{i1} - \dots - \beta_p x_{ip} \right)^2 + \lambda \sum_{j=1}^p \beta_j^2 . \end{aligned}\] 这比通常最小二乘得到的回归系数绝对值变小, 但是求解的稳定性增加了,避免了共线问题。
实际上, 与线性模型\(\boldsymbol Y = \boldsymbol X \boldsymbol\beta + \boldsymbol\varepsilon\) 的普通最小二乘解 \(\hat{\boldsymbol\beta} = (\boldsymbol X^T \boldsymbol X)^{-1} \boldsymbol X^T \boldsymbol Y\) 相比, 岭回归问题的解为 \[ \tilde{\boldsymbol\beta} = (\boldsymbol X^T \boldsymbol X + s \boldsymbol I)^{-1} \boldsymbol X^T \boldsymbol Y \] 其中\(\boldsymbol I\)为单位阵,\(s>0\)与\(\lambda\)有关。
\(\lambda\)称为调节参数,\(\lambda\)越大,相当于模型复杂度越低。 适当选择\(\lambda\)可以在方差与偏差之间找到适当的折衷, 从而减小预测误差。
由于量纲问题,在不同自变量不可比时,数据集应该进行标准化。
用R的glmnet包计算岭回归。
用glmnet()
函数,
指定参数alpha=0
时执行的是岭回归。
用参数lambda=
指定一个调节参数网格,
岭回归将在这些调节参数上计算。
用coef()
从回归结果中取得不同调节参数对应的回归系数估计,
结果是一个矩阵,每列对应于一个调节参数。
仍采用上面去掉了缺失值的Hitters数据集结果da_hit
。
如下程序把回归的设计阵与因变量提取出来:
岭回归涉及到调节参数\(\lambda\)的选择, 为了绘图, 先选择\(\lambda\)的一个网格:
用所有数据针对这样的调节参数网格计算岭回归结果,
注意glmnet()
函数允许调节参数\(\lambda\)输入多个值:
## [1] 20 100
glmnet()
函数默认对数据进行标准化。
coef()
的结果是一个矩阵,每列对应一个调节参数值。
41.2.4.1 用10折交叉验证选取调节参数
仍使用训练集,
但训练集再进行交叉验证。
cv.glmnet()
函数可以执行交叉验证。
这样获得了最优调节参数\(\lambda=\) 25.2283126。 用最优调节参数对测试集作预测, 得到预测均方误差:
ridge.pred <- predict(
ridge.mod, s = bestlam,
newx = model.matrix(Salary ~ ., hit_test)[,-1])
mean( (ridge.pred - hit_test$Salary)^2 )
## [1] 57954.62
最后,用选取的最优调节系数对全数据集建模, 得到相应的岭回归系数估计:
x <- model.matrix(Salary ~ ., da_hit)[,-1]
y <- da_hit$Salary
out <- glmnet(x, y, alpha=0)
predict(out, type='coefficients', s=bestlam)[1:20,]
## (Intercept) AtBat Hits HmRun Runs RBI Walks Years CAtBat CHits CHmRun CRuns CRBI CWalks LeagueN DivisionW PutOuts Assists Errors NewLeagueN
## 8.112693e+01 -6.815959e-01 2.772312e+00 -1.365680e+00 1.014826e+00 7.130224e-01 3.378558e+00 -9.066800e+00 -1.199478e-03 1.361029e-01 6.979958e-01 2.958896e-01 2.570711e-01 -2.789666e-01 5.321272e+01 -1.228345e+02 2.638876e-01 1.698796e-01 -3.685645e+00 -1.810510e+01
41.2.5 Lasso回归
另一种对回归系数的惩罚是\(L_1\)惩罚: \[\begin{align} \min\; \sum_{i=1}^n \left( y_i - \beta_0 - \beta_1 x_{i1} - \dots - \beta_p x_{ip} \right)^2 + \lambda \sum_{j=1}^p |\beta_j| . \tag{41.1} \end{align}\] 奇妙地是,适当选择调节参数\(\lambda\),可以使得部分回归系数变成零, 达到了即减小回归系数的绝对值又挑选重要变量子集的效果。
事实上,(41.1)等价于约束最小值问题 \[\begin{aligned} & \min\; \sum_{i=1}^n \left( y_i - \beta_0 - \beta_1 x_{i1} - \dots - \beta_p x_{ip} \right)^2 \quad \text{s.t.} \\ & \sum_{j=1}^p |\beta_j| \leq s \end{aligned}\] 其中\(s\)与\(\lambda\)一一对应。 这样的约束区域是带有顶点的凸集, 而目标函数是二次函数, 最小值点经常在约束区域顶点达到, 这些顶点是某些坐标等于零的点。 见图41.4。
对于每个调节参数\(\lambda\), 都应该解出(41.1)的相应解, 记为\(\hat{\boldsymbol\beta}(\lambda)\)。 幸运的是, 不需要对每个\(\lambda\)去解最小值问题(41.1), 存在巧妙的算法使得问题的计算量与求解一次最小二乘相仿。
通常选取\(\lambda\)的格子点,计算相应的惩罚回归系数。 用交叉验证方法估计预测的均方误差。 选取使得交叉验证均方误差最小的调节参数(一般R函数中已经作为选项)。
用R的glmnet包计算lasso。
用glmnet()
函数,
指定参数alpha=1
时执行的是lasso。
用参数lambda=
指定一个调节参数网格,
lasso将输出这些调节参数对应的结果。
对回归结果使用plot()
函数可以画出调节参数变化时系数估计的变化情况。
仍使用gmlnet包的glmnet()
函数计算Lasso回归,
指定一个调节参数网格(沿用前面的网格):
x <- model.matrix(Salary ~ ., hit_train)[,-1]
y <- hit_train$Salary
lasso.mod <- glmnet(x, y, alpha=1, lambda=grid)
plot(lasso.mod)
## Warning in regularize.values(x, y, ties, missing(ties), na.rm = na.rm): collapsing to unique 'x' values
对lasso结果使用plot()
函数可以绘制延调节参数网格变化的各回归系数估计,横坐标不是调节参数而是调节参数对应的系数绝对值和,
可以看出随着系数绝对值和增大,实际是调节参数变小,
更多地自变量进入模型。
41.2.5.1 用交叉验证估计调节参数
按照前面划分的训练集与测试集, 仅使用训练集数据做交叉验证估计最优调节参数:
## [1] 2.19423
得到调节参数估计后,对测试集计算预测均方误差:
lasso.pred <- predict(
lasso.mod, s = bestlam,
newx = model.matrix(Salary ~ ., hit_test)[,-1])
mean( (lasso.pred - hit_test$Salary)^2 )
## [1] 58582.15
这个效果比岭回归效果略差。
为了充分利用数据, 使用前面获得的最优调节参数, 对全数据集建模:
x <- model.matrix(Salary ~ ., da_hit)[,-1]
y <- da_hit$Salary
out <- glmnet(x, y, alpha=1, lambda=grid)
lasso.coef <- predict(
out, type='coefficients', s=bestlam)[1:20,]
lasso.coef
## (Intercept) AtBat Hits HmRun Runs RBI Walks Years CAtBat CHits CHmRun CRuns CRBI CWalks LeagueN DivisionW PutOuts Assists Errors NewLeagueN
## 1.348925e+02 -1.689582e+00 5.971182e+00 9.734402e-02 0.000000e+00 0.000000e+00 4.978211e+00 -1.019167e+01 -9.794493e-05 0.000000e+00 5.650266e-01 7.036826e-01 3.867695e-01 -5.851131e-01 3.305686e+01 -1.193420e+02 2.760478e-01 2.008473e-01 -2.277618e+00 0.000000e+00
## (Intercept) AtBat Hits HmRun Walks Years CAtBat CHmRun CRuns CRBI CWalks LeagueN DivisionW PutOuts Assists Errors
## 1.348925e+02 -1.689582e+00 5.971182e+00 9.734402e-02 4.978211e+00 -1.019167e+01 -9.794493e-05 5.650266e-01 7.036826e-01 3.867695e-01 -5.851131e-01 3.305686e+01 -1.193420e+02 2.760478e-01 2.008473e-01 -2.277618e+00
选择的自变量子集有15个自变量。
41.2.6 树回归的简单演示
决策树方法按不同自变量的不同值, 分层地把训练集分组。 每层使用一个变量, 所以这样的分组构成一个二叉树表示。 为了预测一个观测的类归属, 找到它所属的组, 用组的类归属或大多数观测的类归属进行预测。 这样的方法称为决策树(decision tree)。 决策树方法既可以用于判别问题, 也可以用于回归问题,称为回归树。
决策树的好处是容易解释, 在自变量为分类变量时没有额外困难。 但预测准确率可能比其它有监督学习方法差。
改进方法包括装袋法(bagging)、随机森林(random forests)、 提升法(boosting)。 这些改进方法都是把许多棵树合并在一起, 通常能改善准确率但是可解释性变差。
对Hitters数据,用Years和Hits作因变量预测log(Salaray)。
建立完整的树:
剪枝为只有3个叶结点:
显示树:
## node), split, n, deviance, yval
## * denotes terminal node
##
## 1) root 208 161.20 5.936
## 2) Years < 4.5 72 35.07 5.162 *
## 3) Years > 4.5 136 60.05 6.346
## 6) Hits < 117.5 70 23.60 5.986 *
## 7) Hits > 117.5 66 17.75 6.728 *
显示概括:
##
## Regression tree:
## snip.tree(tree = tr1, nodes = c(6L, 2L))
## Number of terminal nodes: 3
## Residual mean deviance: 0.3727 = 76.41 / 205
## Distribution of residuals:
## Min. 1st Qu. Median Mean 3rd Qu. Max.
## -2.2280 -0.3740 -0.0589 0.0000 0.3414 2.5010
做树图:
树的深度(depth)是指从根节点到最远的页节点经过的步数, 比如,上图的树的深度为2, 为了用叶结点给出因变量预测值, 最多需要2次判断。
41.2.7 树回归
对训练集上的未剪枝树用交叉验证方法寻找最优大小:
## $size
## [1] 9 8 6 5 4 3 2 1
##
## $dev
## [1] 79.61217 79.76379 78.84809 77.84420 78.42855 89.51855 101.22807 162.80732
##
## $k
## [1] -Inf 2.445601 2.639571 3.186007 4.133744 8.296626 18.711912 66.037022
##
## $method
## [1] "deviance"
##
## attr(,"class")
## [1] "prune" "tree.sequence"
plot(cv1$size, cv1$dev, type='b')
best.size <- cv1$size[which.min(cv1$dev)[1]]
abline(v=best.size, col='gray')
最优大小为5。 获得训练集上构造的树剪枝后的结果:
在测试集上计算预测均方误差:
pred.test <- predict(tr1b, newdata = hit_test)
test.mse <- mean( (hit_test$Salary - exp(pred.test))^2 )
test.mse
## [1] 72954.21
如果用训练集的因变量平均值估计测试集的因变量值, 均方误差为:
## [1] 170680.7
用所有数据来构造未剪枝树:
用训练集上得到的子树大小剪枝:
41.2.8 装袋法
判别树在不同的训练集、测试集划分上可以产生很大变化, 说明其预测值方差较大。 利用bootstrap的思想, 可以随机选取许多个训练集, 把许多个训练集的模型结果平均, 就可以降低预测值的方差。
办法是从一个训练集中用有放回抽样的方法抽取\(B\)个训练集, 设第\(b\)个抽取的训练集得到的回归函数为\(\hat f^{*b}(\cdot)\), 则最后的回归函数是这些回归函数的平均值: \[\begin{aligned} \hat f_{\text{bagging}}(x) = \frac{1}{B} \sum_{b=1}^b \hat f^{*b}(x) \end{aligned}\] 这称为装袋法(bagging)。 装袋法对改善判别与回归树的精度十分有效。
装袋法的步骤如下:
- 从训练集中取\(B\)个有放回随机抽样的bootstrap训练集,\(B\)取为几百到几千之间。
- 对每个bootstrap训练集,估计未剪枝的树。
- 如果因变量是连续变量,对测试样品,用所有的树的预测值的平均值作预测。
- 如果因变量是分类变量,对测试样品,可以用所有树预测类的多数投票决定预测值。
装袋法也可以用来改进其他的回归和判别方法。
装袋后不能再用图形表示,模型可解释性较差。 但是,可以度量自变量在预测中的重要程度。 在回归问题中, 可以计算每个自变量在所有\(B\)个树种平均减少的残差平方和的量, 以此度量其重要度。 在判别问题中, 可以计算每个自变量在所有\(B\)个树种平均减少的基尼系数的量, 以此度量其重要度。
除了可以用测试集、交叉验证方法以外, 还可以使用袋外观测预测误差。 用bootstrap再抽样获得多个训练集时每个bootstrap训练集总会遗漏一些观测, 平均每个bootstrap训练集会遗漏三分之一的观测。 对每个观测,大约有\(B/3\)棵树没有用到此观测, 可以用这些树的预测值平均来预测此观测,得到一个误差估计, 这样得到的均方误差估计或错判率称为袋外观测估计(OOB估计)。 好处是不用很多额外的工作。
对训练集用装袋法:
bag1 <- randomForest(
log(Salary) ~ .,
data = hit_train,
mtry=ncol(hit_train)-1,
importance=TRUE)
bag1
##
## Call:
## randomForest(formula = log(Salary) ~ ., data = hit_train, mtry = ncol(hit_train) - 1, importance = TRUE)
## Type of random forest: regression
## Number of trees: 500
## No. of variables tried at each split: 19
##
## Mean of squared residuals: 0.1997855
## % Var explained: 74.21
注意randomForest()
函数实际是随机森林法,
但是当mtry
的值取为所有自变量个数时就是装袋法。
对测试集进行预报:
pred2 <- predict(bag1, newdata = hit_test)
test.mse2 <- mean( (hit_test$Salary - exp(pred2))^2 )
test.mse2
## [1] 40493.2
在全集上使用装袋法:
##
## Call:
## randomForest(formula = log(Salary) ~ ., data = da_hit, mtry = ncol(da_hit) - 1, importance = TRUE)
## Type of random forest: regression
## Number of trees: 500
## No. of variables tried at each split: 19
##
## Mean of squared residuals: 0.1887251
## % Var explained: 76.04
变量的重要度数值和图形: 各变量的重要度数值及其图形:
## %IncMSE IncNodePurity
## AtBat 12.1847865 8.0539634
## Hits 9.3337219 8.8872063
## HmRun 3.1834789 1.9012234
## Runs 7.3505924 3.8938682
## RBI 6.5675751 5.5666456
## Walks 9.1539209 7.3512005
## Years 10.5282905 2.3558252
## CAtBat 27.7282364 80.8752219
## CHits 13.0681742 27.4949657
## CHmRun 6.0152055 4.2342280
## CRuns 13.5257425 30.6010931
## CRBI 14.7426270 10.3984912
## CWalks 7.1140115 5.6665591
## League -1.1679843 0.2463835
## Division 1.2910729 0.2785405
## PutOuts 4.7092044 3.7752935
## Assists -0.8721871 1.7583406
## Errors 1.4266974 1.6524380
## NewLeague -1.0968795 0.3831401
最重要的自变量是CAtBats, 其次有CRuns, CHits等。
如何计算变量重要度? 基于树的方法, 每个叶节点的纯度越高(叶结点中所有观测的标签相同,或者因变量值相等), 模型拟合优度越好。 所以, 对每一个变量, 可以计算其在作为分枝用的变量时, 对中间节点的纯度指标的改善量, 将这些改善量加起来。 对装袋法、随机森林、提升法(如GBM), 则是计算每个变量对损失函数的改善量。
不同的机器学习算法对变量重要程度有不同的定义, 比如, 广义线性模型(GLM)用标准化后的自变量的系数估计的绝对值大小作为重要程度度量。
41.2.9 随机森林
随机森林的思想与装袋法类似, 但是试图使得参加平均的各个树之间变得比较独立。 仍采用有放回抽样得到的多个bootstrap训练集, 但是对每个bootstrap训练集构造判别树时, 每次分叉时不考虑所有自变量, 而是仅考虑随机选取的一个自变量子集。
对判别树,每次分叉时选取的自变量个数通常取\(m \approx \sqrt{p}\)个。 比如,对Heart数据的13个自变量, 每次分叉时仅随机选取4个纳入考察范围。
随机森林的想法是基于正相关的样本在平均时并不能很好地降低方差, 独立样本能比较好地降低方差。 如果存在一个最重要的变量, 如果不加限制这个最重要的变量总会是第一个分叉, 使得\(B\)棵树相似程度很高。 随机森林解决这个问题的办法是限制分叉时可选的变量子集。
随机森林也可以用来改进其他的回归和判别方法。
装袋法和随机森林都可以用R扩展包randomForest的
randomForest()
函数实现。
当此函数的mtry
参数取为自变量个数时,执行的就是装袋法;
mtry
取缺省值时,执行随机森林算法。
执行随机森林算法时,
randomForest()
函数在回归问题时分叉时考虑的自变量个数取\(m \approx p/3\),
在判别问题时取\(m \approx \sqrt{p}\)。
对训练集用随机森林法:
##
## Call:
## randomForest(formula = log(Salary) ~ ., data = hit_train, importance = TRUE)
## Type of random forest: regression
## Number of trees: 500
## No. of variables tried at each split: 6
##
## Mean of squared residuals: 0.1961343
## % Var explained: 74.69
当mtry
的值取为缺省值时执行随机森林算法。
对测试集进行预报:
pred3 <- predict(rf1, newdata = hit_test)
test.mse3 <- mean( (hit_test$Salary - exp(pred3))^2 )
test.mse3
## [1] 39605.52
结果与装袋法相近。
在全集上使用随机森林:
##
## Call:
## randomForest(formula = log(Salary) ~ ., data = da_hit, importance = TRUE)
## Type of random forest: regression
## Number of trees: 500
## No. of variables tried at each split: 6
##
## Mean of squared residuals: 0.1819559
## % Var explained: 76.9
各变量的重要度数值及其图形:
## %IncMSE IncNodePurity
## AtBat 10.5831938 7.6803164
## Hits 6.8534500 8.4427137
## HmRun 2.9411205 2.6021258
## Runs 6.6085935 4.7906075
## RBI 7.6427518 6.0407086
## Walks 9.1235296 6.0742818
## Years 12.3990224 6.4031367
## CAtBat 18.1391649 43.9058026
## CHits 16.1265574 34.3144001
## CHmRun 8.6922887 6.0819271
## CRuns 15.0136882 32.8197588
## CRBI 13.8506429 20.4319951
## CWalks 10.7946944 16.8511024
## League 0.9803532 0.2773488
## Division -2.2734452 0.2644631
## PutOuts 1.4260960 3.1688382
## Assists -1.1869686 1.8335868
## Errors 0.6877809 1.6121681
## NewLeague 1.6258977 0.4034615
最重要的自变量是CAtBats, CRuns, CHits, CWalks, CRBI等。
41.3 Heart数据分析
Heart数据是心脏病诊断的数据, 因变量AHD为是否有心脏病, 试图用各个自变量预测(判别)。
读入Heart数据集,并去掉有缺失值的观测:
Heart <- read.csv(
"data/Heart.csv", header=TRUE, row.names=1,
stringsAsFactors=TRUE)
Heart <- na.omit(Heart)
str(Heart)
## 'data.frame': 297 obs. of 14 variables:
## $ Age : int 63 67 67 37 41 56 62 57 63 53 ...
## $ Sex : int 1 1 1 1 0 1 0 0 1 1 ...
## $ ChestPain: Factor w/ 4 levels "asymptomatic",..: 4 1 1 2 3 3 1 1 1 1 ...
## $ RestBP : int 145 160 120 130 130 120 140 120 130 140 ...
## $ Chol : int 233 286 229 250 204 236 268 354 254 203 ...
## $ Fbs : int 1 0 0 0 0 0 0 0 0 1 ...
## $ RestECG : int 2 2 2 0 2 0 2 0 2 2 ...
## $ MaxHR : int 150 108 129 187 172 178 160 163 147 155 ...
## $ ExAng : int 0 1 1 0 0 0 0 1 0 1 ...
## $ Oldpeak : num 2.3 1.5 2.6 3.5 1.4 0.8 3.6 0.6 1.4 3.1 ...
## $ Slope : int 3 2 2 3 1 1 3 1 2 3 ...
## $ Ca : int 0 3 2 0 0 0 2 0 1 0 ...
## $ Thal : Factor w/ 3 levels "fixed","normal",..: 1 2 3 2 2 2 2 2 3 3 ...
## $ AHD : Factor w/ 2 levels "No","Yes": 1 2 2 1 1 1 2 1 2 2 ...
## - attr(*, "na.action")= 'omit' Named int [1:6] 88 167 193 267 288 303
## ..- attr(*, "names")= chr [1:6] "88" "167" "193" "267" ...
##
## Age Min. :29.00 1st Qu.:48.00 Median :56.00 Mean :54.54 3rd Qu.:61.00 Max. :77.00
## Sex Min. :0.0000 1st Qu.:0.0000 Median :1.0000 Mean :0.6768 3rd Qu.:1.0000 Max. :1.0000
## ChestPain asymptomatic:142 nonanginal : 83 nontypical : 49 typical : 23
## RestBP Min. : 94.0 1st Qu.:120.0 Median :130.0 Mean :131.7 3rd Qu.:140.0 Max. :200.0
## Chol Min. :126.0 1st Qu.:211.0 Median :243.0 Mean :247.4 3rd Qu.:276.0 Max. :564.0
## Fbs Min. :0.0000 1st Qu.:0.0000 Median :0.0000 Mean :0.1448 3rd Qu.:0.0000 Max. :1.0000
## RestECG Min. :0.0000 1st Qu.:0.0000 Median :1.0000 Mean :0.9966 3rd Qu.:2.0000 Max. :2.0000
## MaxHR Min. : 71.0 1st Qu.:133.0 Median :153.0 Mean :149.6 3rd Qu.:166.0 Max. :202.0
## ExAng Min. :0.0000 1st Qu.:0.0000 Median :0.0000 Mean :0.3266 3rd Qu.:1.0000 Max. :1.0000
## Oldpeak Min. :0.000 1st Qu.:0.000 Median :0.800 Mean :1.056 3rd Qu.:1.600 Max. :6.200
## Slope Min. :1.000 1st Qu.:1.000 Median :2.000 Mean :1.603 3rd Qu.:2.000 Max. :3.000
## Ca Min. :0.0000 1st Qu.:0.0000 Median :0.0000 Mean :0.6768 3rd Qu.:1.0000 Max. :3.0000
## Thal fixed : 18 normal :164 reversable:115
## AHD No :160 Yes:137
数据下载:Heart.csv
41.3.1 树回归
41.3.1.1 划分训练集与测试集
简单地把观测分为一半训练集、一半测试集:
set.seed(1)
train <- sample(nrow(Heart), size=round(nrow(Heart)/2))
test <- (-train)
test.y <- Heart[test, 'AHD']
在训练集上建立未剪枝的判别树:
41.3.1.2 适当剪枝
用交叉验证方法确定剪枝保留的叶子个数,剪枝时按照错判率执行:
## $size
## [1] 12 9 6 4 2 1
##
## $dev
## [1] 42 44 47 44 57 69
##
## $k
## [1] -Inf 0.000000 1.666667 3.000000 7.000000 26.000000
##
## $method
## [1] "misclass"
##
## attr(,"class")
## [1] "prune" "tree.sequence"
最优的大小是12。但是从图上看,4个叶结点已经足够好,所以取为4。
对训练集生成剪枝结果:
注意剪枝后树的显示中, 内部节点的自变量存在分类变量, 这时按照这个自变量分叉时, 取指定的某几个分类值时对应分支Yes, 取其它的分类值时对应分支No。
41.3.2 用装袋法
对训练集用装袋法:
##
## Call:
## randomForest(formula = AHD ~ ., data = Heart, mtry = 13, importance = TRUE, subset = train)
## Type of random forest: classification
## Number of trees: 500
## No. of variables tried at each split: 13
##
## OOB estimate of error rate: 22.3%
## Confusion matrix:
## No Yes class.error
## No 71 12 0.1445783
## Yes 21 44 0.3230769
注意randomForest()
函数实际是随机森林法,
但是当mtry
的值取为所有自变量个数时就是装袋法。
袋外观测得到的错判率比较差。
对测试集进行预报:
## test.y
## pred2 No Yes
## No 66 17
## Yes 11 55
## [1] 0.1879195
测试集的错判率约为19%。
对全集用装袋法:
##
## Call:
## randomForest(formula = AHD ~ ., data = Heart, mtry = 13, importance = TRUE)
## Type of random forest: classification
## Number of trees: 500
## No. of variables tried at each split: 13
##
## OOB estimate of error rate: 20.88%
## Confusion matrix:
## No Yes class.error
## No 131 29 0.1812500
## Yes 33 104 0.2408759
各变量的重要度数值及其图形:
## No Yes MeanDecreaseAccuracy MeanDecreaseGini
## Age 6.5766876 5.12005531 8.7542379 12.2956568
## Sex 11.2077275 4.48390165 11.2853739 3.7278623
## ChestPain 13.0268932 17.89348038 20.4292863 23.3424850
## RestBP 2.6203153 0.05626759 2.0521195 9.7650173
## Chol -0.8712348 -4.23294461 -3.0733270 11.5911988
## Fbs -0.6941335 -1.16860850 -1.2288380 0.6775051
## RestECG -1.4881617 0.23292163 -0.8772267 1.8426038
## MaxHR 7.7625054 2.34660468 7.5122314 13.2101707
## ExAng 2.7926364 5.45108497 5.7525854 3.5491718
## Oldpeak 14.8193517 14.67748373 20.2425364 14.5480191
## Slope 2.5189935 5.73789018 5.9744484 4.2777028
## Ca 23.0513399 18.01671793 27.4320740 20.0564750
## Thal 20.1968435 18.74418431 25.0618361 28.2479833
最重要的变量是Thal, ChestPain, Ca。
41.3.3 用随机森林
对训练集用随机森林法:
##
## Call:
## randomForest(formula = AHD ~ ., data = Heart, importance = TRUE, subset = train)
## Type of random forest: classification
## Number of trees: 500
## No. of variables tried at each split: 3
##
## OOB estimate of error rate: 21.62%
## Confusion matrix:
## No Yes class.error
## No 71 12 0.1445783
## Yes 20 45 0.3076923
这里mtry
取缺省值,对应于随机森林法。
对测试集进行预报:
## test.y
## pred3 No Yes
## No 70 16
## Yes 7 56
## [1] 0.1543624
测试集的错判率约为15%。
对全集用随机森林:
##
## Call:
## randomForest(formula = AHD ~ ., data = Heart, importance = TRUE)
## Type of random forest: classification
## Number of trees: 500
## No. of variables tried at each split: 3
##
## OOB estimate of error rate: 16.5%
## Confusion matrix:
## No Yes class.error
## No 140 20 0.1250000
## Yes 29 108 0.2116788
各变量的重要度数值及其图形:
## No Yes MeanDecreaseAccuracy MeanDecreaseGini
## Age 7.2380857 5.4451404 9.1859647 12.908917
## Sex 10.1973138 8.0790483 12.6929315 4.938266
## ChestPain 10.4623927 16.7054395 18.8771946 18.218363
## RestBP 1.2157266 1.8875229 2.1025511 10.624864
## Chol -1.2630538 -0.4285615 -1.3028275 11.470420
## Fbs 0.4417651 -2.6574949 -1.4327524 1.418137
## RestECG -1.1149040 1.4649476 0.2220661 2.840670
## MaxHR 9.3788412 6.0542618 10.7139921 17.383623
## ExAng 3.3923281 9.8037831 9.3523828 6.715947
## Oldpeak 10.1617047 14.3372404 17.5616061 15.403935
## Slope 2.6703016 9.3147738 8.5774100 6.752552
## Ca 21.1750038 20.2285033 26.7114362 18.388524
## Thal 18.0250446 16.6731737 22.5365589 18.351175
最重要的变量是ChestPain, Thal, Ca。
41.4 汽车销量数据分析
Carseats是ISLR包的一个数据集,基本情况如下:
{rstatl-car-summ01, cache=TRUE} str(Carseats) summary(Carseats)
把Salses变量按照大于8与否分成两组, 结果存入变量High,以High为因变量作判别分析。
## [1] 400 12
41.4.1 判别树
41.4.1.1 全体数据的判别树
对全体数据建立未剪枝的判别树:
##
## Classification tree:
## tree(formula = High ~ . - Sales, data = d)
## Variables actually used in tree construction:
## [1] "ShelveLoc" "Price" "Income" "CompPrice" "Population" "Advertising" "Age" "US"
## Number of terminal nodes: 27
## Residual mean deviance: 0.4575 = 170.7 / 373
## Misclassification error rate: 0.09 = 36 / 400
41.4.1.2 划分训练集和测试集
把输入数据集随机地分一半当作训练集,另一半当作测试集:
set.seed(2)
train <- sample(nrow(d), size=round(nrow(d)/2))
test <- (-train)
test.high <- d[test, 'High']
用训练数据建立未剪枝的判别树:
##
## Classification tree:
## tree(formula = High ~ . - Sales, data = d, subset = train)
## Variables actually used in tree construction:
## [1] "Price" "Population" "ShelveLoc" "Age" "Education" "CompPrice" "Advertising" "Income" "US"
## Number of terminal nodes: 21
## Residual mean deviance: 0.5543 = 99.22 / 179
## Misclassification error rate: 0.115 = 23 / 200
用未剪枝的树对测试集进行预测,并计算误判率:
## test.high
## pred2 No Yes
## No 104 33
## Yes 13 50
## [1] 0.23
41.4.1.3 用交叉验证确定训练集的剪枝
## $size
## [1] 21 19 14 9 8 5 3 2 1
##
## $dev
## [1] 74 76 81 81 75 77 78 85 81
##
## $k
## [1] -Inf 0.0 1.0 1.4 2.0 3.0 4.0 9.0 18.0
##
## $method
## [1] "misclass"
##
## attr(,"class")
## [1] "prune" "tree.sequence"
用交叉验证方法自动选择的最佳树大小为21。
剪枝:
##
## Classification tree:
## tree(formula = High ~ . - Sales, data = d, subset = train)
## Variables actually used in tree construction:
## [1] "Price" "Population" "ShelveLoc" "Age" "Education" "CompPrice" "Advertising" "Income" "US"
## Number of terminal nodes: 21
## Residual mean deviance: 0.5543 = 99.22 / 179
## Misclassification error rate: 0.115 = 23 / 200
用剪枝后的树对测试集进行预测,计算误判率:
## test.high
## pred3 No Yes
## No 104 32
## Yes 13 51
## [1] 0.225
41.4.2 随机森林
对训练集用随机森林法:
##
## Call:
## randomForest(formula = High ~ . - Sales, data = d, importance = TRUE, subset = train)
## Type of random forest: classification
## Number of trees: 500
## No. of variables tried at each split: 3
##
## OOB estimate of error rate: 25.5%
## Confusion matrix:
## No Yes class.error
## No 102 17 0.1428571
## Yes 34 47 0.4197531
这里mtry
取缺省值,对应于随机森林法。
对测试集进行预报:
## test.high
## pred4 No Yes
## No 109 24
## Yes 8 59
## [1] 0.16
注意错判率结果依赖于训练集和测试集的划分, 另行选择训练集与测试集可能会得到很不一样的错判率结果。
对全集用随机森林:
##
## Call:
## randomForest(formula = High ~ . - Sales, data = d, importance = TRUE)
## Type of random forest: classification
## Number of trees: 500
## No. of variables tried at each split: 3
##
## OOB estimate of error rate: 18.25%
## Confusion matrix:
## No Yes class.error
## No 213 23 0.09745763
## Yes 50 114 0.30487805
各变量的重要度数值及其图形:
## No Yes MeanDecreaseAccuracy MeanDecreaseGini
## CompPrice 11.0998129 5.4168875 11.65469579 21.820876
## Income 3.2897388 4.7177705 5.75865440 20.384692
## Advertising 10.8093624 16.0263308 18.31175988 23.350563
## Population -3.1872660 -1.7367798 -3.63402082 15.670307
## Price 30.0864270 28.7929995 37.44125656 43.492787
## ShelveLoc 30.2789749 33.8109594 39.67983055 30.053785
## Age 9.7116826 9.0261373 12.78808426 22.578000
## Education 0.2214031 -0.3203644 0.06365633 9.899447
## Urban 1.3826674 1.4199879 1.98859615 2.128048
## US 3.7289827 5.1909662 6.83788775 3.405420
重要的自变量为Price, ShelfLoc, 其次有Age, Advertising, CompPrice, Income等。
41.5 波士顿郊区房价数据
MASS包的Boston数据包含了波士顿地区郊区房价的若干数据。 以中位房价medv为因变量建立回归模型。 首先把缺失值去掉后存入数据集d:
数据集概况:
## 'data.frame': 506 obs. of 14 variables:
## $ crim : num 0.00632 0.02731 0.02729 0.03237 0.06905 ...
## $ zn : num 18 0 0 0 0 0 12.5 12.5 12.5 12.5 ...
## $ indus : num 2.31 7.07 7.07 2.18 2.18 2.18 7.87 7.87 7.87 7.87 ...
## $ chas : int 0 0 0 0 0 0 0 0 0 0 ...
## $ nox : num 0.538 0.469 0.469 0.458 0.458 0.458 0.524 0.524 0.524 0.524 ...
## $ rm : num 6.58 6.42 7.18 7 7.15 ...
## $ age : num 65.2 78.9 61.1 45.8 54.2 58.7 66.6 96.1 100 85.9 ...
## $ dis : num 4.09 4.97 4.97 6.06 6.06 ...
## $ rad : int 1 2 2 3 3 3 5 5 5 5 ...
## $ tax : num 296 242 242 222 222 222 311 311 311 311 ...
## $ ptratio: num 15.3 17.8 17.8 18.7 18.7 18.7 15.2 15.2 15.2 15.2 ...
## $ black : num 397 397 393 395 397 ...
## $ lstat : num 4.98 9.14 4.03 2.94 5.33 ...
## $ medv : num 24 21.6 34.7 33.4 36.2 28.7 22.9 27.1 16.5 18.9 ...
## crim zn indus chas nox rm age dis rad tax ptratio black lstat medv
## Min. : 0.00632 Min. : 0.00 Min. : 0.46 Min. :0.00000 Min. :0.3850 Min. :3.561 Min. : 2.90 Min. : 1.130 Min. : 1.000 Min. :187.0 Min. :12.60 Min. : 0.32 Min. : 1.73 Min. : 5.00
## 1st Qu.: 0.08205 1st Qu.: 0.00 1st Qu.: 5.19 1st Qu.:0.00000 1st Qu.:0.4490 1st Qu.:5.886 1st Qu.: 45.02 1st Qu.: 2.100 1st Qu.: 4.000 1st Qu.:279.0 1st Qu.:17.40 1st Qu.:375.38 1st Qu.: 6.95 1st Qu.:17.02
## Median : 0.25651 Median : 0.00 Median : 9.69 Median :0.00000 Median :0.5380 Median :6.208 Median : 77.50 Median : 3.207 Median : 5.000 Median :330.0 Median :19.05 Median :391.44 Median :11.36 Median :21.20
## Mean : 3.61352 Mean : 11.36 Mean :11.14 Mean :0.06917 Mean :0.5547 Mean :6.285 Mean : 68.57 Mean : 3.795 Mean : 9.549 Mean :408.2 Mean :18.46 Mean :356.67 Mean :12.65 Mean :22.53
## 3rd Qu.: 3.67708 3rd Qu.: 12.50 3rd Qu.:18.10 3rd Qu.:0.00000 3rd Qu.:0.6240 3rd Qu.:6.623 3rd Qu.: 94.08 3rd Qu.: 5.188 3rd Qu.:24.000 3rd Qu.:666.0 3rd Qu.:20.20 3rd Qu.:396.23 3rd Qu.:16.95 3rd Qu.:25.00
## Max. :88.97620 Max. :100.00 Max. :27.74 Max. :1.00000 Max. :0.8710 Max. :8.780 Max. :100.00 Max. :12.127 Max. :24.000 Max. :711.0 Max. :22.00 Max. :396.90 Max. :37.97 Max. :50.00
41.5.1 回归树
41.5.1.1 划分训练集和测试集
对训练集建立未剪枝的树:
##
## Regression tree:
## tree(formula = medv ~ ., data = d, subset = train)
## Variables actually used in tree construction:
## [1] "rm" "lstat" "crim" "age"
## Number of terminal nodes: 7
## Residual mean deviance: 10.38 = 2555 / 246
## Distribution of residuals:
## Min. 1st Qu. Median Mean 3rd Qu. Max.
## -10.1800 -1.7770 -0.1775 0.0000 1.9230 16.5800
用未剪枝的树对测试集进行预测,计算均方误差:
## [1] 35.28688
41.5.2 装袋法
用randomForest包计算。
当参数mtry
取为自变量个数时按照装袋法计算。
对训练集计算。
set.seed(1)
bag1 <- randomForest(
medv ~ ., data=d, subset=train,
mtry=ncol(d)-1, importance=TRUE)
bag1
##
## Call:
## randomForest(formula = medv ~ ., data = d, mtry = ncol(d) - 1, importance = TRUE, subset = train)
## Type of random forest: regression
## Number of trees: 500
## No. of variables tried at each split: 13
##
## Mean of squared residuals: 11.39601
## % Var explained: 85.17
在测试集上计算装袋法的均方误差:
## [1] 23.59273
比单棵树的结果有明显改善。
41.5.3 随机森林
用randomForest包计算。
当参数mtry
取为缺省值时按照随机森林方法计算。
对训练集计算。
##
## Call:
## randomForest(formula = medv ~ ., data = d, importance = TRUE, subset = train)
## Type of random forest: regression
## Number of trees: 500
## No. of variables tried at each split: 4
##
## Mean of squared residuals: 10.23441
## % Var explained: 86.69
在测试集上计算随机森林法的均方误差:
## [1] 18.11686
比单棵树的结果有明显改善, 比装袋法的结果也好一些。
各变量的重要度数值及其图形:
## %IncMSE IncNodePurity
## crim 15.372334 1220.14856
## zn 3.335435 194.85945
## indus 6.964559 1021.94751
## chas 2.059298 69.68099
## nox 14.009761 1005.14707
## rm 28.693900 6162.30720
## age 13.832143 708.55138
## dis 10.317731 852.33701
## rad 4.390624 162.22597
## tax 7.536563 564.60422
## ptratio 9.333716 1163.39624
## black 8.341316 355.62445
## lstat 27.132450 5549.25088
41.5.4 提升法
提升法(Boosting)也是可以用在多种回归和判别问题中的方法。 提升法的想法是,用比较简单的模型拟合因变量, 计算残差, 然后以残差为新的因变量建模, 仍使用简单的模型, 把两次的回归函数作加权和, 得到新的残差后,再以新残差作为因变量建模, 如此重复地更新回归函数, 得到由多个回归函数加权和组成的最终的回归函数。
加权一般取为比较小的值, 其目的是降低逼近速度。 统计学习问题中降低逼近速度一般结果更好。
提升法算法:
[(1)] 对训练集,设置\(r_i = y_i\),并令初始回归函数为\(\hat f(\cdot)=0\)。
[(2)] 对\(b=1,2,\dots,B\)重复执行:
- [(a)] 以训练集的自变量为自变量,以\(r\)为因变量,拟合一个仅有\(d\)个分叉的简单树回归函数, 设为\(\hat f_b\);
- [(b)] 更新回归函数,添加一个压缩过的树回归函数: \[\begin{aligned} \hat f(x) \leftarrow \hat f(x) + \lambda \hat f_b(x); \end{aligned}\]
- [(c)] 更新残差: \[\begin{aligned} r_i \leftarrow r_i - \lambda \hat f_b(x_i). \end{aligned}\]
[(3)] 提升法的回归函数为 \[\begin{aligned} \hat f(x) = \sum_{b=1}^B \lambda \hat f_b(x) . \end{aligned}\]
用多少个回归函数做加权和,即\(B\)的选取问题。 取得\(B\)太大也会有过度拟合, 但是只要\(B\)不太大这个问题不严重。 可以用交叉验证选择\(B\)的值。
收缩系数\(\lambda\)。 是一个小的正数, 控制学习速度, 经常用0.01, 0.001这样的值, 与要解决的问题有关。 取\(\lambda\)很小,就需要取\(B\)很大。
用来控制每个回归函数复杂度的参数, 对树回归而言就是树的大小, 用树的深度\(d\)表示。 深度等于1则仅使用一个自变量, 仅有一次分叉, 就是二叉树, 这样多棵树相加, 相当于各个变量的可加模型, 没有交互作用效应, 这样的可加模型往往就很好。 \(d>1\)时就加入了交互项, 比如\(d=2\), 就可以用两个变量, 用叶结点预测因变量时, 最多可以用两个自变量作两次判断, 因为树模型是非线性的, 将许多棵这样的深度为2的树相加, 就可以包含自变量两两之间的非线性的相互作用效应。
使用gbm包。
interaction.depth
表示树的深度(复杂度),
n.trees
表示用多少棵树相加。
shrinkage
表示学习速度,
即算法中的\(\lambda\)。
n.minobsinnode
表示每个叶结点至少应包含的观测点数,
可以设置这个参数,
以避免过少的训练样例也单独作为一个规则。
在训练集上拟合:
set.seed(1)
bst1 <- gbm(
medv ~ .,
data=d[train,],
distribution='gaussian',
n.trees=5000,
interaction.depth=4)
summary(bst1)
## var rel.inf
## rm rm 43.9919329
## lstat lstat 33.1216941
## crim crim 4.2604167
## dis dis 4.0111090
## nox nox 3.4353017
## black black 2.8267554
## age age 2.6113938
## ptratio ptratio 2.5403035
## tax tax 1.4565654
## indus indus 0.8008740
## rad rad 0.6546400
## zn zn 0.1446149
## chas chas 0.1443986
lstat和rm是最重要的变量。
在测试集上预报,并计算均方误差:
## [1] 18.84709
与随机森林方法结果相近。
如果提高学习速度:
bst2 <- gbm(
medv ~ .,
data=d[train,],
distribution='gaussian',
n.trees=5000,
interaction.depth=4,
shrinkage=0.2)
yhat <- predict(
bst2,
newdata=d[test,],
n.trees=5000)
mean( (yhat - d[test, 'medv'])^2 )
## [1] 18.33455
均方误差有改善。
41.6 支持向量机方法
支持向量机是1990年代有计算机科学家发明的一种有监督学习方法, 使用范围较广,预测精度较高。
支持向量机利用了Hilbert空间的方法将线性问题扩展为非线性问题。 线性的支持向量判别法, 可以通过\(\mathbb R^p\)的内积将线性的判别函数转化为如下的表示:
\[\begin{aligned} f(\boldsymbol x) = \beta_0 + \sum_{i=1}^n \alpha_i \langle \boldsymbol x, \boldsymbol x_i \rangle \end{aligned}\] 其中\(\beta_0, \alpha_1, \dots, \alpha_n\)是待定参数。 为了估计参数, 不需要用到各\(\boldsymbol x_i\)的具体值, 而只需要其两两的内积值, 而且在判别函数中只有支持向量对应的\(\alpha_i\)才非零, 记\(\mathcal S\)为支持向量点集, 则线性判别函数为 \[\begin{aligned} f(\boldsymbol x) = \beta_0 + \sum_{i \in \mathcal S} \alpha_i \langle \boldsymbol x, \boldsymbol x_i \rangle \end{aligned}\]
支持向量机方法将\(\mathbb R^p\)中的内积推广为如下的核函数值: \[\begin{aligned} K(\boldsymbol x, \boldsymbol x') \end{aligned}\] 核函数\(K(\boldsymbol x, \boldsymbol x')\), \(\boldsymbol x, \boldsymbol x' \in \mathbb R^p\) 是度量两个观测点\(\boldsymbol x, \boldsymbol x'\)的相似程度的函数。 比如, 取 \[\begin{aligned} K(\boldsymbol x, \boldsymbol x') = \sum_{j=1}^p x_j x_j' \end{aligned}\] 就又回到了线性的支持向量判别法。
核有多种取法。 例如, 取 \[\begin{aligned} K(\boldsymbol x, \boldsymbol x') = \left\{ 1 + \sum_{j=1}^p x_j x_j' \right\}^d \end{aligned}\] 其中\(d>1\)为正整数, 称为多项式核, 则结果是多项式边界的判别法, 本质上是对线性的支持向量方法添加了高次项和交叉项。
利用核代替内积后, 判别法的判别函数变成 \[\begin{aligned} f(\boldsymbol x) = \beta_0 + \sum_{i \in \mathcal S} K(\boldsymbol x, \boldsymbol x_i) \end{aligned}\]
另一种常用的核是径向核(radial kernel), 定义为 \[\begin{aligned} K(\boldsymbol x, \boldsymbol x') = \exp\left\{ - \gamma \sum_{j=1}^p (x_j - x_j')^2 \right\} \end{aligned}\] \(\gamma\)为正常数。 当\(\boldsymbol x\)和\(\boldsymbol x'\)分别落在以原点为中心的两个超球面上时, 其核函数值不变。
使用径向核时, 判别函数为 \[\begin{aligned} f(\boldsymbol x) = \beta_0 + \sum_{i \in \mathcal S} \exp\left\{ - \gamma \sum_{j=1}^p (x_{j} - x_{ij})^2 \right\} \end{aligned}\] 对一个待判别的观测\(\boldsymbol x^*\), 如果\(\boldsymbol x^*\)距离训练观测点\(\boldsymbol x_i\)较远, 则\(K(\boldsymbol x^*, \boldsymbol x_i)\)的值很小, \(\boldsymbol x_i\)对\(\boldsymbol x^*\)的判别基本不起作用。 这样的性质使得径向核方法具有很强的局部性, 只有离\(\boldsymbol x^*\)很近的点才对其判别起作用。
为什么采用核函数计算观测两两的\(\binom{n}{2}\)个核函数值, 而不是直接增加非线性项? 原因是计算这些核函数值计算量是确定的, 而增加许多非线性项, 则可能有很大的计算量, 而且某些核如径向核对应的自变量空间维数是无穷维的, 不能通过添加维度的办法解决。
支持向量机的理论基于再生核希尔伯特空间(RKHS), 可参见(Trevor Hastie 2009)节5.8和节12.3.3。
41.6.1 支持向量机用于Heart数据
考虑心脏病数据Heart的判别。 共297个观测, 随机选取其中207个作为训练集, 90个作为测试集。
set.seed(1)
Heart <- read.csv(
"data/Heart.csv", header=TRUE, row.names=1,
stringsAsFactors=TRUE)
d <- na.omit(Heart)
train <- sample(nrow(d), size=207)
test <- -train
d[["AHD"]] <- factor(d[["AHD"]], levels=c("No", "Yes"))
定义一个错判率函数:
classifier.error <- function(truth, pred){
tab1 <- table(truth, pred)
err <- 1 - sum(diag(tab1))/sum(c(tab1))
err
}
41.6.1.1 线性的SVM
支持向量判别法就是SVM取多项式核,
阶数\(d=1\)的情形。
需要一个调节参数cost
,
cost
越大,
分隔边界越窄,
过度拟合危险越大。
先随便取调节参数cost=1
试验支持向量判别法:
res.svc <- svm(AHD ~ ., data=d[train,], kernel="linear", cost=1, scale=TRUE)
fit.svc <- predict(res.svc)
summary(res.svc)
##
## Call:
## svm(formula = AHD ~ ., data = d[train, ], kernel = "linear", cost = 1, scale = TRUE)
##
##
## Parameters:
## SVM-Type: C-classification
## SVM-Kernel: linear
## cost: 1
##
## Number of Support Vectors: 79
##
## ( 38 41 )
##
##
## Number of Classes: 2
##
## Levels:
## No Yes
计算拟合结果并计算错判率:
## fitted
## truth No Yes
## No 105 9
## Yes 18 75
## SVC错判率: 0.13
e1071函数提供了tune()
函数,
可以在训练集上用十折交叉验证选择较好的调节参数。
set.seed(101)
res.tune <- tune(svm, AHD ~ ., data=d[train,], kernel="linear", scale=TRUE,
ranges=list(cost=c(0.001, 0.01, 0.1, 1, 5, 10, 100, 1000)))
summary(res.tune)
##
## Parameter tuning of 'svm':
##
## - sampling method: 10-fold cross validation
##
## - best parameters:
## cost
## 0.1
##
## - best performance: 0.1542857
##
## - Detailed performance results:
## cost error dispersion
## 1 1e-03 0.4450000 0.08509809
## 2 1e-02 0.1695238 0.07062868
## 3 1e-01 0.1542857 0.07006458
## 4 1e+00 0.1590476 0.07793796
## 5 5e+00 0.1590476 0.08709789
## 6 1e+01 0.1590476 0.08709789
## 7 1e+02 0.1590476 0.08709789
## 8 1e+03 0.1590476 0.08709789
找到的最优调节参数为0.1,
可以用res.tune$best.model
获得对应于最优调节参数的模型:
##
## Call:
## best.tune(method = svm, train.x = AHD ~ ., data = d[train, ], ranges = list(cost = c(0.001, 0.01, 0.1, 1, 5, 10, 100, 1000)), kernel = "linear", scale = TRUE)
##
##
## Parameters:
## SVM-Type: C-classification
## SVM-Kernel: linear
## cost: 0.1
##
## Number of Support Vectors: 90
##
## ( 44 46 )
##
##
## Number of Classes: 2
##
## Levels:
## No Yes
在测试集上测试:
pred.svc <- predict(res.tune$best.model, newdata=d[test,])
tab1 <- table(truth=d[test,"AHD"], predict=pred.svc); tab1
## predict
## truth No Yes
## No 43 3
## Yes 11 33
## SVC错判率: 0.16
41.6.1.2 多项式核SVM
res.svm1 <- svm(AHD ~ ., data=d[train,], kernel="polynomial",
order=2, cost=0.1, scale=TRUE)
fit.svm1 <- predict(res.svm1)
summary(res.svm1)
##
## Call:
## svm(formula = AHD ~ ., data = d[train, ], kernel = "polynomial", order = 2, cost = 0.1, scale = TRUE)
##
##
## Parameters:
## SVM-Type: C-classification
## SVM-Kernel: polynomial
## cost: 0.1
## degree: 3
## coef.0: 0
##
## Number of Support Vectors: 187
##
## ( 92 95 )
##
##
## Number of Classes: 2
##
## Levels:
## No Yes
## fitted
## truth No Yes
## No 114 0
## Yes 82 11
## 2阶多项式核SVM错判率: 0.4
尝试找到调节参数cost
的最优值:
set.seed(101)
res.tune2 <- tune(svm, AHD ~ ., data=d[train,], kernel="polynomial",
order=2, scale=TRUE,
ranges=list(cost=c(0.001, 0.01, 0.1, 1, 5, 10, 100, 1000)))
summary(res.tune2)
##
## Parameter tuning of 'svm':
##
## - sampling method: 10-fold cross validation
##
## - best parameters:
## cost
## 5
##
## - best performance: 0.2130952
##
## - Detailed performance results:
## cost error dispersion
## 1 1e-03 0.4500000 0.08022549
## 2 1e-02 0.4500000 0.08022549
## 3 1e-01 0.4111905 0.09215957
## 4 1e+00 0.2185714 0.09094005
## 5 5e+00 0.2130952 0.09790737
## 6 1e+01 0.2180952 0.07948562
## 7 1e+02 0.2807143 0.09539966
## 8 1e+03 0.2807143 0.09539966
fit.svm2 <- predict(res.tune2$best.model)
tab1 <- table(truth=d[train,"AHD"], fitted=fit.svm2); tab1
## fitted
## truth No Yes
## No 111 3
## Yes 4 89
## 2阶多项式核最优参数SVM错判率: 0.03
看这个最优调节参数的模型在测试集上的表现:
pred.svm2 <- predict(res.tune2$best.model, d[test,])
tab1 <- table(truth=d[test,"AHD"], predict=pred.svm2); tab1
## predict
## truth No Yes
## No 43 3
## Yes 10 34
## 2阶多项式核最优参数SVM测试集错判率: 0.14
在测试集上的表现与线性方法相近。
41.6.1.3 径向核SVM
径向核需要的参数为\(\gamma\)值。
取参数gamma=0.1
。
res.svm3 <- svm(AHD ~ ., data=d[train,], kernel="radial",
gamma=0.1, cost=0.1, scale=TRUE)
fit.svm3 <- predict(res.svm3)
summary(res.svm3)
##
## Call:
## svm(formula = AHD ~ ., data = d[train, ], kernel = "radial", gamma = 0.1, cost = 0.1, scale = TRUE)
##
##
## Parameters:
## SVM-Type: C-classification
## SVM-Kernel: radial
## cost: 0.1
##
## Number of Support Vectors: 179
##
## ( 89 90 )
##
##
## Number of Classes: 2
##
## Levels:
## No Yes
## fitted
## truth No Yes
## No 108 6
## Yes 26 67
## 径向核(gamma=0.1, cost=0.1)SVM错判率: 0.15
选取最优cost
, gamma
调节参数:
set.seed(101)
res.tune4 <- tune(svm, AHD ~ ., data=d[train,], kernel="radial",
scale=TRUE,
ranges=list(cost=c(0.001, 0.01, 0.1, 1, 5, 10, 100, 1000),
gamma=c(0.1, 0.01, 0.001)))
summary(res.tune4)
##
## Parameter tuning of 'svm':
##
## - sampling method: 10-fold cross validation
##
## - best parameters:
## cost gamma
## 100 0.001
##
## - best performance: 0.1492857
##
## - Detailed performance results:
## cost gamma error dispersion
## 1 1e-03 0.100 0.4500000 0.08022549
## 2 1e-02 0.100 0.4500000 0.08022549
## 3 1e-01 0.100 0.2235714 0.09912346
## 4 1e+00 0.100 0.1788095 0.08490543
## 5 5e+00 0.100 0.1835714 0.06267781
## 6 1e+01 0.100 0.1835714 0.07375788
## 7 1e+02 0.100 0.1933333 0.09294732
## 8 1e+03 0.100 0.1933333 0.09294732
## 9 1e-03 0.010 0.4500000 0.08022549
## 10 1e-02 0.010 0.4500000 0.08022549
## 11 1e-01 0.010 0.3147619 0.11998992
## 12 1e+00 0.010 0.1647619 0.06992960
## 13 5e+00 0.010 0.1547619 0.07819776
## 14 1e+01 0.010 0.1547619 0.08135598
## 15 1e+02 0.010 0.2126190 0.06443790
## 16 1e+03 0.010 0.2409524 0.08621108
## 17 1e-03 0.001 0.4500000 0.08022549
## 18 1e-02 0.001 0.4500000 0.08022549
## 19 1e-01 0.001 0.4500000 0.08022549
## 20 1e+00 0.001 0.2138095 0.11215945
## 21 5e+00 0.001 0.1695238 0.07062868
## 22 1e+01 0.001 0.1840476 0.08321647
## 23 1e+02 0.001 0.1492857 0.08228019
## 24 1e+03 0.001 0.1640476 0.07494392
fit.svm4 <- predict(res.tune4$best.model)
tab1 <- table(truth=d[train,"AHD"], fitted=fit.svm4); tab1
## fitted
## truth No Yes
## No 107 7
## Yes 18 75
## 径向核最优参数SVM错判率: 0.12
看这个最优调节参数的模型在测试集上的表现:
pred.svm4 <- predict(res.tune4$best.model, d[test,])
tab1 <- table(truth=d[test,"AHD"], predict=pred.svm2); tab1
## predict
## truth No Yes
## No 43 3
## Yes 10 34
## 径向核最优参数SVM测试集错判率: 0.14
与线性方法结果相近。
41.7 用H2O包进行统计学习计算
H2O是一个开源的、集成的机器学习环境, 基于Java语言开发, 支持并行处理, 支持大型数据。 R的H2O扩展包提供了对H2O软件的接口, 可以用比较统一的界面访问各种机器学习方法。
H2O使用自己的数据格式,
R的data.frame和data.table可以用as.h2o()
函数转换为H2O的H2OFrame格式。
H2O的R扩展包利用网络服务访问正在运行的H2O软件, R本身并不进行计算和数据存储。
41.7.1 安装
如果安装有旧的H2O版本, 应预先卸载。 H2O包还依赖RCurl包和jsonlite包, 应提前安装。
H2O需要使用Java语言, 所以应该先安装一个Java环境, 64位JRE即可(Java运行环境), 64位的JDK则可以支持Java源代码编译和H2O测试。 在Windows下, 下载链接为:
H2O需要进行源代码编译, 所以在Windows操作系统中使用需要安装RTools工具包。
从下列链接下载H2O的源代码形式的扩展包:
放在当前工作目录后, 用如下命令安装, 安装时需要进行编译:
41.7.2 启动和退出H2O
启动:
library(h2o)
h2o.init(
nthreads = -1, max_mem_size = '16g',
ip = "127.0.0.1", port = 54321)
h2o.no_progress()
因为启动了一个本地服务, 所以退出H2O时应该有一个关闭动作:
41.7.3 Hitters数据示例
转换数据格式为H2O格式:
拆分训练集、测试集:
splits <- h2o.splitFrame(
data = hf_hit,
ratios = c(0.60), seed = 1234)
train <- splits[[1]]
test <- splits[[2]]
设置自变量、因变量:
用GBM方法。 先人为指定调优参数进行测试:
gbm1 <- h2o.gbm(
y=y, x=x,
training_frame = train,
ntrees = 10,
max_depth = 2,
min_rows = 3,
learn_rate = 0.1,
distribution= "gaussian")
迭代过程的显示:
结果略。
训练集上的表现:
H2ORegressionMetrics: gbm
** Reported on training data. **
MSE: 98531.6
RMSE: 313.8974
MAE: 223.4093
RMSLE: 0.6764681
Mean Residual Deviance : 98531.6
训练集上的RMSE为314。
变量重要度的度量:
Variable Importances:
variable relative_importance scaled_importance percentage
1 CRBI 34232028.000000 1.000000 0.288900
2 CHits 22870448.000000 0.668101 0.193014
3 Walks 21664550.000000 0.632874 0.182837
4 Runs 11754988.000000 0.343392 0.099206
5 CAtBat 8579465.000000 0.250627 0.072406
6 AtBat 5039900.000000 0.147228 0.042534
7 Hits 4386617.000000 0.128144 0.037021
8 CHmRun 3640191.250000 0.106339 0.030721
9 CRuns 3588647.750000 0.104833 0.030286
10 RBI 1586647.750000 0.046350 0.013390
11 CWalks 1147642.875000 0.033525 0.009685
12 HmRun 0.000000 0.000000 0.000000
13 Years 0.000000 0.000000 0.000000
14 League 0.000000 0.000000 0.000000
15 Division 0.000000 0.000000 0.000000
16 PutOuts 0.000000 0.000000 0.000000
17 Assists 0.000000 0.000000 0.000000
18 Errors 0.000000 0.000000 0.000000
19 NewLeague 0.000000 0.000000 0.000000
可以用结果中scaled_importance
作为每个变量重要程度的度量。
可以用条形图显示:
图形略。
下面进行参数调优。 H2O有两种参数调优方法, 第一种方法是将每个参数的若干个可能的值进行完全组合, 形成一个完全设计试验方案, 称为一个网格, 然后对每一种参数组合训练一个模型, 用交叉验证或者验证集比较这些模型; 第二种方法是形成了网格后, 在网格中随机均匀抽取进行模型比较, 这种方法可以设置一个时间限制, 在限制时间内找到较优模型, 其网格可以密集一些。
例如, 若参数\(A\)可取\(0.5, 1.5\), \(B\)可取\(10, 20\), \(C\)可取\(0.01, 0.1\), 则网格(完全试验方案)为: \[ \begin{array}{rlll} \text{NO} & A & B & C \\ 1 & 0.5 & 10 & 0.01 \\ 2 & 0.5 & 10 & 0.1 \\ 3 & 0.5 & 20 & 0.01 \\ 4 & 0.5 & 20 & 0.1 \\ 5 & 1.5 & 10 & 0.01 \\ 6 & 1.5 & 10 & 0.1 \\ 7 & 1.5 & 20 & 0.01 \\ 8 & 1.5 & 20 & 0.1 \end{array} \]
先用一个较小的网格搜索。 用默认的交叉验证方法。 仅修改树棵数、树最大深度、学习率参数。
time0 <- proc.time()[3]
gbm_params1 <- list(
ntrees = c(10, 20, 30),
max_depth = c(3, 5, 10),
min_rows = c(3, 5, 10),
learn_rate = c(0.01, 0.1, 0.5))
gbm_grid1 <- h2o.grid(
"gbm",
x = x,
y = y,
grid_id = "gbm_grid1",
training_frame = train,
nfolds=5,
seed = 1,
hyper_params= gbm_params1)
time_search <- paste(
round((proc.time()[3] - time0)/60), "minuntes")
cat("Time used:", time_search, "\n")
gbm_gridperf1 <- h2o.getGrid(
grid_id = "gbm_grid1",
sort_by = "rmse",
decreasing = FALSE)
gbm_gridperf1@summary_table
Hyper-Parameter Search Summary: ordered by increasing rmse
learn_rate max_depth min_rows ntrees model_ids rmse
1 0.50000 10.00000 3.00000 10.00000 gbm_grid1_model_9 353.58941
2 0.50000 10.00000 3.00000 20.00000 gbm_grid1_model_36 355.47727
3 0.50000 10.00000 3.00000 30.00000 gbm_grid1_model_63 355.57809
4 0.10000 5.00000 3.00000 30.00000 gbm_grid1_model_59 356.84165
5 0.10000 10.00000 3.00000 30.00000 gbm_grid1_model_62 357.29143
---
learn_rate max_depth min_rows ntrees model_ids rmse
76 0.01000 3.00000 10.00000 10.00000 gbm_grid1_model_19 480.13855
77 0.01000 5.00000 3.00000 10.00000 gbm_grid1_model_4 480.30282
78 0.01000 5.00000 5.00000 10.00000 gbm_grid1_model_13 481.28910
79 0.01000 3.00000 5.00000 10.00000 gbm_grid1_model_10 481.49020
80 0.01000 10.00000 5.00000 10.00000 gbm_grid1_model_16 481.72192
81 0.01000 3.00000 3.00000 10.00000 gbm_grid1_model_1 481.72711
完成参数网格优化后,
可以用h2o.getGrid()
从优化结果中获取网格参数对应的各个模型,
并可以按RMSE、AOC等指标对模型排序显示。
可以用模型代码访问其中的具体模型。
最优参数组合为:
ntrees = 10
;max_depth = 10
;learn_rate = 0.5
;min_rows = 3
。
交叉验证的RMSE为354。
目前的最优模型:
此模型的变量重要度度量:
结果略, 与gbm1的排序有较大变化。
在最优组合附近再次进行搜索, 但使用离散随机化搜索策略, 取一个较密集的网格, 限制时间为5分钟:
time0 <- proc.time()[3]
gbm_params2 <- list(
ntrees = seq(5, 50, by=5),
max_depth = seq(1, 20, by=1),
min_rows = seq(2, 20, by=1),
learn_rate = c(0.01*(5:9), 0.1*(1:5)))
search_criteria2 <- list(
strategy = "RandomDiscrete",
max_runtime_secs = 300)
gbm_grid2 <- h2o.grid(
"gbm",
x = x,
y = y,
grid_id = "gbm_grid2",
training_frame = train,
nfolds = 5,
seed = 1,
hyper_params= gbm_params2,
search_criteria = search_criteria2)
time_search <- paste(
round((proc.time()[3] - time0)/60), "minuntes")
cat("Time used:", time_search, "\n")
gbm_gridperf2 <- h2o.getGrid(
grid_id = "gbm_grid2",
sort_by = "rmse",
decreasing = FALSE)
gbm_gridperf2@summary_table
Hyper-Parameter Search Summary: ordered by increasing rmse
learn_rate max_depth min_rows ntrees model_ids rmse
1 0.10000 3.00000 2.00000 30.00000 gbm_grid2_model_820 344.00923
2 0.05000 4.00000 2.00000 35.00000 gbm_grid2_model_1151 344.84749
3 0.50000 9.00000 3.00000 5.00000 gbm_grid2_model_1033 344.85596
4 0.07000 15.00000 3.00000 50.00000 gbm_grid2_model_884 346.27286
5 0.07000 18.00000 3.00000 40.00000 gbm_grid2_model_675 347.23740
---
learn_rate max_depth min_rows ntrees model_ids rmse
1353 0.05000 18.00000 6.00000 5.00000 gbm_grid2_model_1197 455.81170
1354 0.05000 2.00000 2.00000 5.00000 gbm_grid2_model_1121 456.76030
1355 0.05000 2.00000 3.00000 5.00000 gbm_grid2_model_1305 458.97621
1356 0.08000 1.00000 5.00000 5.00000 gbm_grid2_model_571 459.82785
1357 0.06000 1.00000 5.00000 5.00000 gbm_grid2_model_793 469.72791
1358 0.05000 1.00000 2.00000 5.00000 gbm_grid2_model_167 473.67680
最优参数组合:
ntrees = 30
;max_depth = 3
;learn_rate = 0.1
;min_rows = 2
。
交叉核实的RMSE为344。
提取调优结果的最优模型:
使用最后找到的最优模型在测试集上进行预测比较:
H2ORegressionMetrics: gbm
MSE: 77536.85
RMSE: 278.4544
MAE: 175.6084
RMSLE: 0.5015645
Mean Residual Deviance : 77536.85
测试集上的RMSE为278, 比较理想。
变量重要度分析:
Variable Importances:
Variable Relative Importance Scaled Importance Percentage
1 CRBI 132.466204 1.000000 0.222229
2 Walks 96.518919 0.728631 0.161923
3 CHits 59.025072 0.445586 0.099022
4 CHmRun 55.486735 0.418875 0.093086
5 Runs 50.785759 0.383387 0.085200
6 CRuns 38.727044 0.292354 0.064970
7 CAtBat 33.082905 0.249746 0.055501
8 Hits 25.749126 0.194383 0.043197
9 CWalks 20.566305 0.155257 0.034503
10 RBI 16.308211 0.123112 0.027359
11 Years 15.404481 0.116290 0.025843
12 AtBat 14.144269 0.106776 0.023729
13 Errors 14.052819 0.106086 0.023575
14 PutOuts 10.156549 0.076673 0.017039
15 HmRun 5.738121 0.043318 0.009626
16 Division 4.937515 0.037274 0.008283
17 NewLeague 1.975050 0.014910 0.003313
18 Assists 0.954782 0.007208 0.001602
重要度作图:
图形略。
在测试集上计算因变量预测值:
predict
1 423.0851
2 890.0124
3 160.4647
4 815.4056
5 1271.8789
6 181.5200
变量解释性分析:
这会产生多个关于每个变量的贡献的图形。 也有一些单个图形的函数, 比如SHAP概况图:
SHAP计算每个观测上每个变量的贡献值, 并对变量的总的贡献由大到小排序, 并用散点图绘制出这些贡献。 结果如:
变量重要度图:
41.7.4 AutoML
H2O提供了一个AutoML功能, 可以自动使用各个机器学习方法进行训练、参数调优、模型比较, 输出占优的多个模型。
用户仅需要指定训练数据集training_frame
、因变量y
、最多允许训练时间max_runtime_secs
,
自变量自动选择为因变量以外的所有变量,
参数调优自动使用交叉验证方法。
示例:
library(h2o)
h2o.init()
train <- h2o.importFile("https://s3.amazonaws.com/erin-data/higgs/higgs_train_10k.csv")
test <- h2o.importFile("https://s3.amazonaws.com/erin-data/higgs/higgs_test_5k.csv")
y <- "response"
x <- setdiff(names(train), y)
# 分类问题的因变量必须是因子
train[, y] <- as.factor(train[, y])
test[, y] <- as.factor(test[, y])
# 限制5分钟
aml <- h2o.automl(
x = x, y = y,
training_frame = train,
#max_models = 20,
max_runtime_secs = 300,
seed = 1)
# View the AutoML Leaderboard
lb <- aml@leaderboard
print(lb, n = nrow(lb))
model_id auc logloss aucpr
1 StackedEnsemble_AllModels_3_AutoML_1_20230717_82125 0.7896537 0.5492908 0.8084317
2 StackedEnsemble_AllModels_4_AutoML_1_20230717_82125 0.7888052 0.5503257 0.8076245
3 StackedEnsemble_AllModels_2_AutoML_1_20230717_82125 0.7874863 0.5515817 0.8072801
4 StackedEnsemble_AllModels_1_AutoML_1_20230717_82125 0.7867515 0.5522508 0.8069401
5 StackedEnsemble_BestOfFamily_4_AutoML_1_20230717_82125 0.7854556 0.5534061 0.8053178
6 StackedEnsemble_BestOfFamily_5_AutoML_1_20230717_82125 0.7847936 0.5542375 0.8050583
7 StackedEnsemble_BestOfFamily_3_AutoML_1_20230717_82125 0.7832484 0.5556922 0.8029427
8 StackedEnsemble_BestOfFamily_2_AutoML_1_20230717_82125 0.7819484 0.5568627 0.8017783
9 StackedEnsemble_AllModels_5_AutoML_1_20230717_82125 0.7817324 0.5638011 0.7997335
10 StackedEnsemble_BestOfFamily_1_AutoML_1_20230717_82125 0.7800970 0.5592433 0.7990314
11 GBM_grid_1_AutoML_1_20230717_82125_model_12 0.7800394 0.5595114 0.8014000
12 GBM_grid_1_AutoML_1_20230717_82125_model_9 0.7797381 0.5625036 0.7983718
13 GBM_1_AutoML_1_20230717_82125 0.7795121 0.5602557 0.7995356
14 GBM_2_AutoML_1_20230717_82125 0.7792939 0.5608256 0.7984392
15 GBM_grid_1_AutoML_1_20230717_82125_model_17 0.7790189 0.5649027 0.7959446
16 GBM_grid_1_AutoML_1_20230717_82125_model_16 0.7788996 0.5624376 0.7947606
17 GBM_5_AutoML_1_20230717_82125 0.7788048 0.5617556 0.7967867
18 GBM_grid_1_AutoML_1_20230717_82125_model_19 0.7786671 0.5639216 0.7971413
19 StackedEnsemble_BestOfFamily_6_AutoML_1_20230717_82125 0.7779028 0.5602988 0.7989296
20 GBM_grid_1_AutoML_1_20230717_82125_model_2 0.7778602 0.5646552 0.7953585
21 GBM_grid_1_AutoML_1_20230717_82125_model_14 0.7775555 0.5668371 0.7924693
22 GBM_grid_1_AutoML_1_20230717_82125_model_6 0.7772192 0.5642876 0.7954070
23 GBM_grid_1_AutoML_1_20230717_82125_model_7 0.7764426 0.5701478 0.7923477
24 GBM_3_AutoML_1_20230717_82125 0.7751876 0.5650460 0.7946101
25 GBM_4_AutoML_1_20230717_82125 0.7742870 0.5656442 0.7963992
26 GBM_grid_1_AutoML_1_20230717_82125_model_11 0.7734054 0.5716275 0.7919521
27 GBM_grid_1_AutoML_1_20230717_82125_model_3 0.7729262 0.5681808 0.7911955
28 GBM_grid_1_AutoML_1_20230717_82125_model_4 0.7705223 0.5692442 0.7890998
29 GBM_grid_1_AutoML_1_20230717_82125_model_5 0.7704555 0.5732127 0.7881083
30 XRT_1_AutoML_1_20230717_82125 0.7642216 0.5814393 0.7820797
31 DRF_1_AutoML_1_20230717_82125 0.7631956 0.5802385 0.7840833
32 GBM_grid_1_AutoML_1_20230717_82125_model_10 0.7603439 0.5805147 0.7762872
33 GBM_grid_1_AutoML_1_20230717_82125_model_8 0.7532375 0.5947734 0.7703927
34 GBM_grid_1_AutoML_1_20230717_82125_model_15 0.7532095 0.5887163 0.7719831
35 GBM_grid_1_AutoML_1_20230717_82125_model_1 0.7476579 0.5915102 0.7632106
36 GBM_grid_1_AutoML_1_20230717_82125_model_13 0.7426757 0.6044879 0.7619594
37 DeepLearning_grid_2_AutoML_1_20230717_82125_model_1 0.7297311 0.6137454 0.7358833
38 DeepLearning_grid_1_AutoML_1_20230717_82125_model_1 0.7265855 0.6634738 0.7275126
39 GBM_grid_1_AutoML_1_20230717_82125_model_18 0.7245035 0.6152842 0.7447474
40 DeepLearning_grid_3_AutoML_1_20230717_82125_model_1 0.7160532 0.6232399 0.7192921
41 DeepLearning_grid_1_AutoML_1_20230717_82125_model_2 0.7142102 0.6319313 0.7162592
42 DeepLearning_1_AutoML_1_20230717_82125 0.7081655 0.6274959 0.7123640
43 DeepLearning_grid_1_AutoML_1_20230717_82125_model_3 0.7042074 0.6544330 0.7070402
44 GLM_1_AutoML_1_20230717_82125 0.6826483 0.6385202 0.6807189
mean_per_class_error rmse mse
1 0.3281307 0.4317858 0.1864389
2 0.3212199 0.4322351 0.1868272
3 0.3315550 0.4328426 0.1873527
4 0.3280114 0.4331952 0.1876580
5 0.3354530 0.4338531 0.1882285
6 0.3293683 0.4341424 0.1884796
7 0.3363375 0.4347748 0.1890291
8 0.3316707 0.4353819 0.1895574
9 0.3214789 0.4371155 0.1910699
10 0.3486560 0.4362937 0.1903522
11 0.3367738 0.4365091 0.1905402
12 0.3330875 0.4371565 0.1911058
13 0.3275111 0.4366086 0.1906271
14 0.3278906 0.4367848 0.1907810
15 0.3285993 0.4379804 0.1918268
16 0.3347428 0.4372481 0.1911859
17 0.3343263 0.4371076 0.1910630
18 0.3363475 0.4377541 0.1916287
19 0.3301687 0.4370242 0.1909902
20 0.3337600 0.4380880 0.1919211
21 0.3236620 0.4387509 0.1925023
22 0.3248413 0.4380742 0.1919090
23 0.3332508 0.4402129 0.1937874
24 0.3302285 0.4388332 0.1925746
25 0.3456632 0.4393214 0.1930033
26 0.3288446 0.4411005 0.1945696
27 0.3228082 0.4399974 0.1935977
28 0.3497369 0.4407917 0.1942973
29 0.3286788 0.4424870 0.1957948
30 0.3474700 0.4457808 0.1987205
31 0.3492529 0.4455428 0.1985084
32 0.3560809 0.4456789 0.1986297
33 0.3445959 0.4515736 0.2039187
34 0.3537379 0.4498069 0.2023263
35 0.3594190 0.4510526 0.2034484
36 0.3540427 0.4563462 0.2082518
37 0.3674737 0.4596807 0.2113064
38 0.3713030 0.4701575 0.2210480
39 0.3957685 0.4618223 0.2132799
40 0.3822944 0.4643025 0.2155768
41 0.3894767 0.4655305 0.2167187
42 0.3786903 0.4666819 0.2177920
43 0.4008949 0.4726119 0.2233620
44 0.3972341 0.4726827 0.2234289
41.8 附录
41.8.1 Hitters数据
AtBat | Hits | HmRun | Runs | RBI | Walks | Years | CAtBat | CHits | CHmRun | CRuns | CRBI | CWalks | League | Division | PutOuts | Assists | Errors | Salary | NewLeague | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
-Andy Allanson | 293 | 66 | 1 | 30 | 29 | 14 | 1 | 293 | 66 | 1 | 30 | 29 | 14 | A | E | 446 | 33 | 20 | NA | A |
-Alan Ashby | 315 | 81 | 7 | 24 | 38 | 39 | 14 | 3449 | 835 | 69 | 321 | 414 | 375 | N | W | 632 | 43 | 10 | 475.000 | N |
-Alvin Davis | 479 | 130 | 18 | 66 | 72 | 76 | 3 | 1624 | 457 | 63 | 224 | 266 | 263 | A | W | 880 | 82 | 14 | 480.000 | A |
-Andre Dawson | 496 | 141 | 20 | 65 | 78 | 37 | 11 | 5628 | 1575 | 225 | 828 | 838 | 354 | N | E | 200 | 11 | 3 | 500.000 | N |
-Andres Galarraga | 321 | 87 | 10 | 39 | 42 | 30 | 2 | 396 | 101 | 12 | 48 | 46 | 33 | N | E | 805 | 40 | 4 | 91.500 | N |
-Alfredo Griffin | 594 | 169 | 4 | 74 | 51 | 35 | 11 | 4408 | 1133 | 19 | 501 | 336 | 194 | A | W | 282 | 421 | 25 | 750.000 | A |
-Al Newman | 185 | 37 | 1 | 23 | 8 | 21 | 2 | 214 | 42 | 1 | 30 | 9 | 24 | N | E | 76 | 127 | 7 | 70.000 | A |
-Argenis Salazar | 298 | 73 | 0 | 24 | 24 | 7 | 3 | 509 | 108 | 0 | 41 | 37 | 12 | A | W | 121 | 283 | 9 | 100.000 | A |
-Andres Thomas | 323 | 81 | 6 | 26 | 32 | 8 | 2 | 341 | 86 | 6 | 32 | 34 | 8 | N | W | 143 | 290 | 19 | 75.000 | N |
-Andre Thornton | 401 | 92 | 17 | 49 | 66 | 65 | 13 | 5206 | 1332 | 253 | 784 | 890 | 866 | A | E | 0 | 0 | 0 | 1100.000 | A |
-Alan Trammell | 574 | 159 | 21 | 107 | 75 | 59 | 10 | 4631 | 1300 | 90 | 702 | 504 | 488 | A | E | 238 | 445 | 22 | 517.143 | A |
-Alex Trevino | 202 | 53 | 4 | 31 | 26 | 27 | 9 | 1876 | 467 | 15 | 192 | 186 | 161 | N | W | 304 | 45 | 11 | 512.500 | N |
-Andy VanSlyke | 418 | 113 | 13 | 48 | 61 | 47 | 4 | 1512 | 392 | 41 | 205 | 204 | 203 | N | E | 211 | 11 | 7 | 550.000 | N |
-Alan Wiggins | 239 | 60 | 0 | 30 | 11 | 22 | 6 | 1941 | 510 | 4 | 309 | 103 | 207 | A | E | 121 | 151 | 6 | 700.000 | A |
-Bill Almon | 196 | 43 | 7 | 29 | 27 | 30 | 13 | 3231 | 825 | 36 | 376 | 290 | 238 | N | E | 80 | 45 | 8 | 240.000 | N |
-Billy Beane | 183 | 39 | 3 | 20 | 15 | 11 | 3 | 201 | 42 | 3 | 20 | 16 | 11 | A | W | 118 | 0 | 0 | NA | A |
-Buddy Bell | 568 | 158 | 20 | 89 | 75 | 73 | 15 | 8068 | 2273 | 177 | 1045 | 993 | 732 | N | W | 105 | 290 | 10 | 775.000 | N |
-Buddy Biancalana | 190 | 46 | 2 | 24 | 8 | 15 | 5 | 479 | 102 | 5 | 65 | 23 | 39 | A | W | 102 | 177 | 16 | 175.000 | A |
-Bruce Bochte | 407 | 104 | 6 | 57 | 43 | 65 | 12 | 5233 | 1478 | 100 | 643 | 658 | 653 | A | W | 912 | 88 | 9 | NA | A |
-Bruce Bochy | 127 | 32 | 8 | 16 | 22 | 14 | 8 | 727 | 180 | 24 | 67 | 82 | 56 | N | W | 202 | 22 | 2 | 135.000 | N |
-Barry Bonds | 413 | 92 | 16 | 72 | 48 | 65 | 1 | 413 | 92 | 16 | 72 | 48 | 65 | N | E | 280 | 9 | 5 | 100.000 | N |
-Bobby Bonilla | 426 | 109 | 3 | 55 | 43 | 62 | 1 | 426 | 109 | 3 | 55 | 43 | 62 | A | W | 361 | 22 | 2 | 115.000 | N |
-Bob Boone | 22 | 10 | 1 | 4 | 2 | 1 | 6 | 84 | 26 | 2 | 9 | 9 | 3 | A | W | 812 | 84 | 11 | NA | A |
-Bob Brenly | 472 | 116 | 16 | 60 | 62 | 74 | 6 | 1924 | 489 | 67 | 242 | 251 | 240 | N | W | 518 | 55 | 3 | 600.000 | N |
-Bill Buckner | 629 | 168 | 18 | 73 | 102 | 40 | 18 | 8424 | 2464 | 164 | 1008 | 1072 | 402 | A | E | 1067 | 157 | 14 | 776.667 | A |
-Brett Butler | 587 | 163 | 4 | 92 | 51 | 70 | 6 | 2695 | 747 | 17 | 442 | 198 | 317 | A | E | 434 | 9 | 3 | 765.000 | A |
-Bob Dernier | 324 | 73 | 4 | 32 | 18 | 22 | 7 | 1931 | 491 | 13 | 291 | 108 | 180 | N | E | 222 | 3 | 3 | 708.333 | N |
-Bo Diaz | 474 | 129 | 10 | 50 | 56 | 40 | 10 | 2331 | 604 | 61 | 246 | 327 | 166 | N | W | 732 | 83 | 13 | 750.000 | N |
-Bill Doran | 550 | 152 | 6 | 92 | 37 | 81 | 5 | 2308 | 633 | 32 | 349 | 182 | 308 | N | W | 262 | 329 | 16 | 625.000 | N |
-Brian Downing | 513 | 137 | 20 | 90 | 95 | 90 | 14 | 5201 | 1382 | 166 | 763 | 734 | 784 | A | W | 267 | 5 | 3 | 900.000 | A |
-Bobby Grich | 313 | 84 | 9 | 42 | 30 | 39 | 17 | 6890 | 1833 | 224 | 1033 | 864 | 1087 | A | W | 127 | 221 | 7 | NA | A |
-Billy Hatcher | 419 | 108 | 6 | 55 | 36 | 22 | 3 | 591 | 149 | 8 | 80 | 46 | 31 | N | W | 226 | 7 | 4 | 110.000 | N |
-Bob Horner | 517 | 141 | 27 | 70 | 87 | 52 | 9 | 3571 | 994 | 215 | 545 | 652 | 337 | N | W | 1378 | 102 | 8 | NA | N |
-Brook Jacoby | 583 | 168 | 17 | 83 | 80 | 56 | 5 | 1646 | 452 | 44 | 219 | 208 | 136 | A | E | 109 | 292 | 25 | 612.500 | A |
-Bob Kearney | 204 | 49 | 6 | 23 | 25 | 12 | 7 | 1309 | 308 | 27 | 126 | 132 | 66 | A | W | 419 | 46 | 5 | 300.000 | A |
-Bill Madlock | 379 | 106 | 10 | 38 | 60 | 30 | 14 | 6207 | 1906 | 146 | 859 | 803 | 571 | N | W | 72 | 170 | 24 | 850.000 | N |
-Bobby Meacham | 161 | 36 | 0 | 19 | 10 | 17 | 4 | 1053 | 244 | 3 | 156 | 86 | 107 | A | E | 70 | 149 | 12 | NA | A |
-Bob Melvin | 268 | 60 | 5 | 24 | 25 | 15 | 2 | 350 | 78 | 5 | 34 | 29 | 18 | N | W | 442 | 59 | 6 | 90.000 | N |
-Ben Oglivie | 346 | 98 | 5 | 31 | 53 | 30 | 16 | 5913 | 1615 | 235 | 784 | 901 | 560 | A | E | 0 | 0 | 0 | NA | A |
-Bip Roberts | 241 | 61 | 1 | 34 | 12 | 14 | 1 | 241 | 61 | 1 | 34 | 12 | 14 | N | W | 166 | 172 | 10 | NA | N |
-BillyJo Robidoux | 181 | 41 | 1 | 15 | 21 | 33 | 2 | 232 | 50 | 4 | 20 | 29 | 45 | A | E | 326 | 29 | 5 | 67.500 | A |
-Bill Russell | 216 | 54 | 0 | 21 | 18 | 15 | 18 | 7318 | 1926 | 46 | 796 | 627 | 483 | N | W | 103 | 84 | 5 | NA | N |
-Billy Sample | 200 | 57 | 6 | 23 | 14 | 14 | 9 | 2516 | 684 | 46 | 371 | 230 | 195 | N | W | 69 | 1 | 1 | NA | N |
-Bill Schroeder | 217 | 46 | 7 | 32 | 19 | 9 | 4 | 694 | 160 | 32 | 86 | 76 | 32 | A | E | 307 | 25 | 1 | 180.000 | A |
-Butch Wynegar | 194 | 40 | 7 | 19 | 29 | 30 | 11 | 4183 | 1069 | 64 | 486 | 493 | 608 | A | E | 325 | 22 | 2 | NA | A |
-Chris Bando | 254 | 68 | 2 | 28 | 26 | 22 | 6 | 999 | 236 | 21 | 108 | 117 | 118 | A | E | 359 | 30 | 4 | 305.000 | A |
-Chris Brown | 416 | 132 | 7 | 57 | 49 | 33 | 3 | 932 | 273 | 24 | 113 | 121 | 80 | N | W | 73 | 177 | 18 | 215.000 | N |
-Carmen Castillo | 205 | 57 | 8 | 34 | 32 | 9 | 5 | 756 | 192 | 32 | 117 | 107 | 51 | A | E | 58 | 4 | 4 | 247.500 | A |
-Cecil Cooper | 542 | 140 | 12 | 46 | 75 | 41 | 16 | 7099 | 2130 | 235 | 987 | 1089 | 431 | A | E | 697 | 61 | 9 | NA | A |
-Chili Davis | 526 | 146 | 13 | 71 | 70 | 84 | 6 | 2648 | 715 | 77 | 352 | 342 | 289 | N | W | 303 | 9 | 9 | 815.000 | N |
-Carlton Fisk | 457 | 101 | 14 | 42 | 63 | 22 | 17 | 6521 | 1767 | 281 | 1003 | 977 | 619 | A | W | 389 | 39 | 4 | 875.000 | A |
-Curt Ford | 214 | 53 | 2 | 30 | 29 | 23 | 2 | 226 | 59 | 2 | 32 | 32 | 27 | N | E | 109 | 7 | 3 | 70.000 | N |
-Cliff Johnson | 19 | 7 | 0 | 1 | 2 | 1 | 4 | 41 | 13 | 1 | 3 | 4 | 4 | A | E | 0 | 0 | 0 | NA | A |
-Carney Lansford | 591 | 168 | 19 | 80 | 72 | 39 | 9 | 4478 | 1307 | 113 | 634 | 563 | 319 | A | W | 67 | 147 | 4 | 1200.000 | A |
-Chet Lemon | 403 | 101 | 12 | 45 | 53 | 39 | 12 | 5150 | 1429 | 166 | 747 | 666 | 526 | A | E | 316 | 6 | 5 | 675.000 | A |
-Candy Maldonado | 405 | 102 | 18 | 49 | 85 | 20 | 6 | 950 | 231 | 29 | 99 | 138 | 64 | N | W | 161 | 10 | 3 | 415.000 | N |
-Carmelo Martinez | 244 | 58 | 9 | 28 | 25 | 35 | 4 | 1335 | 333 | 49 | 164 | 179 | 194 | N | W | 142 | 14 | 2 | 340.000 | N |
-Charlie Moore | 235 | 61 | 3 | 24 | 39 | 21 | 14 | 3926 | 1029 | 35 | 441 | 401 | 333 | A | E | 425 | 43 | 4 | NA | A |
-Craig Reynolds | 313 | 78 | 6 | 32 | 41 | 12 | 12 | 3742 | 968 | 35 | 409 | 321 | 170 | N | W | 106 | 206 | 7 | 416.667 | N |
-Cal Ripken | 627 | 177 | 25 | 98 | 81 | 70 | 6 | 3210 | 927 | 133 | 529 | 472 | 313 | A | E | 240 | 482 | 13 | 1350.000 | A |
-Cory Snyder | 416 | 113 | 24 | 58 | 69 | 16 | 1 | 416 | 113 | 24 | 58 | 69 | 16 | A | E | 203 | 70 | 10 | 90.000 | A |
-Chris Speier | 155 | 44 | 6 | 21 | 23 | 15 | 16 | 6631 | 1634 | 98 | 698 | 661 | 777 | N | E | 53 | 88 | 3 | 275.000 | N |
-Curt Wilkerson | 236 | 56 | 0 | 27 | 15 | 11 | 4 | 1115 | 270 | 1 | 116 | 64 | 57 | A | W | 125 | 199 | 13 | 230.000 | A |
-Dave Anderson | 216 | 53 | 1 | 31 | 15 | 22 | 4 | 926 | 210 | 9 | 118 | 69 | 114 | N | W | 73 | 152 | 11 | 225.000 | N |
-Doug Baker | 24 | 3 | 0 | 1 | 0 | 2 | 3 | 159 | 28 | 0 | 20 | 12 | 9 | A | W | 80 | 4 | 0 | NA | A |
-Don Baylor | 585 | 139 | 31 | 93 | 94 | 62 | 17 | 7546 | 1982 | 315 | 1141 | 1179 | 727 | A | E | 0 | 0 | 0 | 950.000 | A |
-Dann Bilardello | 191 | 37 | 4 | 12 | 17 | 14 | 4 | 773 | 163 | 16 | 61 | 74 | 52 | N | E | 391 | 38 | 8 | NA | N |
-Daryl Boston | 199 | 53 | 5 | 29 | 22 | 21 | 3 | 514 | 120 | 8 | 57 | 40 | 39 | A | W | 152 | 3 | 5 | 75.000 | A |
-Darnell Coles | 521 | 142 | 20 | 67 | 86 | 45 | 4 | 815 | 205 | 22 | 99 | 103 | 78 | A | E | 107 | 242 | 23 | 105.000 | A |
-Dave Collins | 419 | 113 | 1 | 44 | 27 | 44 | 12 | 4484 | 1231 | 32 | 612 | 344 | 422 | A | E | 211 | 2 | 1 | NA | A |
-Dave Concepcion | 311 | 81 | 3 | 42 | 30 | 26 | 17 | 8247 | 2198 | 100 | 950 | 909 | 690 | N | W | 153 | 223 | 10 | 320.000 | N |
-Darren Daulton | 138 | 31 | 8 | 18 | 21 | 38 | 3 | 244 | 53 | 12 | 33 | 32 | 55 | N | E | 244 | 21 | 4 | NA | N |
-Doug DeCinces | 512 | 131 | 26 | 69 | 96 | 52 | 14 | 5347 | 1397 | 221 | 712 | 815 | 548 | A | W | 119 | 216 | 12 | 850.000 | A |
-Darrell Evans | 507 | 122 | 29 | 78 | 85 | 91 | 18 | 7761 | 1947 | 347 | 1175 | 1152 | 1380 | A | E | 808 | 108 | 2 | 535.000 | A |
-Dwight Evans | 529 | 137 | 26 | 86 | 97 | 97 | 15 | 6661 | 1785 | 291 | 1082 | 949 | 989 | A | E | 280 | 10 | 5 | 933.333 | A |
-Damaso Garcia | 424 | 119 | 6 | 57 | 46 | 13 | 9 | 3651 | 1046 | 32 | 461 | 301 | 112 | A | E | 224 | 286 | 8 | 850.000 | N |
-Dan Gladden | 351 | 97 | 4 | 55 | 29 | 39 | 4 | 1258 | 353 | 16 | 196 | 110 | 117 | N | W | 226 | 7 | 3 | 210.000 | A |
-Danny Heep | 195 | 55 | 5 | 24 | 33 | 30 | 8 | 1313 | 338 | 25 | 144 | 149 | 153 | N | E | 83 | 2 | 1 | NA | N |
-Dave Henderson | 388 | 103 | 15 | 59 | 47 | 39 | 6 | 2174 | 555 | 80 | 285 | 274 | 186 | A | W | 182 | 9 | 4 | 325.000 | A |
-Donnie Hill | 339 | 96 | 4 | 37 | 29 | 23 | 4 | 1064 | 290 | 11 | 123 | 108 | 55 | A | W | 104 | 213 | 9 | 275.000 | A |
-Dave Kingman | 561 | 118 | 35 | 70 | 94 | 33 | 16 | 6677 | 1575 | 442 | 901 | 1210 | 608 | A | W | 463 | 32 | 8 | NA | A |
-Davey Lopes | 255 | 70 | 7 | 49 | 35 | 43 | 15 | 6311 | 1661 | 154 | 1019 | 608 | 820 | N | E | 51 | 54 | 8 | 450.000 | N |
-Don Mattingly | 677 | 238 | 31 | 117 | 113 | 53 | 5 | 2223 | 737 | 93 | 349 | 401 | 171 | A | E | 1377 | 100 | 6 | 1975.000 | A |
-Darryl Motley | 227 | 46 | 7 | 23 | 20 | 12 | 5 | 1325 | 324 | 44 | 156 | 158 | 67 | A | W | 92 | 2 | 2 | NA | A |
-Dale Murphy | 614 | 163 | 29 | 89 | 83 | 75 | 11 | 5017 | 1388 | 266 | 813 | 822 | 617 | N | W | 303 | 6 | 6 | 1900.000 | N |
-Dwayne Murphy | 329 | 83 | 9 | 50 | 39 | 56 | 9 | 3828 | 948 | 145 | 575 | 528 | 635 | A | W | 276 | 6 | 2 | 600.000 | A |
-Dave Parker | 637 | 174 | 31 | 89 | 116 | 56 | 14 | 6727 | 2024 | 247 | 978 | 1093 | 495 | N | W | 278 | 9 | 9 | 1041.667 | N |
-Dan Pasqua | 280 | 82 | 16 | 44 | 45 | 47 | 2 | 428 | 113 | 25 | 61 | 70 | 63 | A | E | 148 | 4 | 2 | 110.000 | A |
-Darrell Porter | 155 | 41 | 12 | 21 | 29 | 22 | 16 | 5409 | 1338 | 181 | 746 | 805 | 875 | A | W | 165 | 9 | 1 | 260.000 | A |
-Dick Schofield | 458 | 114 | 13 | 67 | 57 | 48 | 4 | 1350 | 298 | 28 | 160 | 123 | 122 | A | W | 246 | 389 | 18 | 475.000 | A |
-Don Slaught | 314 | 83 | 13 | 39 | 46 | 16 | 5 | 1457 | 405 | 28 | 156 | 159 | 76 | A | W | 533 | 40 | 4 | 431.500 | A |
-Darryl Strawberry | 475 | 123 | 27 | 76 | 93 | 72 | 4 | 1810 | 471 | 108 | 292 | 343 | 267 | N | E | 226 | 10 | 6 | 1220.000 | N |
-Dale Sveum | 317 | 78 | 7 | 35 | 35 | 32 | 1 | 317 | 78 | 7 | 35 | 35 | 32 | A | E | 45 | 122 | 26 | 70.000 | A |
-Danny Tartabull | 511 | 138 | 25 | 76 | 96 | 61 | 3 | 592 | 164 | 28 | 87 | 110 | 71 | A | W | 157 | 7 | 8 | 145.000 | A |
-Dickie Thon | 278 | 69 | 3 | 24 | 21 | 29 | 8 | 2079 | 565 | 32 | 258 | 192 | 162 | N | W | 142 | 210 | 10 | NA | N |
-Denny Walling | 382 | 119 | 13 | 54 | 58 | 36 | 12 | 2133 | 594 | 41 | 287 | 294 | 227 | N | W | 59 | 156 | 9 | 595.000 | N |
-Dave Winfield | 565 | 148 | 24 | 90 | 104 | 77 | 14 | 7287 | 2083 | 305 | 1135 | 1234 | 791 | A | E | 292 | 9 | 5 | 1861.460 | A |
-Enos Cabell | 277 | 71 | 2 | 27 | 29 | 14 | 15 | 5952 | 1647 | 60 | 753 | 596 | 259 | N | W | 360 | 32 | 5 | NA | N |
-Eric Davis | 415 | 115 | 27 | 97 | 71 | 68 | 3 | 711 | 184 | 45 | 156 | 119 | 99 | N | W | 274 | 2 | 7 | 300.000 | N |
-Eddie Milner | 424 | 110 | 15 | 70 | 47 | 36 | 7 | 2130 | 544 | 38 | 335 | 174 | 258 | N | W | 292 | 6 | 3 | 490.000 | N |
-Eddie Murray | 495 | 151 | 17 | 61 | 84 | 78 | 10 | 5624 | 1679 | 275 | 884 | 1015 | 709 | A | E | 1045 | 88 | 13 | 2460.000 | A |
-Ernest Riles | 524 | 132 | 9 | 69 | 47 | 54 | 2 | 972 | 260 | 14 | 123 | 92 | 90 | A | E | 212 | 327 | 20 | NA | A |
-Ed Romero | 233 | 49 | 2 | 41 | 23 | 18 | 8 | 1350 | 336 | 7 | 166 | 122 | 106 | A | E | 102 | 132 | 10 | 375.000 | A |
-Ernie Whitt | 395 | 106 | 16 | 48 | 56 | 35 | 10 | 2303 | 571 | 86 | 266 | 323 | 248 | A | E | 709 | 41 | 7 | NA | A |
-Fred Lynn | 397 | 114 | 23 | 67 | 67 | 53 | 13 | 5589 | 1632 | 241 | 906 | 926 | 716 | A | E | 244 | 2 | 4 | NA | A |
-Floyd Rayford | 210 | 37 | 8 | 15 | 19 | 15 | 6 | 994 | 244 | 36 | 107 | 114 | 53 | A | E | 40 | 115 | 15 | NA | A |
-Franklin Stubbs | 420 | 95 | 23 | 55 | 58 | 37 | 3 | 646 | 139 | 31 | 77 | 77 | 61 | N | W | 206 | 10 | 7 | NA | N |
-Frank White | 566 | 154 | 22 | 76 | 84 | 43 | 14 | 6100 | 1583 | 131 | 743 | 693 | 300 | A | W | 316 | 439 | 10 | 750.000 | A |
-George Bell | 641 | 198 | 31 | 101 | 108 | 41 | 5 | 2129 | 610 | 92 | 297 | 319 | 117 | A | E | 269 | 17 | 10 | 1175.000 | A |
-Glenn Braggs | 215 | 51 | 4 | 19 | 18 | 11 | 1 | 215 | 51 | 4 | 19 | 18 | 11 | A | E | 116 | 5 | 12 | 70.000 | A |
-George Brett | 441 | 128 | 16 | 70 | 73 | 80 | 14 | 6675 | 2095 | 209 | 1072 | 1050 | 695 | A | W | 97 | 218 | 16 | 1500.000 | A |
-Greg Brock | 325 | 76 | 16 | 33 | 52 | 37 | 5 | 1506 | 351 | 71 | 195 | 219 | 214 | N | W | 726 | 87 | 3 | 385.000 | A |
-Gary Carter | 490 | 125 | 24 | 81 | 105 | 62 | 13 | 6063 | 1646 | 271 | 847 | 999 | 680 | N | E | 869 | 62 | 8 | 1925.571 | N |
-Glenn Davis | 574 | 152 | 31 | 91 | 101 | 64 | 3 | 985 | 260 | 53 | 148 | 173 | 95 | N | W | 1253 | 111 | 11 | 215.000 | N |
-George Foster | 284 | 64 | 14 | 30 | 42 | 24 | 18 | 7023 | 1925 | 348 | 986 | 1239 | 666 | N | E | 96 | 4 | 4 | NA | N |
-Gary Gaetti | 596 | 171 | 34 | 91 | 108 | 52 | 6 | 2862 | 728 | 107 | 361 | 401 | 224 | A | W | 118 | 334 | 21 | 900.000 | A |
-Greg Gagne | 472 | 118 | 12 | 63 | 54 | 30 | 4 | 793 | 187 | 14 | 102 | 80 | 50 | A | W | 228 | 377 | 26 | 155.000 | A |
-George Hendrick | 283 | 77 | 14 | 45 | 47 | 26 | 16 | 6840 | 1910 | 259 | 915 | 1067 | 546 | A | W | 144 | 6 | 5 | 700.000 | A |
-Glenn Hubbard | 408 | 94 | 4 | 42 | 36 | 66 | 9 | 3573 | 866 | 59 | 429 | 365 | 410 | N | W | 282 | 487 | 19 | 535.000 | N |
-Garth Iorg | 327 | 85 | 3 | 30 | 44 | 20 | 8 | 2140 | 568 | 16 | 216 | 208 | 93 | A | E | 91 | 185 | 12 | 362.500 | A |
-Gary Matthews | 370 | 96 | 21 | 49 | 46 | 60 | 15 | 6986 | 1972 | 231 | 1070 | 955 | 921 | N | E | 137 | 5 | 9 | 733.333 | N |
-Graig Nettles | 354 | 77 | 16 | 36 | 55 | 41 | 20 | 8716 | 2172 | 384 | 1172 | 1267 | 1057 | N | W | 83 | 174 | 16 | 200.000 | N |
-Gary Pettis | 539 | 139 | 5 | 93 | 58 | 69 | 5 | 1469 | 369 | 12 | 247 | 126 | 198 | A | W | 462 | 9 | 7 | 400.000 | A |
-Gary Redus | 340 | 84 | 11 | 62 | 33 | 47 | 5 | 1516 | 376 | 42 | 284 | 141 | 219 | N | E | 185 | 8 | 4 | 400.000 | A |
-Garry Templeton | 510 | 126 | 2 | 42 | 44 | 35 | 11 | 5562 | 1578 | 44 | 703 | 519 | 256 | N | W | 207 | 358 | 20 | 737.500 | N |
-Gorman Thomas | 315 | 59 | 16 | 45 | 36 | 58 | 13 | 4677 | 1051 | 268 | 681 | 782 | 697 | A | W | 0 | 0 | 0 | NA | A |
-Greg Walker | 282 | 78 | 13 | 37 | 51 | 29 | 5 | 1649 | 453 | 73 | 211 | 280 | 138 | A | W | 670 | 57 | 5 | 500.000 | A |
-Gary Ward | 380 | 120 | 5 | 54 | 51 | 31 | 8 | 3118 | 900 | 92 | 444 | 419 | 240 | A | W | 237 | 8 | 1 | 600.000 | A |
-Glenn Wilson | 584 | 158 | 15 | 70 | 84 | 42 | 5 | 2358 | 636 | 58 | 265 | 316 | 134 | N | E | 331 | 20 | 4 | 662.500 | N |
-Harold Baines | 570 | 169 | 21 | 72 | 88 | 38 | 7 | 3754 | 1077 | 140 | 492 | 589 | 263 | A | W | 295 | 15 | 5 | 950.000 | A |
-Hubie Brooks | 306 | 104 | 14 | 50 | 58 | 25 | 7 | 2954 | 822 | 55 | 313 | 377 | 187 | N | E | 116 | 222 | 15 | 750.000 | N |
-Howard Johnson | 220 | 54 | 10 | 30 | 39 | 31 | 5 | 1185 | 299 | 40 | 145 | 154 | 128 | N | E | 50 | 136 | 20 | 297.500 | N |
-Hal McRae | 278 | 70 | 7 | 22 | 37 | 18 | 18 | 7186 | 2081 | 190 | 935 | 1088 | 643 | A | W | 0 | 0 | 0 | 325.000 | A |
-Harold Reynolds | 445 | 99 | 1 | 46 | 24 | 29 | 4 | 618 | 129 | 1 | 72 | 31 | 48 | A | W | 278 | 415 | 16 | 87.500 | A |
-Harry Spilman | 143 | 39 | 5 | 18 | 30 | 15 | 9 | 639 | 151 | 16 | 80 | 97 | 61 | N | W | 138 | 15 | 1 | 175.000 | N |
-Herm Winningham | 185 | 40 | 4 | 23 | 11 | 18 | 3 | 524 | 125 | 7 | 58 | 37 | 47 | N | E | 97 | 2 | 2 | 90.000 | N |
-Jesse Barfield | 589 | 170 | 40 | 107 | 108 | 69 | 6 | 2325 | 634 | 128 | 371 | 376 | 238 | A | E | 368 | 20 | 3 | 1237.500 | A |
-Juan Beniquez | 343 | 103 | 6 | 48 | 36 | 40 | 15 | 4338 | 1193 | 70 | 581 | 421 | 325 | A | E | 211 | 56 | 13 | 430.000 | A |
-Juan Bonilla | 284 | 69 | 1 | 33 | 18 | 25 | 5 | 1407 | 361 | 6 | 139 | 98 | 111 | A | E | 122 | 140 | 5 | NA | N |
-John Cangelosi | 438 | 103 | 2 | 65 | 32 | 71 | 2 | 440 | 103 | 2 | 67 | 32 | 71 | A | W | 276 | 7 | 9 | 100.000 | N |
-Jose Canseco | 600 | 144 | 33 | 85 | 117 | 65 | 2 | 696 | 173 | 38 | 101 | 130 | 69 | A | W | 319 | 4 | 14 | 165.000 | A |
-Joe Carter | 663 | 200 | 29 | 108 | 121 | 32 | 4 | 1447 | 404 | 57 | 210 | 222 | 68 | A | E | 241 | 8 | 6 | 250.000 | A |
-Jack Clark | 232 | 55 | 9 | 34 | 23 | 45 | 12 | 4405 | 1213 | 194 | 702 | 705 | 625 | N | E | 623 | 35 | 3 | 1300.000 | N |
-Jose Cruz | 479 | 133 | 10 | 48 | 72 | 55 | 17 | 7472 | 2147 | 153 | 980 | 1032 | 854 | N | W | 237 | 5 | 4 | 773.333 | N |
-Julio Cruz | 209 | 45 | 0 | 38 | 19 | 42 | 10 | 3859 | 916 | 23 | 557 | 279 | 478 | A | W | 132 | 205 | 5 | NA | A |
-Jody Davis | 528 | 132 | 21 | 61 | 74 | 41 | 6 | 2641 | 671 | 97 | 273 | 383 | 226 | N | E | 885 | 105 | 8 | 1008.333 | N |
-Jim Dwyer | 160 | 39 | 8 | 18 | 31 | 22 | 14 | 2128 | 543 | 56 | 304 | 268 | 298 | A | E | 33 | 3 | 0 | 275.000 | A |
-Julio Franco | 599 | 183 | 10 | 80 | 74 | 32 | 5 | 2482 | 715 | 27 | 330 | 326 | 158 | A | E | 231 | 374 | 18 | 775.000 | A |
-Jim Gantner | 497 | 136 | 7 | 58 | 38 | 26 | 11 | 3871 | 1066 | 40 | 450 | 367 | 241 | A | E | 304 | 347 | 10 | 850.000 | A |
-Johnny Grubb | 210 | 70 | 13 | 32 | 51 | 28 | 15 | 4040 | 1130 | 97 | 544 | 462 | 551 | A | E | 0 | 0 | 0 | 365.000 | A |
-Jerry Hairston | 225 | 61 | 5 | 32 | 26 | 26 | 11 | 1568 | 408 | 25 | 202 | 185 | 257 | A | W | 132 | 9 | 0 | NA | A |
-Jack Howell | 151 | 41 | 4 | 26 | 21 | 19 | 2 | 288 | 68 | 9 | 45 | 39 | 35 | A | W | 28 | 56 | 2 | 95.000 | A |
-John Kruk | 278 | 86 | 4 | 33 | 38 | 45 | 1 | 278 | 86 | 4 | 33 | 38 | 45 | N | W | 102 | 4 | 2 | 110.000 | N |
-Jeffrey Leonard | 341 | 95 | 6 | 48 | 42 | 20 | 10 | 2964 | 808 | 81 | 379 | 428 | 221 | N | W | 158 | 4 | 5 | 100.000 | N |
-Jim Morrison | 537 | 147 | 23 | 58 | 88 | 47 | 10 | 2744 | 730 | 97 | 302 | 351 | 174 | N | E | 92 | 257 | 20 | 277.500 | N |
-John Moses | 399 | 102 | 3 | 56 | 34 | 34 | 5 | 670 | 167 | 4 | 89 | 48 | 54 | A | W | 211 | 9 | 3 | 80.000 | A |
-Jerry Mumphrey | 309 | 94 | 5 | 37 | 32 | 26 | 13 | 4618 | 1330 | 57 | 616 | 522 | 436 | N | E | 161 | 3 | 3 | 600.000 | N |
-Joe Orsulak | 401 | 100 | 2 | 60 | 19 | 28 | 4 | 876 | 238 | 2 | 126 | 44 | 55 | N | E | 193 | 11 | 4 | NA | N |
-Jorge Orta | 336 | 93 | 9 | 35 | 46 | 23 | 15 | 5779 | 1610 | 128 | 730 | 741 | 497 | A | W | 0 | 0 | 0 | NA | A |
-Jim Presley | 616 | 163 | 27 | 83 | 107 | 32 | 3 | 1437 | 377 | 65 | 181 | 227 | 82 | A | W | 110 | 308 | 15 | 200.000 | A |
-Jamie Quirk | 219 | 47 | 8 | 24 | 26 | 17 | 12 | 1188 | 286 | 23 | 100 | 125 | 63 | A | W | 260 | 58 | 4 | NA | A |
-Johnny Ray | 579 | 174 | 7 | 67 | 78 | 58 | 6 | 3053 | 880 | 32 | 366 | 337 | 218 | N | E | 280 | 479 | 5 | 657.000 | N |
-Jeff Reed | 165 | 39 | 2 | 13 | 9 | 16 | 3 | 196 | 44 | 2 | 18 | 10 | 18 | A | W | 332 | 19 | 2 | 75.000 | N |
-Jim Rice | 618 | 200 | 20 | 98 | 110 | 62 | 13 | 7127 | 2163 | 351 | 1104 | 1289 | 564 | A | E | 330 | 16 | 8 | 2412.500 | A |
-Jerry Royster | 257 | 66 | 5 | 31 | 26 | 32 | 14 | 3910 | 979 | 33 | 518 | 324 | 382 | N | W | 87 | 166 | 14 | 250.000 | A |
-John Russell | 315 | 76 | 13 | 35 | 60 | 25 | 3 | 630 | 151 | 24 | 68 | 94 | 55 | N | E | 498 | 39 | 13 | 155.000 | N |
-Juan Samuel | 591 | 157 | 16 | 90 | 78 | 26 | 4 | 2020 | 541 | 52 | 310 | 226 | 91 | N | E | 290 | 440 | 25 | 640.000 | N |
-John Shelby | 404 | 92 | 11 | 54 | 49 | 18 | 6 | 1354 | 325 | 30 | 188 | 135 | 63 | A | E | 222 | 5 | 5 | 300.000 | A |
-Joel Skinner | 315 | 73 | 5 | 23 | 37 | 16 | 4 | 450 | 108 | 6 | 38 | 46 | 28 | A | W | 227 | 15 | 3 | 110.000 | A |
-Jeff Stone | 249 | 69 | 6 | 32 | 19 | 20 | 4 | 702 | 209 | 10 | 97 | 48 | 44 | N | E | 103 | 8 | 2 | NA | N |
-Jim Sundberg | 429 | 91 | 12 | 41 | 42 | 57 | 13 | 5590 | 1397 | 83 | 578 | 579 | 644 | A | W | 686 | 46 | 4 | 825.000 | N |
-Jim Traber | 212 | 54 | 13 | 28 | 44 | 18 | 2 | 233 | 59 | 13 | 31 | 46 | 20 | A | E | 243 | 23 | 5 | NA | A |
-Jose Uribe | 453 | 101 | 3 | 46 | 43 | 61 | 3 | 948 | 218 | 6 | 96 | 72 | 91 | N | W | 249 | 444 | 16 | 195.000 | N |
-Jerry Willard | 161 | 43 | 4 | 17 | 26 | 22 | 3 | 707 | 179 | 21 | 77 | 99 | 76 | A | W | 300 | 12 | 2 | NA | A |
-Joel Youngblood | 184 | 47 | 5 | 20 | 28 | 18 | 11 | 3327 | 890 | 74 | 419 | 382 | 304 | N | W | 49 | 2 | 0 | 450.000 | N |
-Kevin Bass | 591 | 184 | 20 | 83 | 79 | 38 | 5 | 1689 | 462 | 40 | 219 | 195 | 82 | N | W | 303 | 12 | 5 | 630.000 | N |
-Kal Daniels | 181 | 58 | 6 | 34 | 23 | 22 | 1 | 181 | 58 | 6 | 34 | 23 | 22 | N | W | 88 | 0 | 3 | 86.500 | N |
-Kirk Gibson | 441 | 118 | 28 | 84 | 86 | 68 | 8 | 2723 | 750 | 126 | 433 | 420 | 309 | A | E | 190 | 2 | 2 | 1300.000 | A |
-Ken Griffey | 490 | 150 | 21 | 69 | 58 | 35 | 14 | 6126 | 1839 | 121 | 983 | 707 | 600 | A | E | 96 | 5 | 3 | 1000.000 | N |
-Keith Hernandez | 551 | 171 | 13 | 94 | 83 | 94 | 13 | 6090 | 1840 | 128 | 969 | 900 | 917 | N | E | 1199 | 149 | 5 | 1800.000 | N |
-Kent Hrbek | 550 | 147 | 29 | 85 | 91 | 71 | 6 | 2816 | 815 | 117 | 405 | 474 | 319 | A | W | 1218 | 104 | 10 | 1310.000 | A |
-Ken Landreaux | 283 | 74 | 4 | 34 | 29 | 22 | 10 | 3919 | 1062 | 85 | 505 | 456 | 283 | N | W | 145 | 5 | 7 | 737.500 | N |
-Kevin McReynolds | 560 | 161 | 26 | 89 | 96 | 66 | 4 | 1789 | 470 | 65 | 233 | 260 | 155 | N | W | 332 | 9 | 8 | 625.000 | N |
-Kevin Mitchell | 328 | 91 | 12 | 51 | 43 | 33 | 2 | 342 | 94 | 12 | 51 | 44 | 33 | N | E | 145 | 59 | 8 | 125.000 | N |
-Keith Moreland | 586 | 159 | 12 | 72 | 79 | 53 | 9 | 3082 | 880 | 83 | 363 | 477 | 295 | N | E | 181 | 13 | 4 | 1043.333 | N |
-Ken Oberkfell | 503 | 136 | 5 | 62 | 48 | 83 | 10 | 3423 | 970 | 20 | 408 | 303 | 414 | N | W | 65 | 258 | 8 | 725.000 | N |
-Ken Phelps | 344 | 85 | 24 | 69 | 64 | 88 | 7 | 911 | 214 | 64 | 150 | 156 | 187 | A | W | 0 | 0 | 0 | 300.000 | A |
-Kirby Puckett | 680 | 223 | 31 | 119 | 96 | 34 | 3 | 1928 | 587 | 35 | 262 | 201 | 91 | A | W | 429 | 8 | 6 | 365.000 | A |
-Kurt Stillwell | 279 | 64 | 0 | 31 | 26 | 30 | 1 | 279 | 64 | 0 | 31 | 26 | 30 | N | W | 107 | 205 | 16 | 75.000 | N |
-Leon Durham | 484 | 127 | 20 | 66 | 65 | 67 | 7 | 3006 | 844 | 116 | 436 | 458 | 377 | N | E | 1231 | 80 | 7 | 1183.333 | N |
-Len Dykstra | 431 | 127 | 8 | 77 | 45 | 58 | 2 | 667 | 187 | 9 | 117 | 64 | 88 | N | E | 283 | 8 | 3 | 202.500 | N |
-Larry Herndon | 283 | 70 | 8 | 33 | 37 | 27 | 12 | 4479 | 1222 | 94 | 557 | 483 | 307 | A | E | 156 | 2 | 2 | 225.000 | A |
-Lee Lacy | 491 | 141 | 11 | 77 | 47 | 37 | 15 | 4291 | 1240 | 84 | 615 | 430 | 340 | A | E | 239 | 8 | 2 | 525.000 | A |
-Len Matuszek | 199 | 52 | 9 | 26 | 28 | 21 | 6 | 805 | 191 | 30 | 113 | 119 | 87 | N | W | 235 | 22 | 5 | 265.000 | N |
-Lloyd Moseby | 589 | 149 | 21 | 89 | 86 | 64 | 7 | 3558 | 928 | 102 | 513 | 471 | 351 | A | E | 371 | 6 | 6 | 787.500 | A |
-Lance Parrish | 327 | 84 | 22 | 53 | 62 | 38 | 10 | 4273 | 1123 | 212 | 577 | 700 | 334 | A | E | 483 | 48 | 6 | 800.000 | N |
-Larry Parrish | 464 | 128 | 28 | 67 | 94 | 52 | 13 | 5829 | 1552 | 210 | 740 | 840 | 452 | A | W | 0 | 0 | 0 | 587.500 | A |
-Luis Rivera | 166 | 34 | 0 | 20 | 13 | 17 | 1 | 166 | 34 | 0 | 20 | 13 | 17 | N | E | 64 | 119 | 9 | NA | N |
-Larry Sheets | 338 | 92 | 18 | 42 | 60 | 21 | 3 | 682 | 185 | 36 | 88 | 112 | 50 | A | E | 0 | 0 | 0 | 145.000 | A |
-Lonnie Smith | 508 | 146 | 8 | 80 | 44 | 46 | 9 | 3148 | 915 | 41 | 571 | 289 | 326 | A | W | 245 | 5 | 9 | NA | A |
-Lou Whitaker | 584 | 157 | 20 | 95 | 73 | 63 | 10 | 4704 | 1320 | 93 | 724 | 522 | 576 | A | E | 276 | 421 | 11 | 420.000 | A |
-Mike Aldrete | 216 | 54 | 2 | 27 | 25 | 33 | 1 | 216 | 54 | 2 | 27 | 25 | 33 | N | W | 317 | 36 | 1 | 75.000 | N |
-Marty Barrett | 625 | 179 | 4 | 94 | 60 | 65 | 5 | 1696 | 476 | 12 | 216 | 163 | 166 | A | E | 303 | 450 | 14 | 575.000 | A |
-Mike Brown | 243 | 53 | 4 | 18 | 26 | 27 | 4 | 853 | 228 | 23 | 101 | 110 | 76 | N | E | 107 | 3 | 3 | NA | N |
-Mike Davis | 489 | 131 | 19 | 77 | 55 | 34 | 7 | 2051 | 549 | 62 | 300 | 263 | 153 | A | W | 310 | 9 | 9 | 780.000 | A |
-Mike Diaz | 209 | 56 | 12 | 22 | 36 | 19 | 2 | 216 | 58 | 12 | 24 | 37 | 19 | N | E | 201 | 6 | 3 | 90.000 | N |
-Mariano Duncan | 407 | 93 | 8 | 47 | 30 | 30 | 2 | 969 | 230 | 14 | 121 | 69 | 68 | N | W | 172 | 317 | 25 | 150.000 | N |
-Mike Easler | 490 | 148 | 14 | 64 | 78 | 49 | 13 | 3400 | 1000 | 113 | 445 | 491 | 301 | A | E | 0 | 0 | 0 | 700.000 | N |
-Mike Fitzgerald | 209 | 59 | 6 | 20 | 37 | 27 | 4 | 884 | 209 | 14 | 66 | 106 | 92 | N | E | 415 | 35 | 3 | NA | N |
-Mel Hall | 442 | 131 | 18 | 68 | 77 | 33 | 6 | 1416 | 398 | 47 | 210 | 203 | 136 | A | E | 233 | 7 | 7 | 550.000 | A |
-Mickey Hatcher | 317 | 88 | 3 | 40 | 32 | 19 | 8 | 2543 | 715 | 28 | 269 | 270 | 118 | A | W | 220 | 16 | 4 | NA | A |
-Mike Heath | 288 | 65 | 8 | 30 | 36 | 27 | 9 | 2815 | 698 | 55 | 315 | 325 | 189 | N | E | 259 | 30 | 10 | 650.000 | A |
-Mike Kingery | 209 | 54 | 3 | 25 | 14 | 12 | 1 | 209 | 54 | 3 | 25 | 14 | 12 | A | W | 102 | 6 | 3 | 68.000 | A |
-Mike LaValliere | 303 | 71 | 3 | 18 | 30 | 36 | 3 | 344 | 76 | 3 | 20 | 36 | 45 | N | E | 468 | 47 | 6 | 100.000 | N |
-Mike Marshall | 330 | 77 | 19 | 47 | 53 | 27 | 6 | 1928 | 516 | 90 | 247 | 288 | 161 | N | W | 149 | 8 | 6 | 670.000 | N |
-Mike Pagliarulo | 504 | 120 | 28 | 71 | 71 | 54 | 3 | 1085 | 259 | 54 | 150 | 167 | 114 | A | E | 103 | 283 | 19 | 175.000 | A |
-Mark Salas | 258 | 60 | 8 | 28 | 33 | 18 | 3 | 638 | 170 | 17 | 80 | 75 | 36 | A | W | 358 | 32 | 8 | 137.000 | A |
-Mike Schmidt | 20 | 1 | 0 | 0 | 0 | 0 | 2 | 41 | 9 | 2 | 6 | 7 | 4 | N | E | 78 | 220 | 6 | 2127.333 | N |
-Mike Scioscia | 374 | 94 | 5 | 36 | 26 | 62 | 7 | 1968 | 519 | 26 | 181 | 199 | 288 | N | W | 756 | 64 | 15 | 875.000 | N |
-Mickey Tettleton | 211 | 43 | 10 | 26 | 35 | 39 | 3 | 498 | 116 | 14 | 59 | 55 | 78 | A | W | 463 | 32 | 8 | 120.000 | A |
-Milt Thompson | 299 | 75 | 6 | 38 | 23 | 26 | 3 | 580 | 160 | 8 | 71 | 33 | 44 | N | E | 212 | 1 | 2 | 140.000 | N |
-Mitch Webster | 576 | 167 | 8 | 89 | 49 | 57 | 4 | 822 | 232 | 19 | 132 | 83 | 79 | N | E | 325 | 12 | 8 | 210.000 | N |
-Mookie Wilson | 381 | 110 | 9 | 61 | 45 | 32 | 7 | 3015 | 834 | 40 | 451 | 249 | 168 | N | E | 228 | 7 | 5 | 800.000 | N |
-Marvell Wynne | 288 | 76 | 7 | 34 | 37 | 15 | 4 | 1644 | 408 | 16 | 198 | 120 | 113 | N | W | 203 | 3 | 3 | 240.000 | N |
-Mike Young | 369 | 93 | 9 | 43 | 42 | 49 | 5 | 1258 | 323 | 54 | 181 | 177 | 157 | A | E | 149 | 1 | 6 | 350.000 | A |
-Nick Esasky | 330 | 76 | 12 | 35 | 41 | 47 | 4 | 1367 | 326 | 55 | 167 | 198 | 167 | N | W | 512 | 30 | 5 | NA | N |
-Ozzie Guillen | 547 | 137 | 2 | 58 | 47 | 12 | 2 | 1038 | 271 | 3 | 129 | 80 | 24 | A | W | 261 | 459 | 22 | 175.000 | A |
-Oddibe McDowell | 572 | 152 | 18 | 105 | 49 | 65 | 2 | 978 | 249 | 36 | 168 | 91 | 101 | A | W | 325 | 13 | 3 | 200.000 | A |
-Omar Moreno | 359 | 84 | 4 | 46 | 27 | 21 | 12 | 4992 | 1257 | 37 | 699 | 386 | 387 | N | W | 151 | 8 | 5 | NA | N |
-Ozzie Smith | 514 | 144 | 0 | 67 | 54 | 79 | 9 | 4739 | 1169 | 13 | 583 | 374 | 528 | N | E | 229 | 453 | 15 | 1940.000 | N |
-Ozzie Virgil | 359 | 80 | 15 | 45 | 48 | 63 | 7 | 1493 | 359 | 61 | 176 | 202 | 175 | N | W | 682 | 93 | 13 | 700.000 | N |
-Phil Bradley | 526 | 163 | 12 | 88 | 50 | 77 | 4 | 1556 | 470 | 38 | 245 | 167 | 174 | A | W | 250 | 11 | 1 | 750.000 | A |
-Phil Garner | 313 | 83 | 9 | 43 | 41 | 30 | 14 | 5885 | 1543 | 104 | 751 | 714 | 535 | N | W | 58 | 141 | 23 | 450.000 | N |
-Pete Incaviglia | 540 | 135 | 30 | 82 | 88 | 55 | 1 | 540 | 135 | 30 | 82 | 88 | 55 | A | W | 157 | 6 | 14 | 172.000 | A |
-Paul Molitor | 437 | 123 | 9 | 62 | 55 | 40 | 9 | 4139 | 1203 | 79 | 676 | 390 | 364 | A | E | 82 | 170 | 15 | 1260.000 | A |
-Pete O’Brien | 551 | 160 | 23 | 86 | 90 | 87 | 5 | 2235 | 602 | 75 | 278 | 328 | 273 | A | W | 1224 | 115 | 11 | NA | A |
-Pete Rose | 237 | 52 | 0 | 15 | 25 | 30 | 24 | 14053 | 4256 | 160 | 2165 | 1314 | 1566 | N | W | 523 | 43 | 6 | 750.000 | N |
-Pat Sheridan | 236 | 56 | 6 | 41 | 19 | 21 | 5 | 1257 | 329 | 24 | 166 | 125 | 105 | A | E | 172 | 1 | 4 | 190.000 | A |
-Pat Tabler | 473 | 154 | 6 | 61 | 48 | 29 | 6 | 1966 | 566 | 29 | 250 | 252 | 178 | A | E | 846 | 84 | 9 | 580.000 | A |
-Rafael Belliard | 309 | 72 | 0 | 33 | 31 | 26 | 5 | 354 | 82 | 0 | 41 | 32 | 26 | N | E | 117 | 269 | 12 | 130.000 | N |
-Rick Burleson | 271 | 77 | 5 | 35 | 29 | 33 | 12 | 4933 | 1358 | 48 | 630 | 435 | 403 | A | W | 62 | 90 | 3 | 450.000 | A |
-Randy Bush | 357 | 96 | 7 | 50 | 45 | 39 | 5 | 1394 | 344 | 43 | 178 | 192 | 136 | A | W | 167 | 2 | 4 | 300.000 | A |
-Rick Cerone | 216 | 56 | 4 | 22 | 18 | 15 | 12 | 2796 | 665 | 43 | 266 | 304 | 198 | A | E | 391 | 44 | 4 | 250.000 | A |
-Ron Cey | 256 | 70 | 13 | 42 | 36 | 44 | 16 | 7058 | 1845 | 312 | 965 | 1128 | 990 | N | E | 41 | 118 | 8 | 1050.000 | A |
-Rob Deer | 466 | 108 | 33 | 75 | 86 | 72 | 3 | 652 | 142 | 44 | 102 | 109 | 102 | A | E | 286 | 8 | 8 | 215.000 | A |
-Rick Dempsey | 327 | 68 | 13 | 42 | 29 | 45 | 18 | 3949 | 939 | 78 | 438 | 380 | 466 | A | E | 659 | 53 | 7 | 400.000 | A |
-Rich Gedman | 462 | 119 | 16 | 49 | 65 | 37 | 7 | 2131 | 583 | 69 | 244 | 288 | 150 | A | E | 866 | 65 | 6 | NA | A |
-Ron Hassey | 341 | 110 | 9 | 45 | 49 | 46 | 9 | 2331 | 658 | 50 | 249 | 322 | 274 | A | E | 251 | 9 | 4 | 560.000 | A |
-Rickey Henderson | 608 | 160 | 28 | 130 | 74 | 89 | 8 | 4071 | 1182 | 103 | 862 | 417 | 708 | A | E | 426 | 4 | 6 | 1670.000 | A |
-Reggie Jackson | 419 | 101 | 18 | 65 | 58 | 92 | 20 | 9528 | 2510 | 548 | 1509 | 1659 | 1342 | A | W | 0 | 0 | 0 | 487.500 | A |
-Ricky Jones | 33 | 6 | 0 | 2 | 4 | 7 | 1 | 33 | 6 | 0 | 2 | 4 | 7 | A | W | 205 | 5 | 4 | NA | A |
-Ron Kittle | 376 | 82 | 21 | 42 | 60 | 35 | 5 | 1770 | 408 | 115 | 238 | 299 | 157 | A | W | 0 | 0 | 0 | 425.000 | A |
-Ray Knight | 486 | 145 | 11 | 51 | 76 | 40 | 11 | 3967 | 1102 | 67 | 410 | 497 | 284 | N | E | 88 | 204 | 16 | 500.000 | A |
-Randy Kutcher | 186 | 44 | 7 | 28 | 16 | 11 | 1 | 186 | 44 | 7 | 28 | 16 | 11 | N | W | 99 | 3 | 1 | NA | N |
-Rudy Law | 307 | 80 | 1 | 42 | 36 | 29 | 7 | 2421 | 656 | 18 | 379 | 198 | 184 | A | W | 145 | 2 | 2 | NA | A |
-Rick Leach | 246 | 76 | 5 | 35 | 39 | 13 | 6 | 912 | 234 | 12 | 102 | 96 | 80 | A | E | 44 | 0 | 1 | 250.000 | A |
-Rick Manning | 205 | 52 | 8 | 31 | 27 | 17 | 12 | 5134 | 1323 | 56 | 643 | 445 | 459 | A | E | 155 | 3 | 2 | 400.000 | A |
-Rance Mulliniks | 348 | 90 | 11 | 50 | 45 | 43 | 10 | 2288 | 614 | 43 | 295 | 273 | 269 | A | E | 60 | 176 | 6 | 450.000 | A |
-Ron Oester | 523 | 135 | 8 | 52 | 44 | 52 | 9 | 3368 | 895 | 39 | 377 | 284 | 296 | N | W | 367 | 475 | 19 | 750.000 | N |
-Rey Quinones | 312 | 68 | 2 | 32 | 22 | 24 | 1 | 312 | 68 | 2 | 32 | 22 | 24 | A | E | 86 | 150 | 15 | 70.000 | A |
-Rafael Ramirez | 496 | 119 | 8 | 57 | 33 | 21 | 7 | 3358 | 882 | 36 | 365 | 280 | 165 | N | W | 155 | 371 | 29 | 875.000 | N |
-Ronn Reynolds | 126 | 27 | 3 | 8 | 10 | 5 | 4 | 239 | 49 | 3 | 16 | 13 | 14 | N | E | 190 | 2 | 9 | 190.000 | N |
-Ron Roenicke | 275 | 68 | 5 | 42 | 42 | 61 | 6 | 961 | 238 | 16 | 128 | 104 | 172 | N | E | 181 | 3 | 2 | 191.000 | N |
-Ryne Sandberg | 627 | 178 | 14 | 68 | 76 | 46 | 6 | 3146 | 902 | 74 | 494 | 345 | 242 | N | E | 309 | 492 | 5 | 740.000 | N |
-Rafael Santana | 394 | 86 | 1 | 38 | 28 | 36 | 4 | 1089 | 267 | 3 | 94 | 71 | 76 | N | E | 203 | 369 | 16 | 250.000 | N |
-Rick Schu | 208 | 57 | 8 | 32 | 25 | 18 | 3 | 653 | 170 | 17 | 98 | 54 | 62 | N | E | 42 | 94 | 13 | 140.000 | N |
-Ruben Sierra | 382 | 101 | 16 | 50 | 55 | 22 | 1 | 382 | 101 | 16 | 50 | 55 | 22 | A | W | 200 | 7 | 6 | 97.500 | A |
-Roy Smalley | 459 | 113 | 20 | 59 | 57 | 68 | 12 | 5348 | 1369 | 155 | 713 | 660 | 735 | A | W | 0 | 0 | 0 | 740.000 | A |
-Robby Thompson | 549 | 149 | 7 | 73 | 47 | 42 | 1 | 549 | 149 | 7 | 73 | 47 | 42 | N | W | 255 | 450 | 17 | 140.000 | N |
-Rob Wilfong | 288 | 63 | 3 | 25 | 33 | 16 | 10 | 2682 | 667 | 38 | 315 | 259 | 204 | A | W | 135 | 257 | 7 | 341.667 | A |
-Reggie Williams | 303 | 84 | 4 | 35 | 32 | 23 | 2 | 312 | 87 | 4 | 39 | 32 | 23 | N | W | 179 | 5 | 3 | NA | N |
-Robin Yount | 522 | 163 | 9 | 82 | 46 | 62 | 13 | 7037 | 2019 | 153 | 1043 | 827 | 535 | A | E | 352 | 9 | 1 | 1000.000 | A |
-Steve Balboni | 512 | 117 | 29 | 54 | 88 | 43 | 6 | 1750 | 412 | 100 | 204 | 276 | 155 | A | W | 1236 | 98 | 18 | 100.000 | A |
-Scott Bradley | 220 | 66 | 5 | 20 | 28 | 13 | 3 | 290 | 80 | 5 | 27 | 31 | 15 | A | W | 281 | 21 | 3 | 90.000 | A |
-Sid Bream | 522 | 140 | 16 | 73 | 77 | 60 | 4 | 730 | 185 | 22 | 93 | 106 | 86 | N | E | 1320 | 166 | 17 | 200.000 | N |
-Steve Buechele | 461 | 112 | 18 | 54 | 54 | 35 | 2 | 680 | 160 | 24 | 76 | 75 | 49 | A | W | 111 | 226 | 11 | 135.000 | A |
-Shawon Dunston | 581 | 145 | 17 | 66 | 68 | 21 | 2 | 831 | 210 | 21 | 106 | 86 | 40 | N | E | 320 | 465 | 32 | 155.000 | N |
-Scott Fletcher | 530 | 159 | 3 | 82 | 50 | 47 | 6 | 1619 | 426 | 11 | 218 | 149 | 163 | A | W | 196 | 354 | 15 | 475.000 | A |
-Steve Garvey | 557 | 142 | 21 | 58 | 81 | 23 | 18 | 8759 | 2583 | 271 | 1138 | 1299 | 478 | N | W | 1160 | 53 | 7 | 1450.000 | N |
-Steve Jeltz | 439 | 96 | 0 | 44 | 36 | 65 | 4 | 711 | 148 | 1 | 68 | 56 | 99 | N | E | 229 | 406 | 22 | 150.000 | N |
-Steve Lombardozzi | 453 | 103 | 8 | 53 | 33 | 52 | 2 | 507 | 123 | 8 | 63 | 39 | 58 | A | W | 289 | 407 | 6 | 105.000 | A |
-Spike Owen | 528 | 122 | 1 | 67 | 45 | 51 | 4 | 1716 | 403 | 12 | 211 | 146 | 155 | A | W | 209 | 372 | 17 | 350.000 | A |
-Steve Sax | 633 | 210 | 6 | 91 | 56 | 59 | 6 | 3070 | 872 | 19 | 420 | 230 | 274 | N | W | 367 | 432 | 16 | 90.000 | N |
-Tony Armas | 16 | 2 | 0 | 1 | 0 | 0 | 2 | 28 | 4 | 0 | 1 | 0 | 0 | A | E | 247 | 4 | 8 | NA | A |
-Tony Bernazard | 562 | 169 | 17 | 88 | 73 | 53 | 8 | 3181 | 841 | 61 | 450 | 342 | 373 | A | E | 351 | 442 | 17 | 530.000 | A |
-Tom Brookens | 281 | 76 | 3 | 42 | 25 | 20 | 8 | 2658 | 657 | 48 | 324 | 300 | 179 | A | E | 106 | 144 | 7 | 341.667 | A |
-Tom Brunansky | 593 | 152 | 23 | 69 | 75 | 53 | 6 | 2765 | 686 | 133 | 369 | 384 | 321 | A | W | 315 | 10 | 6 | 940.000 | A |
-Tony Fernandez | 687 | 213 | 10 | 91 | 65 | 27 | 4 | 1518 | 448 | 15 | 196 | 137 | 89 | A | E | 294 | 445 | 13 | 350.000 | A |
-Tim Flannery | 368 | 103 | 3 | 48 | 28 | 54 | 8 | 1897 | 493 | 9 | 207 | 162 | 198 | N | W | 209 | 246 | 3 | 326.667 | N |
-Tom Foley | 263 | 70 | 1 | 26 | 23 | 30 | 4 | 888 | 220 | 9 | 83 | 82 | 86 | N | E | 81 | 147 | 4 | 250.000 | N |
-Tony Gwynn | 642 | 211 | 14 | 107 | 59 | 52 | 5 | 2364 | 770 | 27 | 352 | 230 | 193 | N | W | 337 | 19 | 4 | 740.000 | N |
-Terry Harper | 265 | 68 | 8 | 26 | 30 | 29 | 7 | 1337 | 339 | 32 | 135 | 163 | 128 | N | W | 92 | 5 | 3 | 425.000 | A |
-Toby Harrah | 289 | 63 | 7 | 36 | 41 | 44 | 17 | 7402 | 1954 | 195 | 1115 | 919 | 1153 | A | W | 166 | 211 | 7 | NA | A |
-Tommy Herr | 559 | 141 | 2 | 48 | 61 | 73 | 8 | 3162 | 874 | 16 | 421 | 349 | 359 | N | E | 352 | 414 | 9 | 925.000 | N |
-Tim Hulett | 520 | 120 | 17 | 53 | 44 | 21 | 4 | 927 | 227 | 22 | 106 | 80 | 52 | A | W | 70 | 144 | 11 | 185.000 | A |
-Terry Kennedy | 19 | 4 | 1 | 2 | 3 | 1 | 1 | 19 | 4 | 1 | 2 | 3 | 1 | N | W | 692 | 70 | 8 | 920.000 | A |
-Tito Landrum | 205 | 43 | 2 | 24 | 17 | 20 | 7 | 854 | 219 | 12 | 105 | 99 | 71 | N | E | 131 | 6 | 1 | 286.667 | N |
-Tim Laudner | 193 | 47 | 10 | 21 | 29 | 24 | 6 | 1136 | 256 | 42 | 129 | 139 | 106 | A | W | 299 | 13 | 5 | 245.000 | A |
-Tom O’Malley | 181 | 46 | 1 | 19 | 18 | 17 | 5 | 937 | 238 | 9 | 88 | 95 | 104 | A | E | 37 | 98 | 9 | NA | A |
-Tom Paciorek | 213 | 61 | 4 | 17 | 22 | 3 | 17 | 4061 | 1145 | 83 | 488 | 491 | 244 | A | W | 178 | 45 | 4 | 235.000 | A |
-Tony Pena | 510 | 147 | 10 | 56 | 52 | 53 | 7 | 2872 | 821 | 63 | 307 | 340 | 174 | N | E | 810 | 99 | 18 | 1150.000 | N |
-Terry Pendleton | 578 | 138 | 1 | 56 | 59 | 34 | 3 | 1399 | 357 | 7 | 149 | 161 | 87 | N | E | 133 | 371 | 20 | 160.000 | N |
-Tony Perez | 200 | 51 | 2 | 14 | 29 | 25 | 23 | 9778 | 2732 | 379 | 1272 | 1652 | 925 | N | W | 398 | 29 | 7 | NA | N |
-Tony Phillips | 441 | 113 | 5 | 76 | 52 | 76 | 5 | 1546 | 397 | 17 | 226 | 149 | 191 | A | W | 160 | 290 | 11 | 425.000 | A |
-Terry Puhl | 172 | 42 | 3 | 17 | 14 | 15 | 10 | 4086 | 1150 | 57 | 579 | 363 | 406 | N | W | 65 | 0 | 0 | 900.000 | N |
-Tim Raines | 580 | 194 | 9 | 91 | 62 | 78 | 8 | 3372 | 1028 | 48 | 604 | 314 | 469 | N | E | 270 | 13 | 6 | NA | N |
-Ted Simmons | 127 | 32 | 4 | 14 | 25 | 12 | 19 | 8396 | 2402 | 242 | 1048 | 1348 | 819 | N | W | 167 | 18 | 6 | 500.000 | N |
-Tim Teufel | 279 | 69 | 4 | 35 | 31 | 32 | 4 | 1359 | 355 | 31 | 180 | 148 | 158 | N | E | 133 | 173 | 9 | 277.500 | N |
-Tim Wallach | 480 | 112 | 18 | 50 | 71 | 44 | 7 | 3031 | 771 | 110 | 338 | 406 | 239 | N | E | 94 | 270 | 16 | 750.000 | N |
-Vince Coleman | 600 | 139 | 0 | 94 | 29 | 60 | 2 | 1236 | 309 | 1 | 201 | 69 | 110 | N | E | 300 | 12 | 9 | 160.000 | N |
-Von Hayes | 610 | 186 | 19 | 107 | 98 | 74 | 6 | 2728 | 753 | 69 | 399 | 366 | 286 | N | E | 1182 | 96 | 13 | 1300.000 | N |
-Vance Law | 360 | 81 | 5 | 37 | 44 | 37 | 7 | 2268 | 566 | 41 | 279 | 257 | 246 | N | E | 170 | 284 | 3 | 525.000 | N |
-Wally Backman | 387 | 124 | 1 | 67 | 27 | 36 | 7 | 1775 | 506 | 6 | 272 | 125 | 194 | N | E | 186 | 290 | 17 | 550.000 | N |
-Wade Boggs | 580 | 207 | 8 | 107 | 71 | 105 | 5 | 2778 | 978 | 32 | 474 | 322 | 417 | A | E | 121 | 267 | 19 | 1600.000 | A |
-Will Clark | 408 | 117 | 11 | 66 | 41 | 34 | 1 | 408 | 117 | 11 | 66 | 41 | 34 | N | W | 942 | 72 | 11 | 120.000 | N |
-Wally Joyner | 593 | 172 | 22 | 82 | 100 | 57 | 1 | 593 | 172 | 22 | 82 | 100 | 57 | A | W | 1222 | 139 | 15 | 165.000 | A |
-Wayne Krenchicki | 221 | 53 | 2 | 21 | 23 | 22 | 8 | 1063 | 283 | 15 | 107 | 124 | 106 | N | E | 325 | 58 | 6 | NA | N |
-Willie McGee | 497 | 127 | 7 | 65 | 48 | 37 | 5 | 2703 | 806 | 32 | 379 | 311 | 138 | N | E | 325 | 9 | 3 | 700.000 | N |
-Willie Randolph | 492 | 136 | 5 | 76 | 50 | 94 | 12 | 5511 | 1511 | 39 | 897 | 451 | 875 | A | E | 313 | 381 | 20 | 875.000 | A |
-Wayne Tolleson | 475 | 126 | 3 | 61 | 43 | 52 | 6 | 1700 | 433 | 7 | 217 | 93 | 146 | A | W | 37 | 113 | 7 | 385.000 | A |
-Willie Upshaw | 573 | 144 | 9 | 85 | 60 | 78 | 8 | 3198 | 857 | 97 | 470 | 420 | 332 | A | E | 1314 | 131 | 12 | 960.000 | A |
-Willie Wilson | 631 | 170 | 9 | 77 | 44 | 31 | 11 | 4908 | 1457 | 30 | 775 | 357 | 249 | A | W | 408 | 4 | 3 | 1000.000 | A |
41.8.2 Heart数据
Age | Sex | ChestPain | RestBP | Chol | Fbs | RestECG | MaxHR | ExAng | Oldpeak | Slope | Ca | Thal | AHD |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
63 | 1 | typical | 145 | 233 | 1 | 2 | 150 | 0 | 2.3 | 3 | 0 | fixed | No |
67 | 1 | asymptomatic | 160 | 286 | 0 | 2 | 108 | 1 | 1.5 | 2 | 3 | normal | Yes |
67 | 1 | asymptomatic | 120 | 229 | 0 | 2 | 129 | 1 | 2.6 | 2 | 2 | reversable | Yes |
37 | 1 | nonanginal | 130 | 250 | 0 | 0 | 187 | 0 | 3.5 | 3 | 0 | normal | No |
41 | 0 | nontypical | 130 | 204 | 0 | 2 | 172 | 0 | 1.4 | 1 | 0 | normal | No |
56 | 1 | nontypical | 120 | 236 | 0 | 0 | 178 | 0 | 0.8 | 1 | 0 | normal | No |
62 | 0 | asymptomatic | 140 | 268 | 0 | 2 | 160 | 0 | 3.6 | 3 | 2 | normal | Yes |
57 | 0 | asymptomatic | 120 | 354 | 0 | 0 | 163 | 1 | 0.6 | 1 | 0 | normal | No |
63 | 1 | asymptomatic | 130 | 254 | 0 | 2 | 147 | 0 | 1.4 | 2 | 1 | reversable | Yes |
53 | 1 | asymptomatic | 140 | 203 | 1 | 2 | 155 | 1 | 3.1 | 3 | 0 | reversable | Yes |
57 | 1 | asymptomatic | 140 | 192 | 0 | 0 | 148 | 0 | 0.4 | 2 | 0 | fixed | No |
56 | 0 | nontypical | 140 | 294 | 0 | 2 | 153 | 0 | 1.3 | 2 | 0 | normal | No |
56 | 1 | nonanginal | 130 | 256 | 1 | 2 | 142 | 1 | 0.6 | 2 | 1 | fixed | Yes |
44 | 1 | nontypical | 120 | 263 | 0 | 0 | 173 | 0 | 0.0 | 1 | 0 | reversable | No |
52 | 1 | nonanginal | 172 | 199 | 1 | 0 | 162 | 0 | 0.5 | 1 | 0 | reversable | No |
57 | 1 | nonanginal | 150 | 168 | 0 | 0 | 174 | 0 | 1.6 | 1 | 0 | normal | No |
48 | 1 | nontypical | 110 | 229 | 0 | 0 | 168 | 0 | 1.0 | 3 | 0 | reversable | Yes |
54 | 1 | asymptomatic | 140 | 239 | 0 | 0 | 160 | 0 | 1.2 | 1 | 0 | normal | No |
48 | 0 | nonanginal | 130 | 275 | 0 | 0 | 139 | 0 | 0.2 | 1 | 0 | normal | No |
49 | 1 | nontypical | 130 | 266 | 0 | 0 | 171 | 0 | 0.6 | 1 | 0 | normal | No |
64 | 1 | typical | 110 | 211 | 0 | 2 | 144 | 1 | 1.8 | 2 | 0 | normal | No |
58 | 0 | typical | 150 | 283 | 1 | 2 | 162 | 0 | 1.0 | 1 | 0 | normal | No |
58 | 1 | nontypical | 120 | 284 | 0 | 2 | 160 | 0 | 1.8 | 2 | 0 | normal | Yes |
58 | 1 | nonanginal | 132 | 224 | 0 | 2 | 173 | 0 | 3.2 | 1 | 2 | reversable | Yes |
60 | 1 | asymptomatic | 130 | 206 | 0 | 2 | 132 | 1 | 2.4 | 2 | 2 | reversable | Yes |
50 | 0 | nonanginal | 120 | 219 | 0 | 0 | 158 | 0 | 1.6 | 2 | 0 | normal | No |
58 | 0 | nonanginal | 120 | 340 | 0 | 0 | 172 | 0 | 0.0 | 1 | 0 | normal | No |
66 | 0 | typical | 150 | 226 | 0 | 0 | 114 | 0 | 2.6 | 3 | 0 | normal | No |
43 | 1 | asymptomatic | 150 | 247 | 0 | 0 | 171 | 0 | 1.5 | 1 | 0 | normal | No |
40 | 1 | asymptomatic | 110 | 167 | 0 | 2 | 114 | 1 | 2.0 | 2 | 0 | reversable | Yes |
69 | 0 | typical | 140 | 239 | 0 | 0 | 151 | 0 | 1.8 | 1 | 2 | normal | No |
60 | 1 | asymptomatic | 117 | 230 | 1 | 0 | 160 | 1 | 1.4 | 1 | 2 | reversable | Yes |
64 | 1 | nonanginal | 140 | 335 | 0 | 0 | 158 | 0 | 0.0 | 1 | 0 | normal | Yes |
59 | 1 | asymptomatic | 135 | 234 | 0 | 0 | 161 | 0 | 0.5 | 2 | 0 | reversable | No |
44 | 1 | nonanginal | 130 | 233 | 0 | 0 | 179 | 1 | 0.4 | 1 | 0 | normal | No |
42 | 1 | asymptomatic | 140 | 226 | 0 | 0 | 178 | 0 | 0.0 | 1 | 0 | normal | No |
43 | 1 | asymptomatic | 120 | 177 | 0 | 2 | 120 | 1 | 2.5 | 2 | 0 | reversable | Yes |
57 | 1 | asymptomatic | 150 | 276 | 0 | 2 | 112 | 1 | 0.6 | 2 | 1 | fixed | Yes |
55 | 1 | asymptomatic | 132 | 353 | 0 | 0 | 132 | 1 | 1.2 | 2 | 1 | reversable | Yes |
61 | 1 | nonanginal | 150 | 243 | 1 | 0 | 137 | 1 | 1.0 | 2 | 0 | normal | No |
65 | 0 | asymptomatic | 150 | 225 | 0 | 2 | 114 | 0 | 1.0 | 2 | 3 | reversable | Yes |
40 | 1 | typical | 140 | 199 | 0 | 0 | 178 | 1 | 1.4 | 1 | 0 | reversable | No |
71 | 0 | nontypical | 160 | 302 | 0 | 0 | 162 | 0 | 0.4 | 1 | 2 | normal | No |
59 | 1 | nonanginal | 150 | 212 | 1 | 0 | 157 | 0 | 1.6 | 1 | 0 | normal | No |
61 | 0 | asymptomatic | 130 | 330 | 0 | 2 | 169 | 0 | 0.0 | 1 | 0 | normal | Yes |
58 | 1 | nonanginal | 112 | 230 | 0 | 2 | 165 | 0 | 2.5 | 2 | 1 | reversable | Yes |
51 | 1 | nonanginal | 110 | 175 | 0 | 0 | 123 | 0 | 0.6 | 1 | 0 | normal | No |
50 | 1 | asymptomatic | 150 | 243 | 0 | 2 | 128 | 0 | 2.6 | 2 | 0 | reversable | Yes |
65 | 0 | nonanginal | 140 | 417 | 1 | 2 | 157 | 0 | 0.8 | 1 | 1 | normal | No |
53 | 1 | nonanginal | 130 | 197 | 1 | 2 | 152 | 0 | 1.2 | 3 | 0 | normal | No |
41 | 0 | nontypical | 105 | 198 | 0 | 0 | 168 | 0 | 0.0 | 1 | 1 | normal | No |
65 | 1 | asymptomatic | 120 | 177 | 0 | 0 | 140 | 0 | 0.4 | 1 | 0 | reversable | No |
44 | 1 | asymptomatic | 112 | 290 | 0 | 2 | 153 | 0 | 0.0 | 1 | 1 | normal | Yes |
44 | 1 | nontypical | 130 | 219 | 0 | 2 | 188 | 0 | 0.0 | 1 | 0 | normal | No |
60 | 1 | asymptomatic | 130 | 253 | 0 | 0 | 144 | 1 | 1.4 | 1 | 1 | reversable | Yes |
54 | 1 | asymptomatic | 124 | 266 | 0 | 2 | 109 | 1 | 2.2 | 2 | 1 | reversable | Yes |
50 | 1 | nonanginal | 140 | 233 | 0 | 0 | 163 | 0 | 0.6 | 2 | 1 | reversable | Yes |
41 | 1 | asymptomatic | 110 | 172 | 0 | 2 | 158 | 0 | 0.0 | 1 | 0 | reversable | Yes |
54 | 1 | nonanginal | 125 | 273 | 0 | 2 | 152 | 0 | 0.5 | 3 | 1 | normal | No |
51 | 1 | typical | 125 | 213 | 0 | 2 | 125 | 1 | 1.4 | 1 | 1 | normal | No |
51 | 0 | asymptomatic | 130 | 305 | 0 | 0 | 142 | 1 | 1.2 | 2 | 0 | reversable | Yes |
46 | 0 | nonanginal | 142 | 177 | 0 | 2 | 160 | 1 | 1.4 | 3 | 0 | normal | No |
58 | 1 | asymptomatic | 128 | 216 | 0 | 2 | 131 | 1 | 2.2 | 2 | 3 | reversable | Yes |
54 | 0 | nonanginal | 135 | 304 | 1 | 0 | 170 | 0 | 0.0 | 1 | 0 | normal | No |
54 | 1 | asymptomatic | 120 | 188 | 0 | 0 | 113 | 0 | 1.4 | 2 | 1 | reversable | Yes |
60 | 1 | asymptomatic | 145 | 282 | 0 | 2 | 142 | 1 | 2.8 | 2 | 2 | reversable | Yes |
60 | 1 | nonanginal | 140 | 185 | 0 | 2 | 155 | 0 | 3.0 | 2 | 0 | normal | Yes |
54 | 1 | nonanginal | 150 | 232 | 0 | 2 | 165 | 0 | 1.6 | 1 | 0 | reversable | No |
59 | 1 | asymptomatic | 170 | 326 | 0 | 2 | 140 | 1 | 3.4 | 3 | 0 | reversable | Yes |
46 | 1 | nonanginal | 150 | 231 | 0 | 0 | 147 | 0 | 3.6 | 2 | 0 | normal | Yes |
65 | 0 | nonanginal | 155 | 269 | 0 | 0 | 148 | 0 | 0.8 | 1 | 0 | normal | No |
67 | 1 | asymptomatic | 125 | 254 | 1 | 0 | 163 | 0 | 0.2 | 2 | 2 | reversable | Yes |
62 | 1 | asymptomatic | 120 | 267 | 0 | 0 | 99 | 1 | 1.8 | 2 | 2 | reversable | Yes |
65 | 1 | asymptomatic | 110 | 248 | 0 | 2 | 158 | 0 | 0.6 | 1 | 2 | fixed | Yes |
44 | 1 | asymptomatic | 110 | 197 | 0 | 2 | 177 | 0 | 0.0 | 1 | 1 | normal | Yes |
65 | 0 | nonanginal | 160 | 360 | 0 | 2 | 151 | 0 | 0.8 | 1 | 0 | normal | No |
60 | 1 | asymptomatic | 125 | 258 | 0 | 2 | 141 | 1 | 2.8 | 2 | 1 | reversable | Yes |
51 | 0 | nonanginal | 140 | 308 | 0 | 2 | 142 | 0 | 1.5 | 1 | 1 | normal | No |
48 | 1 | nontypical | 130 | 245 | 0 | 2 | 180 | 0 | 0.2 | 2 | 0 | normal | No |
58 | 1 | asymptomatic | 150 | 270 | 0 | 2 | 111 | 1 | 0.8 | 1 | 0 | reversable | Yes |
45 | 1 | asymptomatic | 104 | 208 | 0 | 2 | 148 | 1 | 3.0 | 2 | 0 | normal | No |
53 | 0 | asymptomatic | 130 | 264 | 0 | 2 | 143 | 0 | 0.4 | 2 | 0 | normal | No |
39 | 1 | nonanginal | 140 | 321 | 0 | 2 | 182 | 0 | 0.0 | 1 | 0 | normal | No |
68 | 1 | nonanginal | 180 | 274 | 1 | 2 | 150 | 1 | 1.6 | 2 | 0 | reversable | Yes |
52 | 1 | nontypical | 120 | 325 | 0 | 0 | 172 | 0 | 0.2 | 1 | 0 | normal | No |
44 | 1 | nonanginal | 140 | 235 | 0 | 2 | 180 | 0 | 0.0 | 1 | 0 | normal | No |
47 | 1 | nonanginal | 138 | 257 | 0 | 2 | 156 | 0 | 0.0 | 1 | 0 | normal | No |
53 | 0 | nonanginal | 128 | 216 | 0 | 2 | 115 | 0 | 0.0 | 1 | 0 | NA | No |
53 | 0 | asymptomatic | 138 | 234 | 0 | 2 | 160 | 0 | 0.0 | 1 | 0 | normal | No |
51 | 0 | nonanginal | 130 | 256 | 0 | 2 | 149 | 0 | 0.5 | 1 | 0 | normal | No |
66 | 1 | asymptomatic | 120 | 302 | 0 | 2 | 151 | 0 | 0.4 | 2 | 0 | normal | No |
62 | 0 | asymptomatic | 160 | 164 | 0 | 2 | 145 | 0 | 6.2 | 3 | 3 | reversable | Yes |
62 | 1 | nonanginal | 130 | 231 | 0 | 0 | 146 | 0 | 1.8 | 2 | 3 | reversable | No |
44 | 0 | nonanginal | 108 | 141 | 0 | 0 | 175 | 0 | 0.6 | 2 | 0 | normal | No |
63 | 0 | nonanginal | 135 | 252 | 0 | 2 | 172 | 0 | 0.0 | 1 | 0 | normal | No |
52 | 1 | asymptomatic | 128 | 255 | 0 | 0 | 161 | 1 | 0.0 | 1 | 1 | reversable | Yes |
59 | 1 | asymptomatic | 110 | 239 | 0 | 2 | 142 | 1 | 1.2 | 2 | 1 | reversable | Yes |
60 | 0 | asymptomatic | 150 | 258 | 0 | 2 | 157 | 0 | 2.6 | 2 | 2 | reversable | Yes |
52 | 1 | nontypical | 134 | 201 | 0 | 0 | 158 | 0 | 0.8 | 1 | 1 | normal | No |
48 | 1 | asymptomatic | 122 | 222 | 0 | 2 | 186 | 0 | 0.0 | 1 | 0 | normal | No |
45 | 1 | asymptomatic | 115 | 260 | 0 | 2 | 185 | 0 | 0.0 | 1 | 0 | normal | No |
34 | 1 | typical | 118 | 182 | 0 | 2 | 174 | 0 | 0.0 | 1 | 0 | normal | No |
57 | 0 | asymptomatic | 128 | 303 | 0 | 2 | 159 | 0 | 0.0 | 1 | 1 | normal | No |
71 | 0 | nonanginal | 110 | 265 | 1 | 2 | 130 | 0 | 0.0 | 1 | 1 | normal | No |
49 | 1 | nonanginal | 120 | 188 | 0 | 0 | 139 | 0 | 2.0 | 2 | 3 | reversable | Yes |
54 | 1 | nontypical | 108 | 309 | 0 | 0 | 156 | 0 | 0.0 | 1 | 0 | reversable | No |
59 | 1 | asymptomatic | 140 | 177 | 0 | 0 | 162 | 1 | 0.0 | 1 | 1 | reversable | Yes |
57 | 1 | nonanginal | 128 | 229 | 0 | 2 | 150 | 0 | 0.4 | 2 | 1 | reversable | Yes |
61 | 1 | asymptomatic | 120 | 260 | 0 | 0 | 140 | 1 | 3.6 | 2 | 1 | reversable | Yes |
39 | 1 | asymptomatic | 118 | 219 | 0 | 0 | 140 | 0 | 1.2 | 2 | 0 | reversable | Yes |
61 | 0 | asymptomatic | 145 | 307 | 0 | 2 | 146 | 1 | 1.0 | 2 | 0 | reversable | Yes |
56 | 1 | asymptomatic | 125 | 249 | 1 | 2 | 144 | 1 | 1.2 | 2 | 1 | normal | Yes |
52 | 1 | typical | 118 | 186 | 0 | 2 | 190 | 0 | 0.0 | 2 | 0 | fixed | No |
43 | 0 | asymptomatic | 132 | 341 | 1 | 2 | 136 | 1 | 3.0 | 2 | 0 | reversable | Yes |
62 | 0 | nonanginal | 130 | 263 | 0 | 0 | 97 | 0 | 1.2 | 2 | 1 | reversable | Yes |
41 | 1 | nontypical | 135 | 203 | 0 | 0 | 132 | 0 | 0.0 | 2 | 0 | fixed | No |
58 | 1 | nonanginal | 140 | 211 | 1 | 2 | 165 | 0 | 0.0 | 1 | 0 | normal | No |
35 | 0 | asymptomatic | 138 | 183 | 0 | 0 | 182 | 0 | 1.4 | 1 | 0 | normal | No |
63 | 1 | asymptomatic | 130 | 330 | 1 | 2 | 132 | 1 | 1.8 | 1 | 3 | reversable | Yes |
65 | 1 | asymptomatic | 135 | 254 | 0 | 2 | 127 | 0 | 2.8 | 2 | 1 | reversable | Yes |
48 | 1 | asymptomatic | 130 | 256 | 1 | 2 | 150 | 1 | 0.0 | 1 | 2 | reversable | Yes |
63 | 0 | asymptomatic | 150 | 407 | 0 | 2 | 154 | 0 | 4.0 | 2 | 3 | reversable | Yes |
51 | 1 | nonanginal | 100 | 222 | 0 | 0 | 143 | 1 | 1.2 | 2 | 0 | normal | No |
55 | 1 | asymptomatic | 140 | 217 | 0 | 0 | 111 | 1 | 5.6 | 3 | 0 | reversable | Yes |
65 | 1 | typical | 138 | 282 | 1 | 2 | 174 | 0 | 1.4 | 2 | 1 | normal | Yes |
45 | 0 | nontypical | 130 | 234 | 0 | 2 | 175 | 0 | 0.6 | 2 | 0 | normal | No |
56 | 0 | asymptomatic | 200 | 288 | 1 | 2 | 133 | 1 | 4.0 | 3 | 2 | reversable | Yes |
54 | 1 | asymptomatic | 110 | 239 | 0 | 0 | 126 | 1 | 2.8 | 2 | 1 | reversable | Yes |
44 | 1 | nontypical | 120 | 220 | 0 | 0 | 170 | 0 | 0.0 | 1 | 0 | normal | No |
62 | 0 | asymptomatic | 124 | 209 | 0 | 0 | 163 | 0 | 0.0 | 1 | 0 | normal | No |
54 | 1 | nonanginal | 120 | 258 | 0 | 2 | 147 | 0 | 0.4 | 2 | 0 | reversable | No |
51 | 1 | nonanginal | 94 | 227 | 0 | 0 | 154 | 1 | 0.0 | 1 | 1 | reversable | No |
29 | 1 | nontypical | 130 | 204 | 0 | 2 | 202 | 0 | 0.0 | 1 | 0 | normal | No |
51 | 1 | asymptomatic | 140 | 261 | 0 | 2 | 186 | 1 | 0.0 | 1 | 0 | normal | No |
43 | 0 | nonanginal | 122 | 213 | 0 | 0 | 165 | 0 | 0.2 | 2 | 0 | normal | No |
55 | 0 | nontypical | 135 | 250 | 0 | 2 | 161 | 0 | 1.4 | 2 | 0 | normal | No |
70 | 1 | asymptomatic | 145 | 174 | 0 | 0 | 125 | 1 | 2.6 | 3 | 0 | reversable | Yes |
62 | 1 | nontypical | 120 | 281 | 0 | 2 | 103 | 0 | 1.4 | 2 | 1 | reversable | Yes |
35 | 1 | asymptomatic | 120 | 198 | 0 | 0 | 130 | 1 | 1.6 | 2 | 0 | reversable | Yes |
51 | 1 | nonanginal | 125 | 245 | 1 | 2 | 166 | 0 | 2.4 | 2 | 0 | normal | No |
59 | 1 | nontypical | 140 | 221 | 0 | 0 | 164 | 1 | 0.0 | 1 | 0 | normal | No |
59 | 1 | typical | 170 | 288 | 0 | 2 | 159 | 0 | 0.2 | 2 | 0 | reversable | Yes |
52 | 1 | nontypical | 128 | 205 | 1 | 0 | 184 | 0 | 0.0 | 1 | 0 | normal | No |
64 | 1 | nonanginal | 125 | 309 | 0 | 0 | 131 | 1 | 1.8 | 2 | 0 | reversable | Yes |
58 | 1 | nonanginal | 105 | 240 | 0 | 2 | 154 | 1 | 0.6 | 2 | 0 | reversable | No |
47 | 1 | nonanginal | 108 | 243 | 0 | 0 | 152 | 0 | 0.0 | 1 | 0 | normal | Yes |
57 | 1 | asymptomatic | 165 | 289 | 1 | 2 | 124 | 0 | 1.0 | 2 | 3 | reversable | Yes |
41 | 1 | nonanginal | 112 | 250 | 0 | 0 | 179 | 0 | 0.0 | 1 | 0 | normal | No |
45 | 1 | nontypical | 128 | 308 | 0 | 2 | 170 | 0 | 0.0 | 1 | 0 | normal | No |
60 | 0 | nonanginal | 102 | 318 | 0 | 0 | 160 | 0 | 0.0 | 1 | 1 | normal | No |
52 | 1 | typical | 152 | 298 | 1 | 0 | 178 | 0 | 1.2 | 2 | 0 | reversable | No |
42 | 0 | asymptomatic | 102 | 265 | 0 | 2 | 122 | 0 | 0.6 | 2 | 0 | normal | No |
67 | 0 | nonanginal | 115 | 564 | 0 | 2 | 160 | 0 | 1.6 | 2 | 0 | reversable | No |
55 | 1 | asymptomatic | 160 | 289 | 0 | 2 | 145 | 1 | 0.8 | 2 | 1 | reversable | Yes |
64 | 1 | asymptomatic | 120 | 246 | 0 | 2 | 96 | 1 | 2.2 | 3 | 1 | normal | Yes |
70 | 1 | asymptomatic | 130 | 322 | 0 | 2 | 109 | 0 | 2.4 | 2 | 3 | normal | Yes |
51 | 1 | asymptomatic | 140 | 299 | 0 | 0 | 173 | 1 | 1.6 | 1 | 0 | reversable | Yes |
58 | 1 | asymptomatic | 125 | 300 | 0 | 2 | 171 | 0 | 0.0 | 1 | 2 | reversable | Yes |
60 | 1 | asymptomatic | 140 | 293 | 0 | 2 | 170 | 0 | 1.2 | 2 | 2 | reversable | Yes |
68 | 1 | nonanginal | 118 | 277 | 0 | 0 | 151 | 0 | 1.0 | 1 | 1 | reversable | No |
46 | 1 | nontypical | 101 | 197 | 1 | 0 | 156 | 0 | 0.0 | 1 | 0 | reversable | No |
77 | 1 | asymptomatic | 125 | 304 | 0 | 2 | 162 | 1 | 0.0 | 1 | 3 | normal | Yes |
54 | 0 | nonanginal | 110 | 214 | 0 | 0 | 158 | 0 | 1.6 | 2 | 0 | normal | No |
58 | 0 | asymptomatic | 100 | 248 | 0 | 2 | 122 | 0 | 1.0 | 2 | 0 | normal | No |
48 | 1 | nonanginal | 124 | 255 | 1 | 0 | 175 | 0 | 0.0 | 1 | 2 | normal | No |
57 | 1 | asymptomatic | 132 | 207 | 0 | 0 | 168 | 1 | 0.0 | 1 | 0 | reversable | No |
52 | 1 | nonanginal | 138 | 223 | 0 | 0 | 169 | 0 | 0.0 | 1 | NA | normal | No |
54 | 0 | nontypical | 132 | 288 | 1 | 2 | 159 | 1 | 0.0 | 1 | 1 | normal | No |
35 | 1 | asymptomatic | 126 | 282 | 0 | 2 | 156 | 1 | 0.0 | 1 | 0 | reversable | Yes |
45 | 0 | nontypical | 112 | 160 | 0 | 0 | 138 | 0 | 0.0 | 2 | 0 | normal | No |
70 | 1 | nonanginal | 160 | 269 | 0 | 0 | 112 | 1 | 2.9 | 2 | 1 | reversable | Yes |
53 | 1 | asymptomatic | 142 | 226 | 0 | 2 | 111 | 1 | 0.0 | 1 | 0 | reversable | No |
59 | 0 | asymptomatic | 174 | 249 | 0 | 0 | 143 | 1 | 0.0 | 2 | 0 | normal | Yes |
62 | 0 | asymptomatic | 140 | 394 | 0 | 2 | 157 | 0 | 1.2 | 2 | 0 | normal | No |
64 | 1 | asymptomatic | 145 | 212 | 0 | 2 | 132 | 0 | 2.0 | 2 | 2 | fixed | Yes |
57 | 1 | asymptomatic | 152 | 274 | 0 | 0 | 88 | 1 | 1.2 | 2 | 1 | reversable | Yes |
52 | 1 | asymptomatic | 108 | 233 | 1 | 0 | 147 | 0 | 0.1 | 1 | 3 | reversable | No |
56 | 1 | asymptomatic | 132 | 184 | 0 | 2 | 105 | 1 | 2.1 | 2 | 1 | fixed | Yes |
43 | 1 | nonanginal | 130 | 315 | 0 | 0 | 162 | 0 | 1.9 | 1 | 1 | normal | No |
53 | 1 | nonanginal | 130 | 246 | 1 | 2 | 173 | 0 | 0.0 | 1 | 3 | normal | No |
48 | 1 | asymptomatic | 124 | 274 | 0 | 2 | 166 | 0 | 0.5 | 2 | 0 | reversable | Yes |
56 | 0 | asymptomatic | 134 | 409 | 0 | 2 | 150 | 1 | 1.9 | 2 | 2 | reversable | Yes |
42 | 1 | typical | 148 | 244 | 0 | 2 | 178 | 0 | 0.8 | 1 | 2 | normal | No |
59 | 1 | typical | 178 | 270 | 0 | 2 | 145 | 0 | 4.2 | 3 | 0 | reversable | No |
60 | 0 | asymptomatic | 158 | 305 | 0 | 2 | 161 | 0 | 0.0 | 1 | 0 | normal | Yes |
63 | 0 | nontypical | 140 | 195 | 0 | 0 | 179 | 0 | 0.0 | 1 | 2 | normal | No |
42 | 1 | nonanginal | 120 | 240 | 1 | 0 | 194 | 0 | 0.8 | 3 | 0 | reversable | No |
66 | 1 | nontypical | 160 | 246 | 0 | 0 | 120 | 1 | 0.0 | 2 | 3 | fixed | Yes |
54 | 1 | nontypical | 192 | 283 | 0 | 2 | 195 | 0 | 0.0 | 1 | 1 | reversable | Yes |
69 | 1 | nonanginal | 140 | 254 | 0 | 2 | 146 | 0 | 2.0 | 2 | 3 | reversable | Yes |
50 | 1 | nonanginal | 129 | 196 | 0 | 0 | 163 | 0 | 0.0 | 1 | 0 | normal | No |
51 | 1 | asymptomatic | 140 | 298 | 0 | 0 | 122 | 1 | 4.2 | 2 | 3 | reversable | Yes |
43 | 1 | asymptomatic | 132 | 247 | 1 | 2 | 143 | 1 | 0.1 | 2 | NA | reversable | Yes |
62 | 0 | asymptomatic | 138 | 294 | 1 | 0 | 106 | 0 | 1.9 | 2 | 3 | normal | Yes |
68 | 0 | nonanginal | 120 | 211 | 0 | 2 | 115 | 0 | 1.5 | 2 | 0 | normal | No |
67 | 1 | asymptomatic | 100 | 299 | 0 | 2 | 125 | 1 | 0.9 | 2 | 2 | normal | Yes |
69 | 1 | typical | 160 | 234 | 1 | 2 | 131 | 0 | 0.1 | 2 | 1 | normal | No |
45 | 0 | asymptomatic | 138 | 236 | 0 | 2 | 152 | 1 | 0.2 | 2 | 0 | normal | No |
50 | 0 | nontypical | 120 | 244 | 0 | 0 | 162 | 0 | 1.1 | 1 | 0 | normal | No |
59 | 1 | typical | 160 | 273 | 0 | 2 | 125 | 0 | 0.0 | 1 | 0 | normal | Yes |
50 | 0 | asymptomatic | 110 | 254 | 0 | 2 | 159 | 0 | 0.0 | 1 | 0 | normal | No |
64 | 0 | asymptomatic | 180 | 325 | 0 | 0 | 154 | 1 | 0.0 | 1 | 0 | normal | No |
57 | 1 | nonanginal | 150 | 126 | 1 | 0 | 173 | 0 | 0.2 | 1 | 1 | reversable | No |
64 | 0 | nonanginal | 140 | 313 | 0 | 0 | 133 | 0 | 0.2 | 1 | 0 | reversable | No |
43 | 1 | asymptomatic | 110 | 211 | 0 | 0 | 161 | 0 | 0.0 | 1 | 0 | reversable | No |
45 | 1 | asymptomatic | 142 | 309 | 0 | 2 | 147 | 1 | 0.0 | 2 | 3 | reversable | Yes |
58 | 1 | asymptomatic | 128 | 259 | 0 | 2 | 130 | 1 | 3.0 | 2 | 2 | reversable | Yes |
50 | 1 | asymptomatic | 144 | 200 | 0 | 2 | 126 | 1 | 0.9 | 2 | 0 | reversable | Yes |
55 | 1 | nontypical | 130 | 262 | 0 | 0 | 155 | 0 | 0.0 | 1 | 0 | normal | No |
62 | 0 | asymptomatic | 150 | 244 | 0 | 0 | 154 | 1 | 1.4 | 2 | 0 | normal | Yes |
37 | 0 | nonanginal | 120 | 215 | 0 | 0 | 170 | 0 | 0.0 | 1 | 0 | normal | No |
38 | 1 | typical | 120 | 231 | 0 | 0 | 182 | 1 | 3.8 | 2 | 0 | reversable | Yes |
41 | 1 | nonanginal | 130 | 214 | 0 | 2 | 168 | 0 | 2.0 | 2 | 0 | normal | No |
66 | 0 | asymptomatic | 178 | 228 | 1 | 0 | 165 | 1 | 1.0 | 2 | 2 | reversable | Yes |
52 | 1 | asymptomatic | 112 | 230 | 0 | 0 | 160 | 0 | 0.0 | 1 | 1 | normal | Yes |
56 | 1 | typical | 120 | 193 | 0 | 2 | 162 | 0 | 1.9 | 2 | 0 | reversable | No |
46 | 0 | nontypical | 105 | 204 | 0 | 0 | 172 | 0 | 0.0 | 1 | 0 | normal | No |
46 | 0 | asymptomatic | 138 | 243 | 0 | 2 | 152 | 1 | 0.0 | 2 | 0 | normal | No |
64 | 0 | asymptomatic | 130 | 303 | 0 | 0 | 122 | 0 | 2.0 | 2 | 2 | normal | No |
59 | 1 | asymptomatic | 138 | 271 | 0 | 2 | 182 | 0 | 0.0 | 1 | 0 | normal | No |
41 | 0 | nonanginal | 112 | 268 | 0 | 2 | 172 | 1 | 0.0 | 1 | 0 | normal | No |
54 | 0 | nonanginal | 108 | 267 | 0 | 2 | 167 | 0 | 0.0 | 1 | 0 | normal | No |
39 | 0 | nonanginal | 94 | 199 | 0 | 0 | 179 | 0 | 0.0 | 1 | 0 | normal | No |
53 | 1 | asymptomatic | 123 | 282 | 0 | 0 | 95 | 1 | 2.0 | 2 | 2 | reversable | Yes |
63 | 0 | asymptomatic | 108 | 269 | 0 | 0 | 169 | 1 | 1.8 | 2 | 2 | normal | Yes |
34 | 0 | nontypical | 118 | 210 | 0 | 0 | 192 | 0 | 0.7 | 1 | 0 | normal | No |
47 | 1 | asymptomatic | 112 | 204 | 0 | 0 | 143 | 0 | 0.1 | 1 | 0 | normal | No |
67 | 0 | nonanginal | 152 | 277 | 0 | 0 | 172 | 0 | 0.0 | 1 | 1 | normal | No |
54 | 1 | asymptomatic | 110 | 206 | 0 | 2 | 108 | 1 | 0.0 | 2 | 1 | normal | Yes |
66 | 1 | asymptomatic | 112 | 212 | 0 | 2 | 132 | 1 | 0.1 | 1 | 1 | normal | Yes |
52 | 0 | nonanginal | 136 | 196 | 0 | 2 | 169 | 0 | 0.1 | 2 | 0 | normal | No |
55 | 0 | asymptomatic | 180 | 327 | 0 | 1 | 117 | 1 | 3.4 | 2 | 0 | normal | Yes |
49 | 1 | nonanginal | 118 | 149 | 0 | 2 | 126 | 0 | 0.8 | 1 | 3 | normal | Yes |
74 | 0 | nontypical | 120 | 269 | 0 | 2 | 121 | 1 | 0.2 | 1 | 1 | normal | No |
54 | 0 | nonanginal | 160 | 201 | 0 | 0 | 163 | 0 | 0.0 | 1 | 1 | normal | No |
54 | 1 | asymptomatic | 122 | 286 | 0 | 2 | 116 | 1 | 3.2 | 2 | 2 | normal | Yes |
56 | 1 | asymptomatic | 130 | 283 | 1 | 2 | 103 | 1 | 1.6 | 3 | 0 | reversable | Yes |
46 | 1 | asymptomatic | 120 | 249 | 0 | 2 | 144 | 0 | 0.8 | 1 | 0 | reversable | Yes |
49 | 0 | nontypical | 134 | 271 | 0 | 0 | 162 | 0 | 0.0 | 2 | 0 | normal | No |
42 | 1 | nontypical | 120 | 295 | 0 | 0 | 162 | 0 | 0.0 | 1 | 0 | normal | No |
41 | 1 | nontypical | 110 | 235 | 0 | 0 | 153 | 0 | 0.0 | 1 | 0 | normal | No |
41 | 0 | nontypical | 126 | 306 | 0 | 0 | 163 | 0 | 0.0 | 1 | 0 | normal | No |
49 | 0 | asymptomatic | 130 | 269 | 0 | 0 | 163 | 0 | 0.0 | 1 | 0 | normal | No |
61 | 1 | typical | 134 | 234 | 0 | 0 | 145 | 0 | 2.6 | 2 | 2 | normal | Yes |
60 | 0 | nonanginal | 120 | 178 | 1 | 0 | 96 | 0 | 0.0 | 1 | 0 | normal | No |
67 | 1 | asymptomatic | 120 | 237 | 0 | 0 | 71 | 0 | 1.0 | 2 | 0 | normal | Yes |
58 | 1 | asymptomatic | 100 | 234 | 0 | 0 | 156 | 0 | 0.1 | 1 | 1 | reversable | Yes |
47 | 1 | asymptomatic | 110 | 275 | 0 | 2 | 118 | 1 | 1.0 | 2 | 1 | normal | Yes |
52 | 1 | asymptomatic | 125 | 212 | 0 | 0 | 168 | 0 | 1.0 | 1 | 2 | reversable | Yes |
62 | 1 | nontypical | 128 | 208 | 1 | 2 | 140 | 0 | 0.0 | 1 | 0 | normal | No |
57 | 1 | asymptomatic | 110 | 201 | 0 | 0 | 126 | 1 | 1.5 | 2 | 0 | fixed | No |
58 | 1 | asymptomatic | 146 | 218 | 0 | 0 | 105 | 0 | 2.0 | 2 | 1 | reversable | Yes |
64 | 1 | asymptomatic | 128 | 263 | 0 | 0 | 105 | 1 | 0.2 | 2 | 1 | reversable | No |
51 | 0 | nonanginal | 120 | 295 | 0 | 2 | 157 | 0 | 0.6 | 1 | 0 | normal | No |
43 | 1 | asymptomatic | 115 | 303 | 0 | 0 | 181 | 0 | 1.2 | 2 | 0 | normal | No |
42 | 0 | nonanginal | 120 | 209 | 0 | 0 | 173 | 0 | 0.0 | 2 | 0 | normal | No |
67 | 0 | asymptomatic | 106 | 223 | 0 | 0 | 142 | 0 | 0.3 | 1 | 2 | normal | No |
76 | 0 | nonanginal | 140 | 197 | 0 | 1 | 116 | 0 | 1.1 | 2 | 0 | normal | No |
70 | 1 | nontypical | 156 | 245 | 0 | 2 | 143 | 0 | 0.0 | 1 | 0 | normal | No |
57 | 1 | nontypical | 124 | 261 | 0 | 0 | 141 | 0 | 0.3 | 1 | 0 | reversable | Yes |
44 | 0 | nonanginal | 118 | 242 | 0 | 0 | 149 | 0 | 0.3 | 2 | 1 | normal | No |
58 | 0 | nontypical | 136 | 319 | 1 | 2 | 152 | 0 | 0.0 | 1 | 2 | normal | Yes |
60 | 0 | typical | 150 | 240 | 0 | 0 | 171 | 0 | 0.9 | 1 | 0 | normal | No |
44 | 1 | nonanginal | 120 | 226 | 0 | 0 | 169 | 0 | 0.0 | 1 | 0 | normal | No |
61 | 1 | asymptomatic | 138 | 166 | 0 | 2 | 125 | 1 | 3.6 | 2 | 1 | normal | Yes |
42 | 1 | asymptomatic | 136 | 315 | 0 | 0 | 125 | 1 | 1.8 | 2 | 0 | fixed | Yes |
52 | 1 | asymptomatic | 128 | 204 | 1 | 0 | 156 | 1 | 1.0 | 2 | 0 | NA | Yes |
59 | 1 | nonanginal | 126 | 218 | 1 | 0 | 134 | 0 | 2.2 | 2 | 1 | fixed | Yes |
40 | 1 | asymptomatic | 152 | 223 | 0 | 0 | 181 | 0 | 0.0 | 1 | 0 | reversable | Yes |
42 | 1 | nonanginal | 130 | 180 | 0 | 0 | 150 | 0 | 0.0 | 1 | 0 | normal | No |
61 | 1 | asymptomatic | 140 | 207 | 0 | 2 | 138 | 1 | 1.9 | 1 | 1 | reversable | Yes |
66 | 1 | asymptomatic | 160 | 228 | 0 | 2 | 138 | 0 | 2.3 | 1 | 0 | fixed | No |
46 | 1 | asymptomatic | 140 | 311 | 0 | 0 | 120 | 1 | 1.8 | 2 | 2 | reversable | Yes |
71 | 0 | asymptomatic | 112 | 149 | 0 | 0 | 125 | 0 | 1.6 | 2 | 0 | normal | No |
59 | 1 | typical | 134 | 204 | 0 | 0 | 162 | 0 | 0.8 | 1 | 2 | normal | Yes |
64 | 1 | typical | 170 | 227 | 0 | 2 | 155 | 0 | 0.6 | 2 | 0 | reversable | No |
66 | 0 | nonanginal | 146 | 278 | 0 | 2 | 152 | 0 | 0.0 | 2 | 1 | normal | No |
39 | 0 | nonanginal | 138 | 220 | 0 | 0 | 152 | 0 | 0.0 | 2 | 0 | normal | No |
57 | 1 | nontypical | 154 | 232 | 0 | 2 | 164 | 0 | 0.0 | 1 | 1 | normal | Yes |
58 | 0 | asymptomatic | 130 | 197 | 0 | 0 | 131 | 0 | 0.6 | 2 | 0 | normal | No |
57 | 1 | asymptomatic | 110 | 335 | 0 | 0 | 143 | 1 | 3.0 | 2 | 1 | reversable | Yes |
47 | 1 | nonanginal | 130 | 253 | 0 | 0 | 179 | 0 | 0.0 | 1 | 0 | normal | No |
55 | 0 | asymptomatic | 128 | 205 | 0 | 1 | 130 | 1 | 2.0 | 2 | 1 | reversable | Yes |
35 | 1 | nontypical | 122 | 192 | 0 | 0 | 174 | 0 | 0.0 | 1 | 0 | normal | No |
61 | 1 | asymptomatic | 148 | 203 | 0 | 0 | 161 | 0 | 0.0 | 1 | 1 | reversable | Yes |
58 | 1 | asymptomatic | 114 | 318 | 0 | 1 | 140 | 0 | 4.4 | 3 | 3 | fixed | Yes |
58 | 0 | asymptomatic | 170 | 225 | 1 | 2 | 146 | 1 | 2.8 | 2 | 2 | fixed | Yes |
58 | 1 | nontypical | 125 | 220 | 0 | 0 | 144 | 0 | 0.4 | 2 | NA | reversable | No |
56 | 1 | nontypical | 130 | 221 | 0 | 2 | 163 | 0 | 0.0 | 1 | 0 | reversable | No |
56 | 1 | nontypical | 120 | 240 | 0 | 0 | 169 | 0 | 0.0 | 3 | 0 | normal | No |
67 | 1 | nonanginal | 152 | 212 | 0 | 2 | 150 | 0 | 0.8 | 2 | 0 | reversable | Yes |
55 | 0 | nontypical | 132 | 342 | 0 | 0 | 166 | 0 | 1.2 | 1 | 0 | normal | No |
44 | 1 | asymptomatic | 120 | 169 | 0 | 0 | 144 | 1 | 2.8 | 3 | 0 | fixed | Yes |
63 | 1 | asymptomatic | 140 | 187 | 0 | 2 | 144 | 1 | 4.0 | 1 | 2 | reversable | Yes |
63 | 0 | asymptomatic | 124 | 197 | 0 | 0 | 136 | 1 | 0.0 | 2 | 0 | normal | Yes |
41 | 1 | nontypical | 120 | 157 | 0 | 0 | 182 | 0 | 0.0 | 1 | 0 | normal | No |
59 | 1 | asymptomatic | 164 | 176 | 1 | 2 | 90 | 0 | 1.0 | 2 | 2 | fixed | Yes |
57 | 0 | asymptomatic | 140 | 241 | 0 | 0 | 123 | 1 | 0.2 | 2 | 0 | reversable | Yes |
45 | 1 | typical | 110 | 264 | 0 | 0 | 132 | 0 | 1.2 | 2 | 0 | reversable | Yes |
68 | 1 | asymptomatic | 144 | 193 | 1 | 0 | 141 | 0 | 3.4 | 2 | 2 | reversable | Yes |
57 | 1 | asymptomatic | 130 | 131 | 0 | 0 | 115 | 1 | 1.2 | 2 | 1 | reversable | Yes |
57 | 0 | nontypical | 130 | 236 | 0 | 2 | 174 | 0 | 0.0 | 2 | 1 | normal | Yes |
38 | 1 | nonanginal | 138 | 175 | 0 | 0 | 173 | 0 | 0.0 | 1 | NA | normal | No |
41.8.3 CarSeats数据
Sales | CompPrice | Income | Advertising | Population | Price | ShelveLoc | Age | Education | Urban | US |
---|---|---|---|---|---|---|---|---|---|---|
9.50 | 138 | 73 | 11 | 276 | 120 | Bad | 42 | 17 | Yes | Yes |
11.22 | 111 | 48 | 16 | 260 | 83 | Good | 65 | 10 | Yes | Yes |
10.06 | 113 | 35 | 10 | 269 | 80 | Medium | 59 | 12 | Yes | Yes |
7.40 | 117 | 100 | 4 | 466 | 97 | Medium | 55 | 14 | Yes | Yes |
4.15 | 141 | 64 | 3 | 340 | 128 | Bad | 38 | 13 | Yes | No |
10.81 | 124 | 113 | 13 | 501 | 72 | Bad | 78 | 16 | No | Yes |
6.63 | 115 | 105 | 0 | 45 | 108 | Medium | 71 | 15 | Yes | No |
11.85 | 136 | 81 | 15 | 425 | 120 | Good | 67 | 10 | Yes | Yes |
6.54 | 132 | 110 | 0 | 108 | 124 | Medium | 76 | 10 | No | No |
4.69 | 132 | 113 | 0 | 131 | 124 | Medium | 76 | 17 | No | Yes |
9.01 | 121 | 78 | 9 | 150 | 100 | Bad | 26 | 10 | No | Yes |
11.96 | 117 | 94 | 4 | 503 | 94 | Good | 50 | 13 | Yes | Yes |
3.98 | 122 | 35 | 2 | 393 | 136 | Medium | 62 | 18 | Yes | No |
10.96 | 115 | 28 | 11 | 29 | 86 | Good | 53 | 18 | Yes | Yes |
11.17 | 107 | 117 | 11 | 148 | 118 | Good | 52 | 18 | Yes | Yes |
8.71 | 149 | 95 | 5 | 400 | 144 | Medium | 76 | 18 | No | No |
7.58 | 118 | 32 | 0 | 284 | 110 | Good | 63 | 13 | Yes | No |
12.29 | 147 | 74 | 13 | 251 | 131 | Good | 52 | 10 | Yes | Yes |
13.91 | 110 | 110 | 0 | 408 | 68 | Good | 46 | 17 | No | Yes |
8.73 | 129 | 76 | 16 | 58 | 121 | Medium | 69 | 12 | Yes | Yes |
6.41 | 125 | 90 | 2 | 367 | 131 | Medium | 35 | 18 | Yes | Yes |
12.13 | 134 | 29 | 12 | 239 | 109 | Good | 62 | 18 | No | Yes |
5.08 | 128 | 46 | 6 | 497 | 138 | Medium | 42 | 13 | Yes | No |
5.87 | 121 | 31 | 0 | 292 | 109 | Medium | 79 | 10 | Yes | No |
10.14 | 145 | 119 | 16 | 294 | 113 | Bad | 42 | 12 | Yes | Yes |
14.90 | 139 | 32 | 0 | 176 | 82 | Good | 54 | 11 | No | No |
8.33 | 107 | 115 | 11 | 496 | 131 | Good | 50 | 11 | No | Yes |
5.27 | 98 | 118 | 0 | 19 | 107 | Medium | 64 | 17 | Yes | No |
2.99 | 103 | 74 | 0 | 359 | 97 | Bad | 55 | 11 | Yes | Yes |
7.81 | 104 | 99 | 15 | 226 | 102 | Bad | 58 | 17 | Yes | Yes |
13.55 | 125 | 94 | 0 | 447 | 89 | Good | 30 | 12 | Yes | No |
8.25 | 136 | 58 | 16 | 241 | 131 | Medium | 44 | 18 | Yes | Yes |
6.20 | 107 | 32 | 12 | 236 | 137 | Good | 64 | 10 | No | Yes |
8.77 | 114 | 38 | 13 | 317 | 128 | Good | 50 | 16 | Yes | Yes |
2.67 | 115 | 54 | 0 | 406 | 128 | Medium | 42 | 17 | Yes | Yes |
11.07 | 131 | 84 | 11 | 29 | 96 | Medium | 44 | 17 | No | Yes |
8.89 | 122 | 76 | 0 | 270 | 100 | Good | 60 | 18 | No | No |
4.95 | 121 | 41 | 5 | 412 | 110 | Medium | 54 | 10 | Yes | Yes |
6.59 | 109 | 73 | 0 | 454 | 102 | Medium | 65 | 15 | Yes | No |
3.24 | 130 | 60 | 0 | 144 | 138 | Bad | 38 | 10 | No | No |
2.07 | 119 | 98 | 0 | 18 | 126 | Bad | 73 | 17 | No | No |
7.96 | 157 | 53 | 0 | 403 | 124 | Bad | 58 | 16 | Yes | No |
10.43 | 77 | 69 | 0 | 25 | 24 | Medium | 50 | 18 | Yes | No |
4.12 | 123 | 42 | 11 | 16 | 134 | Medium | 59 | 13 | Yes | Yes |
4.16 | 85 | 79 | 6 | 325 | 95 | Medium | 69 | 13 | Yes | Yes |
4.56 | 141 | 63 | 0 | 168 | 135 | Bad | 44 | 12 | Yes | Yes |
12.44 | 127 | 90 | 14 | 16 | 70 | Medium | 48 | 15 | No | Yes |
4.38 | 126 | 98 | 0 | 173 | 108 | Bad | 55 | 16 | Yes | No |
3.91 | 116 | 52 | 0 | 349 | 98 | Bad | 69 | 18 | Yes | No |
10.61 | 157 | 93 | 0 | 51 | 149 | Good | 32 | 17 | Yes | No |
1.42 | 99 | 32 | 18 | 341 | 108 | Bad | 80 | 16 | Yes | Yes |
4.42 | 121 | 90 | 0 | 150 | 108 | Bad | 75 | 16 | Yes | No |
7.91 | 153 | 40 | 3 | 112 | 129 | Bad | 39 | 18 | Yes | Yes |
6.92 | 109 | 64 | 13 | 39 | 119 | Medium | 61 | 17 | Yes | Yes |
4.90 | 134 | 103 | 13 | 25 | 144 | Medium | 76 | 17 | No | Yes |
6.85 | 143 | 81 | 5 | 60 | 154 | Medium | 61 | 18 | Yes | Yes |
11.91 | 133 | 82 | 0 | 54 | 84 | Medium | 50 | 17 | Yes | No |
0.91 | 93 | 91 | 0 | 22 | 117 | Bad | 75 | 11 | Yes | No |
5.42 | 103 | 93 | 15 | 188 | 103 | Bad | 74 | 16 | Yes | Yes |
5.21 | 118 | 71 | 4 | 148 | 114 | Medium | 80 | 13 | Yes | No |
8.32 | 122 | 102 | 19 | 469 | 123 | Bad | 29 | 13 | Yes | Yes |
7.32 | 105 | 32 | 0 | 358 | 107 | Medium | 26 | 13 | No | No |
1.82 | 139 | 45 | 0 | 146 | 133 | Bad | 77 | 17 | Yes | Yes |
8.47 | 119 | 88 | 10 | 170 | 101 | Medium | 61 | 13 | Yes | Yes |
7.80 | 100 | 67 | 12 | 184 | 104 | Medium | 32 | 16 | No | Yes |
4.90 | 122 | 26 | 0 | 197 | 128 | Medium | 55 | 13 | No | No |
8.85 | 127 | 92 | 0 | 508 | 91 | Medium | 56 | 18 | Yes | No |
9.01 | 126 | 61 | 14 | 152 | 115 | Medium | 47 | 16 | Yes | Yes |
13.39 | 149 | 69 | 20 | 366 | 134 | Good | 60 | 13 | Yes | Yes |
7.99 | 127 | 59 | 0 | 339 | 99 | Medium | 65 | 12 | Yes | No |
9.46 | 89 | 81 | 15 | 237 | 99 | Good | 74 | 12 | Yes | Yes |
6.50 | 148 | 51 | 16 | 148 | 150 | Medium | 58 | 17 | No | Yes |
5.52 | 115 | 45 | 0 | 432 | 116 | Medium | 25 | 15 | Yes | No |
12.61 | 118 | 90 | 10 | 54 | 104 | Good | 31 | 11 | No | Yes |
6.20 | 150 | 68 | 5 | 125 | 136 | Medium | 64 | 13 | No | Yes |
8.55 | 88 | 111 | 23 | 480 | 92 | Bad | 36 | 16 | No | Yes |
10.64 | 102 | 87 | 10 | 346 | 70 | Medium | 64 | 15 | Yes | Yes |
7.70 | 118 | 71 | 12 | 44 | 89 | Medium | 67 | 18 | No | Yes |
4.43 | 134 | 48 | 1 | 139 | 145 | Medium | 65 | 12 | Yes | Yes |
9.14 | 134 | 67 | 0 | 286 | 90 | Bad | 41 | 13 | Yes | No |
8.01 | 113 | 100 | 16 | 353 | 79 | Bad | 68 | 11 | Yes | Yes |
7.52 | 116 | 72 | 0 | 237 | 128 | Good | 70 | 13 | Yes | No |
11.62 | 151 | 83 | 4 | 325 | 139 | Good | 28 | 17 | Yes | Yes |
4.42 | 109 | 36 | 7 | 468 | 94 | Bad | 56 | 11 | Yes | Yes |
2.23 | 111 | 25 | 0 | 52 | 121 | Bad | 43 | 18 | No | No |
8.47 | 125 | 103 | 0 | 304 | 112 | Medium | 49 | 13 | No | No |
8.70 | 150 | 84 | 9 | 432 | 134 | Medium | 64 | 15 | Yes | No |
11.70 | 131 | 67 | 7 | 272 | 126 | Good | 54 | 16 | No | Yes |
6.56 | 117 | 42 | 7 | 144 | 111 | Medium | 62 | 10 | Yes | Yes |
7.95 | 128 | 66 | 3 | 493 | 119 | Medium | 45 | 16 | No | No |
5.33 | 115 | 22 | 0 | 491 | 103 | Medium | 64 | 11 | No | No |
4.81 | 97 | 46 | 11 | 267 | 107 | Medium | 80 | 15 | Yes | Yes |
4.53 | 114 | 113 | 0 | 97 | 125 | Medium | 29 | 12 | Yes | No |
8.86 | 145 | 30 | 0 | 67 | 104 | Medium | 55 | 17 | Yes | No |
8.39 | 115 | 97 | 5 | 134 | 84 | Bad | 55 | 11 | Yes | Yes |
5.58 | 134 | 25 | 10 | 237 | 148 | Medium | 59 | 13 | Yes | Yes |
9.48 | 147 | 42 | 10 | 407 | 132 | Good | 73 | 16 | No | Yes |
7.45 | 161 | 82 | 5 | 287 | 129 | Bad | 33 | 16 | Yes | Yes |
12.49 | 122 | 77 | 24 | 382 | 127 | Good | 36 | 16 | No | Yes |
4.88 | 121 | 47 | 3 | 220 | 107 | Bad | 56 | 16 | No | Yes |
4.11 | 113 | 69 | 11 | 94 | 106 | Medium | 76 | 12 | No | Yes |
6.20 | 128 | 93 | 0 | 89 | 118 | Medium | 34 | 18 | Yes | No |
5.30 | 113 | 22 | 0 | 57 | 97 | Medium | 65 | 16 | No | No |
5.07 | 123 | 91 | 0 | 334 | 96 | Bad | 78 | 17 | Yes | Yes |
4.62 | 121 | 96 | 0 | 472 | 138 | Medium | 51 | 12 | Yes | No |
5.55 | 104 | 100 | 8 | 398 | 97 | Medium | 61 | 11 | Yes | Yes |
0.16 | 102 | 33 | 0 | 217 | 139 | Medium | 70 | 18 | No | No |
8.55 | 134 | 107 | 0 | 104 | 108 | Medium | 60 | 12 | Yes | No |
3.47 | 107 | 79 | 2 | 488 | 103 | Bad | 65 | 16 | Yes | No |
8.98 | 115 | 65 | 0 | 217 | 90 | Medium | 60 | 17 | No | No |
9.00 | 128 | 62 | 7 | 125 | 116 | Medium | 43 | 14 | Yes | Yes |
6.62 | 132 | 118 | 12 | 272 | 151 | Medium | 43 | 14 | Yes | Yes |
6.67 | 116 | 99 | 5 | 298 | 125 | Good | 62 | 12 | Yes | Yes |
6.01 | 131 | 29 | 11 | 335 | 127 | Bad | 33 | 12 | Yes | Yes |
9.31 | 122 | 87 | 9 | 17 | 106 | Medium | 65 | 13 | Yes | Yes |
8.54 | 139 | 35 | 0 | 95 | 129 | Medium | 42 | 13 | Yes | No |
5.08 | 135 | 75 | 0 | 202 | 128 | Medium | 80 | 10 | No | No |
8.80 | 145 | 53 | 0 | 507 | 119 | Medium | 41 | 12 | Yes | No |
7.57 | 112 | 88 | 2 | 243 | 99 | Medium | 62 | 11 | Yes | Yes |
7.37 | 130 | 94 | 8 | 137 | 128 | Medium | 64 | 12 | Yes | Yes |
6.87 | 128 | 105 | 11 | 249 | 131 | Medium | 63 | 13 | Yes | Yes |
11.67 | 125 | 89 | 10 | 380 | 87 | Bad | 28 | 10 | Yes | Yes |
6.88 | 119 | 100 | 5 | 45 | 108 | Medium | 75 | 10 | Yes | Yes |
8.19 | 127 | 103 | 0 | 125 | 155 | Good | 29 | 15 | No | Yes |
8.87 | 131 | 113 | 0 | 181 | 120 | Good | 63 | 14 | Yes | No |
9.34 | 89 | 78 | 0 | 181 | 49 | Medium | 43 | 15 | No | No |
11.27 | 153 | 68 | 2 | 60 | 133 | Good | 59 | 16 | Yes | Yes |
6.52 | 125 | 48 | 3 | 192 | 116 | Medium | 51 | 14 | Yes | Yes |
4.96 | 133 | 100 | 3 | 350 | 126 | Bad | 55 | 13 | Yes | Yes |
4.47 | 143 | 120 | 7 | 279 | 147 | Bad | 40 | 10 | No | Yes |
8.41 | 94 | 84 | 13 | 497 | 77 | Medium | 51 | 12 | Yes | Yes |
6.50 | 108 | 69 | 3 | 208 | 94 | Medium | 77 | 16 | Yes | No |
9.54 | 125 | 87 | 9 | 232 | 136 | Good | 72 | 10 | Yes | Yes |
7.62 | 132 | 98 | 2 | 265 | 97 | Bad | 62 | 12 | Yes | Yes |
3.67 | 132 | 31 | 0 | 327 | 131 | Medium | 76 | 16 | Yes | No |
6.44 | 96 | 94 | 14 | 384 | 120 | Medium | 36 | 18 | No | Yes |
5.17 | 131 | 75 | 0 | 10 | 120 | Bad | 31 | 18 | No | No |
6.52 | 128 | 42 | 0 | 436 | 118 | Medium | 80 | 11 | Yes | No |
10.27 | 125 | 103 | 12 | 371 | 109 | Medium | 44 | 10 | Yes | Yes |
12.30 | 146 | 62 | 10 | 310 | 94 | Medium | 30 | 13 | No | Yes |
6.03 | 133 | 60 | 10 | 277 | 129 | Medium | 45 | 18 | Yes | Yes |
6.53 | 140 | 42 | 0 | 331 | 131 | Bad | 28 | 15 | Yes | No |
7.44 | 124 | 84 | 0 | 300 | 104 | Medium | 77 | 15 | Yes | No |
0.53 | 122 | 88 | 7 | 36 | 159 | Bad | 28 | 17 | Yes | Yes |
9.09 | 132 | 68 | 0 | 264 | 123 | Good | 34 | 11 | No | No |
8.77 | 144 | 63 | 11 | 27 | 117 | Medium | 47 | 17 | Yes | Yes |
3.90 | 114 | 83 | 0 | 412 | 131 | Bad | 39 | 14 | Yes | No |
10.51 | 140 | 54 | 9 | 402 | 119 | Good | 41 | 16 | No | Yes |
7.56 | 110 | 119 | 0 | 384 | 97 | Medium | 72 | 14 | No | Yes |
11.48 | 121 | 120 | 13 | 140 | 87 | Medium | 56 | 11 | Yes | Yes |
10.49 | 122 | 84 | 8 | 176 | 114 | Good | 57 | 10 | No | Yes |
10.77 | 111 | 58 | 17 | 407 | 103 | Good | 75 | 17 | No | Yes |
7.64 | 128 | 78 | 0 | 341 | 128 | Good | 45 | 13 | No | No |
5.93 | 150 | 36 | 7 | 488 | 150 | Medium | 25 | 17 | No | Yes |
6.89 | 129 | 69 | 10 | 289 | 110 | Medium | 50 | 16 | No | Yes |
7.71 | 98 | 72 | 0 | 59 | 69 | Medium | 65 | 16 | Yes | No |
7.49 | 146 | 34 | 0 | 220 | 157 | Good | 51 | 16 | Yes | No |
10.21 | 121 | 58 | 8 | 249 | 90 | Medium | 48 | 13 | No | Yes |
12.53 | 142 | 90 | 1 | 189 | 112 | Good | 39 | 10 | No | Yes |
9.32 | 119 | 60 | 0 | 372 | 70 | Bad | 30 | 18 | No | No |
4.67 | 111 | 28 | 0 | 486 | 111 | Medium | 29 | 12 | No | No |
2.93 | 143 | 21 | 5 | 81 | 160 | Medium | 67 | 12 | No | Yes |
3.63 | 122 | 74 | 0 | 424 | 149 | Medium | 51 | 13 | Yes | No |
5.68 | 130 | 64 | 0 | 40 | 106 | Bad | 39 | 17 | No | No |
8.22 | 148 | 64 | 0 | 58 | 141 | Medium | 27 | 13 | No | Yes |
0.37 | 147 | 58 | 7 | 100 | 191 | Bad | 27 | 15 | Yes | Yes |
6.71 | 119 | 67 | 17 | 151 | 137 | Medium | 55 | 11 | Yes | Yes |
6.71 | 106 | 73 | 0 | 216 | 93 | Medium | 60 | 13 | Yes | No |
7.30 | 129 | 89 | 0 | 425 | 117 | Medium | 45 | 10 | Yes | No |
11.48 | 104 | 41 | 15 | 492 | 77 | Good | 73 | 18 | Yes | Yes |
8.01 | 128 | 39 | 12 | 356 | 118 | Medium | 71 | 10 | Yes | Yes |
12.49 | 93 | 106 | 12 | 416 | 55 | Medium | 75 | 15 | Yes | Yes |
9.03 | 104 | 102 | 13 | 123 | 110 | Good | 35 | 16 | Yes | Yes |
6.38 | 135 | 91 | 5 | 207 | 128 | Medium | 66 | 18 | Yes | Yes |
0.00 | 139 | 24 | 0 | 358 | 185 | Medium | 79 | 15 | No | No |
7.54 | 115 | 89 | 0 | 38 | 122 | Medium | 25 | 12 | Yes | No |
5.61 | 138 | 107 | 9 | 480 | 154 | Medium | 47 | 11 | No | Yes |
10.48 | 138 | 72 | 0 | 148 | 94 | Medium | 27 | 17 | Yes | Yes |
10.66 | 104 | 71 | 14 | 89 | 81 | Medium | 25 | 14 | No | Yes |
7.78 | 144 | 25 | 3 | 70 | 116 | Medium | 77 | 18 | Yes | Yes |
4.94 | 137 | 112 | 15 | 434 | 149 | Bad | 66 | 13 | Yes | Yes |
7.43 | 121 | 83 | 0 | 79 | 91 | Medium | 68 | 11 | Yes | No |
4.74 | 137 | 60 | 4 | 230 | 140 | Bad | 25 | 13 | Yes | No |
5.32 | 118 | 74 | 6 | 426 | 102 | Medium | 80 | 18 | Yes | Yes |
9.95 | 132 | 33 | 7 | 35 | 97 | Medium | 60 | 11 | No | Yes |
10.07 | 130 | 100 | 11 | 449 | 107 | Medium | 64 | 10 | Yes | Yes |
8.68 | 120 | 51 | 0 | 93 | 86 | Medium | 46 | 17 | No | No |
6.03 | 117 | 32 | 0 | 142 | 96 | Bad | 62 | 17 | Yes | No |
8.07 | 116 | 37 | 0 | 426 | 90 | Medium | 76 | 15 | Yes | No |
12.11 | 118 | 117 | 18 | 509 | 104 | Medium | 26 | 15 | No | Yes |
8.79 | 130 | 37 | 13 | 297 | 101 | Medium | 37 | 13 | No | Yes |
6.67 | 156 | 42 | 13 | 170 | 173 | Good | 74 | 14 | Yes | Yes |
7.56 | 108 | 26 | 0 | 408 | 93 | Medium | 56 | 14 | No | No |
13.28 | 139 | 70 | 7 | 71 | 96 | Good | 61 | 10 | Yes | Yes |
7.23 | 112 | 98 | 18 | 481 | 128 | Medium | 45 | 11 | Yes | Yes |
4.19 | 117 | 93 | 4 | 420 | 112 | Bad | 66 | 11 | Yes | Yes |
4.10 | 130 | 28 | 6 | 410 | 133 | Bad | 72 | 16 | Yes | Yes |
2.52 | 124 | 61 | 0 | 333 | 138 | Medium | 76 | 16 | Yes | No |
3.62 | 112 | 80 | 5 | 500 | 128 | Medium | 69 | 10 | Yes | Yes |
6.42 | 122 | 88 | 5 | 335 | 126 | Medium | 64 | 14 | Yes | Yes |
5.56 | 144 | 92 | 0 | 349 | 146 | Medium | 62 | 12 | No | No |
5.94 | 138 | 83 | 0 | 139 | 134 | Medium | 54 | 18 | Yes | No |
4.10 | 121 | 78 | 4 | 413 | 130 | Bad | 46 | 10 | No | Yes |
2.05 | 131 | 82 | 0 | 132 | 157 | Bad | 25 | 14 | Yes | No |
8.74 | 155 | 80 | 0 | 237 | 124 | Medium | 37 | 14 | Yes | No |
5.68 | 113 | 22 | 1 | 317 | 132 | Medium | 28 | 12 | Yes | No |
4.97 | 162 | 67 | 0 | 27 | 160 | Medium | 77 | 17 | Yes | Yes |
8.19 | 111 | 105 | 0 | 466 | 97 | Bad | 61 | 10 | No | No |
7.78 | 86 | 54 | 0 | 497 | 64 | Bad | 33 | 12 | Yes | No |
3.02 | 98 | 21 | 11 | 326 | 90 | Bad | 76 | 11 | No | Yes |
4.36 | 125 | 41 | 2 | 357 | 123 | Bad | 47 | 14 | No | Yes |
9.39 | 117 | 118 | 14 | 445 | 120 | Medium | 32 | 15 | Yes | Yes |
12.04 | 145 | 69 | 19 | 501 | 105 | Medium | 45 | 11 | Yes | Yes |
8.23 | 149 | 84 | 5 | 220 | 139 | Medium | 33 | 10 | Yes | Yes |
4.83 | 115 | 115 | 3 | 48 | 107 | Medium | 73 | 18 | Yes | Yes |
2.34 | 116 | 83 | 15 | 170 | 144 | Bad | 71 | 11 | Yes | Yes |
5.73 | 141 | 33 | 0 | 243 | 144 | Medium | 34 | 17 | Yes | No |
4.34 | 106 | 44 | 0 | 481 | 111 | Medium | 70 | 14 | No | No |
9.70 | 138 | 61 | 12 | 156 | 120 | Medium | 25 | 14 | Yes | Yes |
10.62 | 116 | 79 | 19 | 359 | 116 | Good | 58 | 17 | Yes | Yes |
10.59 | 131 | 120 | 15 | 262 | 124 | Medium | 30 | 10 | Yes | Yes |
6.43 | 124 | 44 | 0 | 125 | 107 | Medium | 80 | 11 | Yes | No |
7.49 | 136 | 119 | 6 | 178 | 145 | Medium | 35 | 13 | Yes | Yes |
3.45 | 110 | 45 | 9 | 276 | 125 | Medium | 62 | 14 | Yes | Yes |
4.10 | 134 | 82 | 0 | 464 | 141 | Medium | 48 | 13 | No | No |
6.68 | 107 | 25 | 0 | 412 | 82 | Bad | 36 | 14 | Yes | No |
7.80 | 119 | 33 | 0 | 245 | 122 | Good | 56 | 14 | Yes | No |
8.69 | 113 | 64 | 10 | 68 | 101 | Medium | 57 | 16 | Yes | Yes |
5.40 | 149 | 73 | 13 | 381 | 163 | Bad | 26 | 11 | No | Yes |
11.19 | 98 | 104 | 0 | 404 | 72 | Medium | 27 | 18 | No | No |
5.16 | 115 | 60 | 0 | 119 | 114 | Bad | 38 | 14 | No | No |
8.09 | 132 | 69 | 0 | 123 | 122 | Medium | 27 | 11 | No | No |
13.14 | 137 | 80 | 10 | 24 | 105 | Good | 61 | 15 | Yes | Yes |
8.65 | 123 | 76 | 18 | 218 | 120 | Medium | 29 | 14 | No | Yes |
9.43 | 115 | 62 | 11 | 289 | 129 | Good | 56 | 16 | No | Yes |
5.53 | 126 | 32 | 8 | 95 | 132 | Medium | 50 | 17 | Yes | Yes |
9.32 | 141 | 34 | 16 | 361 | 108 | Medium | 69 | 10 | Yes | Yes |
9.62 | 151 | 28 | 8 | 499 | 135 | Medium | 48 | 10 | Yes | Yes |
7.36 | 121 | 24 | 0 | 200 | 133 | Good | 73 | 13 | Yes | No |
3.89 | 123 | 105 | 0 | 149 | 118 | Bad | 62 | 16 | Yes | Yes |
10.31 | 159 | 80 | 0 | 362 | 121 | Medium | 26 | 18 | Yes | No |
12.01 | 136 | 63 | 0 | 160 | 94 | Medium | 38 | 12 | Yes | No |
4.68 | 124 | 46 | 0 | 199 | 135 | Medium | 52 | 14 | No | No |
7.82 | 124 | 25 | 13 | 87 | 110 | Medium | 57 | 10 | Yes | Yes |
8.78 | 130 | 30 | 0 | 391 | 100 | Medium | 26 | 18 | Yes | No |
10.00 | 114 | 43 | 0 | 199 | 88 | Good | 57 | 10 | No | Yes |
6.90 | 120 | 56 | 20 | 266 | 90 | Bad | 78 | 18 | Yes | Yes |
5.04 | 123 | 114 | 0 | 298 | 151 | Bad | 34 | 16 | Yes | No |
5.36 | 111 | 52 | 0 | 12 | 101 | Medium | 61 | 11 | Yes | Yes |
5.05 | 125 | 67 | 0 | 86 | 117 | Bad | 65 | 11 | Yes | No |
9.16 | 137 | 105 | 10 | 435 | 156 | Good | 72 | 14 | Yes | Yes |
3.72 | 139 | 111 | 5 | 310 | 132 | Bad | 62 | 13 | Yes | Yes |
8.31 | 133 | 97 | 0 | 70 | 117 | Medium | 32 | 16 | Yes | No |
5.64 | 124 | 24 | 5 | 288 | 122 | Medium | 57 | 12 | No | Yes |
9.58 | 108 | 104 | 23 | 353 | 129 | Good | 37 | 17 | Yes | Yes |
7.71 | 123 | 81 | 8 | 198 | 81 | Bad | 80 | 15 | Yes | Yes |
4.20 | 147 | 40 | 0 | 277 | 144 | Medium | 73 | 10 | Yes | No |
8.67 | 125 | 62 | 14 | 477 | 112 | Medium | 80 | 13 | Yes | Yes |
3.47 | 108 | 38 | 0 | 251 | 81 | Bad | 72 | 14 | No | No |
5.12 | 123 | 36 | 10 | 467 | 100 | Bad | 74 | 11 | No | Yes |
7.67 | 129 | 117 | 8 | 400 | 101 | Bad | 36 | 10 | Yes | Yes |
5.71 | 121 | 42 | 4 | 188 | 118 | Medium | 54 | 15 | Yes | Yes |
6.37 | 120 | 77 | 15 | 86 | 132 | Medium | 48 | 18 | Yes | Yes |
7.77 | 116 | 26 | 6 | 434 | 115 | Medium | 25 | 17 | Yes | Yes |
6.95 | 128 | 29 | 5 | 324 | 159 | Good | 31 | 15 | Yes | Yes |
5.31 | 130 | 35 | 10 | 402 | 129 | Bad | 39 | 17 | Yes | Yes |
9.10 | 128 | 93 | 12 | 343 | 112 | Good | 73 | 17 | No | Yes |
5.83 | 134 | 82 | 7 | 473 | 112 | Bad | 51 | 12 | No | Yes |
6.53 | 123 | 57 | 0 | 66 | 105 | Medium | 39 | 11 | Yes | No |
5.01 | 159 | 69 | 0 | 438 | 166 | Medium | 46 | 17 | Yes | No |
11.99 | 119 | 26 | 0 | 284 | 89 | Good | 26 | 10 | Yes | No |
4.55 | 111 | 56 | 0 | 504 | 110 | Medium | 62 | 16 | Yes | No |
12.98 | 113 | 33 | 0 | 14 | 63 | Good | 38 | 12 | Yes | No |
10.04 | 116 | 106 | 8 | 244 | 86 | Medium | 58 | 12 | Yes | Yes |
7.22 | 135 | 93 | 2 | 67 | 119 | Medium | 34 | 11 | Yes | Yes |
6.67 | 107 | 119 | 11 | 210 | 132 | Medium | 53 | 11 | Yes | Yes |
6.93 | 135 | 69 | 14 | 296 | 130 | Medium | 73 | 15 | Yes | Yes |
7.80 | 136 | 48 | 12 | 326 | 125 | Medium | 36 | 16 | Yes | Yes |
7.22 | 114 | 113 | 2 | 129 | 151 | Good | 40 | 15 | No | Yes |
3.42 | 141 | 57 | 13 | 376 | 158 | Medium | 64 | 18 | Yes | Yes |
2.86 | 121 | 86 | 10 | 496 | 145 | Bad | 51 | 10 | Yes | Yes |
11.19 | 122 | 69 | 7 | 303 | 105 | Good | 45 | 16 | No | Yes |
7.74 | 150 | 96 | 0 | 80 | 154 | Good | 61 | 11 | Yes | No |
5.36 | 135 | 110 | 0 | 112 | 117 | Medium | 80 | 16 | No | No |
6.97 | 106 | 46 | 11 | 414 | 96 | Bad | 79 | 17 | No | No |
7.60 | 146 | 26 | 11 | 261 | 131 | Medium | 39 | 10 | Yes | Yes |
7.53 | 117 | 118 | 11 | 429 | 113 | Medium | 67 | 18 | No | Yes |
6.88 | 95 | 44 | 4 | 208 | 72 | Bad | 44 | 17 | Yes | Yes |
6.98 | 116 | 40 | 0 | 74 | 97 | Medium | 76 | 15 | No | No |
8.75 | 143 | 77 | 25 | 448 | 156 | Medium | 43 | 17 | Yes | Yes |
9.49 | 107 | 111 | 14 | 400 | 103 | Medium | 41 | 11 | No | Yes |
6.64 | 118 | 70 | 0 | 106 | 89 | Bad | 39 | 17 | Yes | No |
11.82 | 113 | 66 | 16 | 322 | 74 | Good | 76 | 15 | Yes | Yes |
11.28 | 123 | 84 | 0 | 74 | 89 | Good | 59 | 10 | Yes | No |
12.66 | 148 | 76 | 3 | 126 | 99 | Good | 60 | 11 | Yes | Yes |
4.21 | 118 | 35 | 14 | 502 | 137 | Medium | 79 | 10 | No | Yes |
8.21 | 127 | 44 | 13 | 160 | 123 | Good | 63 | 18 | Yes | Yes |
3.07 | 118 | 83 | 13 | 276 | 104 | Bad | 75 | 10 | Yes | Yes |
10.98 | 148 | 63 | 0 | 312 | 130 | Good | 63 | 15 | Yes | No |
9.40 | 135 | 40 | 17 | 497 | 96 | Medium | 54 | 17 | No | Yes |
8.57 | 116 | 78 | 1 | 158 | 99 | Medium | 45 | 11 | Yes | Yes |
7.41 | 99 | 93 | 0 | 198 | 87 | Medium | 57 | 16 | Yes | Yes |
5.28 | 108 | 77 | 13 | 388 | 110 | Bad | 74 | 14 | Yes | Yes |
10.01 | 133 | 52 | 16 | 290 | 99 | Medium | 43 | 11 | Yes | Yes |
11.93 | 123 | 98 | 12 | 408 | 134 | Good | 29 | 10 | Yes | Yes |
8.03 | 115 | 29 | 26 | 394 | 132 | Medium | 33 | 13 | Yes | Yes |
4.78 | 131 | 32 | 1 | 85 | 133 | Medium | 48 | 12 | Yes | Yes |
5.90 | 138 | 92 | 0 | 13 | 120 | Bad | 61 | 12 | Yes | No |
9.24 | 126 | 80 | 19 | 436 | 126 | Medium | 52 | 10 | Yes | Yes |
11.18 | 131 | 111 | 13 | 33 | 80 | Bad | 68 | 18 | Yes | Yes |
9.53 | 175 | 65 | 29 | 419 | 166 | Medium | 53 | 12 | Yes | Yes |
6.15 | 146 | 68 | 12 | 328 | 132 | Bad | 51 | 14 | Yes | Yes |
6.80 | 137 | 117 | 5 | 337 | 135 | Bad | 38 | 10 | Yes | Yes |
9.33 | 103 | 81 | 3 | 491 | 54 | Medium | 66 | 13 | Yes | No |
7.72 | 133 | 33 | 10 | 333 | 129 | Good | 71 | 14 | Yes | Yes |
6.39 | 131 | 21 | 8 | 220 | 171 | Good | 29 | 14 | Yes | Yes |
15.63 | 122 | 36 | 5 | 369 | 72 | Good | 35 | 10 | Yes | Yes |
6.41 | 142 | 30 | 0 | 472 | 136 | Good | 80 | 15 | No | No |
10.08 | 116 | 72 | 10 | 456 | 130 | Good | 41 | 14 | No | Yes |
6.97 | 127 | 45 | 19 | 459 | 129 | Medium | 57 | 11 | No | Yes |
5.86 | 136 | 70 | 12 | 171 | 152 | Medium | 44 | 18 | Yes | Yes |
7.52 | 123 | 39 | 5 | 499 | 98 | Medium | 34 | 15 | Yes | No |
9.16 | 140 | 50 | 10 | 300 | 139 | Good | 60 | 15 | Yes | Yes |
10.36 | 107 | 105 | 18 | 428 | 103 | Medium | 34 | 12 | Yes | Yes |
2.66 | 136 | 65 | 4 | 133 | 150 | Bad | 53 | 13 | Yes | Yes |
11.70 | 144 | 69 | 11 | 131 | 104 | Medium | 47 | 11 | Yes | Yes |
4.69 | 133 | 30 | 0 | 152 | 122 | Medium | 53 | 17 | Yes | No |
6.23 | 112 | 38 | 17 | 316 | 104 | Medium | 80 | 16 | Yes | Yes |
3.15 | 117 | 66 | 1 | 65 | 111 | Bad | 55 | 11 | Yes | Yes |
11.27 | 100 | 54 | 9 | 433 | 89 | Good | 45 | 12 | Yes | Yes |
4.99 | 122 | 59 | 0 | 501 | 112 | Bad | 32 | 14 | No | No |
10.10 | 135 | 63 | 15 | 213 | 134 | Medium | 32 | 10 | Yes | Yes |
5.74 | 106 | 33 | 20 | 354 | 104 | Medium | 61 | 12 | Yes | Yes |
5.87 | 136 | 60 | 7 | 303 | 147 | Medium | 41 | 10 | Yes | Yes |
7.63 | 93 | 117 | 9 | 489 | 83 | Bad | 42 | 13 | Yes | Yes |
6.18 | 120 | 70 | 15 | 464 | 110 | Medium | 72 | 15 | Yes | Yes |
5.17 | 138 | 35 | 6 | 60 | 143 | Bad | 28 | 18 | Yes | No |
8.61 | 130 | 38 | 0 | 283 | 102 | Medium | 80 | 15 | Yes | No |
5.97 | 112 | 24 | 0 | 164 | 101 | Medium | 45 | 11 | Yes | No |
11.54 | 134 | 44 | 4 | 219 | 126 | Good | 44 | 15 | Yes | Yes |
7.50 | 140 | 29 | 0 | 105 | 91 | Bad | 43 | 16 | Yes | No |
7.38 | 98 | 120 | 0 | 268 | 93 | Medium | 72 | 10 | No | No |
7.81 | 137 | 102 | 13 | 422 | 118 | Medium | 71 | 10 | No | Yes |
5.99 | 117 | 42 | 10 | 371 | 121 | Bad | 26 | 14 | Yes | Yes |
8.43 | 138 | 80 | 0 | 108 | 126 | Good | 70 | 13 | No | Yes |
4.81 | 121 | 68 | 0 | 279 | 149 | Good | 79 | 12 | Yes | No |
8.97 | 132 | 107 | 0 | 144 | 125 | Medium | 33 | 13 | No | No |
6.88 | 96 | 39 | 0 | 161 | 112 | Good | 27 | 14 | No | No |
12.57 | 132 | 102 | 20 | 459 | 107 | Good | 49 | 11 | Yes | Yes |
9.32 | 134 | 27 | 18 | 467 | 96 | Medium | 49 | 14 | No | Yes |
8.64 | 111 | 101 | 17 | 266 | 91 | Medium | 63 | 17 | No | Yes |
10.44 | 124 | 115 | 16 | 458 | 105 | Medium | 62 | 16 | No | Yes |
13.44 | 133 | 103 | 14 | 288 | 122 | Good | 61 | 17 | Yes | Yes |
9.45 | 107 | 67 | 12 | 430 | 92 | Medium | 35 | 12 | No | Yes |
5.30 | 133 | 31 | 1 | 80 | 145 | Medium | 42 | 18 | Yes | Yes |
7.02 | 130 | 100 | 0 | 306 | 146 | Good | 42 | 11 | Yes | No |
3.58 | 142 | 109 | 0 | 111 | 164 | Good | 72 | 12 | Yes | No |
13.36 | 103 | 73 | 3 | 276 | 72 | Medium | 34 | 15 | Yes | Yes |
4.17 | 123 | 96 | 10 | 71 | 118 | Bad | 69 | 11 | Yes | Yes |
3.13 | 130 | 62 | 11 | 396 | 130 | Bad | 66 | 14 | Yes | Yes |
8.77 | 118 | 86 | 7 | 265 | 114 | Good | 52 | 15 | No | Yes |
8.68 | 131 | 25 | 10 | 183 | 104 | Medium | 56 | 15 | No | Yes |
5.25 | 131 | 55 | 0 | 26 | 110 | Bad | 79 | 12 | Yes | Yes |
10.26 | 111 | 75 | 1 | 377 | 108 | Good | 25 | 12 | Yes | No |
10.50 | 122 | 21 | 16 | 488 | 131 | Good | 30 | 14 | Yes | Yes |
6.53 | 154 | 30 | 0 | 122 | 162 | Medium | 57 | 17 | No | No |
5.98 | 124 | 56 | 11 | 447 | 134 | Medium | 53 | 12 | No | Yes |
14.37 | 95 | 106 | 0 | 256 | 53 | Good | 52 | 17 | Yes | No |
10.71 | 109 | 22 | 10 | 348 | 79 | Good | 74 | 14 | No | Yes |
10.26 | 135 | 100 | 22 | 463 | 122 | Medium | 36 | 14 | Yes | Yes |
7.68 | 126 | 41 | 22 | 403 | 119 | Bad | 42 | 12 | Yes | Yes |
9.08 | 152 | 81 | 0 | 191 | 126 | Medium | 54 | 16 | Yes | No |
7.80 | 121 | 50 | 0 | 508 | 98 | Medium | 65 | 11 | No | No |
5.58 | 137 | 71 | 0 | 402 | 116 | Medium | 78 | 17 | Yes | No |
9.44 | 131 | 47 | 7 | 90 | 118 | Medium | 47 | 12 | Yes | Yes |
7.90 | 132 | 46 | 4 | 206 | 124 | Medium | 73 | 11 | Yes | No |
16.27 | 141 | 60 | 19 | 319 | 92 | Good | 44 | 11 | Yes | Yes |
6.81 | 132 | 61 | 0 | 263 | 125 | Medium | 41 | 12 | No | No |
6.11 | 133 | 88 | 3 | 105 | 119 | Medium | 79 | 12 | Yes | Yes |
5.81 | 125 | 111 | 0 | 404 | 107 | Bad | 54 | 15 | Yes | No |
9.64 | 106 | 64 | 10 | 17 | 89 | Medium | 68 | 17 | Yes | Yes |
3.90 | 124 | 65 | 21 | 496 | 151 | Bad | 77 | 13 | Yes | Yes |
4.95 | 121 | 28 | 19 | 315 | 121 | Medium | 66 | 14 | Yes | Yes |
9.35 | 98 | 117 | 0 | 76 | 68 | Medium | 63 | 10 | Yes | No |
12.85 | 123 | 37 | 15 | 348 | 112 | Good | 28 | 12 | Yes | Yes |
5.87 | 131 | 73 | 13 | 455 | 132 | Medium | 62 | 17 | Yes | Yes |
5.32 | 152 | 116 | 0 | 170 | 160 | Medium | 39 | 16 | Yes | No |
8.67 | 142 | 73 | 14 | 238 | 115 | Medium | 73 | 14 | No | Yes |
8.14 | 135 | 89 | 11 | 245 | 78 | Bad | 79 | 16 | Yes | Yes |
8.44 | 128 | 42 | 8 | 328 | 107 | Medium | 35 | 12 | Yes | Yes |
5.47 | 108 | 75 | 9 | 61 | 111 | Medium | 67 | 12 | Yes | Yes |
6.10 | 153 | 63 | 0 | 49 | 124 | Bad | 56 | 16 | Yes | No |
4.53 | 129 | 42 | 13 | 315 | 130 | Bad | 34 | 13 | Yes | Yes |
5.57 | 109 | 51 | 10 | 26 | 120 | Medium | 30 | 17 | No | Yes |
5.35 | 130 | 58 | 19 | 366 | 139 | Bad | 33 | 16 | Yes | Yes |
12.57 | 138 | 108 | 17 | 203 | 128 | Good | 33 | 14 | Yes | Yes |
6.14 | 139 | 23 | 3 | 37 | 120 | Medium | 55 | 11 | No | Yes |
7.41 | 162 | 26 | 12 | 368 | 159 | Medium | 40 | 18 | Yes | Yes |
5.94 | 100 | 79 | 7 | 284 | 95 | Bad | 50 | 12 | Yes | Yes |
9.71 | 134 | 37 | 0 | 27 | 120 | Good | 49 | 16 | Yes | Yes |
41.8.4 Boston数据
crim | zn | indus | chas | nox | rm | age | dis | rad | tax | ptratio | black | lstat | medv |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.00632 | 18.0 | 2.31 | 0 | 0.5380 | 6.575 | 65.2 | 4.0900 | 1 | 296 | 15.3 | 396.90 | 4.98 | 24.0 |
0.02731 | 0.0 | 7.07 | 0 | 0.4690 | 6.421 | 78.9 | 4.9671 | 2 | 242 | 17.8 | 396.90 | 9.14 | 21.6 |
0.02729 | 0.0 | 7.07 | 0 | 0.4690 | 7.185 | 61.1 | 4.9671 | 2 | 242 | 17.8 | 392.83 | 4.03 | 34.7 |
0.03237 | 0.0 | 2.18 | 0 | 0.4580 | 6.998 | 45.8 | 6.0622 | 3 | 222 | 18.7 | 394.63 | 2.94 | 33.4 |
0.06905 | 0.0 | 2.18 | 0 | 0.4580 | 7.147 | 54.2 | 6.0622 | 3 | 222 | 18.7 | 396.90 | 5.33 | 36.2 |
0.02985 | 0.0 | 2.18 | 0 | 0.4580 | 6.430 | 58.7 | 6.0622 | 3 | 222 | 18.7 | 394.12 | 5.21 | 28.7 |
0.08829 | 12.5 | 7.87 | 0 | 0.5240 | 6.012 | 66.6 | 5.5605 | 5 | 311 | 15.2 | 395.60 | 12.43 | 22.9 |
0.14455 | 12.5 | 7.87 | 0 | 0.5240 | 6.172 | 96.1 | 5.9505 | 5 | 311 | 15.2 | 396.90 | 19.15 | 27.1 |
0.21124 | 12.5 | 7.87 | 0 | 0.5240 | 5.631 | 100.0 | 6.0821 | 5 | 311 | 15.2 | 386.63 | 29.93 | 16.5 |
0.17004 | 12.5 | 7.87 | 0 | 0.5240 | 6.004 | 85.9 | 6.5921 | 5 | 311 | 15.2 | 386.71 | 17.10 | 18.9 |
0.22489 | 12.5 | 7.87 | 0 | 0.5240 | 6.377 | 94.3 | 6.3467 | 5 | 311 | 15.2 | 392.52 | 20.45 | 15.0 |
0.11747 | 12.5 | 7.87 | 0 | 0.5240 | 6.009 | 82.9 | 6.2267 | 5 | 311 | 15.2 | 396.90 | 13.27 | 18.9 |
0.09378 | 12.5 | 7.87 | 0 | 0.5240 | 5.889 | 39.0 | 5.4509 | 5 | 311 | 15.2 | 390.50 | 15.71 | 21.7 |
0.62976 | 0.0 | 8.14 | 0 | 0.5380 | 5.949 | 61.8 | 4.7075 | 4 | 307 | 21.0 | 396.90 | 8.26 | 20.4 |
0.63796 | 0.0 | 8.14 | 0 | 0.5380 | 6.096 | 84.5 | 4.4619 | 4 | 307 | 21.0 | 380.02 | 10.26 | 18.2 |
0.62739 | 0.0 | 8.14 | 0 | 0.5380 | 5.834 | 56.5 | 4.4986 | 4 | 307 | 21.0 | 395.62 | 8.47 | 19.9 |
1.05393 | 0.0 | 8.14 | 0 | 0.5380 | 5.935 | 29.3 | 4.4986 | 4 | 307 | 21.0 | 386.85 | 6.58 | 23.1 |
0.78420 | 0.0 | 8.14 | 0 | 0.5380 | 5.990 | 81.7 | 4.2579 | 4 | 307 | 21.0 | 386.75 | 14.67 | 17.5 |
0.80271 | 0.0 | 8.14 | 0 | 0.5380 | 5.456 | 36.6 | 3.7965 | 4 | 307 | 21.0 | 288.99 | 11.69 | 20.2 |
0.72580 | 0.0 | 8.14 | 0 | 0.5380 | 5.727 | 69.5 | 3.7965 | 4 | 307 | 21.0 | 390.95 | 11.28 | 18.2 |
1.25179 | 0.0 | 8.14 | 0 | 0.5380 | 5.570 | 98.1 | 3.7979 | 4 | 307 | 21.0 | 376.57 | 21.02 | 13.6 |
0.85204 | 0.0 | 8.14 | 0 | 0.5380 | 5.965 | 89.2 | 4.0123 | 4 | 307 | 21.0 | 392.53 | 13.83 | 19.6 |
1.23247 | 0.0 | 8.14 | 0 | 0.5380 | 6.142 | 91.7 | 3.9769 | 4 | 307 | 21.0 | 396.90 | 18.72 | 15.2 |
0.98843 | 0.0 | 8.14 | 0 | 0.5380 | 5.813 | 100.0 | 4.0952 | 4 | 307 | 21.0 | 394.54 | 19.88 | 14.5 |
0.75026 | 0.0 | 8.14 | 0 | 0.5380 | 5.924 | 94.1 | 4.3996 | 4 | 307 | 21.0 | 394.33 | 16.30 | 15.6 |
0.84054 | 0.0 | 8.14 | 0 | 0.5380 | 5.599 | 85.7 | 4.4546 | 4 | 307 | 21.0 | 303.42 | 16.51 | 13.9 |
0.67191 | 0.0 | 8.14 | 0 | 0.5380 | 5.813 | 90.3 | 4.6820 | 4 | 307 | 21.0 | 376.88 | 14.81 | 16.6 |
0.95577 | 0.0 | 8.14 | 0 | 0.5380 | 6.047 | 88.8 | 4.4534 | 4 | 307 | 21.0 | 306.38 | 17.28 | 14.8 |
0.77299 | 0.0 | 8.14 | 0 | 0.5380 | 6.495 | 94.4 | 4.4547 | 4 | 307 | 21.0 | 387.94 | 12.80 | 18.4 |
1.00245 | 0.0 | 8.14 | 0 | 0.5380 | 6.674 | 87.3 | 4.2390 | 4 | 307 | 21.0 | 380.23 | 11.98 | 21.0 |
1.13081 | 0.0 | 8.14 | 0 | 0.5380 | 5.713 | 94.1 | 4.2330 | 4 | 307 | 21.0 | 360.17 | 22.60 | 12.7 |
1.35472 | 0.0 | 8.14 | 0 | 0.5380 | 6.072 | 100.0 | 4.1750 | 4 | 307 | 21.0 | 376.73 | 13.04 | 14.5 |
1.38799 | 0.0 | 8.14 | 0 | 0.5380 | 5.950 | 82.0 | 3.9900 | 4 | 307 | 21.0 | 232.60 | 27.71 | 13.2 |
1.15172 | 0.0 | 8.14 | 0 | 0.5380 | 5.701 | 95.0 | 3.7872 | 4 | 307 | 21.0 | 358.77 | 18.35 | 13.1 |
1.61282 | 0.0 | 8.14 | 0 | 0.5380 | 6.096 | 96.9 | 3.7598 | 4 | 307 | 21.0 | 248.31 | 20.34 | 13.5 |
0.06417 | 0.0 | 5.96 | 0 | 0.4990 | 5.933 | 68.2 | 3.3603 | 5 | 279 | 19.2 | 396.90 | 9.68 | 18.9 |
0.09744 | 0.0 | 5.96 | 0 | 0.4990 | 5.841 | 61.4 | 3.3779 | 5 | 279 | 19.2 | 377.56 | 11.41 | 20.0 |
0.08014 | 0.0 | 5.96 | 0 | 0.4990 | 5.850 | 41.5 | 3.9342 | 5 | 279 | 19.2 | 396.90 | 8.77 | 21.0 |
0.17505 | 0.0 | 5.96 | 0 | 0.4990 | 5.966 | 30.2 | 3.8473 | 5 | 279 | 19.2 | 393.43 | 10.13 | 24.7 |
0.02763 | 75.0 | 2.95 | 0 | 0.4280 | 6.595 | 21.8 | 5.4011 | 3 | 252 | 18.3 | 395.63 | 4.32 | 30.8 |
0.03359 | 75.0 | 2.95 | 0 | 0.4280 | 7.024 | 15.8 | 5.4011 | 3 | 252 | 18.3 | 395.62 | 1.98 | 34.9 |
0.12744 | 0.0 | 6.91 | 0 | 0.4480 | 6.770 | 2.9 | 5.7209 | 3 | 233 | 17.9 | 385.41 | 4.84 | 26.6 |
0.14150 | 0.0 | 6.91 | 0 | 0.4480 | 6.169 | 6.6 | 5.7209 | 3 | 233 | 17.9 | 383.37 | 5.81 | 25.3 |
0.15936 | 0.0 | 6.91 | 0 | 0.4480 | 6.211 | 6.5 | 5.7209 | 3 | 233 | 17.9 | 394.46 | 7.44 | 24.7 |
0.12269 | 0.0 | 6.91 | 0 | 0.4480 | 6.069 | 40.0 | 5.7209 | 3 | 233 | 17.9 | 389.39 | 9.55 | 21.2 |
0.17142 | 0.0 | 6.91 | 0 | 0.4480 | 5.682 | 33.8 | 5.1004 | 3 | 233 | 17.9 | 396.90 | 10.21 | 19.3 |
0.18836 | 0.0 | 6.91 | 0 | 0.4480 | 5.786 | 33.3 | 5.1004 | 3 | 233 | 17.9 | 396.90 | 14.15 | 20.0 |
0.22927 | 0.0 | 6.91 | 0 | 0.4480 | 6.030 | 85.5 | 5.6894 | 3 | 233 | 17.9 | 392.74 | 18.80 | 16.6 |
0.25387 | 0.0 | 6.91 | 0 | 0.4480 | 5.399 | 95.3 | 5.8700 | 3 | 233 | 17.9 | 396.90 | 30.81 | 14.4 |
0.21977 | 0.0 | 6.91 | 0 | 0.4480 | 5.602 | 62.0 | 6.0877 | 3 | 233 | 17.9 | 396.90 | 16.20 | 19.4 |
0.08873 | 21.0 | 5.64 | 0 | 0.4390 | 5.963 | 45.7 | 6.8147 | 4 | 243 | 16.8 | 395.56 | 13.45 | 19.7 |
0.04337 | 21.0 | 5.64 | 0 | 0.4390 | 6.115 | 63.0 | 6.8147 | 4 | 243 | 16.8 | 393.97 | 9.43 | 20.5 |
0.05360 | 21.0 | 5.64 | 0 | 0.4390 | 6.511 | 21.1 | 6.8147 | 4 | 243 | 16.8 | 396.90 | 5.28 | 25.0 |
0.04981 | 21.0 | 5.64 | 0 | 0.4390 | 5.998 | 21.4 | 6.8147 | 4 | 243 | 16.8 | 396.90 | 8.43 | 23.4 |
0.01360 | 75.0 | 4.00 | 0 | 0.4100 | 5.888 | 47.6 | 7.3197 | 3 | 469 | 21.1 | 396.90 | 14.80 | 18.9 |
0.01311 | 90.0 | 1.22 | 0 | 0.4030 | 7.249 | 21.9 | 8.6966 | 5 | 226 | 17.9 | 395.93 | 4.81 | 35.4 |
0.02055 | 85.0 | 0.74 | 0 | 0.4100 | 6.383 | 35.7 | 9.1876 | 2 | 313 | 17.3 | 396.90 | 5.77 | 24.7 |
0.01432 | 100.0 | 1.32 | 0 | 0.4110 | 6.816 | 40.5 | 8.3248 | 5 | 256 | 15.1 | 392.90 | 3.95 | 31.6 |
0.15445 | 25.0 | 5.13 | 0 | 0.4530 | 6.145 | 29.2 | 7.8148 | 8 | 284 | 19.7 | 390.68 | 6.86 | 23.3 |
0.10328 | 25.0 | 5.13 | 0 | 0.4530 | 5.927 | 47.2 | 6.9320 | 8 | 284 | 19.7 | 396.90 | 9.22 | 19.6 |
0.14932 | 25.0 | 5.13 | 0 | 0.4530 | 5.741 | 66.2 | 7.2254 | 8 | 284 | 19.7 | 395.11 | 13.15 | 18.7 |
0.17171 | 25.0 | 5.13 | 0 | 0.4530 | 5.966 | 93.4 | 6.8185 | 8 | 284 | 19.7 | 378.08 | 14.44 | 16.0 |
0.11027 | 25.0 | 5.13 | 0 | 0.4530 | 6.456 | 67.8 | 7.2255 | 8 | 284 | 19.7 | 396.90 | 6.73 | 22.2 |
0.12650 | 25.0 | 5.13 | 0 | 0.4530 | 6.762 | 43.4 | 7.9809 | 8 | 284 | 19.7 | 395.58 | 9.50 | 25.0 |
0.01951 | 17.5 | 1.38 | 0 | 0.4161 | 7.104 | 59.5 | 9.2229 | 3 | 216 | 18.6 | 393.24 | 8.05 | 33.0 |
0.03584 | 80.0 | 3.37 | 0 | 0.3980 | 6.290 | 17.8 | 6.6115 | 4 | 337 | 16.1 | 396.90 | 4.67 | 23.5 |
0.04379 | 80.0 | 3.37 | 0 | 0.3980 | 5.787 | 31.1 | 6.6115 | 4 | 337 | 16.1 | 396.90 | 10.24 | 19.4 |
0.05789 | 12.5 | 6.07 | 0 | 0.4090 | 5.878 | 21.4 | 6.4980 | 4 | 345 | 18.9 | 396.21 | 8.10 | 22.0 |
0.13554 | 12.5 | 6.07 | 0 | 0.4090 | 5.594 | 36.8 | 6.4980 | 4 | 345 | 18.9 | 396.90 | 13.09 | 17.4 |
0.12816 | 12.5 | 6.07 | 0 | 0.4090 | 5.885 | 33.0 | 6.4980 | 4 | 345 | 18.9 | 396.90 | 8.79 | 20.9 |
0.08826 | 0.0 | 10.81 | 0 | 0.4130 | 6.417 | 6.6 | 5.2873 | 4 | 305 | 19.2 | 383.73 | 6.72 | 24.2 |
0.15876 | 0.0 | 10.81 | 0 | 0.4130 | 5.961 | 17.5 | 5.2873 | 4 | 305 | 19.2 | 376.94 | 9.88 | 21.7 |
0.09164 | 0.0 | 10.81 | 0 | 0.4130 | 6.065 | 7.8 | 5.2873 | 4 | 305 | 19.2 | 390.91 | 5.52 | 22.8 |
0.19539 | 0.0 | 10.81 | 0 | 0.4130 | 6.245 | 6.2 | 5.2873 | 4 | 305 | 19.2 | 377.17 | 7.54 | 23.4 |
0.07896 | 0.0 | 12.83 | 0 | 0.4370 | 6.273 | 6.0 | 4.2515 | 5 | 398 | 18.7 | 394.92 | 6.78 | 24.1 |
0.09512 | 0.0 | 12.83 | 0 | 0.4370 | 6.286 | 45.0 | 4.5026 | 5 | 398 | 18.7 | 383.23 | 8.94 | 21.4 |
0.10153 | 0.0 | 12.83 | 0 | 0.4370 | 6.279 | 74.5 | 4.0522 | 5 | 398 | 18.7 | 373.66 | 11.97 | 20.0 |
0.08707 | 0.0 | 12.83 | 0 | 0.4370 | 6.140 | 45.8 | 4.0905 | 5 | 398 | 18.7 | 386.96 | 10.27 | 20.8 |
0.05646 | 0.0 | 12.83 | 0 | 0.4370 | 6.232 | 53.7 | 5.0141 | 5 | 398 | 18.7 | 386.40 | 12.34 | 21.2 |
0.08387 | 0.0 | 12.83 | 0 | 0.4370 | 5.874 | 36.6 | 4.5026 | 5 | 398 | 18.7 | 396.06 | 9.10 | 20.3 |
0.04113 | 25.0 | 4.86 | 0 | 0.4260 | 6.727 | 33.5 | 5.4007 | 4 | 281 | 19.0 | 396.90 | 5.29 | 28.0 |
0.04462 | 25.0 | 4.86 | 0 | 0.4260 | 6.619 | 70.4 | 5.4007 | 4 | 281 | 19.0 | 395.63 | 7.22 | 23.9 |
0.03659 | 25.0 | 4.86 | 0 | 0.4260 | 6.302 | 32.2 | 5.4007 | 4 | 281 | 19.0 | 396.90 | 6.72 | 24.8 |
0.03551 | 25.0 | 4.86 | 0 | 0.4260 | 6.167 | 46.7 | 5.4007 | 4 | 281 | 19.0 | 390.64 | 7.51 | 22.9 |
0.05059 | 0.0 | 4.49 | 0 | 0.4490 | 6.389 | 48.0 | 4.7794 | 3 | 247 | 18.5 | 396.90 | 9.62 | 23.9 |
0.05735 | 0.0 | 4.49 | 0 | 0.4490 | 6.630 | 56.1 | 4.4377 | 3 | 247 | 18.5 | 392.30 | 6.53 | 26.6 |
0.05188 | 0.0 | 4.49 | 0 | 0.4490 | 6.015 | 45.1 | 4.4272 | 3 | 247 | 18.5 | 395.99 | 12.86 | 22.5 |
0.07151 | 0.0 | 4.49 | 0 | 0.4490 | 6.121 | 56.8 | 3.7476 | 3 | 247 | 18.5 | 395.15 | 8.44 | 22.2 |
0.05660 | 0.0 | 3.41 | 0 | 0.4890 | 7.007 | 86.3 | 3.4217 | 2 | 270 | 17.8 | 396.90 | 5.50 | 23.6 |
0.05302 | 0.0 | 3.41 | 0 | 0.4890 | 7.079 | 63.1 | 3.4145 | 2 | 270 | 17.8 | 396.06 | 5.70 | 28.7 |
0.04684 | 0.0 | 3.41 | 0 | 0.4890 | 6.417 | 66.1 | 3.0923 | 2 | 270 | 17.8 | 392.18 | 8.81 | 22.6 |
0.03932 | 0.0 | 3.41 | 0 | 0.4890 | 6.405 | 73.9 | 3.0921 | 2 | 270 | 17.8 | 393.55 | 8.20 | 22.0 |
0.04203 | 28.0 | 15.04 | 0 | 0.4640 | 6.442 | 53.6 | 3.6659 | 4 | 270 | 18.2 | 395.01 | 8.16 | 22.9 |
0.02875 | 28.0 | 15.04 | 0 | 0.4640 | 6.211 | 28.9 | 3.6659 | 4 | 270 | 18.2 | 396.33 | 6.21 | 25.0 |
0.04294 | 28.0 | 15.04 | 0 | 0.4640 | 6.249 | 77.3 | 3.6150 | 4 | 270 | 18.2 | 396.90 | 10.59 | 20.6 |
0.12204 | 0.0 | 2.89 | 0 | 0.4450 | 6.625 | 57.8 | 3.4952 | 2 | 276 | 18.0 | 357.98 | 6.65 | 28.4 |
0.11504 | 0.0 | 2.89 | 0 | 0.4450 | 6.163 | 69.6 | 3.4952 | 2 | 276 | 18.0 | 391.83 | 11.34 | 21.4 |
0.12083 | 0.0 | 2.89 | 0 | 0.4450 | 8.069 | 76.0 | 3.4952 | 2 | 276 | 18.0 | 396.90 | 4.21 | 38.7 |
0.08187 | 0.0 | 2.89 | 0 | 0.4450 | 7.820 | 36.9 | 3.4952 | 2 | 276 | 18.0 | 393.53 | 3.57 | 43.8 |
0.06860 | 0.0 | 2.89 | 0 | 0.4450 | 7.416 | 62.5 | 3.4952 | 2 | 276 | 18.0 | 396.90 | 6.19 | 33.2 |
0.14866 | 0.0 | 8.56 | 0 | 0.5200 | 6.727 | 79.9 | 2.7778 | 5 | 384 | 20.9 | 394.76 | 9.42 | 27.5 |
0.11432 | 0.0 | 8.56 | 0 | 0.5200 | 6.781 | 71.3 | 2.8561 | 5 | 384 | 20.9 | 395.58 | 7.67 | 26.5 |
0.22876 | 0.0 | 8.56 | 0 | 0.5200 | 6.405 | 85.4 | 2.7147 | 5 | 384 | 20.9 | 70.80 | 10.63 | 18.6 |
0.21161 | 0.0 | 8.56 | 0 | 0.5200 | 6.137 | 87.4 | 2.7147 | 5 | 384 | 20.9 | 394.47 | 13.44 | 19.3 |
0.13960 | 0.0 | 8.56 | 0 | 0.5200 | 6.167 | 90.0 | 2.4210 | 5 | 384 | 20.9 | 392.69 | 12.33 | 20.1 |
0.13262 | 0.0 | 8.56 | 0 | 0.5200 | 5.851 | 96.7 | 2.1069 | 5 | 384 | 20.9 | 394.05 | 16.47 | 19.5 |
0.17120 | 0.0 | 8.56 | 0 | 0.5200 | 5.836 | 91.9 | 2.2110 | 5 | 384 | 20.9 | 395.67 | 18.66 | 19.5 |
0.13117 | 0.0 | 8.56 | 0 | 0.5200 | 6.127 | 85.2 | 2.1224 | 5 | 384 | 20.9 | 387.69 | 14.09 | 20.4 |
0.12802 | 0.0 | 8.56 | 0 | 0.5200 | 6.474 | 97.1 | 2.4329 | 5 | 384 | 20.9 | 395.24 | 12.27 | 19.8 |
0.26363 | 0.0 | 8.56 | 0 | 0.5200 | 6.229 | 91.2 | 2.5451 | 5 | 384 | 20.9 | 391.23 | 15.55 | 19.4 |
0.10793 | 0.0 | 8.56 | 0 | 0.5200 | 6.195 | 54.4 | 2.7778 | 5 | 384 | 20.9 | 393.49 | 13.00 | 21.7 |
0.10084 | 0.0 | 10.01 | 0 | 0.5470 | 6.715 | 81.6 | 2.6775 | 6 | 432 | 17.8 | 395.59 | 10.16 | 22.8 |
0.12329 | 0.0 | 10.01 | 0 | 0.5470 | 5.913 | 92.9 | 2.3534 | 6 | 432 | 17.8 | 394.95 | 16.21 | 18.8 |
0.22212 | 0.0 | 10.01 | 0 | 0.5470 | 6.092 | 95.4 | 2.5480 | 6 | 432 | 17.8 | 396.90 | 17.09 | 18.7 |
0.14231 | 0.0 | 10.01 | 0 | 0.5470 | 6.254 | 84.2 | 2.2565 | 6 | 432 | 17.8 | 388.74 | 10.45 | 18.5 |
0.17134 | 0.0 | 10.01 | 0 | 0.5470 | 5.928 | 88.2 | 2.4631 | 6 | 432 | 17.8 | 344.91 | 15.76 | 18.3 |
0.13158 | 0.0 | 10.01 | 0 | 0.5470 | 6.176 | 72.5 | 2.7301 | 6 | 432 | 17.8 | 393.30 | 12.04 | 21.2 |
0.15098 | 0.0 | 10.01 | 0 | 0.5470 | 6.021 | 82.6 | 2.7474 | 6 | 432 | 17.8 | 394.51 | 10.30 | 19.2 |
0.13058 | 0.0 | 10.01 | 0 | 0.5470 | 5.872 | 73.1 | 2.4775 | 6 | 432 | 17.8 | 338.63 | 15.37 | 20.4 |
0.14476 | 0.0 | 10.01 | 0 | 0.5470 | 5.731 | 65.2 | 2.7592 | 6 | 432 | 17.8 | 391.50 | 13.61 | 19.3 |
0.06899 | 0.0 | 25.65 | 0 | 0.5810 | 5.870 | 69.7 | 2.2577 | 2 | 188 | 19.1 | 389.15 | 14.37 | 22.0 |
0.07165 | 0.0 | 25.65 | 0 | 0.5810 | 6.004 | 84.1 | 2.1974 | 2 | 188 | 19.1 | 377.67 | 14.27 | 20.3 |
0.09299 | 0.0 | 25.65 | 0 | 0.5810 | 5.961 | 92.9 | 2.0869 | 2 | 188 | 19.1 | 378.09 | 17.93 | 20.5 |
0.15038 | 0.0 | 25.65 | 0 | 0.5810 | 5.856 | 97.0 | 1.9444 | 2 | 188 | 19.1 | 370.31 | 25.41 | 17.3 |
0.09849 | 0.0 | 25.65 | 0 | 0.5810 | 5.879 | 95.8 | 2.0063 | 2 | 188 | 19.1 | 379.38 | 17.58 | 18.8 |
0.16902 | 0.0 | 25.65 | 0 | 0.5810 | 5.986 | 88.4 | 1.9929 | 2 | 188 | 19.1 | 385.02 | 14.81 | 21.4 |
0.38735 | 0.0 | 25.65 | 0 | 0.5810 | 5.613 | 95.6 | 1.7572 | 2 | 188 | 19.1 | 359.29 | 27.26 | 15.7 |
0.25915 | 0.0 | 21.89 | 0 | 0.6240 | 5.693 | 96.0 | 1.7883 | 4 | 437 | 21.2 | 392.11 | 17.19 | 16.2 |
0.32543 | 0.0 | 21.89 | 0 | 0.6240 | 6.431 | 98.8 | 1.8125 | 4 | 437 | 21.2 | 396.90 | 15.39 | 18.0 |
0.88125 | 0.0 | 21.89 | 0 | 0.6240 | 5.637 | 94.7 | 1.9799 | 4 | 437 | 21.2 | 396.90 | 18.34 | 14.3 |
0.34006 | 0.0 | 21.89 | 0 | 0.6240 | 6.458 | 98.9 | 2.1185 | 4 | 437 | 21.2 | 395.04 | 12.60 | 19.2 |
1.19294 | 0.0 | 21.89 | 0 | 0.6240 | 6.326 | 97.7 | 2.2710 | 4 | 437 | 21.2 | 396.90 | 12.26 | 19.6 |
0.59005 | 0.0 | 21.89 | 0 | 0.6240 | 6.372 | 97.9 | 2.3274 | 4 | 437 | 21.2 | 385.76 | 11.12 | 23.0 |
0.32982 | 0.0 | 21.89 | 0 | 0.6240 | 5.822 | 95.4 | 2.4699 | 4 | 437 | 21.2 | 388.69 | 15.03 | 18.4 |
0.97617 | 0.0 | 21.89 | 0 | 0.6240 | 5.757 | 98.4 | 2.3460 | 4 | 437 | 21.2 | 262.76 | 17.31 | 15.6 |
0.55778 | 0.0 | 21.89 | 0 | 0.6240 | 6.335 | 98.2 | 2.1107 | 4 | 437 | 21.2 | 394.67 | 16.96 | 18.1 |
0.32264 | 0.0 | 21.89 | 0 | 0.6240 | 5.942 | 93.5 | 1.9669 | 4 | 437 | 21.2 | 378.25 | 16.90 | 17.4 |
0.35233 | 0.0 | 21.89 | 0 | 0.6240 | 6.454 | 98.4 | 1.8498 | 4 | 437 | 21.2 | 394.08 | 14.59 | 17.1 |
0.24980 | 0.0 | 21.89 | 0 | 0.6240 | 5.857 | 98.2 | 1.6686 | 4 | 437 | 21.2 | 392.04 | 21.32 | 13.3 |
0.54452 | 0.0 | 21.89 | 0 | 0.6240 | 6.151 | 97.9 | 1.6687 | 4 | 437 | 21.2 | 396.90 | 18.46 | 17.8 |
0.29090 | 0.0 | 21.89 | 0 | 0.6240 | 6.174 | 93.6 | 1.6119 | 4 | 437 | 21.2 | 388.08 | 24.16 | 14.0 |
1.62864 | 0.0 | 21.89 | 0 | 0.6240 | 5.019 | 100.0 | 1.4394 | 4 | 437 | 21.2 | 396.90 | 34.41 | 14.4 |
3.32105 | 0.0 | 19.58 | 1 | 0.8710 | 5.403 | 100.0 | 1.3216 | 5 | 403 | 14.7 | 396.90 | 26.82 | 13.4 |
4.09740 | 0.0 | 19.58 | 0 | 0.8710 | 5.468 | 100.0 | 1.4118 | 5 | 403 | 14.7 | 396.90 | 26.42 | 15.6 |
2.77974 | 0.0 | 19.58 | 0 | 0.8710 | 4.903 | 97.8 | 1.3459 | 5 | 403 | 14.7 | 396.90 | 29.29 | 11.8 |
2.37934 | 0.0 | 19.58 | 0 | 0.8710 | 6.130 | 100.0 | 1.4191 | 5 | 403 | 14.7 | 172.91 | 27.80 | 13.8 |
2.15505 | 0.0 | 19.58 | 0 | 0.8710 | 5.628 | 100.0 | 1.5166 | 5 | 403 | 14.7 | 169.27 | 16.65 | 15.6 |
2.36862 | 0.0 | 19.58 | 0 | 0.8710 | 4.926 | 95.7 | 1.4608 | 5 | 403 | 14.7 | 391.71 | 29.53 | 14.6 |
2.33099 | 0.0 | 19.58 | 0 | 0.8710 | 5.186 | 93.8 | 1.5296 | 5 | 403 | 14.7 | 356.99 | 28.32 | 17.8 |
2.73397 | 0.0 | 19.58 | 0 | 0.8710 | 5.597 | 94.9 | 1.5257 | 5 | 403 | 14.7 | 351.85 | 21.45 | 15.4 |
1.65660 | 0.0 | 19.58 | 0 | 0.8710 | 6.122 | 97.3 | 1.6180 | 5 | 403 | 14.7 | 372.80 | 14.10 | 21.5 |
1.49632 | 0.0 | 19.58 | 0 | 0.8710 | 5.404 | 100.0 | 1.5916 | 5 | 403 | 14.7 | 341.60 | 13.28 | 19.6 |
1.12658 | 0.0 | 19.58 | 1 | 0.8710 | 5.012 | 88.0 | 1.6102 | 5 | 403 | 14.7 | 343.28 | 12.12 | 15.3 |
2.14918 | 0.0 | 19.58 | 0 | 0.8710 | 5.709 | 98.5 | 1.6232 | 5 | 403 | 14.7 | 261.95 | 15.79 | 19.4 |
1.41385 | 0.0 | 19.58 | 1 | 0.8710 | 6.129 | 96.0 | 1.7494 | 5 | 403 | 14.7 | 321.02 | 15.12 | 17.0 |
3.53501 | 0.0 | 19.58 | 1 | 0.8710 | 6.152 | 82.6 | 1.7455 | 5 | 403 | 14.7 | 88.01 | 15.02 | 15.6 |
2.44668 | 0.0 | 19.58 | 0 | 0.8710 | 5.272 | 94.0 | 1.7364 | 5 | 403 | 14.7 | 88.63 | 16.14 | 13.1 |
1.22358 | 0.0 | 19.58 | 0 | 0.6050 | 6.943 | 97.4 | 1.8773 | 5 | 403 | 14.7 | 363.43 | 4.59 | 41.3 |
1.34284 | 0.0 | 19.58 | 0 | 0.6050 | 6.066 | 100.0 | 1.7573 | 5 | 403 | 14.7 | 353.89 | 6.43 | 24.3 |
1.42502 | 0.0 | 19.58 | 0 | 0.8710 | 6.510 | 100.0 | 1.7659 | 5 | 403 | 14.7 | 364.31 | 7.39 | 23.3 |
1.27346 | 0.0 | 19.58 | 1 | 0.6050 | 6.250 | 92.6 | 1.7984 | 5 | 403 | 14.7 | 338.92 | 5.50 | 27.0 |
1.46336 | 0.0 | 19.58 | 0 | 0.6050 | 7.489 | 90.8 | 1.9709 | 5 | 403 | 14.7 | 374.43 | 1.73 | 50.0 |
1.83377 | 0.0 | 19.58 | 1 | 0.6050 | 7.802 | 98.2 | 2.0407 | 5 | 403 | 14.7 | 389.61 | 1.92 | 50.0 |
1.51902 | 0.0 | 19.58 | 1 | 0.6050 | 8.375 | 93.9 | 2.1620 | 5 | 403 | 14.7 | 388.45 | 3.32 | 50.0 |
2.24236 | 0.0 | 19.58 | 0 | 0.6050 | 5.854 | 91.8 | 2.4220 | 5 | 403 | 14.7 | 395.11 | 11.64 | 22.7 |
2.92400 | 0.0 | 19.58 | 0 | 0.6050 | 6.101 | 93.0 | 2.2834 | 5 | 403 | 14.7 | 240.16 | 9.81 | 25.0 |
2.01019 | 0.0 | 19.58 | 0 | 0.6050 | 7.929 | 96.2 | 2.0459 | 5 | 403 | 14.7 | 369.30 | 3.70 | 50.0 |
1.80028 | 0.0 | 19.58 | 0 | 0.6050 | 5.877 | 79.2 | 2.4259 | 5 | 403 | 14.7 | 227.61 | 12.14 | 23.8 |
2.30040 | 0.0 | 19.58 | 0 | 0.6050 | 6.319 | 96.1 | 2.1000 | 5 | 403 | 14.7 | 297.09 | 11.10 | 23.8 |
2.44953 | 0.0 | 19.58 | 0 | 0.6050 | 6.402 | 95.2 | 2.2625 | 5 | 403 | 14.7 | 330.04 | 11.32 | 22.3 |
1.20742 | 0.0 | 19.58 | 0 | 0.6050 | 5.875 | 94.6 | 2.4259 | 5 | 403 | 14.7 | 292.29 | 14.43 | 17.4 |
2.31390 | 0.0 | 19.58 | 0 | 0.6050 | 5.880 | 97.3 | 2.3887 | 5 | 403 | 14.7 | 348.13 | 12.03 | 19.1 |
0.13914 | 0.0 | 4.05 | 0 | 0.5100 | 5.572 | 88.5 | 2.5961 | 5 | 296 | 16.6 | 396.90 | 14.69 | 23.1 |
0.09178 | 0.0 | 4.05 | 0 | 0.5100 | 6.416 | 84.1 | 2.6463 | 5 | 296 | 16.6 | 395.50 | 9.04 | 23.6 |
0.08447 | 0.0 | 4.05 | 0 | 0.5100 | 5.859 | 68.7 | 2.7019 | 5 | 296 | 16.6 | 393.23 | 9.64 | 22.6 |
0.06664 | 0.0 | 4.05 | 0 | 0.5100 | 6.546 | 33.1 | 3.1323 | 5 | 296 | 16.6 | 390.96 | 5.33 | 29.4 |
0.07022 | 0.0 | 4.05 | 0 | 0.5100 | 6.020 | 47.2 | 3.5549 | 5 | 296 | 16.6 | 393.23 | 10.11 | 23.2 |
0.05425 | 0.0 | 4.05 | 0 | 0.5100 | 6.315 | 73.4 | 3.3175 | 5 | 296 | 16.6 | 395.60 | 6.29 | 24.6 |
0.06642 | 0.0 | 4.05 | 0 | 0.5100 | 6.860 | 74.4 | 2.9153 | 5 | 296 | 16.6 | 391.27 | 6.92 | 29.9 |
0.05780 | 0.0 | 2.46 | 0 | 0.4880 | 6.980 | 58.4 | 2.8290 | 3 | 193 | 17.8 | 396.90 | 5.04 | 37.2 |
0.06588 | 0.0 | 2.46 | 0 | 0.4880 | 7.765 | 83.3 | 2.7410 | 3 | 193 | 17.8 | 395.56 | 7.56 | 39.8 |
0.06888 | 0.0 | 2.46 | 0 | 0.4880 | 6.144 | 62.2 | 2.5979 | 3 | 193 | 17.8 | 396.90 | 9.45 | 36.2 |
0.09103 | 0.0 | 2.46 | 0 | 0.4880 | 7.155 | 92.2 | 2.7006 | 3 | 193 | 17.8 | 394.12 | 4.82 | 37.9 |
0.10008 | 0.0 | 2.46 | 0 | 0.4880 | 6.563 | 95.6 | 2.8470 | 3 | 193 | 17.8 | 396.90 | 5.68 | 32.5 |
0.08308 | 0.0 | 2.46 | 0 | 0.4880 | 5.604 | 89.8 | 2.9879 | 3 | 193 | 17.8 | 391.00 | 13.98 | 26.4 |
0.06047 | 0.0 | 2.46 | 0 | 0.4880 | 6.153 | 68.8 | 3.2797 | 3 | 193 | 17.8 | 387.11 | 13.15 | 29.6 |
0.05602 | 0.0 | 2.46 | 0 | 0.4880 | 7.831 | 53.6 | 3.1992 | 3 | 193 | 17.8 | 392.63 | 4.45 | 50.0 |
0.07875 | 45.0 | 3.44 | 0 | 0.4370 | 6.782 | 41.1 | 3.7886 | 5 | 398 | 15.2 | 393.87 | 6.68 | 32.0 |
0.12579 | 45.0 | 3.44 | 0 | 0.4370 | 6.556 | 29.1 | 4.5667 | 5 | 398 | 15.2 | 382.84 | 4.56 | 29.8 |
0.08370 | 45.0 | 3.44 | 0 | 0.4370 | 7.185 | 38.9 | 4.5667 | 5 | 398 | 15.2 | 396.90 | 5.39 | 34.9 |
0.09068 | 45.0 | 3.44 | 0 | 0.4370 | 6.951 | 21.5 | 6.4798 | 5 | 398 | 15.2 | 377.68 | 5.10 | 37.0 |
0.06911 | 45.0 | 3.44 | 0 | 0.4370 | 6.739 | 30.8 | 6.4798 | 5 | 398 | 15.2 | 389.71 | 4.69 | 30.5 |
0.08664 | 45.0 | 3.44 | 0 | 0.4370 | 7.178 | 26.3 | 6.4798 | 5 | 398 | 15.2 | 390.49 | 2.87 | 36.4 |
0.02187 | 60.0 | 2.93 | 0 | 0.4010 | 6.800 | 9.9 | 6.2196 | 1 | 265 | 15.6 | 393.37 | 5.03 | 31.1 |
0.01439 | 60.0 | 2.93 | 0 | 0.4010 | 6.604 | 18.8 | 6.2196 | 1 | 265 | 15.6 | 376.70 | 4.38 | 29.1 |
0.01381 | 80.0 | 0.46 | 0 | 0.4220 | 7.875 | 32.0 | 5.6484 | 4 | 255 | 14.4 | 394.23 | 2.97 | 50.0 |
0.04011 | 80.0 | 1.52 | 0 | 0.4040 | 7.287 | 34.1 | 7.3090 | 2 | 329 | 12.6 | 396.90 | 4.08 | 33.3 |
0.04666 | 80.0 | 1.52 | 0 | 0.4040 | 7.107 | 36.6 | 7.3090 | 2 | 329 | 12.6 | 354.31 | 8.61 | 30.3 |
0.03768 | 80.0 | 1.52 | 0 | 0.4040 | 7.274 | 38.3 | 7.3090 | 2 | 329 | 12.6 | 392.20 | 6.62 | 34.6 |
0.03150 | 95.0 | 1.47 | 0 | 0.4030 | 6.975 | 15.3 | 7.6534 | 3 | 402 | 17.0 | 396.90 | 4.56 | 34.9 |
0.01778 | 95.0 | 1.47 | 0 | 0.4030 | 7.135 | 13.9 | 7.6534 | 3 | 402 | 17.0 | 384.30 | 4.45 | 32.9 |
0.03445 | 82.5 | 2.03 | 0 | 0.4150 | 6.162 | 38.4 | 6.2700 | 2 | 348 | 14.7 | 393.77 | 7.43 | 24.1 |
0.02177 | 82.5 | 2.03 | 0 | 0.4150 | 7.610 | 15.7 | 6.2700 | 2 | 348 | 14.7 | 395.38 | 3.11 | 42.3 |
0.03510 | 95.0 | 2.68 | 0 | 0.4161 | 7.853 | 33.2 | 5.1180 | 4 | 224 | 14.7 | 392.78 | 3.81 | 48.5 |
0.02009 | 95.0 | 2.68 | 0 | 0.4161 | 8.034 | 31.9 | 5.1180 | 4 | 224 | 14.7 | 390.55 | 2.88 | 50.0 |
0.13642 | 0.0 | 10.59 | 0 | 0.4890 | 5.891 | 22.3 | 3.9454 | 4 | 277 | 18.6 | 396.90 | 10.87 | 22.6 |
0.22969 | 0.0 | 10.59 | 0 | 0.4890 | 6.326 | 52.5 | 4.3549 | 4 | 277 | 18.6 | 394.87 | 10.97 | 24.4 |
0.25199 | 0.0 | 10.59 | 0 | 0.4890 | 5.783 | 72.7 | 4.3549 | 4 | 277 | 18.6 | 389.43 | 18.06 | 22.5 |
0.13587 | 0.0 | 10.59 | 1 | 0.4890 | 6.064 | 59.1 | 4.2392 | 4 | 277 | 18.6 | 381.32 | 14.66 | 24.4 |
0.43571 | 0.0 | 10.59 | 1 | 0.4890 | 5.344 | 100.0 | 3.8750 | 4 | 277 | 18.6 | 396.90 | 23.09 | 20.0 |
0.17446 | 0.0 | 10.59 | 1 | 0.4890 | 5.960 | 92.1 | 3.8771 | 4 | 277 | 18.6 | 393.25 | 17.27 | 21.7 |
0.37578 | 0.0 | 10.59 | 1 | 0.4890 | 5.404 | 88.6 | 3.6650 | 4 | 277 | 18.6 | 395.24 | 23.98 | 19.3 |
0.21719 | 0.0 | 10.59 | 1 | 0.4890 | 5.807 | 53.8 | 3.6526 | 4 | 277 | 18.6 | 390.94 | 16.03 | 22.4 |
0.14052 | 0.0 | 10.59 | 0 | 0.4890 | 6.375 | 32.3 | 3.9454 | 4 | 277 | 18.6 | 385.81 | 9.38 | 28.1 |
0.28955 | 0.0 | 10.59 | 0 | 0.4890 | 5.412 | 9.8 | 3.5875 | 4 | 277 | 18.6 | 348.93 | 29.55 | 23.7 |
0.19802 | 0.0 | 10.59 | 0 | 0.4890 | 6.182 | 42.4 | 3.9454 | 4 | 277 | 18.6 | 393.63 | 9.47 | 25.0 |
0.04560 | 0.0 | 13.89 | 1 | 0.5500 | 5.888 | 56.0 | 3.1121 | 5 | 276 | 16.4 | 392.80 | 13.51 | 23.3 |
0.07013 | 0.0 | 13.89 | 0 | 0.5500 | 6.642 | 85.1 | 3.4211 | 5 | 276 | 16.4 | 392.78 | 9.69 | 28.7 |
0.11069 | 0.0 | 13.89 | 1 | 0.5500 | 5.951 | 93.8 | 2.8893 | 5 | 276 | 16.4 | 396.90 | 17.92 | 21.5 |
0.11425 | 0.0 | 13.89 | 1 | 0.5500 | 6.373 | 92.4 | 3.3633 | 5 | 276 | 16.4 | 393.74 | 10.50 | 23.0 |
0.35809 | 0.0 | 6.20 | 1 | 0.5070 | 6.951 | 88.5 | 2.8617 | 8 | 307 | 17.4 | 391.70 | 9.71 | 26.7 |
0.40771 | 0.0 | 6.20 | 1 | 0.5070 | 6.164 | 91.3 | 3.0480 | 8 | 307 | 17.4 | 395.24 | 21.46 | 21.7 |
0.62356 | 0.0 | 6.20 | 1 | 0.5070 | 6.879 | 77.7 | 3.2721 | 8 | 307 | 17.4 | 390.39 | 9.93 | 27.5 |
0.61470 | 0.0 | 6.20 | 0 | 0.5070 | 6.618 | 80.8 | 3.2721 | 8 | 307 | 17.4 | 396.90 | 7.60 | 30.1 |
0.31533 | 0.0 | 6.20 | 0 | 0.5040 | 8.266 | 78.3 | 2.8944 | 8 | 307 | 17.4 | 385.05 | 4.14 | 44.8 |
0.52693 | 0.0 | 6.20 | 0 | 0.5040 | 8.725 | 83.0 | 2.8944 | 8 | 307 | 17.4 | 382.00 | 4.63 | 50.0 |
0.38214 | 0.0 | 6.20 | 0 | 0.5040 | 8.040 | 86.5 | 3.2157 | 8 | 307 | 17.4 | 387.38 | 3.13 | 37.6 |
0.41238 | 0.0 | 6.20 | 0 | 0.5040 | 7.163 | 79.9 | 3.2157 | 8 | 307 | 17.4 | 372.08 | 6.36 | 31.6 |
0.29819 | 0.0 | 6.20 | 0 | 0.5040 | 7.686 | 17.0 | 3.3751 | 8 | 307 | 17.4 | 377.51 | 3.92 | 46.7 |
0.44178 | 0.0 | 6.20 | 0 | 0.5040 | 6.552 | 21.4 | 3.3751 | 8 | 307 | 17.4 | 380.34 | 3.76 | 31.5 |
0.53700 | 0.0 | 6.20 | 0 | 0.5040 | 5.981 | 68.1 | 3.6715 | 8 | 307 | 17.4 | 378.35 | 11.65 | 24.3 |
0.46296 | 0.0 | 6.20 | 0 | 0.5040 | 7.412 | 76.9 | 3.6715 | 8 | 307 | 17.4 | 376.14 | 5.25 | 31.7 |
0.57529 | 0.0 | 6.20 | 0 | 0.5070 | 8.337 | 73.3 | 3.8384 | 8 | 307 | 17.4 | 385.91 | 2.47 | 41.7 |
0.33147 | 0.0 | 6.20 | 0 | 0.5070 | 8.247 | 70.4 | 3.6519 | 8 | 307 | 17.4 | 378.95 | 3.95 | 48.3 |
0.44791 | 0.0 | 6.20 | 1 | 0.5070 | 6.726 | 66.5 | 3.6519 | 8 | 307 | 17.4 | 360.20 | 8.05 | 29.0 |
0.33045 | 0.0 | 6.20 | 0 | 0.5070 | 6.086 | 61.5 | 3.6519 | 8 | 307 | 17.4 | 376.75 | 10.88 | 24.0 |
0.52058 | 0.0 | 6.20 | 1 | 0.5070 | 6.631 | 76.5 | 4.1480 | 8 | 307 | 17.4 | 388.45 | 9.54 | 25.1 |
0.51183 | 0.0 | 6.20 | 0 | 0.5070 | 7.358 | 71.6 | 4.1480 | 8 | 307 | 17.4 | 390.07 | 4.73 | 31.5 |
0.08244 | 30.0 | 4.93 | 0 | 0.4280 | 6.481 | 18.5 | 6.1899 | 6 | 300 | 16.6 | 379.41 | 6.36 | 23.7 |
0.09252 | 30.0 | 4.93 | 0 | 0.4280 | 6.606 | 42.2 | 6.1899 | 6 | 300 | 16.6 | 383.78 | 7.37 | 23.3 |
0.11329 | 30.0 | 4.93 | 0 | 0.4280 | 6.897 | 54.3 | 6.3361 | 6 | 300 | 16.6 | 391.25 | 11.38 | 22.0 |
0.10612 | 30.0 | 4.93 | 0 | 0.4280 | 6.095 | 65.1 | 6.3361 | 6 | 300 | 16.6 | 394.62 | 12.40 | 20.1 |
0.10290 | 30.0 | 4.93 | 0 | 0.4280 | 6.358 | 52.9 | 7.0355 | 6 | 300 | 16.6 | 372.75 | 11.22 | 22.2 |
0.12757 | 30.0 | 4.93 | 0 | 0.4280 | 6.393 | 7.8 | 7.0355 | 6 | 300 | 16.6 | 374.71 | 5.19 | 23.7 |
0.20608 | 22.0 | 5.86 | 0 | 0.4310 | 5.593 | 76.5 | 7.9549 | 7 | 330 | 19.1 | 372.49 | 12.50 | 17.6 |
0.19133 | 22.0 | 5.86 | 0 | 0.4310 | 5.605 | 70.2 | 7.9549 | 7 | 330 | 19.1 | 389.13 | 18.46 | 18.5 |
0.33983 | 22.0 | 5.86 | 0 | 0.4310 | 6.108 | 34.9 | 8.0555 | 7 | 330 | 19.1 | 390.18 | 9.16 | 24.3 |
0.19657 | 22.0 | 5.86 | 0 | 0.4310 | 6.226 | 79.2 | 8.0555 | 7 | 330 | 19.1 | 376.14 | 10.15 | 20.5 |
0.16439 | 22.0 | 5.86 | 0 | 0.4310 | 6.433 | 49.1 | 7.8265 | 7 | 330 | 19.1 | 374.71 | 9.52 | 24.5 |
0.19073 | 22.0 | 5.86 | 0 | 0.4310 | 6.718 | 17.5 | 7.8265 | 7 | 330 | 19.1 | 393.74 | 6.56 | 26.2 |
0.14030 | 22.0 | 5.86 | 0 | 0.4310 | 6.487 | 13.0 | 7.3967 | 7 | 330 | 19.1 | 396.28 | 5.90 | 24.4 |
0.21409 | 22.0 | 5.86 | 0 | 0.4310 | 6.438 | 8.9 | 7.3967 | 7 | 330 | 19.1 | 377.07 | 3.59 | 24.8 |
0.08221 | 22.0 | 5.86 | 0 | 0.4310 | 6.957 | 6.8 | 8.9067 | 7 | 330 | 19.1 | 386.09 | 3.53 | 29.6 |
0.36894 | 22.0 | 5.86 | 0 | 0.4310 | 8.259 | 8.4 | 8.9067 | 7 | 330 | 19.1 | 396.90 | 3.54 | 42.8 |
0.04819 | 80.0 | 3.64 | 0 | 0.3920 | 6.108 | 32.0 | 9.2203 | 1 | 315 | 16.4 | 392.89 | 6.57 | 21.9 |
0.03548 | 80.0 | 3.64 | 0 | 0.3920 | 5.876 | 19.1 | 9.2203 | 1 | 315 | 16.4 | 395.18 | 9.25 | 20.9 |
0.01538 | 90.0 | 3.75 | 0 | 0.3940 | 7.454 | 34.2 | 6.3361 | 3 | 244 | 15.9 | 386.34 | 3.11 | 44.0 |
0.61154 | 20.0 | 3.97 | 0 | 0.6470 | 8.704 | 86.9 | 1.8010 | 5 | 264 | 13.0 | 389.70 | 5.12 | 50.0 |
0.66351 | 20.0 | 3.97 | 0 | 0.6470 | 7.333 | 100.0 | 1.8946 | 5 | 264 | 13.0 | 383.29 | 7.79 | 36.0 |
0.65665 | 20.0 | 3.97 | 0 | 0.6470 | 6.842 | 100.0 | 2.0107 | 5 | 264 | 13.0 | 391.93 | 6.90 | 30.1 |
0.54011 | 20.0 | 3.97 | 0 | 0.6470 | 7.203 | 81.8 | 2.1121 | 5 | 264 | 13.0 | 392.80 | 9.59 | 33.8 |
0.53412 | 20.0 | 3.97 | 0 | 0.6470 | 7.520 | 89.4 | 2.1398 | 5 | 264 | 13.0 | 388.37 | 7.26 | 43.1 |
0.52014 | 20.0 | 3.97 | 0 | 0.6470 | 8.398 | 91.5 | 2.2885 | 5 | 264 | 13.0 | 386.86 | 5.91 | 48.8 |
0.82526 | 20.0 | 3.97 | 0 | 0.6470 | 7.327 | 94.5 | 2.0788 | 5 | 264 | 13.0 | 393.42 | 11.25 | 31.0 |
0.55007 | 20.0 | 3.97 | 0 | 0.6470 | 7.206 | 91.6 | 1.9301 | 5 | 264 | 13.0 | 387.89 | 8.10 | 36.5 |
0.76162 | 20.0 | 3.97 | 0 | 0.6470 | 5.560 | 62.8 | 1.9865 | 5 | 264 | 13.0 | 392.40 | 10.45 | 22.8 |
0.78570 | 20.0 | 3.97 | 0 | 0.6470 | 7.014 | 84.6 | 2.1329 | 5 | 264 | 13.0 | 384.07 | 14.79 | 30.7 |
0.57834 | 20.0 | 3.97 | 0 | 0.5750 | 8.297 | 67.0 | 2.4216 | 5 | 264 | 13.0 | 384.54 | 7.44 | 50.0 |
0.54050 | 20.0 | 3.97 | 0 | 0.5750 | 7.470 | 52.6 | 2.8720 | 5 | 264 | 13.0 | 390.30 | 3.16 | 43.5 |
0.09065 | 20.0 | 6.96 | 1 | 0.4640 | 5.920 | 61.5 | 3.9175 | 3 | 223 | 18.6 | 391.34 | 13.65 | 20.7 |
0.29916 | 20.0 | 6.96 | 0 | 0.4640 | 5.856 | 42.1 | 4.4290 | 3 | 223 | 18.6 | 388.65 | 13.00 | 21.1 |
0.16211 | 20.0 | 6.96 | 0 | 0.4640 | 6.240 | 16.3 | 4.4290 | 3 | 223 | 18.6 | 396.90 | 6.59 | 25.2 |
0.11460 | 20.0 | 6.96 | 0 | 0.4640 | 6.538 | 58.7 | 3.9175 | 3 | 223 | 18.6 | 394.96 | 7.73 | 24.4 |
0.22188 | 20.0 | 6.96 | 1 | 0.4640 | 7.691 | 51.8 | 4.3665 | 3 | 223 | 18.6 | 390.77 | 6.58 | 35.2 |
0.05644 | 40.0 | 6.41 | 1 | 0.4470 | 6.758 | 32.9 | 4.0776 | 4 | 254 | 17.6 | 396.90 | 3.53 | 32.4 |
0.09604 | 40.0 | 6.41 | 0 | 0.4470 | 6.854 | 42.8 | 4.2673 | 4 | 254 | 17.6 | 396.90 | 2.98 | 32.0 |
0.10469 | 40.0 | 6.41 | 1 | 0.4470 | 7.267 | 49.0 | 4.7872 | 4 | 254 | 17.6 | 389.25 | 6.05 | 33.2 |
0.06127 | 40.0 | 6.41 | 1 | 0.4470 | 6.826 | 27.6 | 4.8628 | 4 | 254 | 17.6 | 393.45 | 4.16 | 33.1 |
0.07978 | 40.0 | 6.41 | 0 | 0.4470 | 6.482 | 32.1 | 4.1403 | 4 | 254 | 17.6 | 396.90 | 7.19 | 29.1 |
0.21038 | 20.0 | 3.33 | 0 | 0.4429 | 6.812 | 32.2 | 4.1007 | 5 | 216 | 14.9 | 396.90 | 4.85 | 35.1 |
0.03578 | 20.0 | 3.33 | 0 | 0.4429 | 7.820 | 64.5 | 4.6947 | 5 | 216 | 14.9 | 387.31 | 3.76 | 45.4 |
0.03705 | 20.0 | 3.33 | 0 | 0.4429 | 6.968 | 37.2 | 5.2447 | 5 | 216 | 14.9 | 392.23 | 4.59 | 35.4 |
0.06129 | 20.0 | 3.33 | 1 | 0.4429 | 7.645 | 49.7 | 5.2119 | 5 | 216 | 14.9 | 377.07 | 3.01 | 46.0 |
0.01501 | 90.0 | 1.21 | 1 | 0.4010 | 7.923 | 24.8 | 5.8850 | 1 | 198 | 13.6 | 395.52 | 3.16 | 50.0 |
0.00906 | 90.0 | 2.97 | 0 | 0.4000 | 7.088 | 20.8 | 7.3073 | 1 | 285 | 15.3 | 394.72 | 7.85 | 32.2 |
0.01096 | 55.0 | 2.25 | 0 | 0.3890 | 6.453 | 31.9 | 7.3073 | 1 | 300 | 15.3 | 394.72 | 8.23 | 22.0 |
0.01965 | 80.0 | 1.76 | 0 | 0.3850 | 6.230 | 31.5 | 9.0892 | 1 | 241 | 18.2 | 341.60 | 12.93 | 20.1 |
0.03871 | 52.5 | 5.32 | 0 | 0.4050 | 6.209 | 31.3 | 7.3172 | 6 | 293 | 16.6 | 396.90 | 7.14 | 23.2 |
0.04590 | 52.5 | 5.32 | 0 | 0.4050 | 6.315 | 45.6 | 7.3172 | 6 | 293 | 16.6 | 396.90 | 7.60 | 22.3 |
0.04297 | 52.5 | 5.32 | 0 | 0.4050 | 6.565 | 22.9 | 7.3172 | 6 | 293 | 16.6 | 371.72 | 9.51 | 24.8 |
0.03502 | 80.0 | 4.95 | 0 | 0.4110 | 6.861 | 27.9 | 5.1167 | 4 | 245 | 19.2 | 396.90 | 3.33 | 28.5 |
0.07886 | 80.0 | 4.95 | 0 | 0.4110 | 7.148 | 27.7 | 5.1167 | 4 | 245 | 19.2 | 396.90 | 3.56 | 37.3 |
0.03615 | 80.0 | 4.95 | 0 | 0.4110 | 6.630 | 23.4 | 5.1167 | 4 | 245 | 19.2 | 396.90 | 4.70 | 27.9 |
0.08265 | 0.0 | 13.92 | 0 | 0.4370 | 6.127 | 18.4 | 5.5027 | 4 | 289 | 16.0 | 396.90 | 8.58 | 23.9 |
0.08199 | 0.0 | 13.92 | 0 | 0.4370 | 6.009 | 42.3 | 5.5027 | 4 | 289 | 16.0 | 396.90 | 10.40 | 21.7 |
0.12932 | 0.0 | 13.92 | 0 | 0.4370 | 6.678 | 31.1 | 5.9604 | 4 | 289 | 16.0 | 396.90 | 6.27 | 28.6 |
0.05372 | 0.0 | 13.92 | 0 | 0.4370 | 6.549 | 51.0 | 5.9604 | 4 | 289 | 16.0 | 392.85 | 7.39 | 27.1 |
0.14103 | 0.0 | 13.92 | 0 | 0.4370 | 5.790 | 58.0 | 6.3200 | 4 | 289 | 16.0 | 396.90 | 15.84 | 20.3 |
0.06466 | 70.0 | 2.24 | 0 | 0.4000 | 6.345 | 20.1 | 7.8278 | 5 | 358 | 14.8 | 368.24 | 4.97 | 22.5 |
0.05561 | 70.0 | 2.24 | 0 | 0.4000 | 7.041 | 10.0 | 7.8278 | 5 | 358 | 14.8 | 371.58 | 4.74 | 29.0 |
0.04417 | 70.0 | 2.24 | 0 | 0.4000 | 6.871 | 47.4 | 7.8278 | 5 | 358 | 14.8 | 390.86 | 6.07 | 24.8 |
0.03537 | 34.0 | 6.09 | 0 | 0.4330 | 6.590 | 40.4 | 5.4917 | 7 | 329 | 16.1 | 395.75 | 9.50 | 22.0 |
0.09266 | 34.0 | 6.09 | 0 | 0.4330 | 6.495 | 18.4 | 5.4917 | 7 | 329 | 16.1 | 383.61 | 8.67 | 26.4 |
0.10000 | 34.0 | 6.09 | 0 | 0.4330 | 6.982 | 17.7 | 5.4917 | 7 | 329 | 16.1 | 390.43 | 4.86 | 33.1 |
0.05515 | 33.0 | 2.18 | 0 | 0.4720 | 7.236 | 41.1 | 4.0220 | 7 | 222 | 18.4 | 393.68 | 6.93 | 36.1 |
0.05479 | 33.0 | 2.18 | 0 | 0.4720 | 6.616 | 58.1 | 3.3700 | 7 | 222 | 18.4 | 393.36 | 8.93 | 28.4 |
0.07503 | 33.0 | 2.18 | 0 | 0.4720 | 7.420 | 71.9 | 3.0992 | 7 | 222 | 18.4 | 396.90 | 6.47 | 33.4 |
0.04932 | 33.0 | 2.18 | 0 | 0.4720 | 6.849 | 70.3 | 3.1827 | 7 | 222 | 18.4 | 396.90 | 7.53 | 28.2 |
0.49298 | 0.0 | 9.90 | 0 | 0.5440 | 6.635 | 82.5 | 3.3175 | 4 | 304 | 18.4 | 396.90 | 4.54 | 22.8 |
0.34940 | 0.0 | 9.90 | 0 | 0.5440 | 5.972 | 76.7 | 3.1025 | 4 | 304 | 18.4 | 396.24 | 9.97 | 20.3 |
2.63548 | 0.0 | 9.90 | 0 | 0.5440 | 4.973 | 37.8 | 2.5194 | 4 | 304 | 18.4 | 350.45 | 12.64 | 16.1 |
0.79041 | 0.0 | 9.90 | 0 | 0.5440 | 6.122 | 52.8 | 2.6403 | 4 | 304 | 18.4 | 396.90 | 5.98 | 22.1 |
0.26169 | 0.0 | 9.90 | 0 | 0.5440 | 6.023 | 90.4 | 2.8340 | 4 | 304 | 18.4 | 396.30 | 11.72 | 19.4 |
0.26938 | 0.0 | 9.90 | 0 | 0.5440 | 6.266 | 82.8 | 3.2628 | 4 | 304 | 18.4 | 393.39 | 7.90 | 21.6 |
0.36920 | 0.0 | 9.90 | 0 | 0.5440 | 6.567 | 87.3 | 3.6023 | 4 | 304 | 18.4 | 395.69 | 9.28 | 23.8 |
0.25356 | 0.0 | 9.90 | 0 | 0.5440 | 5.705 | 77.7 | 3.9450 | 4 | 304 | 18.4 | 396.42 | 11.50 | 16.2 |
0.31827 | 0.0 | 9.90 | 0 | 0.5440 | 5.914 | 83.2 | 3.9986 | 4 | 304 | 18.4 | 390.70 | 18.33 | 17.8 |
0.24522 | 0.0 | 9.90 | 0 | 0.5440 | 5.782 | 71.7 | 4.0317 | 4 | 304 | 18.4 | 396.90 | 15.94 | 19.8 |
0.40202 | 0.0 | 9.90 | 0 | 0.5440 | 6.382 | 67.2 | 3.5325 | 4 | 304 | 18.4 | 395.21 | 10.36 | 23.1 |
0.47547 | 0.0 | 9.90 | 0 | 0.5440 | 6.113 | 58.8 | 4.0019 | 4 | 304 | 18.4 | 396.23 | 12.73 | 21.0 |
0.16760 | 0.0 | 7.38 | 0 | 0.4930 | 6.426 | 52.3 | 4.5404 | 5 | 287 | 19.6 | 396.90 | 7.20 | 23.8 |
0.18159 | 0.0 | 7.38 | 0 | 0.4930 | 6.376 | 54.3 | 4.5404 | 5 | 287 | 19.6 | 396.90 | 6.87 | 23.1 |
0.35114 | 0.0 | 7.38 | 0 | 0.4930 | 6.041 | 49.9 | 4.7211 | 5 | 287 | 19.6 | 396.90 | 7.70 | 20.4 |
0.28392 | 0.0 | 7.38 | 0 | 0.4930 | 5.708 | 74.3 | 4.7211 | 5 | 287 | 19.6 | 391.13 | 11.74 | 18.5 |
0.34109 | 0.0 | 7.38 | 0 | 0.4930 | 6.415 | 40.1 | 4.7211 | 5 | 287 | 19.6 | 396.90 | 6.12 | 25.0 |
0.19186 | 0.0 | 7.38 | 0 | 0.4930 | 6.431 | 14.7 | 5.4159 | 5 | 287 | 19.6 | 393.68 | 5.08 | 24.6 |
0.30347 | 0.0 | 7.38 | 0 | 0.4930 | 6.312 | 28.9 | 5.4159 | 5 | 287 | 19.6 | 396.90 | 6.15 | 23.0 |
0.24103 | 0.0 | 7.38 | 0 | 0.4930 | 6.083 | 43.7 | 5.4159 | 5 | 287 | 19.6 | 396.90 | 12.79 | 22.2 |
0.06617 | 0.0 | 3.24 | 0 | 0.4600 | 5.868 | 25.8 | 5.2146 | 4 | 430 | 16.9 | 382.44 | 9.97 | 19.3 |
0.06724 | 0.0 | 3.24 | 0 | 0.4600 | 6.333 | 17.2 | 5.2146 | 4 | 430 | 16.9 | 375.21 | 7.34 | 22.6 |
0.04544 | 0.0 | 3.24 | 0 | 0.4600 | 6.144 | 32.2 | 5.8736 | 4 | 430 | 16.9 | 368.57 | 9.09 | 19.8 |
0.05023 | 35.0 | 6.06 | 0 | 0.4379 | 5.706 | 28.4 | 6.6407 | 1 | 304 | 16.9 | 394.02 | 12.43 | 17.1 |
0.03466 | 35.0 | 6.06 | 0 | 0.4379 | 6.031 | 23.3 | 6.6407 | 1 | 304 | 16.9 | 362.25 | 7.83 | 19.4 |
0.05083 | 0.0 | 5.19 | 0 | 0.5150 | 6.316 | 38.1 | 6.4584 | 5 | 224 | 20.2 | 389.71 | 5.68 | 22.2 |
0.03738 | 0.0 | 5.19 | 0 | 0.5150 | 6.310 | 38.5 | 6.4584 | 5 | 224 | 20.2 | 389.40 | 6.75 | 20.7 |
0.03961 | 0.0 | 5.19 | 0 | 0.5150 | 6.037 | 34.5 | 5.9853 | 5 | 224 | 20.2 | 396.90 | 8.01 | 21.1 |
0.03427 | 0.0 | 5.19 | 0 | 0.5150 | 5.869 | 46.3 | 5.2311 | 5 | 224 | 20.2 | 396.90 | 9.80 | 19.5 |
0.03041 | 0.0 | 5.19 | 0 | 0.5150 | 5.895 | 59.6 | 5.6150 | 5 | 224 | 20.2 | 394.81 | 10.56 | 18.5 |
0.03306 | 0.0 | 5.19 | 0 | 0.5150 | 6.059 | 37.3 | 4.8122 | 5 | 224 | 20.2 | 396.14 | 8.51 | 20.6 |
0.05497 | 0.0 | 5.19 | 0 | 0.5150 | 5.985 | 45.4 | 4.8122 | 5 | 224 | 20.2 | 396.90 | 9.74 | 19.0 |
0.06151 | 0.0 | 5.19 | 0 | 0.5150 | 5.968 | 58.5 | 4.8122 | 5 | 224 | 20.2 | 396.90 | 9.29 | 18.7 |
0.01301 | 35.0 | 1.52 | 0 | 0.4420 | 7.241 | 49.3 | 7.0379 | 1 | 284 | 15.5 | 394.74 | 5.49 | 32.7 |
0.02498 | 0.0 | 1.89 | 0 | 0.5180 | 6.540 | 59.7 | 6.2669 | 1 | 422 | 15.9 | 389.96 | 8.65 | 16.5 |
0.02543 | 55.0 | 3.78 | 0 | 0.4840 | 6.696 | 56.4 | 5.7321 | 5 | 370 | 17.6 | 396.90 | 7.18 | 23.9 |
0.03049 | 55.0 | 3.78 | 0 | 0.4840 | 6.874 | 28.1 | 6.4654 | 5 | 370 | 17.6 | 387.97 | 4.61 | 31.2 |
0.03113 | 0.0 | 4.39 | 0 | 0.4420 | 6.014 | 48.5 | 8.0136 | 3 | 352 | 18.8 | 385.64 | 10.53 | 17.5 |
0.06162 | 0.0 | 4.39 | 0 | 0.4420 | 5.898 | 52.3 | 8.0136 | 3 | 352 | 18.8 | 364.61 | 12.67 | 17.2 |
0.01870 | 85.0 | 4.15 | 0 | 0.4290 | 6.516 | 27.7 | 8.5353 | 4 | 351 | 17.9 | 392.43 | 6.36 | 23.1 |
0.01501 | 80.0 | 2.01 | 0 | 0.4350 | 6.635 | 29.7 | 8.3440 | 4 | 280 | 17.0 | 390.94 | 5.99 | 24.5 |
0.02899 | 40.0 | 1.25 | 0 | 0.4290 | 6.939 | 34.5 | 8.7921 | 1 | 335 | 19.7 | 389.85 | 5.89 | 26.6 |
0.06211 | 40.0 | 1.25 | 0 | 0.4290 | 6.490 | 44.4 | 8.7921 | 1 | 335 | 19.7 | 396.90 | 5.98 | 22.9 |
0.07950 | 60.0 | 1.69 | 0 | 0.4110 | 6.579 | 35.9 | 10.7103 | 4 | 411 | 18.3 | 370.78 | 5.49 | 24.1 |
0.07244 | 60.0 | 1.69 | 0 | 0.4110 | 5.884 | 18.5 | 10.7103 | 4 | 411 | 18.3 | 392.33 | 7.79 | 18.6 |
0.01709 | 90.0 | 2.02 | 0 | 0.4100 | 6.728 | 36.1 | 12.1265 | 5 | 187 | 17.0 | 384.46 | 4.50 | 30.1 |
0.04301 | 80.0 | 1.91 | 0 | 0.4130 | 5.663 | 21.9 | 10.5857 | 4 | 334 | 22.0 | 382.80 | 8.05 | 18.2 |
0.10659 | 80.0 | 1.91 | 0 | 0.4130 | 5.936 | 19.5 | 10.5857 | 4 | 334 | 22.0 | 376.04 | 5.57 | 20.6 |
8.98296 | 0.0 | 18.10 | 1 | 0.7700 | 6.212 | 97.4 | 2.1222 | 24 | 666 | 20.2 | 377.73 | 17.60 | 17.8 |
3.84970 | 0.0 | 18.10 | 1 | 0.7700 | 6.395 | 91.0 | 2.5052 | 24 | 666 | 20.2 | 391.34 | 13.27 | 21.7 |
5.20177 | 0.0 | 18.10 | 1 | 0.7700 | 6.127 | 83.4 | 2.7227 | 24 | 666 | 20.2 | 395.43 | 11.48 | 22.7 |
4.26131 | 0.0 | 18.10 | 0 | 0.7700 | 6.112 | 81.3 | 2.5091 | 24 | 666 | 20.2 | 390.74 | 12.67 | 22.6 |
4.54192 | 0.0 | 18.10 | 0 | 0.7700 | 6.398 | 88.0 | 2.5182 | 24 | 666 | 20.2 | 374.56 | 7.79 | 25.0 |
3.83684 | 0.0 | 18.10 | 0 | 0.7700 | 6.251 | 91.1 | 2.2955 | 24 | 666 | 20.2 | 350.65 | 14.19 | 19.9 |
3.67822 | 0.0 | 18.10 | 0 | 0.7700 | 5.362 | 96.2 | 2.1036 | 24 | 666 | 20.2 | 380.79 | 10.19 | 20.8 |
4.22239 | 0.0 | 18.10 | 1 | 0.7700 | 5.803 | 89.0 | 1.9047 | 24 | 666 | 20.2 | 353.04 | 14.64 | 16.8 |
3.47428 | 0.0 | 18.10 | 1 | 0.7180 | 8.780 | 82.9 | 1.9047 | 24 | 666 | 20.2 | 354.55 | 5.29 | 21.9 |
4.55587 | 0.0 | 18.10 | 0 | 0.7180 | 3.561 | 87.9 | 1.6132 | 24 | 666 | 20.2 | 354.70 | 7.12 | 27.5 |
3.69695 | 0.0 | 18.10 | 0 | 0.7180 | 4.963 | 91.4 | 1.7523 | 24 | 666 | 20.2 | 316.03 | 14.00 | 21.9 |
13.52220 | 0.0 | 18.10 | 0 | 0.6310 | 3.863 | 100.0 | 1.5106 | 24 | 666 | 20.2 | 131.42 | 13.33 | 23.1 |
4.89822 | 0.0 | 18.10 | 0 | 0.6310 | 4.970 | 100.0 | 1.3325 | 24 | 666 | 20.2 | 375.52 | 3.26 | 50.0 |
5.66998 | 0.0 | 18.10 | 1 | 0.6310 | 6.683 | 96.8 | 1.3567 | 24 | 666 | 20.2 | 375.33 | 3.73 | 50.0 |
6.53876 | 0.0 | 18.10 | 1 | 0.6310 | 7.016 | 97.5 | 1.2024 | 24 | 666 | 20.2 | 392.05 | 2.96 | 50.0 |
9.23230 | 0.0 | 18.10 | 0 | 0.6310 | 6.216 | 100.0 | 1.1691 | 24 | 666 | 20.2 | 366.15 | 9.53 | 50.0 |
8.26725 | 0.0 | 18.10 | 1 | 0.6680 | 5.875 | 89.6 | 1.1296 | 24 | 666 | 20.2 | 347.88 | 8.88 | 50.0 |
11.10810 | 0.0 | 18.10 | 0 | 0.6680 | 4.906 | 100.0 | 1.1742 | 24 | 666 | 20.2 | 396.90 | 34.77 | 13.8 |
18.49820 | 0.0 | 18.10 | 0 | 0.6680 | 4.138 | 100.0 | 1.1370 | 24 | 666 | 20.2 | 396.90 | 37.97 | 13.8 |
19.60910 | 0.0 | 18.10 | 0 | 0.6710 | 7.313 | 97.9 | 1.3163 | 24 | 666 | 20.2 | 396.90 | 13.44 | 15.0 |
15.28800 | 0.0 | 18.10 | 0 | 0.6710 | 6.649 | 93.3 | 1.3449 | 24 | 666 | 20.2 | 363.02 | 23.24 | 13.9 |
9.82349 | 0.0 | 18.10 | 0 | 0.6710 | 6.794 | 98.8 | 1.3580 | 24 | 666 | 20.2 | 396.90 | 21.24 | 13.3 |
23.64820 | 0.0 | 18.10 | 0 | 0.6710 | 6.380 | 96.2 | 1.3861 | 24 | 666 | 20.2 | 396.90 | 23.69 | 13.1 |
17.86670 | 0.0 | 18.10 | 0 | 0.6710 | 6.223 | 100.0 | 1.3861 | 24 | 666 | 20.2 | 393.74 | 21.78 | 10.2 |
88.97620 | 0.0 | 18.10 | 0 | 0.6710 | 6.968 | 91.9 | 1.4165 | 24 | 666 | 20.2 | 396.90 | 17.21 | 10.4 |
15.87440 | 0.0 | 18.10 | 0 | 0.6710 | 6.545 | 99.1 | 1.5192 | 24 | 666 | 20.2 | 396.90 | 21.08 | 10.9 |
9.18702 | 0.0 | 18.10 | 0 | 0.7000 | 5.536 | 100.0 | 1.5804 | 24 | 666 | 20.2 | 396.90 | 23.60 | 11.3 |
7.99248 | 0.0 | 18.10 | 0 | 0.7000 | 5.520 | 100.0 | 1.5331 | 24 | 666 | 20.2 | 396.90 | 24.56 | 12.3 |
20.08490 | 0.0 | 18.10 | 0 | 0.7000 | 4.368 | 91.2 | 1.4395 | 24 | 666 | 20.2 | 285.83 | 30.63 | 8.8 |
16.81180 | 0.0 | 18.10 | 0 | 0.7000 | 5.277 | 98.1 | 1.4261 | 24 | 666 | 20.2 | 396.90 | 30.81 | 7.2 |
24.39380 | 0.0 | 18.10 | 0 | 0.7000 | 4.652 | 100.0 | 1.4672 | 24 | 666 | 20.2 | 396.90 | 28.28 | 10.5 |
22.59710 | 0.0 | 18.10 | 0 | 0.7000 | 5.000 | 89.5 | 1.5184 | 24 | 666 | 20.2 | 396.90 | 31.99 | 7.4 |
14.33370 | 0.0 | 18.10 | 0 | 0.7000 | 4.880 | 100.0 | 1.5895 | 24 | 666 | 20.2 | 372.92 | 30.62 | 10.2 |
8.15174 | 0.0 | 18.10 | 0 | 0.7000 | 5.390 | 98.9 | 1.7281 | 24 | 666 | 20.2 | 396.90 | 20.85 | 11.5 |
6.96215 | 0.0 | 18.10 | 0 | 0.7000 | 5.713 | 97.0 | 1.9265 | 24 | 666 | 20.2 | 394.43 | 17.11 | 15.1 |
5.29305 | 0.0 | 18.10 | 0 | 0.7000 | 6.051 | 82.5 | 2.1678 | 24 | 666 | 20.2 | 378.38 | 18.76 | 23.2 |
11.57790 | 0.0 | 18.10 | 0 | 0.7000 | 5.036 | 97.0 | 1.7700 | 24 | 666 | 20.2 | 396.90 | 25.68 | 9.7 |
8.64476 | 0.0 | 18.10 | 0 | 0.6930 | 6.193 | 92.6 | 1.7912 | 24 | 666 | 20.2 | 396.90 | 15.17 | 13.8 |
13.35980 | 0.0 | 18.10 | 0 | 0.6930 | 5.887 | 94.7 | 1.7821 | 24 | 666 | 20.2 | 396.90 | 16.35 | 12.7 |
8.71675 | 0.0 | 18.10 | 0 | 0.6930 | 6.471 | 98.8 | 1.7257 | 24 | 666 | 20.2 | 391.98 | 17.12 | 13.1 |
5.87205 | 0.0 | 18.10 | 0 | 0.6930 | 6.405 | 96.0 | 1.6768 | 24 | 666 | 20.2 | 396.90 | 19.37 | 12.5 |
7.67202 | 0.0 | 18.10 | 0 | 0.6930 | 5.747 | 98.9 | 1.6334 | 24 | 666 | 20.2 | 393.10 | 19.92 | 8.5 |
38.35180 | 0.0 | 18.10 | 0 | 0.6930 | 5.453 | 100.0 | 1.4896 | 24 | 666 | 20.2 | 396.90 | 30.59 | 5.0 |
9.91655 | 0.0 | 18.10 | 0 | 0.6930 | 5.852 | 77.8 | 1.5004 | 24 | 666 | 20.2 | 338.16 | 29.97 | 6.3 |
25.04610 | 0.0 | 18.10 | 0 | 0.6930 | 5.987 | 100.0 | 1.5888 | 24 | 666 | 20.2 | 396.90 | 26.77 | 5.6 |
14.23620 | 0.0 | 18.10 | 0 | 0.6930 | 6.343 | 100.0 | 1.5741 | 24 | 666 | 20.2 | 396.90 | 20.32 | 7.2 |
9.59571 | 0.0 | 18.10 | 0 | 0.6930 | 6.404 | 100.0 | 1.6390 | 24 | 666 | 20.2 | 376.11 | 20.31 | 12.1 |
24.80170 | 0.0 | 18.10 | 0 | 0.6930 | 5.349 | 96.0 | 1.7028 | 24 | 666 | 20.2 | 396.90 | 19.77 | 8.3 |
41.52920 | 0.0 | 18.10 | 0 | 0.6930 | 5.531 | 85.4 | 1.6074 | 24 | 666 | 20.2 | 329.46 | 27.38 | 8.5 |
67.92080 | 0.0 | 18.10 | 0 | 0.6930 | 5.683 | 100.0 | 1.4254 | 24 | 666 | 20.2 | 384.97 | 22.98 | 5.0 |
20.71620 | 0.0 | 18.10 | 0 | 0.6590 | 4.138 | 100.0 | 1.1781 | 24 | 666 | 20.2 | 370.22 | 23.34 | 11.9 |
11.95110 | 0.0 | 18.10 | 0 | 0.6590 | 5.608 | 100.0 | 1.2852 | 24 | 666 | 20.2 | 332.09 | 12.13 | 27.9 |
7.40389 | 0.0 | 18.10 | 0 | 0.5970 | 5.617 | 97.9 | 1.4547 | 24 | 666 | 20.2 | 314.64 | 26.40 | 17.2 |
14.43830 | 0.0 | 18.10 | 0 | 0.5970 | 6.852 | 100.0 | 1.4655 | 24 | 666 | 20.2 | 179.36 | 19.78 | 27.5 |
51.13580 | 0.0 | 18.10 | 0 | 0.5970 | 5.757 | 100.0 | 1.4130 | 24 | 666 | 20.2 | 2.60 | 10.11 | 15.0 |
14.05070 | 0.0 | 18.10 | 0 | 0.5970 | 6.657 | 100.0 | 1.5275 | 24 | 666 | 20.2 | 35.05 | 21.22 | 17.2 |
18.81100 | 0.0 | 18.10 | 0 | 0.5970 | 4.628 | 100.0 | 1.5539 | 24 | 666 | 20.2 | 28.79 | 34.37 | 17.9 |
28.65580 | 0.0 | 18.10 | 0 | 0.5970 | 5.155 | 100.0 | 1.5894 | 24 | 666 | 20.2 | 210.97 | 20.08 | 16.3 |
45.74610 | 0.0 | 18.10 | 0 | 0.6930 | 4.519 | 100.0 | 1.6582 | 24 | 666 | 20.2 | 88.27 | 36.98 | 7.0 |
18.08460 | 0.0 | 18.10 | 0 | 0.6790 | 6.434 | 100.0 | 1.8347 | 24 | 666 | 20.2 | 27.25 | 29.05 | 7.2 |
10.83420 | 0.0 | 18.10 | 0 | 0.6790 | 6.782 | 90.8 | 1.8195 | 24 | 666 | 20.2 | 21.57 | 25.79 | 7.5 |
25.94060 | 0.0 | 18.10 | 0 | 0.6790 | 5.304 | 89.1 | 1.6475 | 24 | 666 | 20.2 | 127.36 | 26.64 | 10.4 |
73.53410 | 0.0 | 18.10 | 0 | 0.6790 | 5.957 | 100.0 | 1.8026 | 24 | 666 | 20.2 | 16.45 | 20.62 | 8.8 |
11.81230 | 0.0 | 18.10 | 0 | 0.7180 | 6.824 | 76.5 | 1.7940 | 24 | 666 | 20.2 | 48.45 | 22.74 | 8.4 |
11.08740 | 0.0 | 18.10 | 0 | 0.7180 | 6.411 | 100.0 | 1.8589 | 24 | 666 | 20.2 | 318.75 | 15.02 | 16.7 |
7.02259 | 0.0 | 18.10 | 0 | 0.7180 | 6.006 | 95.3 | 1.8746 | 24 | 666 | 20.2 | 319.98 | 15.70 | 14.2 |
12.04820 | 0.0 | 18.10 | 0 | 0.6140 | 5.648 | 87.6 | 1.9512 | 24 | 666 | 20.2 | 291.55 | 14.10 | 20.8 |
7.05042 | 0.0 | 18.10 | 0 | 0.6140 | 6.103 | 85.1 | 2.0218 | 24 | 666 | 20.2 | 2.52 | 23.29 | 13.4 |
8.79212 | 0.0 | 18.10 | 0 | 0.5840 | 5.565 | 70.6 | 2.0635 | 24 | 666 | 20.2 | 3.65 | 17.16 | 11.7 |
15.86030 | 0.0 | 18.10 | 0 | 0.6790 | 5.896 | 95.4 | 1.9096 | 24 | 666 | 20.2 | 7.68 | 24.39 | 8.3 |
12.24720 | 0.0 | 18.10 | 0 | 0.5840 | 5.837 | 59.7 | 1.9976 | 24 | 666 | 20.2 | 24.65 | 15.69 | 10.2 |
37.66190 | 0.0 | 18.10 | 0 | 0.6790 | 6.202 | 78.7 | 1.8629 | 24 | 666 | 20.2 | 18.82 | 14.52 | 10.9 |
7.36711 | 0.0 | 18.10 | 0 | 0.6790 | 6.193 | 78.1 | 1.9356 | 24 | 666 | 20.2 | 96.73 | 21.52 | 11.0 |
9.33889 | 0.0 | 18.10 | 0 | 0.6790 | 6.380 | 95.6 | 1.9682 | 24 | 666 | 20.2 | 60.72 | 24.08 | 9.5 |
8.49213 | 0.0 | 18.10 | 0 | 0.5840 | 6.348 | 86.1 | 2.0527 | 24 | 666 | 20.2 | 83.45 | 17.64 | 14.5 |
10.06230 | 0.0 | 18.10 | 0 | 0.5840 | 6.833 | 94.3 | 2.0882 | 24 | 666 | 20.2 | 81.33 | 19.69 | 14.1 |
6.44405 | 0.0 | 18.10 | 0 | 0.5840 | 6.425 | 74.8 | 2.2004 | 24 | 666 | 20.2 | 97.95 | 12.03 | 16.1 |
5.58107 | 0.0 | 18.10 | 0 | 0.7130 | 6.436 | 87.9 | 2.3158 | 24 | 666 | 20.2 | 100.19 | 16.22 | 14.3 |
13.91340 | 0.0 | 18.10 | 0 | 0.7130 | 6.208 | 95.0 | 2.2222 | 24 | 666 | 20.2 | 100.63 | 15.17 | 11.7 |
11.16040 | 0.0 | 18.10 | 0 | 0.7400 | 6.629 | 94.6 | 2.1247 | 24 | 666 | 20.2 | 109.85 | 23.27 | 13.4 |
14.42080 | 0.0 | 18.10 | 0 | 0.7400 | 6.461 | 93.3 | 2.0026 | 24 | 666 | 20.2 | 27.49 | 18.05 | 9.6 |
15.17720 | 0.0 | 18.10 | 0 | 0.7400 | 6.152 | 100.0 | 1.9142 | 24 | 666 | 20.2 | 9.32 | 26.45 | 8.7 |
13.67810 | 0.0 | 18.10 | 0 | 0.7400 | 5.935 | 87.9 | 1.8206 | 24 | 666 | 20.2 | 68.95 | 34.02 | 8.4 |
9.39063 | 0.0 | 18.10 | 0 | 0.7400 | 5.627 | 93.9 | 1.8172 | 24 | 666 | 20.2 | 396.90 | 22.88 | 12.8 |
22.05110 | 0.0 | 18.10 | 0 | 0.7400 | 5.818 | 92.4 | 1.8662 | 24 | 666 | 20.2 | 391.45 | 22.11 | 10.5 |
9.72418 | 0.0 | 18.10 | 0 | 0.7400 | 6.406 | 97.2 | 2.0651 | 24 | 666 | 20.2 | 385.96 | 19.52 | 17.1 |
5.66637 | 0.0 | 18.10 | 0 | 0.7400 | 6.219 | 100.0 | 2.0048 | 24 | 666 | 20.2 | 395.69 | 16.59 | 18.4 |
9.96654 | 0.0 | 18.10 | 0 | 0.7400 | 6.485 | 100.0 | 1.9784 | 24 | 666 | 20.2 | 386.73 | 18.85 | 15.4 |
12.80230 | 0.0 | 18.10 | 0 | 0.7400 | 5.854 | 96.6 | 1.8956 | 24 | 666 | 20.2 | 240.52 | 23.79 | 10.8 |
10.67180 | 0.0 | 18.10 | 0 | 0.7400 | 6.459 | 94.8 | 1.9879 | 24 | 666 | 20.2 | 43.06 | 23.98 | 11.8 |
6.28807 | 0.0 | 18.10 | 0 | 0.7400 | 6.341 | 96.4 | 2.0720 | 24 | 666 | 20.2 | 318.01 | 17.79 | 14.9 |
9.92485 | 0.0 | 18.10 | 0 | 0.7400 | 6.251 | 96.6 | 2.1980 | 24 | 666 | 20.2 | 388.52 | 16.44 | 12.6 |
9.32909 | 0.0 | 18.10 | 0 | 0.7130 | 6.185 | 98.7 | 2.2616 | 24 | 666 | 20.2 | 396.90 | 18.13 | 14.1 |
7.52601 | 0.0 | 18.10 | 0 | 0.7130 | 6.417 | 98.3 | 2.1850 | 24 | 666 | 20.2 | 304.21 | 19.31 | 13.0 |
6.71772 | 0.0 | 18.10 | 0 | 0.7130 | 6.749 | 92.6 | 2.3236 | 24 | 666 | 20.2 | 0.32 | 17.44 | 13.4 |
5.44114 | 0.0 | 18.10 | 0 | 0.7130 | 6.655 | 98.2 | 2.3552 | 24 | 666 | 20.2 | 355.29 | 17.73 | 15.2 |
5.09017 | 0.0 | 18.10 | 0 | 0.7130 | 6.297 | 91.8 | 2.3682 | 24 | 666 | 20.2 | 385.09 | 17.27 | 16.1 |
8.24809 | 0.0 | 18.10 | 0 | 0.7130 | 7.393 | 99.3 | 2.4527 | 24 | 666 | 20.2 | 375.87 | 16.74 | 17.8 |
9.51363 | 0.0 | 18.10 | 0 | 0.7130 | 6.728 | 94.1 | 2.4961 | 24 | 666 | 20.2 | 6.68 | 18.71 | 14.9 |
4.75237 | 0.0 | 18.10 | 0 | 0.7130 | 6.525 | 86.5 | 2.4358 | 24 | 666 | 20.2 | 50.92 | 18.13 | 14.1 |
4.66883 | 0.0 | 18.10 | 0 | 0.7130 | 5.976 | 87.9 | 2.5806 | 24 | 666 | 20.2 | 10.48 | 19.01 | 12.7 |
8.20058 | 0.0 | 18.10 | 0 | 0.7130 | 5.936 | 80.3 | 2.7792 | 24 | 666 | 20.2 | 3.50 | 16.94 | 13.5 |
7.75223 | 0.0 | 18.10 | 0 | 0.7130 | 6.301 | 83.7 | 2.7831 | 24 | 666 | 20.2 | 272.21 | 16.23 | 14.9 |
6.80117 | 0.0 | 18.10 | 0 | 0.7130 | 6.081 | 84.4 | 2.7175 | 24 | 666 | 20.2 | 396.90 | 14.70 | 20.0 |
4.81213 | 0.0 | 18.10 | 0 | 0.7130 | 6.701 | 90.0 | 2.5975 | 24 | 666 | 20.2 | 255.23 | 16.42 | 16.4 |
3.69311 | 0.0 | 18.10 | 0 | 0.7130 | 6.376 | 88.4 | 2.5671 | 24 | 666 | 20.2 | 391.43 | 14.65 | 17.7 |
6.65492 | 0.0 | 18.10 | 0 | 0.7130 | 6.317 | 83.0 | 2.7344 | 24 | 666 | 20.2 | 396.90 | 13.99 | 19.5 |
5.82115 | 0.0 | 18.10 | 0 | 0.7130 | 6.513 | 89.9 | 2.8016 | 24 | 666 | 20.2 | 393.82 | 10.29 | 20.2 |
7.83932 | 0.0 | 18.10 | 0 | 0.6550 | 6.209 | 65.4 | 2.9634 | 24 | 666 | 20.2 | 396.90 | 13.22 | 21.4 |
3.16360 | 0.0 | 18.10 | 0 | 0.6550 | 5.759 | 48.2 | 3.0665 | 24 | 666 | 20.2 | 334.40 | 14.13 | 19.9 |
3.77498 | 0.0 | 18.10 | 0 | 0.6550 | 5.952 | 84.7 | 2.8715 | 24 | 666 | 20.2 | 22.01 | 17.15 | 19.0 |
4.42228 | 0.0 | 18.10 | 0 | 0.5840 | 6.003 | 94.5 | 2.5403 | 24 | 666 | 20.2 | 331.29 | 21.32 | 19.1 |
15.57570 | 0.0 | 18.10 | 0 | 0.5800 | 5.926 | 71.0 | 2.9084 | 24 | 666 | 20.2 | 368.74 | 18.13 | 19.1 |
13.07510 | 0.0 | 18.10 | 0 | 0.5800 | 5.713 | 56.7 | 2.8237 | 24 | 666 | 20.2 | 396.90 | 14.76 | 20.1 |
4.34879 | 0.0 | 18.10 | 0 | 0.5800 | 6.167 | 84.0 | 3.0334 | 24 | 666 | 20.2 | 396.90 | 16.29 | 19.9 |
4.03841 | 0.0 | 18.10 | 0 | 0.5320 | 6.229 | 90.7 | 3.0993 | 24 | 666 | 20.2 | 395.33 | 12.87 | 19.6 |
3.56868 | 0.0 | 18.10 | 0 | 0.5800 | 6.437 | 75.0 | 2.8965 | 24 | 666 | 20.2 | 393.37 | 14.36 | 23.2 |
4.64689 | 0.0 | 18.10 | 0 | 0.6140 | 6.980 | 67.6 | 2.5329 | 24 | 666 | 20.2 | 374.68 | 11.66 | 29.8 |
8.05579 | 0.0 | 18.10 | 0 | 0.5840 | 5.427 | 95.4 | 2.4298 | 24 | 666 | 20.2 | 352.58 | 18.14 | 13.8 |
6.39312 | 0.0 | 18.10 | 0 | 0.5840 | 6.162 | 97.4 | 2.2060 | 24 | 666 | 20.2 | 302.76 | 24.10 | 13.3 |
4.87141 | 0.0 | 18.10 | 0 | 0.6140 | 6.484 | 93.6 | 2.3053 | 24 | 666 | 20.2 | 396.21 | 18.68 | 16.7 |
15.02340 | 0.0 | 18.10 | 0 | 0.6140 | 5.304 | 97.3 | 2.1007 | 24 | 666 | 20.2 | 349.48 | 24.91 | 12.0 |
10.23300 | 0.0 | 18.10 | 0 | 0.6140 | 6.185 | 96.7 | 2.1705 | 24 | 666 | 20.2 | 379.70 | 18.03 | 14.6 |
14.33370 | 0.0 | 18.10 | 0 | 0.6140 | 6.229 | 88.0 | 1.9512 | 24 | 666 | 20.2 | 383.32 | 13.11 | 21.4 |
5.82401 | 0.0 | 18.10 | 0 | 0.5320 | 6.242 | 64.7 | 3.4242 | 24 | 666 | 20.2 | 396.90 | 10.74 | 23.0 |
5.70818 | 0.0 | 18.10 | 0 | 0.5320 | 6.750 | 74.9 | 3.3317 | 24 | 666 | 20.2 | 393.07 | 7.74 | 23.7 |
5.73116 | 0.0 | 18.10 | 0 | 0.5320 | 7.061 | 77.0 | 3.4106 | 24 | 666 | 20.2 | 395.28 | 7.01 | 25.0 |
2.81838 | 0.0 | 18.10 | 0 | 0.5320 | 5.762 | 40.3 | 4.0983 | 24 | 666 | 20.2 | 392.92 | 10.42 | 21.8 |
2.37857 | 0.0 | 18.10 | 0 | 0.5830 | 5.871 | 41.9 | 3.7240 | 24 | 666 | 20.2 | 370.73 | 13.34 | 20.6 |
3.67367 | 0.0 | 18.10 | 0 | 0.5830 | 6.312 | 51.9 | 3.9917 | 24 | 666 | 20.2 | 388.62 | 10.58 | 21.2 |
5.69175 | 0.0 | 18.10 | 0 | 0.5830 | 6.114 | 79.8 | 3.5459 | 24 | 666 | 20.2 | 392.68 | 14.98 | 19.1 |
4.83567 | 0.0 | 18.10 | 0 | 0.5830 | 5.905 | 53.2 | 3.1523 | 24 | 666 | 20.2 | 388.22 | 11.45 | 20.6 |
0.15086 | 0.0 | 27.74 | 0 | 0.6090 | 5.454 | 92.7 | 1.8209 | 4 | 711 | 20.1 | 395.09 | 18.06 | 15.2 |
0.18337 | 0.0 | 27.74 | 0 | 0.6090 | 5.414 | 98.3 | 1.7554 | 4 | 711 | 20.1 | 344.05 | 23.97 | 7.0 |
0.20746 | 0.0 | 27.74 | 0 | 0.6090 | 5.093 | 98.0 | 1.8226 | 4 | 711 | 20.1 | 318.43 | 29.68 | 8.1 |
0.10574 | 0.0 | 27.74 | 0 | 0.6090 | 5.983 | 98.8 | 1.8681 | 4 | 711 | 20.1 | 390.11 | 18.07 | 13.6 |
0.11132 | 0.0 | 27.74 | 0 | 0.6090 | 5.983 | 83.5 | 2.1099 | 4 | 711 | 20.1 | 396.90 | 13.35 | 20.1 |
0.17331 | 0.0 | 9.69 | 0 | 0.5850 | 5.707 | 54.0 | 2.3817 | 6 | 391 | 19.2 | 396.90 | 12.01 | 21.8 |
0.27957 | 0.0 | 9.69 | 0 | 0.5850 | 5.926 | 42.6 | 2.3817 | 6 | 391 | 19.2 | 396.90 | 13.59 | 24.5 |
0.17899 | 0.0 | 9.69 | 0 | 0.5850 | 5.670 | 28.8 | 2.7986 | 6 | 391 | 19.2 | 393.29 | 17.60 | 23.1 |
0.28960 | 0.0 | 9.69 | 0 | 0.5850 | 5.390 | 72.9 | 2.7986 | 6 | 391 | 19.2 | 396.90 | 21.14 | 19.7 |
0.26838 | 0.0 | 9.69 | 0 | 0.5850 | 5.794 | 70.6 | 2.8927 | 6 | 391 | 19.2 | 396.90 | 14.10 | 18.3 |
0.23912 | 0.0 | 9.69 | 0 | 0.5850 | 6.019 | 65.3 | 2.4091 | 6 | 391 | 19.2 | 396.90 | 12.92 | 21.2 |
0.17783 | 0.0 | 9.69 | 0 | 0.5850 | 5.569 | 73.5 | 2.3999 | 6 | 391 | 19.2 | 395.77 | 15.10 | 17.5 |
0.22438 | 0.0 | 9.69 | 0 | 0.5850 | 6.027 | 79.7 | 2.4982 | 6 | 391 | 19.2 | 396.90 | 14.33 | 16.8 |
0.06263 | 0.0 | 11.93 | 0 | 0.5730 | 6.593 | 69.1 | 2.4786 | 1 | 273 | 21.0 | 391.99 | 9.67 | 22.4 |
0.04527 | 0.0 | 11.93 | 0 | 0.5730 | 6.120 | 76.7 | 2.2875 | 1 | 273 | 21.0 | 396.90 | 9.08 | 20.6 |
0.06076 | 0.0 | 11.93 | 0 | 0.5730 | 6.976 | 91.0 | 2.1675 | 1 | 273 | 21.0 | 396.90 | 5.64 | 23.9 |
0.10959 | 0.0 | 11.93 | 0 | 0.5730 | 6.794 | 89.3 | 2.3889 | 1 | 273 | 21.0 | 393.45 | 6.48 | 22.0 |
0.04741 | 0.0 | 11.93 | 0 | 0.5730 | 6.030 | 80.8 | 2.5050 | 1 | 273 | 21.0 | 396.90 | 7.88 | 11.9 |