The cumulative observed model quality of the top ranked models produced by each method. The McGuffin group methods are indicated in bold. The ModFOLD servers did not participate in the TS category, however the cumulative scores for the top ranked models are are shown for comparison with server methods. The McGuffin results are from our manual TS predictions, which were made using a beta version of the ModFOLD v 2.0 method to select models. Methods are ranked by the combined score, which is the mean of the TM-score, MaxSub and GDT-TS scores.
Method TM-score MaxSub GDT-TS Combined
Zhang-Server 34.686 28.759 30.208 31.218
ModFOLDclust 34.539 28.687 30.202 31.143
McGuffin 34.316 28.363 29.898 30.859
Phyre_de_novo 33.808 27.695 29.565 30.356
pro-sp3-TASSER 33.923 27.694 29.357 30.324
RAPTOR 33.327 27.392 28.930 29.883
FAMSD 33.442 27.249 28.821 29.837
HHpred4 33.446 27.180 28.850 29.825
HHpred5 33.238 27.207 28.831 29.759
METATASSER 33.289 27.370 28.595 29.752
MULTICOM-CLUSTER 33.078 27.288 28.792 29.719
HHpred2 33.269 26.833 28.649 29.584
MUProt 33.005 27.028 28.628 29.554
BAKER-ROBETTA 33.074 26.949 28.579 29.534
MULTICOM-REFINE 32.879 26.989 28.569 29.479
MUSTER 32.977 26.738 28.456 29.391
MULTICOM-RANK 32.662 26.748 28.405 29.272
Phyre2 32.812 26.662 28.270 29.248
Phragment 32.760 26.519 28.230 29.169
Pcons_multi 32.722 26.542 28.186 29.150
SAM-T08-server 32.614 26.650 28.164 29.143
Poing 32.706 26.452 28.169 29.109
circle 32.541 26.608 28.097 29.082
PS2-server 32.387 26.412 28.193 28.997
COMA-M 32.397 26.295 28.194 28.962
GS-KudlatyPred 32.196 26.290 28.002 28.829
MULTICOM-CMFR 32.338 26.197 27.953 28.829
FFASsuboptimal 32.229 26.244 27.901 28.791
COMA 32.241 26.017 27.651 28.636
ModFOLDv1_1 31.917 26.190 27.647 28.585
mGenTHREADER 31.706 26.183 27.739 28.542
3D-JIGSAW_V3 31.936 26.001 27.562 28.500
GeneSilicoMetaServer 31.822 25.883 27.537 28.414
FFASstandard 31.612 25.839 27.373 28.275
FALCON 31.976 25.521 27.312 28.270
3DShot2 31.962 25.593 27.022 28.192
GS-MetaServer2 31.385 25.808 27.289 28.161
3D-JIGSAW_AEP 31.401 25.766 27.266 28.144
nFOLD3 31.418 25.644 27.234 28.099
FFASflextemplate 31.378 25.493 27.067 27.979
FEIG 31.708 25.311 26.882 27.967
PSI 31.534 25.122 26.859 27.838
SAM-T06-server 31.368 24.902 26.932 27.734
CpHModels 30.898 25.153 26.591 27.547
BioSerf 30.160 24.261 26.075 26.832
fais-server 30.341 23.947 25.807 26.698
FALCON_CONSENSUS 30.578 23.676 25.608 26.621
keasar-server 29.782 24.052 25.827 26.554
3Dpro 29.870 24.116 25.561 26.516
SAM-T02-server 29.515 24.164 25.648 26.442
Pcons_dot_net 29.145 23.688 25.196 26.010
panther_server 29.223 23.725 25.018 25.989
pipe_int 28.952 23.477 25.006 25.812
Pcons_local 28.905 23.266 24.940 25.704
FUGUE_KM 28.758 23.044 24.699 25.500
forecast 28.674 22.573 24.280 25.176
rehtnap 26.582 21.092 22.600 23.424
MUFOLD-Server 26.982 19.995 22.304 23.094
ACOMPMOD 25.888 19.481 21.438 22.269
Frankenstein 25.162 19.795 21.464 22.140
Distill 26.033 19.244 20.743 22.007
LEE-SERVER 23.712 19.957 20.924 21.531
Fiser-M4T 23.195 19.663 20.615 21.158
Pushchino 23.483 19.279 20.703 21.155
LOOPP_Server 22.686 17.353 19.009 19.683
YASARA 21.093 18.248 18.950 19.431
FOLDpro 20.821 14.316 16.635 17.258
mariner1 19.666 13.551 15.465 16.227
OLGAFS 17.599 13.880 15.273 15.584
MUFOLD-MD 15.353 9.419 11.643 12.138
huber-torda-server 12.429 9.848 10.852 11.043
RBO-Proteus 11.564 6.717 8.700 8.994
schenk-torda-server 7.090 3.017 4.557 4.888
mahmood-torda-server 3.827 1.862 2.887 2.858
Jiang_Zhu 2.110 1.656 1.845 1.870
LEE 0.849 0.769 0.799 0.805
DistillSN 1.023 0.330 0.650 0.668
HCA 0.413 0.246 0.351 0.337