Mostrando 36,641 - 36,660 Resultados de 37,890 Para Buscar '"forestal"', tiempo de consulta: 0.34s Limitar resultados
  1. 36641
    “…Moreover, 36, 44, and 8 SNPs were selected as the minimum numbers of markers by the AdaBoost (AB), Random Forest (RF), and Decision Tree (DT) machine learning classification models, which had accuracy rates of 99.6%, 98.0%, and 97.9%, respectively. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  2. 36642
    “…Multiple risk prediction models were developed using parametric models (elastic net, least absolute shrinkage and selection operator, and ridge regression) and nonparametric models (random forest and gradient boosting). The models were assessed using holdout data with area under the receiver operating characteristic curve (AUROC), percentage of calibration, and calibration curve belts. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  3. 36643
    “…METHODS: We employed multiple machine learning prediction algorithms least absolute shrinkage and selection operator, random forest, deep neural network, and support vector machine to assess factors associated with ceasing opioid use in a sample of 1,192 African Americans (AAs) and 2,557 individuals of European ancestry (EAs) who met Diagnostic and Statistical Manual of Mental Disorders, 5th Edition criteria for OUD. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  4. 36644
    “…In the TCGA-GBM data, modelBuildR allowed best prognostic separation of patients with highest median overall survival difference (7.51 months) followed a difference of 6.04 months for a random forest based method. CONCLUSIONS: The proposed heuristic is beneficial for the retrieval of features associated with two true groups classified with errors. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  5. 36645
    “…Then, logistic regression (LR) with LASSO (least absolute shrinkage and selection operator) regularization, support vector machine (SVM), random forest (RF), neural network (NN), and k-nearest neighbor (kNN) were used as classification algorithms. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  6. 36646
  7. 36647
    “…To address variation of the patterns, a flexible and robust machine learning workflow was set up, based on random forest classifiers, and comprising three steps: variable selection, parameter optimization, and classification. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  8. 36648
    “…The diagnostic performance of nearest neighbor algorithm (NNA) and support vector machine (SVM) was better than random forests (RF) in the training cohort. The AUC, sensitivity, and specificity of NNA were 0.872 (95% CI: 0.750−0.994), 0.967, and 0.778, respectively. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  9. 36649
    “…Features were selected in the training cohorts using recursive feature elimination with repeated 5-fold cross-validation, followed by the development of random forest models. The performance of the models was assessed using the area under the curve (AUC). …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  10. 36650
    “…A large proportion of LTS cases exhibited tertiary lymphoid tissue (TLT) formation, which has been observed to be a positive prognostic marker in a number of tumor types. Using a Random-Forest variable selection approach, we identified the density of stromal iNOS(+) cells and CD68(+) cells as strong positive and negative prognostic variables, respectively. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  11. 36651
    “…We developed 5 ML-assisted models from 22 clinical features using logistic regression (LR), LR optimized by least absolute shrinkage and selection operator (Lasso) regularization (Lasso-LR), support vector machine (SVM), extreme gradient boosting (XGBoost), and random forest (RF). The area under the curve (AUC) was applied to determine the model with the highest discrimination. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  12. 36652
  13. 36653
    “…Every morning BCAPS generated forecasts of salbutamol sulfate (e.g., Ventolin) inhaler dispensations for the upcoming days in 16 Health Service Delivery Areas (HSDAs) using random forest machine learning. These forecasts were compared with observations over a 63-day study period using different methods including the index of agreement (IOA), which ranges from 0 (no agreement) to 1 (perfect agreement). …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  14. 36654
  15. 36655
    “…Correlation analysis showed that the expression of stx2 was negatively correlated with the expression of MS4A1 (R=-0.56, P=0.05) and positively correlated with the expression of LTB (R=0.60, P=0.05). The random forest model and Boruta method revealed that expression of selected immune genes could be predictive indicators of stx2 expression with prediction accuracy of MS4A1 >LTB >CCL21 >CD19. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  16. 36656
    “…Patient and clinical characteristics available preoperatively, intraoperatively, and a combination of both were used as inputs for 5 candidate ML models: logistic regression, support vector machine, random forest, gradient boosting tree (GBT), and deep neural network (DNN). …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  17. 36657
    “…We also assessed the heterogeneity among studies and publication bias via the I-squared index and forest plots. RESULTS: There was no significant difference between arthroplasty and internal fixation groups in patient mortality at both short-term and long-term points. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  18. 36658
  19. 36659
    “…Novel PPIs were predicted by applying the HiPPIP algorithm, which computes features of protein pairs such as cellular localization, molecular function, biological process membership, genomic location of the gene, and gene expression in microarray experiments, and classifies the pairwise features as interacting or non-interacting based on a random forest model. We validated five novel predicted PPIs experimentally. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  20. 36660
Herramientas de búsqueda: RSS