Mostrando 221 - 240 Resultados de 272 Para Buscar '"Gigabyte"', tiempo de consulta: 0.35s Limitar resultados
  1. 221
    “…The world is awash in data—by 2020 it is expected that there will be approximately 40 trillion gigabytes of data in existence, with that number doubling every 2 to 3 years. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  2. 222
    “…However, like other conifers, the members of Cupressaceae have extremely large genome (> 8 gigabytes), which limited the researches of these taxa. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  3. 223
    “…The datasets from these samples are typically large, with file sizes ranging from gigabytes to terabytes and the number of image slices within the three-dimensional stack in the hundreds. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  4. 224
    “…Each project has already generated assemblies of hundreds of gigabytes on disk, greatly impeding the distribution of and access to such rich datasets. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  5. 225
    “…The software stacks easily accumulate to many gigabytes for a single release. New releases are often produced on a daily basis. …”
    Enlace del recurso
    Enlace del recurso
  6. 226
    “…FastTree 2 inferred a topology and likelihood-based local support values for 237,882 distinct 16S ribosomal RNAs on a desktop computer in 22 hours and 5.8 gigabytes of memory. CONCLUSIONS/SIGNIFICANCE: FastTree 2 allows the inference of maximum-likelihood phylogenies for huge alignments. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Texto
  7. 227
    “…We show that, the compression factor of the algorithm ranges from 16 to several hundreds, which potentially allows SNP data of hundreds of Gigabytes to be stored in hundreds of Megabytes. We provide a C++ implementation of the algorithm, which supports direct loading and parallel loading of the compressed format without requiring extra time for decompression. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  8. 228
    “…This conceptually simple strategy encounters the following computational issues: a large number of tests and very large genotype files (many Gigabytes) which cannot be directly loaded into the software memory. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  9. 229
    “…Although high-throughput sequencing has made it possible to examine the genome and transcriptome at unprecedented resolution, extracting useful information from gigabytes of sequencing data still requires substantial computational skills. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  10. 230
  11. 231
    “…Bottom-up proteomics, the most common MS-based proteomics approach, has always been challenging in terms of data management, processing, analysis and visualization, with modern instruments capable of producing several gigabytes of data out of a single experiment. Here, we present ProteoSign, a freely available web application, dedicated in allowing users to perform proteomics differential expression/abundance analysis in a user-friendly and self-explanatory way. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  12. 232
    “…A human genome of 100 kilobase resolution would involve ∼30,000 loci, requiring gigabytes just in storing the matrices. RESULTS: We propose a succinct representation of the distance matrices which tremendously reduces the space requirement. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  13. 233
  14. 234
    “…HPC is a key tool for processing and analyzing the constantly growing volume of data, from 64.2 zettabytes in 2020 to an expected 180 zettabytes in 2025 (1 zettabyte is equal to 1 trillion gigabytes). As such, HPC has a large number of application areas that range from climate change, monitoring and mitigating planning to the production of safer and greener vehicles and treating COVID-19 pandemic to the advancement of knowledge in almost every scientific field and industrial domain. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  15. 235
    “…One of the widely used imaging techniques that generates data at tens of gigabytes per second is computed tomography (CT). Although CT experiments result in rapid data generation, the analysis and reconstruction of the collected data may require hours or even days of computation time with a medium-sized workstation, which hinders the scientific progress that relies on the results of analysis. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  16. 236
    “…We performed searches for 231 world countries (including independent sovereign states, dependent areas, and disputed territories) and common misspellings in >14 gigabytes of data including >13 billion characters of clinical text. …”
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  17. 237
    “…However, the data generated by NGS technology is usually in the order of hundreds of gigabytes per experiment, thus requiring efficient and scalable programs to perform data analysis quickly. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  18. 238
  19. 239
    “…There are a large number of whole-slide imaging scanners, and the resulting images are frequently larger than 100,000 × 100,000 pixels which typically image 100,000 to one million cells, ranging from several hundred megabytes to many gigabytes in size. AIMS AND OBJECTIVES: Provide HTTP access over the web to Whole Slide Image tiles that do not have localized tiling servers but only basic HTTP access. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
  20. 240
    “…The proposed model is able to segment several hundreds of gigabytes of data in a few minutes and could be applied to other materials and tomography techniques. …”
    Enlace del recurso
    Enlace del recurso
    Enlace del recurso
    Online Artículo Texto
Herramientas de búsqueda: RSS