Skip to main content

Breadcrumb

  1. ACCESS Home
  2. Support
  3. Knowledge Base
  4. Knowledge Base Resources

Knowledge Base Resources

These resources are contributed by researchers, facilitators, engineers, and HPC admins. Please upvote resources you find useful!
Add a Resource

Topics

  • machine-learning (50)
  • ai (45)
  • training (41)
  • data-analysis (40)
  • deep-learning (28)
  • documentation (28)
  • big-data (26)
  • neural-networks (24)
  • workforce-development (21)
  • professional-development (18)
  • visualization (18)
  • parallelization (16)
  • community-outreach (14)
  • programming (14)
  • image-processing (13)
  • cybersecurity (12)
  • gpu (12)
  • r (12)
  • pytorch (11)
  • slurm (10)
  • c (9)
  • cloud-computing (9)
  • compiling (9)
  • mpi (9)
  • plotting (9)
  • administering-hpc (8)

Topics

  • machine-learning (50)
  • ai (45)
  • training (41)
  • data-analysis (40)
  • deep-learning (28)
  • documentation (28)
  • big-data (26)
  • neural-networks (24)
  • workforce-development (21)
  • professional-development (18)
  • visualization (18)
  • parallelization (16)
  • community-outreach (14)
  • programming (14)
  • image-processing (13)
  • cybersecurity (12)
  • gpu (12)
  • r (12)
  • pytorch (11)
  • slurm (10)
  • c (9)
  • cloud-computing (9)
  • compiling (9)
  • mpi (9)
  • plotting (9)
  • administering-hpc (8)

If you'd like to use more filters, please login to view them all.

Training an LSTM Model in Pytorch
0
  • Tutorial Link
  • Airline Data Link
This google colab notebook tutorial demonstrates how to create and train an lstm model in pytorch to be used to predict time series data. An airline passenger dataset is used as an example.
aisupervised-learningmachine-learning
0 Likes

Login to like
Type
learning
Level
Intermediate
Machine Learning with sci-kit learn
0
  • scikit learn tutorial
In the realm of Python-based machine learning, Scikit-Learn stands out as one of the most powerful and versatile tools available. This introductory post serves as a gateway to understanding Scikit-Learn through explanations of introductory ML concepts along with implementations examples in Python.
aibig-datamachine-learning
0 Likes

Login to like
Type
learning
Level
Beginner
An Introduction to the Julia Programming Language
0
  • An Introduction to Julia
  • The Julia Computing Language
The Julia Programming Language is one of the fastest growing software languages for AI/ML development. It writes in manner that's similar to Python while being nearly as fast as C++, while being open source, and reproducible across platforms and environments. The following link provide an introduction to using Julia including the basic syntax, data structures, key functions, and a few key packages.
aidata-analysismachine-learningjulia
0 Likes

Login to like
Type
learning
Level
Beginner
RMACC Website
0
  • RMACC.org
Rocky Mountain Advanced Computing Consortium Website
community-outreach
0 Likes

Login to like
Type
website
Level
Beginner, Intermediate, Advanced
Campus Research Computing Consortium (CaRCC)
0
  • CaRCC
CaRCC – the Campus Research Computing Consortium – is an organization of dedicated professionals developing, advocating for, and advancing campus research computing and data and associated professions. Vision: CaRCC advances the frontiers of research by improving the effectiveness of research computing and data (RCD) professionals, including their career development and visibility, and their ability to deliver services and resources for researchers. CaRCC connects RCD professionals and organizations around common objectives to increase knowledge sharing and enable continuous innovation in research computing and data capabilities.
community-outreachprofessional-developmentresearch-facilitationworkforce-development
0 Likes

Login to like
Type
website
Level
Beginner, Intermediate, Advanced
GDAL Multi-threading
0
  • GDAL Multi-threading
Multi-threading guidance when using GDAL.
parallelizationgis
0 Likes

Login to like
Type
learning
Level
Intermediate
Jetstream Home
0
  • Jetstream Website
Jetstream2 makes cutting-edge high-performance computing and software easy to use for your research regardless of your project’s scale—even if you have limited experience with supercomputing systems.Cloud-based and on-demand, the 24/7 system includes discipline-specific apps. You can even create virtual machines that look and feel like your lab workstation or home machine, with thousands of times the computing power.
jetstream
0 Likes

Login to like
Type
website
Level
Beginner, Intermediate, Advanced
Using Dask on HPC Systems
0
  • Dask Tutorial Github Page
  • Video Recording of Tutorial - Part 1
  • Video Recording of Tutorial - Part 2
A tutorial on the effective use of Dask on HPC resources. The four-hour tutorial will be split into two sections, with early topics focused on novice Dask users and later topics focused on intermediate usage on HPC and associated best practices. The knowledge areas covered include (but are not limited to): Beginner section High-level collections including dask.array and dask.dataframe Distributed Dask clusters using HPC job schedulers Earth Science data analysis using Dask with Xarray Using the Dask dashboard to understand your computation Intermediate section Optimizing the number of workers and memory allocation Choosing appropriate chunk shapes and sizes for Dask collections Querying resource usage and debugging errors
trainingjupyterhubpython
0 Likes

Login to like
Type
learning
Level
Beginner, Intermediate
iOS CoreML + SwiftUI Image Classification Model
0
  • Document Tutorial
This tutorial will teach step-by-step how to create an image classification model using Core ML in XCode and integrate it into an iOS app that will use the user's iPhone camera to scan objects and predict based on the image classification model.
aimachine-learning
0 Likes

Login to like
Type
documentation
Level
Beginner
Numba: Compiler for Python
0
  • Numba Compiler
Numba is a Python compiler designed for accelerating numerical and array operations, enabling users to enhance their application's performance by writing high-performance functions in Python itself. It utilizes LLVM to transform pure Python code into optimized machine code, achieving speeds comparable to languages like C, C++, and Fortran. Noteworthy features include dynamic code generation during import or runtime, support for both CPU and GPU hardware, and seamless integration with the Python scientific software ecosystem, particularly Numpy.
vectorizationoptimizationperformance-tuningparallelization
0 Likes

Login to like
Type
documentation
Level
Intermediate, Advanced
Working with Python on HPC Clusters
0
  • Working with Python on HPC Clusters
This tutorial series and documentation covers topics on using Python on HPC clusters. The specific steps are based on the HOPPER cluster at George Mason University in Fairfax, VA. They should be implementable on most HPC clusters that have the SLURM scheduler installed, the Environment Modules system for managing packages and Open onDemand for a web-based GUI to access the cluster resources.
pytorchbatch-jobsjob-submissionschedulingslurmmodulesscriptingcondapython
0 Likes

Login to like
Type
documentation
Level
Beginner, Intermediate
High performance computing 101
0
  • High performance computing 101
An introductory guide to High Performance Computing.
administering-hpc
0 Likes

Login to like
Type
website
Level
Beginner
Introduction to Linux CLI for Researchers
0
  • Intro Linux Tutorial for researchers
The goal of this video is to help researchers and students recently given allocations to High Performance Compute resources a basic introduction to Linux commands to help them get started. These are a few of the most fundamental commands for navigating and getting started. If you find this video helpful or would like me to continue this series let me know!
bashsshresearch-facilitationtraining
0 Likes

Login to like
Type
learning
Level
Beginner
Use Windows Subsystem for Linux for HPC Command Line Access from Windows
0
  • Install Linux on Windows with WSL
Windows Subsystem for Linux (WSL) provides a Linux environment for Windows users to access HPC resources fast and efficiently.
workflowssh
0 Likes

Login to like
Type
tool
Level
Beginner
Introduction to MP
0
  • A “Hands-on” Introduction to OpenMP*
Open Multi-Processing, is an API designed to simplify the integration of parallelism in software development, particularly for applications running on multi-core processors and shared-memory systems. It is an important resource as it goes over what openMP and ways to work with it. It is especially important because it provides a straightforward way to express parallelism in code through pragma directives, making it easier to create parallel regions, parallelize loops, and define critical sections. The key benefit of OpenMP lies in its ease of use, automatic thread management, and portability across various compilers and platforms. For app development, especially in the context of mobile or desktop applications, OpenMP can enhance performance by leveraging the capabilities of modern multi-core processors. By parallelizing computationally intensive tasks, such as image processing, data analysis, or simulations, apps can run faster and more efficiently, providing a smoother user experience and taking full advantage of the available hardware resources. OpenMP's scalability allows apps to adapt to different hardware configurations, making it a valuable tool for developers aiming to optimize their software for a range of devices and platforms.
expansefastercc++compilingopenmpprogramming
0 Likes

Login to like
Type
presentation
Level
Intermediate
Language models and using HPC resources
0
  • AI-Generated Text Detection In 2023
Documentation and research based on the latest NLP text generation detection methods for 2023.
natural-language-processing
0 Likes

Login to like
Type
learning
Level
Intermediate
Neurodesk
0
  • Neurodesk
Neurodesk provides a containerised data analysis environment to facilitate reproducible analysis of neuroimaging data. Analysis pipelines for neuroimaging data typically rely on specific versions of packages and software, and are dependent on their native operating system. These dependencies mean that a working analysis pipeline may fail or produce different results on a new computer, or even on the same computer after a software update. Neurodesk provides a platform in which anyone, anywhere, using any computer can reproduce your original research findings given the original data and analysis code.
psychologycontainerssoftware-installationversion-control
0 Likes

Login to like
Type
website
Level
Beginner, Intermediate, Advanced
phenoACCESS-24 workshop program materials
0
  • phenoACCESS-24: Workshop on Research Computing and Plant Phenotyping
phenoACCESS-24: Workshop on Research Computing and Plant Phenotyping High-throughput plant phenotyping is computationally intensive, requiring data storage, data processing and analysis, research computing expertise, and mechanisms for data sharing. This workshop is aimed at research computing workforce development by addressing questions such as what is plant phenotyping; what types of data are collected; what are the preprocessing and analytical needs; what tools and platforms exist for data capture, management, analysis, and storage; and how best to collaborate and engage with phenotyping researchers. The full-day agenda will include speakers (scientists and research compute staff); panel discussions (how to work with research computing staff and facilities; how to engage with phenotyping scientists), and networking opportunities (meet-and-greet, ice breakers, small group discussions). The videos and slide decks for the talks are included on the linked page.
big-datadata-managementmetadatabiologyprofessional-developmentworkforce-development
0 Likes

Login to like
Type
website
Level
Intermediate
Docker Tutorial for Beginners
0
  • Docker Tutorial for Beginners
A Docker tutorial for beginners is a course that teaches the basics of Docker, a containerization platform that allows you to package your application and its dependencies into a standardized unit for development, shipment, and deployment.
docker
0 Likes

Login to like
Type
video_link
Level
Beginner, Intermediate, Advanced
Bridges-2 Home Page
0
  • Bridges 2 Home Page
Landing Page for Bridges-2 information
matlab
0 Likes

Login to like
Type
website
Level
Beginner, Intermediate, Advanced
Slurm Tutorials
0
  • Slurm Tutorials
Introduction to the Slurm Workload Manager for users and system administrators, plus some material for Slurm programmers.
administering-hpccluster-managementhpc-cluster-architecturetraining
0 Likes

Login to like
Type
learning
Level
Beginner
What are LSTMs?
0
  • Introduction to LSTMs
This reading will explain what a long short-term memory neural network is. LSTMs are a type of neural networks that rely on both past and present data to make decisions about future data. It relies on loops back to previous data to make such decisions. This makes LSTMs very good for predicting time-dependent behavior.
aideep-learningmachine-learningneural-networks
0 Likes

Login to like
Type
learning
Level
Intermediate, Advanced
Metadata Systems
0
  • Metadata Systems
Metadata is a vital topic in libraries and librarianship, encompassing structured information used for accessing digital resources. The definition of metadata varies but is essentially data about data. It has evolved beyond simply describing metadata schemas and now focuses on topics like interoperability, non-descriptive metadata (administrative and preservation metadata), and the effective application of metadata schemas for user discovery. Interoperability, the ability to seamlessly exchange metadata between systems, is a major concern. Different levels of interoperability are examined, including schema-level, record-level, and repository-level. Challenges to interoperability include variations in standards, collaboration barriers, and costs.Metadata management is discussed in terms of the holistic management of metadata across an entire library. Steps include analyzing metadata requirements, adopting schema, creating metadata content, delivery/access, evaluation, and maintenance. Administrative metadata, which encompasses ownership and production information, is becoming more critical, particularly for electronic resource licensing. Preservation metadata is also gaining importance in ensuring the long-term viability of digital objects.
metadata
0 Likes

Login to like
Type
learning
Level
Intermediate

Pagination

  • First page « First
  • Previous page ‹‹
  • …
  • Page 3
  • Page 4
  • Page 5
  • Page 6
  • Current page 7
  • Page 8
  • Page 9
  • Page 10
  • Page 11
  • …
  • Next page ››
  • Last page Last »