BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Drupal//recurring_events_ical//2.0//EN
BEGIN:VEVENT
UID:c88d2307-d441-47c7-ba86-853bef39b555@support.access-ci.org
DTSTAMP:20251121T123746Z
DTSTART:20260312T180000Z
DTEND:20260312T193000Z
SUMMARY:COMPLECS: Data Transfer
DESCRIPTION:SummaryWhether analyzing experimental data collected from devic
 es in the field on a laptop or generating simulated data from large-scale 
 numerical calculations performed on high-performance computing (HPC) syste
 ms, how you move your data to where you need it, when you need it, is one 
 of the most important aspects of creating your research workflows. And the
 re are many ways to transfer data between the data storage and the file sy
 stems you interact with. However, which transfer method is right for you w
 ill depend on the answers to a few key questions about the data: Where is 
 the data located? How is the data organized? How much data is there? And w
 here is the data going? In this second part of our series on Data Managem
 ent, we introduce you to the essential concepts and command-line tools you
  should learn when you first begin transferring data to and from HPC (or a
 ny remote) systems regularly. You will learn how to check the integrity of
  your data after a transfer has completed, how to utilize file compression
 , and how to choose the right data transfer tool for different situations.
  We also introduce you to the common data storage and file systems your da
 ta may encounter, their advantages and limitations, and how their differen
 t characteristics may affect data transfer performance on one end or the o
 ther. Additional topics about data transfer will be covered as time permit
 s. InstructorMarty Kandes is a Senior Computational and Data Science Rese
 arch Specialist at the San Diego Supercomputer Center (SDSC). As part of t
 he High-Performance Computing (HPC) User Services Group within the Data-En
 abled Scientific Computing Division, he provides technical user support an
 d services to the national research community leveraging the Advanced Cybe
 rinfrasurcture (CI) and HPC resources designed, built and operated by SDSC
  on behalf of the U.S. National Science Foundation (NSF). Marty is also a 
 member of the National Artificial Intelligence (AI) Research Institute for
  Intelligent CI with Computational Learning in the Environment (ICICLE). H
 is current research interests include problems in distributed AI inference
  over wireless networks, data privacy in natural language processing, and 
 secure interactive computing. He also contributes to many of the education
 , outreach, and training initiatives at SDSC, including serving as a Co-PI
  for the COMPrehensive Learning for end-users to Effectively utilize Cyber
 infraStructure (COMPLECS) CyberTraining program and as mentor for the Rese
 arch Experience for High School Students (REHS) program. Marty received h
 is Ph.D. in Computational Science from the Computational Science Research 
 Center (CSRC) at San Diego State University (SDSU), where he studied quant
 um systems in rotating frames of reference through the use of numerical si
 mulations. He also holds an M.S. in Physics from SDSU and dual B.S. degree
 s in Applied Mathematics and Physics from the University of Michigan, Ann 
 Arbor. See a complete list of SDSC's upcoming training and events here.--
 - COMPLECS (COMPrehensive Learning for end-users to Effectively utilize C
 yberinfraStructure) is a new SDSC program where training will cover non-pr
 ogramming skills needed to effectively use supercomputers. Topics include 
 parallel computing concepts, Linux tools and bash scripting, security, bat
 ch computing, how to get help, data management and interactive computing. 
 Each session offers 1 hour of instruction followed by a 30-minute Q&A. COM
 PLECS is supported by NSF award 2320934.
URL:https://support.access-ci.org/events/8714
END:VEVENT
END:VCALENDAR