Upcoming Maintenance May 27-29, 2025 - See our status page. for more details.

Filesystem Retirement - July 1, 2025 - More information on the Common Retirement FAQ page

Upcoming Introductory Trainings Are Now Available - Details on our Upcoming Events Page

logo
HCC-DOCS
Contact Us
Initializing search
    hcc/hcc-docs
    hcc/hcc-docs
    • HCC Documentation
    • Introduction to HPC
    • Connecting to HCC Clusters
      • Basic Linux commands
      • How to setup X11 forwarding
      • Connecting with MobaXterm
      • Connecting with PuTTY (Windows)
      • Reusing SSH connections
      • Connecting with Terminal
    • Creating an Account
      • Changing Your Password
      • Setting Up and Using Duo
    • Handling Data
      • Sharing data on Swan
      • Data storage
        • Using Scratch
        • NRDSTOR
          • MacOS
          • Smbclient
          • Swan
          • Windows
        • Using NU's Gitlab Instance
          • Setting up GitLab on HCC Clusters
        • Linux File Permissions
        • Using Attic
        • Using the /common File System
        • Preventing File Loss
        • Integrating Box with HCC
      • Data transfer
        • File Transfer with CyberDuck
        • File Transfer with scp
        • File Transfer with WinSCP
        • Globus connect
          • Activating HCC Cluster Collections
          • File Transfers Between Collections
          • File Transfers to and from Personal Workstations
          • File Sharing
          • Creating Globus Groups
          • Globus Command Line Interface
          • Activating UNL OneDrive on Globus
        • High Speed Data Transfers
        • Using Rclone with UNL's OneDrive
        • Connecting to CB3 iRODS
    • Running Applications
      • App specific
        • Building LIS
        • Building WRF
        • DMTCP Checkpointing
        • Fortran/C on HCC
        • MPI Jobs on HCC
        • Running Gaussian at HCC
        • Install and Running Matlab CobraToolbox, Gurobi, and IBM ILOG CPLEX
        • Running OLAM at HCC
        • Running Paraview
        • Running PostgreSQL
        • Running SAS on HCC
        • Running Theano
        • Visual Studio Code on HCC resources
        • Allinea profiling and debugging
          • Using Allinea Forge via Reverse Connect
          • Allinea performance reports
            • BLAST with Allinea Performance Reports
            • LAMMPS with Allinea Performance Reports
            • Ray with Allinea Performance Reports
        • Bioinformatics tools
          • Biodata Module
          • QIIME
          • Alignment tools
            • BLAT
            • Bowtie
            • Bowtie2
            • Clustal Omega
            • TopHat/TopHat2
            • Blast
              • Create Local BLAST Database
              • Running BLAST Alignment
            • Bwa
              • Running BWA Commands
          • Data manipulation tools
            • SRAtoolkit
            • Bamtools
              • Running BamTools Commands
            • Samtools
              • Running SAMtools Commands
          • De novo assembly tools
            • Oases
            • Ray
            • SOAPdenovo2
            • Trinity
              • Running Trinity in Multiple Steps
            • Velvet
              • Running Velvet with Paired-End Data
              • Running Velvet with Single-End and Paired-End Data
              • Running Velvet with Single-End Data
          • Pre processing tools
            • Cutadapt
            • PRINSEQ
            • Scythe
            • Sickle
            • TagCleaner
          • Reference based assembly tools
            • Cufflinks
          • Removing detecting redundant sequences
            • CAP3
            • CD-HIT
      • Running JupyterLab Notebooks with Slurm
      • Modules
        • Available Software for Swan
      • User software
        • Using R Libraries
        • Using Anaconda Package Manager
        • Compiling an OpenMP Application
        • Using Apptainer and Docker Containers
        • Installing Perl modules
    • Submitting Jobs
      • Creating an Interactive Job
      • Submitting a Job Array
      • Submitting GPU Jobs
      • Submitting an MPI Job
      • Submitting an OpenMP Job
      • Job Dependencies
      • Monitoring Jobs
      • GPU Monitoring and Optimizing
      • Partitions
        • Available Partitions for Swan
      • HCC Acknowledgment Credit
      • App specific
        • Submitting ANSYS Jobs
        • Submitting MATLAB Jobs
        • Submitting R Jobs
    • HCC OnDemand
      • Connecting to HCC OnDemand
      • Managing and Transferring Files with HCC OnDemand
      • Job Management and Submission with HCC OnDemand
      • Shell Access with HCC OnDemand
      • Virtual Desktop and Interactive Apps with HCC OnDemand
      • CryoSPARC Interactive App
    • Anvil: HCC's Cloud
      • Adding SSH Key Pairs
      • Anvil Instance Types
      • Available images
      • Connecting to Linux Instances from Mac
      • Connecting to Linux Instances from Windows
      • Connecting to Linux Instances using X2Go
      • Connecting to the Anvil VPN
      • Connecting to Windows Instances
      • Creating an Instance
      • Creating and attaching a volume
      • Creating SSH key pairs on Mac
      • Creating SSH key pairs on Windows
      • Formatting and mounting a volume in Linux
      • Formatting and mounting a volume in Windows
      • Resizing an instance
      • Using MySQL instances
      • What are the per-group resources limit?
    • NRP
      • Quick Start
      • Basic Kubernetes
      • GPU Pods
      • Batch Jobs
      • Deployments
      • Storage
      • JupyterHub Service
    • The OSG Consortium
      • Characteristics of an OSG friendly job
    • FAQ
      • HCC Class Info for Instructors
      • HCC Class Info for Students
      • Common Retirement
      • I have an HCC account, now what?
      • SSH host keys
    • Good HCC Pratices
      • ATTIC Guidelines and Best Practices
    • Contact Us

    Contact Us

    If you have questions, please contact us at hcc-support@unl.edu or join one of our Remote Open Office hours or schedule a remote session at hcc-support@unl.edu.

    Lincoln Omaha
    118 SHOR
    1100 T St
    Lincoln, NE 68588-0150
    1110 S 67th St
    Omaha, NE 68106
    Map Map

    Or contact one of usĀ directly.

    Holland Computing Center | 118 Schorr Center, Lincoln NE 68588
    Made with Material for MkDocs