Difference between revisions of "Grant Info"
Line 3: | Line 3: | ||
In order to assist faculty in preparing grants, the CNRG maintains this page of descriptions for our services. This page describes the services we offer in a format that we have been told is what is most often needed. | In order to assist faculty in preparing grants, the CNRG maintains this page of descriptions for our services. This page describes the services we offer in a format that we have been told is what is most often needed. | ||
− | + | <br> | |
'''Computation Cluster'''<br>The computation cluster consists of ten worker nodes, with each worker node containing two quad core Intel E5530 2.4 GHz CPUs and 24GB of RAM. These nodes are connected via 1 Gb/s Ethernet to a Cisco 3650 switch, which in turn is connected to the head node via 10 Gb/s Ethernet. This system is connected to 100TB of disk space, which is backed up daily. This cluster uses the ROCKS 5.2 cluster distribution.<br>'''<br>EBI Cluster'''<br>The EBI cluster consists of 25 worker nodes, with each worker node containing two quad core Intel E5440 2.8GHz processors with 16GB of RAM. These nodes are connected via 1GB/s Ethernet to a Cisco 3650 switch, which is in turn connected to the head node. The head node is connected to an Infortrend A24G-G2430 system which provides 33TB of storage, which is backed up daily. This cluster uses the ROCKS 5.0 cluster distribution. | '''Computation Cluster'''<br>The computation cluster consists of ten worker nodes, with each worker node containing two quad core Intel E5530 2.4 GHz CPUs and 24GB of RAM. These nodes are connected via 1 Gb/s Ethernet to a Cisco 3650 switch, which in turn is connected to the head node via 10 Gb/s Ethernet. This system is connected to 100TB of disk space, which is backed up daily. This cluster uses the ROCKS 5.2 cluster distribution.<br>'''<br>EBI Cluster'''<br>The EBI cluster consists of 25 worker nodes, with each worker node containing two quad core Intel E5440 2.8GHz processors with 16GB of RAM. These nodes are connected via 1GB/s Ethernet to a Cisco 3650 switch, which is in turn connected to the head node. The head node is connected to an Infortrend A24G-G2430 system which provides 33TB of storage, which is backed up daily. This cluster uses the ROCKS 5.0 cluster distribution. | ||
+ | <br> | ||
+ | Large Memory Cluster | ||
− | + | <br> | |
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
+ | Web Server | ||
− | + | <br> | |
− | + | Web Application Server | |
+ | <br> '''File Server''' | ||
− | + | The new file server, which will be in place in June of 2010, consists of 100TB of disk space, and will share its disk with other system managed by the IGB. This disk space will be backed up every evening via the backup service. | |
− | + | <br> '''Backup Server''' | |
+ | The Backup Server consists of 200TB of disk space and a Qualstar 50 tape LTO4 library. Using Zmanda we backup all the servers every evening, then once every six months, we backup this information to tape and start over again. | ||
+ | <br> | ||
− | '''Computer Lab''' | + | '''Computer Lab''' |
The IGB computer lab consists of 48 Dell Opteron FX160 computers with 19 inch screens. These computers use the Intel Atom processor, and are meant to perform normal, every day tasks such as web browsing or word processing. Using a software package provided by Citrix, we are able to rapidly reconfigure the operating system and software of these machines. The bulk of the high performance processing is expected to be done on the cluster that is dedicated for use in the classroom and is described below. Additionally, the lab has a file/web/database server that can host information to facilitate any scheduled event. | The IGB computer lab consists of 48 Dell Opteron FX160 computers with 19 inch screens. These computers use the Intel Atom processor, and are meant to perform normal, every day tasks such as web browsing or word processing. Using a software package provided by Citrix, we are able to rapidly reconfigure the operating system and software of these machines. The bulk of the high performance processing is expected to be done on the cluster that is dedicated for use in the classroom and is described below. Additionally, the lab has a file/web/database server that can host information to facilitate any scheduled event. | ||
− | '''<br>''' | + | '''<br>''' |
'''Computer Lab Cluster'''<br> | '''Computer Lab Cluster'''<br> | ||
The IGB computer lab cluster consists of 24 dual quad processor Dell 2950 III servers connected via 1Gb Ethernet. The intent of the cluster is to teach students how to use their applications in a high performance computing (HPC) environment. This would include experience with the Linux operating system, learning how to write scripts for submission of non-interactive jobs, and submission and monitoring of jobs through the Sun Grid Engine (SGE). With this setup each student in the classroom can simultaneously have control of up to 4 processors in the cluster at the same time. | The IGB computer lab cluster consists of 24 dual quad processor Dell 2950 III servers connected via 1Gb Ethernet. The intent of the cluster is to teach students how to use their applications in a high performance computing (HPC) environment. This would include experience with the Linux operating system, learning how to write scripts for submission of non-interactive jobs, and submission and monitoring of jobs through the Sun Grid Engine (SGE). With this setup each student in the classroom can simultaneously have control of up to 4 processors in the cluster at the same time. |
Revision as of 13:57, 8 March 2010
Grant Information[edit]
In order to assist faculty in preparing grants, the CNRG maintains this page of descriptions for our services. This page describes the services we offer in a format that we have been told is what is most often needed.
Computation Cluster
The computation cluster consists of ten worker nodes, with each worker node containing two quad core Intel E5530 2.4 GHz CPUs and 24GB of RAM. These nodes are connected via 1 Gb/s Ethernet to a Cisco 3650 switch, which in turn is connected to the head node via 10 Gb/s Ethernet. This system is connected to 100TB of disk space, which is backed up daily. This cluster uses the ROCKS 5.2 cluster distribution.
EBI Cluster
The EBI cluster consists of 25 worker nodes, with each worker node containing two quad core Intel E5440 2.8GHz processors with 16GB of RAM. These nodes are connected via 1GB/s Ethernet to a Cisco 3650 switch, which is in turn connected to the head node. The head node is connected to an Infortrend A24G-G2430 system which provides 33TB of storage, which is backed up daily. This cluster uses the ROCKS 5.0 cluster distribution.
Large Memory Cluster
Web Server
Web Application Server
File Server
The new file server, which will be in place in June of 2010, consists of 100TB of disk space, and will share its disk with other system managed by the IGB. This disk space will be backed up every evening via the backup service.
Backup Server
The Backup Server consists of 200TB of disk space and a Qualstar 50 tape LTO4 library. Using Zmanda we backup all the servers every evening, then once every six months, we backup this information to tape and start over again.
Computer Lab
The IGB computer lab consists of 48 Dell Opteron FX160 computers with 19 inch screens. These computers use the Intel Atom processor, and are meant to perform normal, every day tasks such as web browsing or word processing. Using a software package provided by Citrix, we are able to rapidly reconfigure the operating system and software of these machines. The bulk of the high performance processing is expected to be done on the cluster that is dedicated for use in the classroom and is described below. Additionally, the lab has a file/web/database server that can host information to facilitate any scheduled event.
Computer Lab Cluster
The IGB computer lab cluster consists of 24 dual quad processor Dell 2950 III servers connected via 1Gb Ethernet. The intent of the cluster is to teach students how to use their applications in a high performance computing (HPC) environment. This would include experience with the Linux operating system, learning how to write scripts for submission of non-interactive jobs, and submission and monitoring of jobs through the Sun Grid Engine (SGE). With this setup each student in the classroom can simultaneously have control of up to 4 processors in the cluster at the same time.