Teams across campus are using cloud services to deliver solutions. Check out these examples of successful projects in the cloud:
Launch of Web-based Math Assessment Tool
Weinberg College IT Solutions
Weinberg College IT Solutions (WITS) was approached by a faculty member in the statistics department looking to host a web-based math assessment tool for delivering and automatically grading math homework and tests (IMathAS, Internet Mathematics Assessment System). When security issues with the CAESAR integration arose, WITS began to consider hosting the application internally rather than relying on the vendor.
When weighing where to host the application, WITS investigated the feasibility and affordability of hosting IMathAS in AWS in an automated and relatively inexpensive manner. They found a resoundingly successful approach, and Northwestern IMathAS is currently running on an EC2 instance, using Puppet for orchestration and Terraform to configure the AWS environment and launch instances. EC2 provides the ability to scale up or down and replace instances as needed, and development instances can be provisioned via Terraform on-demand. Crucially, hosting the application on EC2 also allowed WITS to run the Qualys agent to validate the app was secure enough to meet Caesar’s requirements, allowing the applications to connect and interact.
NUANCE Data Workflow
Northwestern IT Research Computing Services
The Northwestern University Atomic and Nanoscale Characterization Experimental Center (NUANCE) generates large amounts (terabytes daily) of raw data in the form of images, video, and configuration files for nearly 100 scientific instruments. Saving this data locally is impossible, given local computer storage limitations, data security concerns, and the need for users of the instruments to be able to access their data after their time in the lab. NUANCE needed a data workflow that would give users time to retrieve their data from a centralized storage location, then archive the data to meet grant requirements for long-term data retention.
For this workflow, we used a combination of local networked storage, Research Data Storage Service (RDSS), an automated data lifecycle tool called Komprise, and AWS S3 and Glacier. RDSS was a natural choice for storage, as it is intended for research data, and can be mounted as a network drive by Windows, Mac, or Linux. Komprise automates moving data on a schedule to other file systems. In this case, we created a policy for archiving data from RDSS to AWS S3 and then immediately moving to Glacier for long-term cold storage via an S3 lifecycle policy.
The major challenges were providing users access to their data and minimizing costs. For security and cost reasons, RDSS permissions only make data available in the lab on shared workstations for a maximum of 30 days. This minimizes the cost of having many terabytes of RDSS storage, as the data is being moved out to AWS quickly. The cost of AWS Glacier is $2/TB/month, vs. approximately $11/TB/month for RDSS. We had to balance the need for cost savings against providing users self-service access to their data. Data is retrievable from AWS if needed, but users quickly adapted to the 30 day window and requests for data retrieval have been rare. This data workflow has been easy for users to understand and straightforward to support.