# Setting Up Your Laptop for the GLC16 Workshop

The 2016 Great Lakes Cosmology Conference will begin on Sunday, June 19, with a graduate student workshop on scientific python, and two popular analysis toolkits: pynbody and yt. Pynbody is a simulation analysis tool for working (primarily) with particle-based datasets, such as the outputs of SPH or N-body simulations. yt is a community-developed analysis and visualization toolkit for volumetric data. yt has been applied mostly to astrophysical simulation data, but it can be applied to many different types of data including seismology, radio telescope data, weather simulations, and nuclear engineering simulations.

The workshop will begin with an introduction to python and the Jupyter notebook, and then have two in-depth presentations on yt and pynbody. In order to make things easy for you, it would be helpful if you brought a laptop with some tools pre-installed, to follow along with the tutorials. This post is a brief set of instructions on how I'd recommend you set things up. If you are a super class-A hacker, and you can do all of this in your sleep, feel free to skip these instructions.

# Building Minimal Cosmological Zoomin ICs

## Cubes, Ellipsoids, and Convex Hulls are non-optimal¶

When building a cosmological zoom-in simulation, the strategy is to run a low resolution, dark matter only simulation, select halos of interest, trace the particles that form that halo back to the initial conditions, and then build a new set of ICs with the regions that form the halo given higher resolution. In order to minimize the number of high-resolution elements needed in the zoom-in IC, the volume that is refined should hug the particles of interest as tightly as possible. Unfortunately, because the cosmic web is composed of sheets and filaments, the regions in an IC that need refinement can have complex shapes (they often look like prawn crackers). This means that simple shapes that enclose the region (cubes, ellipsoids and convex hulls are often used) will frequently contain many times the volume just traced by the particles, giving much larger (and computationally expensive). THERE HAS TO BE ANOTHER WAY!

# RAMSES on the Scinet GPC

RAMSES is an AMR MHD cosmological code. This is a short guide on how to run it on the Scinet GPC.

# Making Mock Observations with Sunrise & Docker

In a previous post, I described how

## Setting up the docker instance

### Data volumes for file sharing

We will need somewhere to place our simulation outputs, configuration files, and ultimately, the output of sunrise. We will also need to let our docker container read and write to this directory. All you need to do in order to do this is use the -v host_directory:container_directory switch when starting your container. This will mount host_directory in the docker container at the container_directory mountpoint. In other words, we start our docker image with something like this command:

docker run -t -i -v ~/sunrise_data:/sunrise_data bwkeller/sunrise:latest
/bin/bash


# Sunrise on Docker

Sunrise is a popular "Monte-Carlo Radiation Transfer code for calculating absorption and scattering of light in astrophysical situations" (it's hosted on bitbucket here). Unfortunately, it is notoriously finnicky to get installed, as it relies on specific point release versions of nearly a dozen different libraries. This is exactly the sort of problem that Docker is supposed to solve, at scale: distributing packages along with all of the library infrastructure they require in one self-contained image. If you are skeptical about docker's performance, check out this paper IBM Research has published: in nearly every metric, docker performance is within 5% of native bare-metal

I'm going to build a docker image with a working install of sunrise, that should save me and my fellow grad students days of wrestling with angry, old C++ libraries. Details below the fold.