Working with Haskell on a Shared Host (like Uberspace)
When you develop a Haskell project, the easiest way is currently to use stack for setting up a development environment, handling dependencies, and building and running your project. In this post, I will briefly explain how I installed and now use stack on Uberspace. This will probably work on with similar shared hosters as well, maybe with some small changes.
Usually, you don’t have root or sudo access on a shared server, so you can neither install a distribution package for stack nor use the generic installer script, which needs sudo. Instead, you can use one of the available binary builds. For me (Uberspace with CentOS 6), the 64-bit static binary worked fine.
Just unpack the archive and move the
stack file to
On Uberspace, the directory
~/bin is added to
$PATH by default.
If this is different for you, either add it to the path yourself (e.g. in your
.bashrc or equivalent)
... export PATH="$HOME/bin:$PATH ...
stack in a different location on your path.
Remember to log out and log in again, if you modified your
After installing stack, proceed as you would on your local machine. By installing stack first, you get a complete Haskell environment without using any distribution packages.
Limiting Memory Usage during Compilation
When working on a shared host, however, chances are that your processes may not exceed a certain limit of memory usage.
Since Haskell compilation is quite resource expensive, this can become a problem for larger projects or dependencies.
I recently ran into this problem when compiling a project using hakyll,
which depends on pandoc.
Uberspace has a memory limit of about
1GiB 600MB (which may be exceeded up to 1200MB for less than 10 minutes), which is easily exceeded by compiling pandoc.
Fortunately, every Haskell program linked with
enabled can be told to limit its heap size by adding
+RTS -M... -RTS.
This also works for GHC!
In my case, I used a heap limit of 768MB (RTS options are 1,000 based),
resulting in a total maximal RAM usage of around 900MiB according to htop.1
In theory this means something like:
$ ghc ... +RTS -M768m -RTS
Unfortunately, with stack things look a bit differently:
While you can specify options for GHC with
--ghc-options "...", this only works for the local project,
not for the dependencies
(or snapshot projects in general).
--ghc-options "+RTS -M768m -RTS" to your build command does not affect the compilation process of a dependency like pandoc.
As described in the stack FAQ linked above, the solution is to add a
ghc-options entry to
GHC options added there are applied to the dependencies.
Usually, you should be careful with this setting since compiling a dependency with different GHC options can lead to unexpected behaviour.
In this case, however, it is completely safe since the RTS options only affect the runtime behaviour of GHC, not its output.
In my case, the entry looks like this:
ghc-options: "*": "\"+RTS -M768m -RTS\""
The memory limit is usually not required for your local machine.
Instead of adding and removing this options depending on whether you compile on a locally or remotely, you can create a second stack configuration file
remote.yaml containing the additional
ghc-options for the remote machine.
When needed (e.g., on the server), you can use it with
$ stack --stack-yaml remote.yaml ...
but it does not affect your workflow on the local machine at all.
Alternatively, add the
ghc-options to your
~/.stack/config.yaml to enable it globally on your remote machine.
GHC and stack cannot be installed globally (e.g., via distribution packages) on a machine with limited access. Additionally, compiling larger Haskell libraries can use more memory than allowed on a shared host.
- Install stack by putting a precompiled build in a location on the path.
- The the heap size of GHC can be limited using
+RTS -M... -RTS.
- To pass this option to your dependencies, use the
ghc-optionssetting in your
stack.yaml, an alternative
remote.yaml, or the user-global
~/.stack/config.yaml(on the remote machine).
If you stay at that level for more than 10 minutes, it is still to high on Uberspace. In this case, try a lower limit. ↩