Random number generation with Rcpp and OpenMP

Random number generation with Rcpp and OpenMP

The following code shows how to write some simple code to draw random numbers from a normal and a binomial distribution. Notice that instead of declaring A as a numeric matri Serial Double loop #include <Rcpp.h> using namespace Rcpp; // [[Rcpp::export]] NumericMatrix my_matrix(int I) { NumericMatrix A(I,2); for(int i = 0; i < I; i++){ A(i,0) = R::rnorm(2,1) ; A(i,1) = R::rbinom(1,0.5) ; } colnames(A) = CharacterVector::create("Normal", "Bernoulli"); return A; } set.
Hello Rcpp

Hello Rcpp

This past weekend I discovered the wonders of c++ thanks to this datacamp course. Although c++ syntax is different, knowing Fortran made this much easier. Filling a matrix with c++ The following code creates a function that can be called from R to fill a matrix. Something that is different than in Fortran is that to make loops more efficient you have to do right (j) to left (i) instead of left to right.
Hello World: R + Fortran + OpenMP

Hello World: R + Fortran + OpenMP

Why? I want to fill up a big matrix and I care about speed and to a lesser degree memory efficiency. In practice the matrix will have 4000 rows and K columns where K is the number of observations for which I want to run my predictive model. For this exercise I will keep K to just 500 because my R approach eats a ton of memory. For this simple exercise, I will \(A_{ik} = 1 / (1 + exp(i^2 + i^3 + k^2 + k^3))\) in practice the operation that I need to do is much more complicated which will make the difference is run time even bigger.
cast-web-api in a snap

cast-web-api in a snap

After the sd card on my raspberry pi died the prospect of creating a new one so I could connect my google cast devices to smartthings did not sound like fun. Installing cast-web-api is not trivial. So, I decided this was a good opportunity to create my first snap. Creating a snap to wrap a nodejs app was super easy. I shared my code on github, but this is the whole thing:

nvidia-docker + greta

Goal: Use greta with nvidia-docker Docker file: ## Based on work by https://github.com/earthlab/dockerfiles/blob/master/r-greta/Dockerfile ## https://github.com/rocker-org/ml ## rocker ## FROM nvidia/cuda:9.0-cudnn7-runtime MAINTAINER "Ignacio Martinez" ignacio@protonmail.com RUN echo 'debconf debconf/frontend select Noninteractive' | debconf-set-selections ## Prepare R installation from RUN sh -c 'echo "deb https://cloud.r-project.org/bin/linux/ubuntu xenial-cran35/" >> /etc/apt/sources.list' \ && apt-key adv --keyserver keyserver.ubuntu.com --recv-keys E298A3A825C0D65DFD57CBB651716619E084DAB9 RUN apt-get update \ && apt-get upgrade -y -q \ && apt-get install -y --no-install-recommends \ libapparmor1 \ r-base \ r-base-dev \ littler \ r-cran-littler \ libxml2-dev \ libxt-dev \ libssl-dev \ libcurl4-openssl-dev \ imagemagick \ python-pip \ libpython2.

Cloud computing with R and AWS

Why? You want to run R code on the cloud. For whatever reason, you don’t want to use google nor azure. Credit I took most of the code from this gist The code This function takes a list with your instances, the path to your private key, and returns a cluster object that can be used with the future package. I was told that this function will be part of a new package soon.

Embarrassingly Parallel Computing with doAzureParallel

Why? You want to run 100 regressions, they each take one hour, and the only difference is the data set they are using. This is an embarrassingly parallel problem. For whatever reason, you want to use Azure instead of google compute engine… Before you start I will assume that: you have an Azure account, you have correctly installed, and configured doAzureParallel Create some fake data library(dplyr) library(stringr) set.

Embarrassingly Parallel Computing with googleComputeEngineR

Why? You want to run 100 regressions, they each take one hour, and the only difference is the data set they are using. This is an embarrassingly parallel problem. Before you start I will assume that: you have a Google compute engine account, you have correctly installed, and configured googleComputeEngineR Create some fake data library(googleComputeEngineR) library(dplyr) library(stringr) library(future) library(future.apply) set.seed(12618) n<-10000 fakeData <- list() for(ii in 1:100){ fakeData[[ii]] <- future({ fakeDF <- data.

My highlights from StanCon 2018

Source: THE JUMPING RIVERS BLOG During my econ PhD I learned a lot about frequentist statistics. Alas, my training of Bayesian statistics was limited. Three years ago, I joined @MathPolResearch and started delving into this whole new world. Two weeks ago, thanks to @jumping_uk, I was able to attend StanCon. This was an amazing experience, which allowed me to meet some great people and learn a lot from them. These are my highlights from the conference:
Send emails from R with mailgun

Send emails from R with mailgun

Why? Until now I’ve been sending emails with R using my Gmail account. This works, but configuring mailR for the first time is always a pain. A few days ago @marked told me about mailgun and how to use it. The great thing is that you only need httr to use it. How? Using @marked as my base, I created a tiny R package to make using mailgun even easier: