 # 18 - Cluster computing

1.
Write a MapReduce function for the following situations. In each case, write the corresponding R code to understand how MapReduce and conventional programming differ.
1a.
Compute the square and cube of the numbers in the range 1 to 25. Display the results in a data frame.
```# R
# create a list of 25 integers
ints <- 1:25
result <- sapply(ints,function(x) x^2)
result
result <- sapply(ints,function(x) x^3)
result

# MapReduce
require(rmr2)
rmr.options(backend = "local") # local or hadoop
# load a list of 25 integers into HDFS
hdfs.ints = to.dfs(1:25)
# mapper for the key-value pairs to compute squares
mapper <- function(k,v) {
key <- v
value <- c(key^2,key^3)
keyval(key,value)
}
# run MapReduce
out = mapreduce(input = hdfs.ints, map = mapper)
# convert to a data frame
df = as.data.frame(from.dfs(out))
# reshape with n, n^2, n^3 one row
#add identifiers for each row as they are consecutively the square and cube
df\$powers <- c('n^2','n^3')
output <- cast(df,key ~ powers,value="val")
1c.
Using the average monthly temperatures for New York's Central Park, compute the max, mean, and min for August.
```# R