
<aside> 📦
You have to first install and load these packages in R
</aside>
sfOur first exercise is to draw the famous and beautiful Wytham Woods near us, which hosts the world’s longest ecological experiment. Wytham Woods is a semi-natural woodland, where different parts are established at different times. The stands can be classified as N (natural), M (mixed), P-E19 (early 19th-century plantation), P-M19 (mid 19th-century plantation), and P-M20 (mid 20th-century plantation).
<aside> 📥
Download
</aside>
st_read to load the correct shapefile, save it as ww.ww. How many features does it store?ww (a sf object) by using the square bracket notation and use the drop argument to drop geometries.
ww[1,]ww[nc$stand == "P-M19", ]ww[1, nc$stand == "P-M19", ]ww[1, nc$stand == "P-M19", drop = TRUE]st_geometry](<https://r-spatial.github.io/sf/reference/st_geometry.html>)(ww)[[1]]plot(ww) to quickly visualise the map of the first attributes.
strand attribute by the square bracket notation.ggplot2 to produce a fancier, publication-ready figure.
ggplot2 uses layer-based aesthetics, we will start with an empty grid ggplot(ww).sf object onto it, and specify to use stand to colour code. Hint: use geom_sf() and aes().coord_sf() to make sure the coordinate grid can be added.mutate and st_read to calculate the area for each stand, name it as area_m2.ggplot. It might sound counterproductive, but use drop_units to get rid of them.oaks.csv. It is a simple CSV with latitudinal and longitudinal coordinates.
sf points, using st_as_sf. Remember to set the correct corrds = and crs = .geom_sf to add the trees onto the map of Wytham Woods as black dots.st_intersect to count the number of points with the polygons. What is the returned object?lengths to calculate the number of points.terraWe will explore the temperature data across UK with the famous and useful WorldClim database.
<aside> 📥
Download
</aside>
<aside> 💡
You can actually use the geodata package to directly download WordClim data like this:
geodata::worldclim_country(country = "United Kingdom", var = "tavg", res = 2.5, path = "."))
</aside>
Use rast() to read the raster and save it as tavg.
Inspect the objects:
tavg and ww_v?res(), ext(), and crs() on tavg. What do they tell you?summary(ww.v) to inspect the attribute table.Use plot() to quickly visualise the temperature raster.
It is a Raster Stack which saves monthly variable. Let’s use mean(tavg) to create an annual average tavg.ann. Visualise it again.
We can then also download the map of UK with the rnaturalearth package (South 2017), with the function ne_states.
uk <- rnaturalearth::ne_states("United Kingdom", returnclass = "sf")
Use ggplot() + geom_sf() to visualise this UK map.
You will actually notice that the extent of tavg.ann raster and the uk map is slightly different. We will use crop() to crop the raster to the extent of uk.
There’s a bit of France in tavg.ann. We can use mask() to convert all values outside the uk map to NA.
The aggregate() function can be used to aggregate groups of cells of a raster in order to create a new raster with a lower resolution (thus larger cells). It takes two arguments. fact = is the aggregation factor expressed as number of cells in each direction (both horizontally and vertically), or two integers denoting the horizontal and vertical aggregation factor separately. fun = specifies the function used to aggregate values.
We can use centroids() to calculate the centroid of each district in the UK (each polygon), but we have to convert the uk to a vectorised form with vect(). It is now a SpatVector object**.**
We can then extract the raster values at a set of points using the extract() function. Here we will extract the mean annual temperature of the centroid of each district.
tavg.ann again.vect(uk) with add = T.crds() to extract the coordinates of those centroids.points().extract() to pass the SpatVector object with the raster as the first argument, then a data frame of centroids as the second argument.We can also use the same extract() function to obtain the average raster values of tavg.ann by polygons in uk. We will save this as a feature inside uk.
uk$tavg.ann <- extract(tavg.ann, vect(uk), mean, na.rm = T)$mean
ggplot(uk) + geom_sf(aes(fill = tavg.ann))
We could also have computed the average by weighting the area of each polygon, simply set weights = T.
uk$tavg.ann.weighted <- extract(tavg.ann, vect(uk), mean, weights = T, na.rm = T)$mean
ggplot(uk) + geom_sf(aes(fill = tavg.ann.weighted))
<aside> 🎉
Congratulations! You now have learned how to handle basic manipulation and visualisation of vectors and rasters with the two most important packages sf and terra. Drawing a map in R is no longer a challenge!
</aside>