Skip to content

Overview

WPS Endpoint

https://cloud.csiss.gmu.edu/smap_service

WPS GetCapability

https://cloud.csiss.gmu.edu/smap_service?service=WPS&version=1.0.0&request=GetCapabilities

Use Examples

Python code example

import requests

url = 'https://cloud.csiss.gmu.edu/smap_service?service=WPS&version=1.0.0&request=Execute&identifier=GetFileByFips&DataInputs=layer=SMAP-9KM-DAILY-SUB_2020.01.01_013000;fips=06'

r = requests.get(url)

r.content

R code eaxmple

An example of R code for retrieving data from Crop-CASMA web service:

library(curl) # R-package required for donwloadling GeoTIFF correctly

year <- 2020 # Specify the year of interest
res <- "SMAP-9KM-DAILY-TOP_%s_AVERAGE" # See name convention to download other variables (keep the DATE as '%s')
DOWN_PATH <- "C:/my/folder/where/to/save/the/files"
stateFIPS <- "-1" # Download at the national level
# stateFIPS <- c("01", "04", "06") # To download separately each state (county FIPS can be listed as well)

init <- as.Date(paste0(year, "-04-01")) # Set initial date to download
init <- as.character(init + 0:75) # Add 75 days later
init <- sprintf(res, gsub("-", ".", init)) # Format date to conform the API standard
grd <- expand.grid(time = init, fip = stateFIPS) # Combine dates and FIPS
url <- 'https://cloud.csiss.gmu.edu/smap_service?service=WPS&version=1.0.0&request=Execute&identifier=GetFileByFips&DataInputs=layer=%s;fips=%s'
url <- unname(mapply(sprintf, url, grd$time, grd$fip)) # List all the urls to download
for (myURL in url) { # Loop over the URLs
  r <- readLines(myURL, warn = FALSE) # Download the XML file
  r <- unlist(strsplit(r, "(<|>)")) # Process the XML file
  r <- grep("https://.*.tif", r, value = TRUE) # Find the url of the GeoTIFF file to download
  curl_download(r, file.path(DOWN_PATH, gsub("^h.*/", "", r))) # Download the file into a specified location
}