Mapping traceroutes with Perl

I was playing around on my smartphone, looking for interesting apps and found one that displays the trace of a request to a server (e.g. www.google.com). Nevertheless, since I didn’t really need it, it was to expensive and I thought: “ok, let’s do it myself!” and I wrote a little perl script and uploaded it to GitHub. The script traces the route of a request and either produces a static google map as png or writes a GPX file, which can be used in GIS-software or Google maps.

Continue reading “Mapping traceroutes with Perl”

Retrieving and reprojecting MODIS satellite products

Data from the MODIS satellites are a great source for comparison to vegetation models and also for teaching. However, they come in a format I am not really used to and in a strange projection. The format is HDF, of which even several versions exists. But there are also several tools to convert it to a “better” file format for me. HDF needs much less space compared to other formats. Therefore it makes sense for satellite data to be stored as HDF.

First I need to know which data I want and in which region. Just recently a colleague asked me to get the fire product (MCD45A1) for Vietnam for him. Therefore I show it as example here.

On the map provided on the MODIS overview webpage one can try to select the correct row (v) and column (h) values. However, I already said it is a strange projection. I reprojected the grid to a standard equirectangular projection:

modis sinusoidal

That’s also not easy to select the right h/v values. However with both maps it worked. I set up a perl script using my module WebService::MODIS, which is also available on CPAN. Here is the example script, how to download the desired satellite data:

#!/usr/bin/env perl
use warnings;
use strict;
use WebService::MODIS;
readCache; # or "initCache;" if not done yet
my $fire = WebService::MODIS->new();
$fire->product("MCD45A1");
$fire->version('051');
$fire->h([27,28]);
$fire->v([6,7,8]);
$fire->dates(["2007.01.01", "2014.12.31"]);
$fire->createUrl;
#print "$_\n" foreach $fire->url;
$fire->download();

Now the files are in the current working directory. I do the mosaicing and reprojecting with ModisDownload.R. You may now ask: “Why not also downloading the data with the R Skript?” I did it in the beginning and it was very frustrating, since it crashed or did not start downloading at all and continued download was not possible. Therefore the perl module. To make use of different directories, where the downloaded files were and the reprojected files should be saved, I modified the original ModisDownload.R. Here is the diff of line 184 and 185 (Version 3.2, 13th Jan. 2014):

- write(paste(getwd(),"/",hdfNames[1], sep=""), mosaicname)
- for (j in 2:length(hdfNames)) write(paste(getwd(),"/",hdfNames[j], sep=""),mosaicname,append=T)
+ write(hdfNames[1], mosaicname)
+ for (j in 2:length(hdfNames)) write(hdfNames[j],mosaicname,append=T)

Now the filenames given to the mosaicHDF function must contain the full path to the files if called from another directory. ModisDownload.R uses the Modis Reprojection Tool (MRT), which is written in Java. Last year we used it for teaching. However, the installation under Windows it is a pain in the … Therefore, we preprocessed the data this year and supplied the students with GeoTiffs. Here is now the R script, which processes the downloaded hdf files and creates one GeoTiff per timeslice in UTM projection in the current working directory:

MRT.PATH <- "/path/to/MRT/bin"
MODISDOWNLOAD.DIR <- "/path/to/ModisDownload.R"
source(paste(MODISDOWNLOAD.DIR, "ModisDownload.R", sep="/"))
MODIS <- list(product="MCD45A1", bands="1 0 0 0 0 0 0 0", pixel.size=500)
DOWNLOAD.DIR <- "/path/to/downloaded/MODIS/files"
OUTPUT.DIR <- "/path/to/large/disk/for/GeoTIFFs"
setwd(OUTPUT.DIR)
downloaded <- list.files(DOWNLOAD.DIR, pattern="hdf$", full.names=TRUE)
if (length(downloaded) != 0) {
  yyyyddd <- sub("^.*\\.A", "", downloaded)
  yyyyddd <- unique(sub("\\..*$", "", yyyyddd))
  year <- trunc(as.integer(yyyyddd) / 1000)
  day.of.year <- as.integer(yyyyddd) - year * 1000
  dates <- as.Date(paste(year,"-01-01", sep=""), format="%Y-%m-%d") + day.of.year - 1
  for (j in 1:length(dates)) {
    in.hdf <- downloaded[grepl(paste("A", yyyyddd[j], sep=""), downloaded)]
    out.hdf <- paste(MODIS[["product"]], "_", dates[j], ".hdf", sep="")
    out.tif <- paste(MODIS[["product"]], "_", dates[j], ".tif", sep="")
    if (length(in.hdf)>1) {
      mosaicHDF(in.hdf, out.hdf, MRTpath=MRT.PATH)
    } else {
      file.symlink(in.hdf, out.hdf)
    }
    reprojectHDF(out.hdf,
      out.tif,
      MRTpath=MRT.PATH,
      proj_type="UTM",
      proj_params="0 0 0 0 0 0 0 0 0 0 0 0 0 0 0",
      utm_zone=0,
      datum="WGS84",
      bands_subset=MODIS[["bands"]],
      pixel_size=MODIS[["pixel.size"]])
    ## unlink(out.hdf)
  }
}

Now I have a timeseries of Geotiffs in UTM projection I can work with.

Downloading MODIS data with WebService::MODIS

For teaching at the University of Frankfurt about “Remote sensing in global ecology” with R together with a collegue we need a lot of different MODIS products for several regions in the world. Last year we used ModisDownload.R. It is a great tool, since it can download the data and if Modis Reprojection Tool (MRT) is installed, it merges several tiles together and transforms them to other geograpgical projections. However, it was a bit frustrating since it stopped downloading quiet often and does not support continued downloads. So I decided to write my own routine for downloading the MODIS products. For unexperienced users, like our students last year, it is quiet difficult to install and run MRT under windows, therefore we will supply them with ready made GeoTiffs this year. For this preprocessing I still use ModisDownload.R as an interface to MRT.

After downloading and installing the module from CPAN and installing it and its prerequisites with the normal

perl Makefile.PL
make
make test
make install

you can choose, which MODIS product for which regions you need and start writing the perl program for downloading the data. In the Perl program you then need to load the module with

use WebService::MODIS;

What can my module do?

The module first needs to retrieve the directory structure of the sever, where the MODIS data resides. this is done with the function

initCache();

This takes a few seconds to minutes. If you want, you can write the cached server structure to disk, for future usage, since the data is lost once you close the current perl process. The following function writes the cache to disk:

writeCache();

And this function is already one of the bugs I am aware of, since so far it only works under Linux. I did not know which environmental variable I could use for userspace configuration files under MacOS or even Windows. I happy if someone could help me here.

Once you start a new perl process, you can load the data from the saved configuration files with

readCache();

Optionally you can supply the writeCache and readCache functions with your own path, if you’re not happy with the default one.

If you know, which MODIS porducts, which tiles and for which time period you want the files, you can now either create a download list for usage in other download managers or use the build in download function. Two options exist, to create a new object:


my $lct = WebService::MODIS->new(product => "MCD12Q1",
dates => ['2001-01-01', '2010-01-01'], ifExactDates => 1,
h => [11,12], v=> [9,10]);

or

my $lct = WebService::MODIS->new();
$lct->product("MCD12Q1");
$lct->h([11,12]);
$lct->v([9,10]);
$lct->dates(["2001-01-01", "2010-01--01"]);
$lct->ifExactDates(1);

creates an object of the MODIS landcover product MCD12Q1 covering Rondonia, Brazil for the two years 2001 and 2010. This does not yet contain the details about the URLs. For that the Server needs to be contacted again with

$lct->createUrl;

Now the link list is for the $lct object is available and can be used either to print the list of URLs

print "$_\n" foreach $lct->url;

or to download the data (>700MB!):

$lct->download();

Where the download function can take the destination path as an optional argument.

About my first experience with “The Comprehensive Perl Archive Network” (CPAN)

After reading two other webpages I got a rough picture of what I have to do to publish a module on CPAN. I used

h2xs -AXc -n ModisDownload

for creating a skeleton of the module, then I filled it with my code. As you can see, I initially called my module “ModisDownload”, like the R library I got inspired by. What is really motivating to write a documentation is the default one, which would not have been so nice for me, if I left it unedited.

After that I applied for a Perl Authors Upload Server (PAUSE) account, which I got really fast with the hint to first publish my module on PrePAN, a prerelease server, where experienced CPAN users should help improve other modules. I got one comment, to rename my module and that the API is not very “perlish”. First comment ok, was applied quiet fast. The second part is not very constructive. However, I tried my best, to make my API more perlish by giving it an object oriented design. Constructive comments for further improvements are welcome. Especially the cache stuff still has the old unperlish(?) API.

Now I finally upladed the module to CPAN. The module is first sent out to “CPAN Testers”, a group of volunteers for testing new modules. The next day I got the report from these testers and what the f…, all tests had status “FAIL”. I now ran “make test” the first time myself I must shameful say and it failed. It still tried to load the module “ModisDownload”, but since I renamed the module to “WebService::MODIS” it had to fail. I fixed it and extented the test a little and tested the test myself, before I submitted a new version of my module to CPAN.

Now I hope there a not that many bugs in the code anymore and I might extend the funtionality somewhat, that GDAL can be used for transformation or selecting the horizontal and vertical sinusoidal numbers by latitude and longitude and if I get other suggestion I can see what is possible.

Two hours later: In the meantime it became already version number 1.2. When I look at the public CPAN page, I realized, the dependencies to other modules where missing. Therefore I added these missing dependencies to the Makefile.PL, which must be made by hand.