5. Geo Services Guide

This guide covers the details of Geo data management in rasdaman. This is supported through a separate component called petascope. Further components are concerned with data ingestion and CRS definition management.

petascope implements the following OGC interface standards:

For this purpose, petascope maintains its additional metadata (such as georeferencing) which is kept in separate relational tables. Note that not all rasdaman raster objects and collections are available through petascope by default, mainly those that have been ingested via WCS-T.

Petascope is implemented as a war file of servlets which give access to coverages (in the OGC sense) stored in rasdaman. Internally, incoming requests requiring coverage evaluation are translated into rasql queries by petascope. These queries are passed on to rasdaman, which constitutes the central workhorse. Results returned from rasdaman are forwarded to the client, finally.

5.1. Servlet endpoints

Once the petascope servlet is deployed (TODO see installation guide), the following service endpoints are available:

  • /rasdaman: context path

    • rasdaman/ows: serving OGC Web Services (OWS) like WCS, WCPS, WMS and WCS-T:
    • rasdaman/rasql: direct RasQL.

For example, assuming that the service’s IP address is 123.456.789.1 and the service port is 8080, the following request URLs would deliver the Capabilities documents for OGC WMS and WCS, respectively:



5.2. The world of coverages

5.2.1. Offset vectors and coefficients

In ISO and OGC specifications, a coverage is defined as a function from a spatial and/or temporal domain to an attribute range. Such domain can represent both vector or raster datasets, including a variety of different topologies like images, point clouds, polygons, triangulated irregular networks, etc. Coverages in the Earth Observation (EO) domain usually tend to model coverages as a geometric grid of points, a.k.a. grid coverages.

Grid coverages are a network of points connected by lines, retaining a gridded structure. Their geometric domain can either be expressed in an analytic form ((geo)rectified grids), or they need non-affine transforms to translate the internal indexed grid to the external (geo) Coordinate Reference System (CRS). Grid coverages inherently have a dimensionality, which is determined by the number of its axis. These axes are not to be confused with the axes of the CRS which defines dimensionality of the tuples of coordinates of each grid point. Indeed the dimensionality of a grid is not necessarily equal to the dimensionality of its CRS (its however surely not greater for geometric constraints): an (oblique?) 2D grid can always be put inside a 3D CRS, for instance.

Petascope currently supports grid topologies whose axes are aligned with the axes of the CRS. Such kind of grids are a subset of GML rectified grids, with the constraint that the offset vectors – the vectors which determine the (fixed) relative geometric distance between grid points along a grid axis – need to be parallel to an axis of the (external) CRS. In such cases, an offset vector can be regarded as resolution of the grid along an axis.

Rectified grids with non-aligned grid axis / offset vectors are not (yet) supported.

▶ show

In addition (starting from version 9.0.0) Petascope supports aligned grids with arbitrary (irregular) spacing between points along one or more (or all) grid axes. This kind of geometry is a subtype of (geo)referenceable grids and can be defined by attaching a set of coefficients (or weights) to an offset vector. The series of coefficients determines how many offset vectors is a grid point geometrically distant from the grid origin, and their cardinality must coincide with the cardinality of the grid points along that axis. Rectified grids (conceivable as a subset of referenceable grids in an Euler diagram) have an inherent series of incremental integer coefficients attached to each offset vector, so that e.g. the third point along axis 0 is computed as [GO + 2\*v0] (indexes start from 0).

A graphical example:

▶ show

In this example, the grid is still aligned with CRS axes E/N, but the spacing is irregular along grid axis 0. We then need to explicitly define a series of 4 coefficients (one for each grid point along 0) that weight their distance to the grid origin (in terms of v0): in our case the weights are c0={0, 1, 1.5, 3.5}. Indeed the point P in the graphical example above – which has internal (rasdaman) grid coordinates {3,2} (origin is {0,0}) – can hence be geometrically expressed as : (GO + c0[3]\*v0 + 2\*v1) = (GO + 3.5\*v0 + 2\*v1).

It is underlined that the irregular spacing must be fixed for each grid line along a certain grid axis. If not so, the referenceable grid becomes warped and the domain needs to be addressed with explicit CRS coordinates for each single grid point (look-up tables).


In petascope only grids whose lines are rectilinear and aligned with a Cartesian CRS are supported. This means: no rotated nor warped (curvilinear) grids.

5.2.2. Grid axis labels and CRS axis labels

Now that the difference between a grid axis and a CRS axis has been cleared, we address the issue of determining (and customizing) the axis labels a coverage in Petascope.

When importing a coverage, a spatio-temporal CRS needs to be assigned to it, in order to give a meaning to its domain. Composition of CRSs is possible via the OGC SECORE CRS resolver. For instance a time-series of WGS84 images can have the following native CRS:


Note: currently gml:CompoundCRS is not supported (#679) so, for example, http://www.opengis.net/def/crs/EPSG/0/7415\ would have to be represented by composing its components using the same format as above i.e.


In order to verify the CRS assigned to a coverage offered by Petascope, there are several ways:

  1. check the wcs:CoverageSummary/ows:BoundingBox@crs attribute in a WCS GetCapabilities response;
  2. check the @srsName attribute in the @{gml:SRSReferenceGroup} attributes group in WCS DescribeCoverage response (gml:domainSet);
  3. use the WCPS function crsSet();

It is important to understand that the assigned CRS automatically determines the CRS axis labels (and all other axis semantics like direction and unit of measure), and these are the same labels targeted in the subsets of the WCS and WCPS requests. Such labels correspond to the gml:axisAbbrev elements in the CRS definition (mind that ellipsoidal Coordinate Systems (CS) do not count in case of projected CRSs, which build a further CS on top of it).

This excerpt from the CRS definition of the WGS84 / UTM zone 33N projection shows how the first axis defined by this CRS is the easting, with label E and metres m as Unit of Measure (UoM, see gml:CoordinateSystemAxis@uom link):

▶ show

Since only aligned grids are supported, we decided to assign the same CRS axes labels to the grid axes. Such labels are listed in the gml:domainSet/gml:axisLabels element of a WCS coverage description, and are not to be confused with the labels of the CRS axes, which are instead listed in the @{gml:SRSReferenceGroup} attributes group, as said.

Indeed, despite the labels of grid and CRS axes will be the same, their order can actually differ. Many geographic CRSs (like the well-known WGS84 /EPSG:4326) define latitudes first, whereas it is GIS practice to always place longitudes in the first place, just like rasdaman does when storing the multidimensional-arrays (marrays).

With regards to this long-standing issue, Petascope strictly keeps the CRS axis order which is defined in its definition when it comes to GML, whereas GIS order (longitude first) is kept for other binary encodings like !GeoTiff or NetCDF, so to keep metadata consistency with common GIS libraries (e.g. GDAL). On the other hand, the order of grid axis labels need to follow the internal grid topology of marrays inside rasdaman.

To make things clearer, an excerpt of the GML domain of our 3D systemtest coverage eobstest (regular time series of EO imagery) is proposed:


The CRS of the coverage is an (ordered) composition of a temporal CRS (linear count of days [d] from the epoch 1950-01-01T00:00:00) and a geospatial CRS where latitude is defined first (the well-known EPSG:4326). This means that every tuple of spatio-temporal coordinates in the coverage’s domain will be a 3D tuple listing the count of days from 1^st^ of January 1950, then latitude degrees then longitude degrees, like shown in the gml:origin/gml:pos element: the origin of the 3D grid is set on 1^st^ of January 1950, 75.5 degrees north and 25 degrees east (with respect to the origin of the cartesian CS defined in EPSG:4326).

Grid coordinates follow instead the internal grid space, which is not aware of any spatio-temporal attribute, and follows the order of axis as they are stored in rasdaman: in the example, it is expressed that the collection is composed of a 6x101x232 marray, having t (time) as first axis, then Long then Lat. The spatio-temporal coordinates are instead expressed following the order of the CRS definition, hence with latitude degrees before longitudes.

A final remark goes to the customization of CRS (and consequently grid) axes labels, which can be particularly needed for temporal CRSs, especially in case of multiple time axis in the same CRS. Concrete CRS definitions are a static XML tree of GML elements defining axis, geographic coordinate systems, datums, and so on. The candidate standard OGC CRS Name-Type Specification offers a new kind of CRS, a parametrized CRS, which can be bound to a concrete definition, a CRS template, and which offers customization of one or more GML elements directly via key-value pairs in the query component of HTTP URL identifying the CRS.

As a practical example, we propose the complete XML definition of the parametrized CRS defining ANSI dates, identified by the URI http://rasdaman.org:8080/def/crs/OGC/0/AnsiDate:

▶ show

This single-parameter definition allow the customization of the concrete CRS template OGC:.AnsiDate-template (identified by http://rasdaman.org:8080/def/crs/OGC/0/.AnsiDate-template) on its unique axis label (crsnts:parameter/crsnts:target), via a parameter labeled axis-label, and default value ansi.

This way, when we assign this parameterized CRS to a coverage, we can either leave the default ansi label to the time axis, or change it to some other value by setting the parameter in the URL query:

  • default ansi axis label: http://rasdaman.org:8080/def/crs/OGC/0/AnsiDate
  • custom ansi_date axis label: http://rasdaman.org:8080/def/crs/OGC/0/AnsiDate?axis-label="ansi_date"

5.2.3. Coverage Implementation Schema (CIS 1.0 and CIS 1.1) in petascope

CIS specifies the OGC coverage model by establishing a concrete, interoperable, conformance-testable coverage structure regardless of their data format encoding down to the level of single “pixels” or “voxels”.

Coverages can be encoded in any suitable format (such as GML, JSON, GeoTIFF, or netCDF). Coverages are independent from service definitions and, therefore, can be accessed through a variety of OGC services types, such as the Web Coverage Service (WCS) Standard.

Since rasdaman version 9.7+, besides CIS 1.0 for WCS version 2.0.1, petascope supports CIS 1.1 for WCS version 2.1.0 with these conformance classes:

  • Class coverage.
  • Class grid-regular (in CIS 1.0: GridCoverage and RectifiedGridCoverage coverage types).
  • Class grid-irregular (only supports CIS::IrregularAxis, in CIS 1.0: ReferenceableGridCoverage coverage type).
  • Class gml-coverage: For WCS version 2.1.0 only, petascope allows to transform CIS 1.0 coverage types to CIS 1.1 in GML format by new non-standard extra parameter in the request outputType=GeneralGridCoverage, see here for more information.
  • Class other-format-coverage.
  • Class multipart-coverage.

5.2.4. Subsets in Petascope

We will describe how subsets (trims and slices) are treated by Petascope. Before this you will have to understand how the topology of a grid coverage is interpreted with regards to its origin, its bounding-box and the assumptions on the sample spaces of the points. Some practical examples will be proposed. Geometric interpretation of a coverage

This section will focus on how the topology of a grid coverage is stored and how Petascope interprets it. When it comes to the so-called domainSet of a coverage (hereby also called domain, topology or geometry), Petascope follows pretty much the GML model for rectified grids: the grid origin and one offset vector per grid axis are enough to deduce the full domainSet of such (regular) grids. When it comes to referenceable grids, the domainSet still is kept in a compact vectorial form by adding weighting coefficients to one or more offset vectors.

As by ​`GML standard <http://www.opengeospatial.org/standards/gml>`_ a grid is a “network composed of two or more sets of curves in which the members of each set intersect the members of the other sets in an algorithmic way”. The intersections of the curves are represented by points: a point is 0D and is defined by a single coordinate tuple.

A first question arises on where to put the grid origin. The GML and ​GMLCOV standards say that the mapping from the domain to the range (feature space, payload, values) of a coverage is specified through a function, formally a gml:coverageFunction. From the GML standard: “If the gml:coverageFunction property is omitted for a gridded coverage (including rectified gridded coverages) the gml:startPoint is considered to be the value of the gml:low property in the gml:Grid geometry, and the gml:sequenceRule is assumed to be linear and the gml:axisOrder property is assumed to be +1 +2”.


In the image, it is assumed that the first grid axis (+1) is the horizontal axis, while the second (+2) is the vertical axis; the grid starting point is the full diamond. Rasdaman uses its own grid function when listing cell values, linearly spanning the outer dimensions first, then proceeding to the innermost ones. To make it clearer, this means column-major order.

In order to have a coeherent GML output, a mapping coverage function is then declared. This can look like this in a 3D hypothetical response:

▶ show

Coming back to the origin question on where to put the origin of our grid coverages, we have to make it coincide to what the starting value represents in rasdaman, the marray origin. As often done in GIS applications, the origin of an image is set to be its upper-left corner: this finally means that the origin of our rectified and referenceable grid coverages shall be there too in order to provide a coherent GML/GMLCOV coverage. Note that placing the origin in the upper-left corner of an image means that the offset vector along the northing axis will point South, hence will have negative norm (in case the direction of the CRS axis points North!).

When it comes to further dimensions (a third elevation axis, time, etc.), the position of the origin depends on the way data has been ingested. Taking the example of a time series, if the marray origin (which we can denote as [0:0:__:0], though it is more precisely described as [dom.lo[0]:dom.lo[1]:__:dom.lo[n]) is the earliest moment in time, then the grid origin will be the earliest moment in the series too, and the offset vector in time will point to the future (positive norm); in the other case, the origin will be the latest time in the series, and its vector will point to the past (negative norm).

To summarize, in any case the grid origin must point to the marray origin. This is important in order to properly implement our linear sequence rule.

A second question arises on how to treat coverage points: are they points or are they areas? The formal ISO term for the area of a point is sample space. We will refer to it as well as footprint or area. The GML standard provides guidance on the way to interpret a coverage: “When a grid point is used to represent a sample space (e.g. image pixel), the grid point represents the center of the sample space (see ISO 19123:2005, 8.2.2)”.

In spite of this, there is no formal way to describe GML-wise the footprint of the points of a grid. Our current policy applies distinct choices separately for each grid axis, in the following way:

  • regular axis: when a grid axis has equal spacing between each of its points, then it is assumed that the sample space of the points is equal to this spacing (resolution) and that the grid points are in the middle of this interval.
  • irregular axis: when a grid axis has an uneven spacing between its points, then there is no (currently implemented) way to either express or deduce its sample space, hence 0D points are assumed here (no footprint).

It is important to note that sample spaces are meaningful when areas are legal in the Coordinate Reference System (CRS): this is not the case for Index CRSs, where the allowed values are integrals only. Even on regular axes, points in an Index CRSs can only be points, and hence will have 0D footprint. Such policy is translated in practice to a point-is-pixel-center interpretation of regular rectified images.

The following art explains it visually:

▶ show

The left-side grid is the GML coverage model for a regular grid: it is a network of (rectilinear) curves, whose intersections determine the grid points ‘+’. The description of this model is what petascopedb knows about the grid.

The right-hand grid is instead how Petascope inteprets the information in petascopedb, and hence is the coverage that is seen by the enduser. You can see that, being this a regular grid, sample spaces (pixels) are added in the perception of the coverage, causing an extension of the bbox (gml:boundedBy) of half-pixel on all sides. The width of the pixel is assumed to be equal to the (regular) spacing of the grid points, hence each pixel is of size |v_0| x |v_1|, being * the norm operator.

As a final example, imagine that we take this regular 2D pattern and we build a stack of such images on irregular levels of altitude:

▶ show

In petascopedb we will need to add an other axis to the coverage topology, assigning a vector ‘v_2’ to it (we support gmlrgrid:ReferenceableGridByVectors only, hence each axis of any kind of grid will have a vector). Weighting coefficients will then determine the height of each new z-level of the cube: such heights are encoded as distance from the grid origin ‘#’ normalized by the offset vector v_2. Please note that the vector of northings v_1 is not visible due to the 2D perspective: the image is showing the XZ plane.

Regarding the sample spaces, while petascope will still assume the points are pixels on the XY plane (eastings/northings), it will instead assume 0D footprint along Z, that is along height: this means that the extent of the cube along height will exactly fit to the lowest and highest layers, and that input Z slices will have to select the exact value of an existing layer.

The latter would not hold on regular axes: this is because input subsets are targeting the sample spaces, and not just the grid points, but this is covered more deeply in the following section. Input and output subsettings

This section will cover two different facets of the interpretation and usage of subsets: how they are formalized by Petascope and how they are adjusted. Trimming subsets ‘lo,hi’ are mainly covered here: slices do not pose many interpretative discussions.

A first point is whether an interval (a trim operation) should be (half) open or closed. Formally speaking, this determines whether the extremes of the subset should or shouldn’t be considered part of it: (lo,hi) is an open interval, [lo.hi) is a (right) open interval, and [lo,hi] is a closed interval. Requirement 38 of the ​WCS Core standard (OGC 09-110r4) specifies that a /subset/ is a closed interval.

A subsequent question is whether to apply the subsets on the coverage points or on their footprints. While the WCS standard does not provide recommendations, we decided to target the sample spaces, being it a much more intuitive behavior for users who might ignore the internal representation of an image and do not want to lose that “half-pixel” that would inevitably get lost if footprints were to be ignored.

We also consider here “right-open sample spaces”, so the borders of the footprints are not all part of the footprint itself: this means that two adjacent footprints will not share the border, which will instead belong to the greater point (so typically on the right side in the CRS space). A slice exactly on that border will then pick the right-hand “greater” point only. Border-points instead always include the external borders of the footprint: slices right on the native BBOX of the whole coverage will pick the border points and will not return an exception.

Clarified this, the last point is how coverage bounds are set before shipping, with respect to the input subsets. That means whether our service should return the request bounding box or the minimal bounding box.

Following the (strong) encouragement in the WCS standard itself (requirement 38 WCS Core), Petascope will fit the input subsets to the extents of sample spaces (e.g. to the pixel areas), thus returning the minimal bounding box. This means that the input bbox will usually be extended to the next footprint border. This is also a consequence of our decision to apply subsets on footprints: a value which lies inside a pixel will always select the associated grid point, even if the position of the grid point is actually outside of the subset interval. Examples

In this section we will examine the intepretation of subsets by petascope by taking different subsets on a single dimension of 2D coverage. To appreciate the effect of sample spaces, we will first assume regular spacing on the axis, and then irregular 0D-footprints.

Test coverage information:

mean_summer_airtemp (EPSG:4326)
Size is 886, 711
Pixel Size = (0.050000000000000,-0.050000000000000)
Upper Left  ( 111.9750000,  -8.9750000)
Lower Left  ( 111.9750000, -44.5250000)
Upper Right ( 156.2750000,  -8.9750000)
Lower Right ( 156.2750000, -44.5250000)

From this geo-information we deduce that the grid origin, which has to be set in the upper-left corner of the image, in the centre of the pixel are, will be:

origin(mean_summer_airtemp) = [ (111.975 + 0.025) ,  (-8.975 - 0.025) ]
                            = [  112.000          ,   -9.000          ]

Regular axis: point-is-area

▶ show

Applying these subsets to mean_summer_airtemp will produce the following responses:

▶ show

Irregular axis: point-is-point

▶ show

Applying these subsets to mean_summer_airtemp will produce the following responses:

▶ show

5.2.5. CRS management

Petascope relies on a [SecoreUserGuide SECORE] Coordinate Reference System (CRS) resolver that can provide proper metadata on, indeed, coverage’s native CRSs. One could either [SecoreDevGuide deploy] a local SECORE instance, or use the official OGC SECORE resolver (http://www.opengis.net/def/crs/). CRS resources are identified then by HTTP URIs, following the related OGC policy document of 2011, based on the White Paper ‘OGC Identifiers - the case for http URIs’. These HTTP URIs must resolve to GML resources that describe the CRS, such as http://rasdaman.org:8080/def/crs/EPSG/0/27700 that themselves contain only resolvable HTTP URIs pointing to additional definitions within the CRS; so for example http://www.epsg-registry.org/export.htm?gml=urn:ogc:def:crs:EPSG::27700 is not allowed because, though it is a resolvable HTTP URI pointing at a GML resource that describes the CRS, internally it uses URNs which SECORE is unable to resolve.

5.3. OGC Web Services

5.3.1. WCS

“The OpenGIS Web Coverage Service Interface Standard (WCS) defines a standard interface and operations that enables interoperable access to geospatial coverages.” (WCS standards)

Metadata regarding the range (feature space) of a coverage "myCoverage" is a fundamental part of a GMLCOV coverage model. Responses to WCS DescribeCoverage and GetCoverage will show such information in the gmlcov:rangeType element, encoded as fields of the OGC SWE data model. For instance, the range type of a test coverage mr, associated with the primitive quantity with unsigned char values is the following:

▶ show

The set of standard rasdaman data types, materializes the base types defined in the ODMG standard, which is converted to SWE Quantity elements’ defintion attributes by table below:

Table 5.1 rasdaman base types to Quantity’s definition types
rasdaman types size Quantity’s definition types
boolean 8 bit unsignedByte
octet 8 bit signedByte
char 8 bit unsignedByte
short 16 bit signedShort
unsigned short / ushort 16 bit unsignedShort
long 32 bit signedInt
unsigned long / ulong 32 bit unsignedInt
float 32 bit float32
double 64 bit float64
complex 64 bit cfloat32
complexd 128 bit cfloat64

Note that a quantity can be associated with multiple allowed intervals, as by SWE specifications.

Declarations of NIL values are also possible: one or more values representing not available data or which have special meanings can be declared along with related reasons, which are expressed via URIs (see http://www.opengis.net/def/nil/OGC/0/ for official NIL resources provided by OGC).

You can use http://yourserver/rasdaman/ows as service endpoints to which to send WCS requests, e.g.


See example queries in the WCS systemtest which send KVP (key value pairs) GET request and XML POST request to Petascope.

5.3.2. WCPS

“The OpenGIS Web Coverage Service Interface Standard (WCS) defines a protocol-independent language for the extraction, processing, and analysis of multi-dimensional gridded coverages representing sensor, image, or statistics data. Services implementing this language provide access to original or derived sets of geospatial coverage information, in forms that are useful for client-side rendering, input into scientific models, and other client applications. Further information about WPCS can be found at the WCPS Service page of the OGC Network. (http://www.opengeospatial.org/standards/wcps)

The WCPS language is independent from any particular request and response encoding, allowing embedding of WCPS into different target service frameworks like WCS and WPS. The following documents are relevant for WCPS; they can be downloaded from www.opengeospatial.org/standards/wcps:

  • OGC 08-068r2: The protocol-independent (“abstract”) syntax definition; this is the core document. Document type: IS (Interface Standard.
  • OGC 08-059r3: This document defines the embedding of WCPS into WCS by specifying a concrete protocol which adds an optional ProcessCoverages request type to WCS. Document type: IS (Interface Standard).
  • OGC 09-045: This draft document defines the embedding of WCPS into WPS as an application profile by specifying a concrete subtype of the Execute request type.

There are a online demo and online tutorial; see also the WCPS manual and tutorial.

The petascope implementation supports both Abstract (example) and XML syntaxes (example). For guidelines on how to safely build and troubleshoot WCPS query with Petascope, see this topic in the mailing-list.

The standard for WCPS GET request is


You can use http://your.server/rasdaman/ows/wcps as a shortcut service endpoint to which to send WCPS requests. This is not an OGC standard for WCPS but is kept for testing purpose for WCPS queries. The following form is equivalent to the previous one:


5.3.3. WMS

“The OpenGIS Web Map Service Interface Standard (WMS) provides a simple HTTP interface for requesting geo-registered map images from one or more distributed geospatial databases. A WMS request defines the geographic layer(s) and area of interest to be processed. The response to the request is one or more geo-registered map images (returned as JPEG, PNG, etc) that can be displayed in a browser application. The interface also supports the ability to specify whether the returned images should be transparent so that layers from multiple servers can be combined or not.”

Petascope supports WMS 1.3.0. Some resources: Administration

The WMS 1.3 is self-administered by all intents and purposes, the database schema is created automatically and updates each time the Petascope servlet starts if necessary. The only input needed from the administrator is the service information which should be filled in $RMANHOME/etc/wms_service.properties before the servlet is started. Layer creating & removal

Layers can be easily created from existing coverages in WCS. This has several advantages:

  • Creating the layer is extremely simple and can be done by both humans and machines.
  • The possibilities of inserting data into WCS are quite advanced (see wiki:WCSTImportGuide).
  • Data is not duplicated among the services offered by Petascope.

There are 2 ways of publising a new WMS layer from an imported geo-referenced coverage:

  • By setting: wms_import in the ingredients file when importing wcst_import.
  • By sending HTTP InsertWCSLayer request manually to petascope.

Possible WMS requests:

  • The InsertWCSLayer request will create a new layer from an existing coverage without an associated WMS layer served by the web coverage service offered by petascope. Example:

  • To update an existing WMS layer from an existing coverage with an associated WMS layer use UpdateWCSLayer request. Example:

  • To remove a layer, just delete the associated coverage. Example:

            &request=DeleteCoverage&coverageId=MyCoverage Transparent nodata value

By adding a parameter transparent=true to WMS requests, the returned image will have NoData Value=0 in the bands’ metadata, so the WMS client will consider all the pixels with 0 value as transparent. E.g:

▶ show Interpolation value

Since v9.8, when output CRS is different from the native CRS in a GetMap request, the WMS service will reproject the result to the requested output CRS. The interpolation / resampling algorithm used during the reprojection can be controlled with a non-standard parameter interpolation=<method> added to the GetMap request. Valid values for <method> are documented in the rasql project() function, cf. Geographic projection; by default, nearest-neighbour is used (near).

Example request that changes the default interpolation method:

▶ show Style creation

Styles can be created for layers using rasql and WCPS query fragments. This allows users to define several visualization options for the same dataset in a flexible way. Examples of such options would be color classification, NDVI detection etc. The following HTTP request will create a style with the name, abstract and layer provided in the KVP parameters below


For Tomcat version 7+ it requires the query (WCPS/rasql fragment) to be encoded correctly. Please use this website http://meyerweb.com/eric/tools/dencoder/ to encode your query first:

  • WCPS query fragment example (since rasdaman 9.5):

    ▶ show

    The variable $c will be replaced by a layer name when sending a GetMap request containing this layer’s style.

  • Rasql query fragment examples:

    ▶ show

    The variable $Iterator will be replaced with the actual name of the rasdaman collection and the whole fragment will be integrated inside the regular GetMap request.

  • Since v9.8.1, it is possible to use multiple layers in a style definition. Besides the iterators $c in WCPS query fragments and $Iterator in rasql query fragments, which always refer to the current layer, other layers can be referenced by name using an iterator of the form $LAYER_NAME in the style expression.

    Example: create a WCPS query fragment style referencing 2 layers ($c refers to layer sentinel2_B4 which defines the style):

    ▶ show

    Then, in any GetMap request using this style, the result will be obtained from the combination of the 2 layers: sentinel2_B4 and sentinel2_B8:

    ▶ show

  • Since v10.0, a WMS style supports ColorTable definition which allows to colorize the result of WMS GetMap request when the style is requested. A style can contain either one or both query fragment and Color Table definitions. The InsertStyle request supports two new non-standard extra parameters colorTableType (valid values: ColorMap, GDAL and SLD) and colorTableDefintion containing corresponding definition, example:

    ▶ show

    Below the supported color table definitions for each color table type are explained:

    • Rasdaman ColorMap: check Coloring Arrays for more details. The color table definition must be a JSON object, for example:

      ▶ show

    • GDAL ColorPalette: check encode for more details. The color table definition must be a JSON object and contains 256 color arrays in colorTable array, example:

      ▶ show

    • WMS Styled Layer Descriptor (SLD): The color table definition must be valid XML and contains ColorMap element. Check Coloring Arrays for details about the supported types (ramp (default), values, intervals), example ColorMap with type="values":

      ▶ show


To remove a particular style you can use a DeleteStyle request. Note that this is a non-standard extension of WMS 1.3.

    &request=DeleteStyle&layer=dessert_area&style=FireMarkup 3D+ coverage as WMS layer

Petascope allows to import a 3D+ coverage as a WMS layer. The user can specify "wms_import": true in the ingredients file when importing data with wcst_import.sh for 3D+ coverage with regular_time_series, irregular_time_series and general_coverage recipes. For example you find an irregular_time_series 3D coverage from 2D geotiff files use case.

Once the data coverage is ingested, the user can send GetMap requests on non-geo-referenced axes according to the OGC WMS 1.3.0 standard. The table below shows the subset parameters for different axis types:

Axis Type Subset parameter
Time time=…
Elevation elevation=…
Other dim_AxisName=… (e.g dim_pressure=…)

According to the WMS 1.3.0 specification, the subset for non-geo-referenced axes can have these formats:

  • Specific value (value1): time=‘2012-01-01T00:01:20Z, dim_pressure=20,…
  • Range values (min/max): time=‘2012-01-01T00:01:20Z’/‘2013-01-01T00:01:20Z, dim_pressure=20/30,…
  • Multiple values (value1,value2,value3,…): time=‘2012-01-01T00:01:20Z, ‘2013-01-01T00:01:20Z, dim_pressure=20,30,60,100,…
  • Multiple range values (min1/max1,min2/max2,…): dim_pressure=20/30,40/60,…


A GetMap request is always 2D, so if a non-geo-referenced axis is omitted from the request it will be considered as a slice on the upper bound of this axis (e.g. in a time-series it will return the slice for the latest date).

GetMap request examples:

​* Multiple values on time, dim_pressure axes of 4d coverage. Testing the WMS

You can test the service using your favorite WMS client or directly through a GetMap request like the following:

▶ show Errors and Workarounds

Cannot load new WMS layer in QGIS
In this case, the problem is due to QGIS caching the WMS GetCapabilities from the last request so the new layer does not exist (see here for clear caching solution: http://osgeo-org.1560.x6.nabble.com/WMS-provider-Cannot-calculate-extent-td5250516.html)

5.3.4. WCS-T

The WCS Transaction extension (WCS-T) defines a standard way of inserting, deleting and updating coverages via a set of web requests. This guide describes the request types that WCS-T introduces and shows the steps necessary to import coverage data into a rasdaman server, data which is then available in the server’s WCS offerings.

Supported coverage data format

Currently, WCS-T supports coverages in GML format for importing. The metadata of the coverage is thus explicitly specified, while the raw cell values can be stored either explicitly in the GML body, or in an external file linked in the GML body, as shown in the examples below. The format of the file storing the cell values must be one supported by the GDAL library (http://www.gdal.org/formats_list.html), such as TIFF / GeoTIFF, JPEG, JPEG2000, PNG etc.


Besides the standard HTTP GET requests, petascope supports key-value parameters which are sent as HTTP POST requests to Insert/Update coverages. Inserting coverages

Inserting a new coverage into the server’s WCS offerings is done using the InsertCoverage request.

Standard parameters:

Request Parameter Value Description Required
service WCS   Yes
version 2.0.1 or later   Yes
request InsertCoverage   Yes
inputCoverageRef a valid url. Url pointing to the GML coverage to be inserted. One of inputCoverageRef or inputCoverage is required
inputCoverage a coverage in GML format The coverage to be inserted, in GML format. One of inputCoverageRef or inputCoverage is required
useId new or existing Indicates wheter to use the coverage id from the coverage body, or tells the server to generate a new one. No

Vendor specific parameters:

Request Parameter Value Description Required
pixelDataType any GDAL supported data type (e.g: Float32) or concatenated data types by commas, (e.g: Float32,Int32,Float32). In cases where cell values are given in the GML body, the datatype can be indicated through this parameter. If omitted, it defaults to Byte. No
tiling same as rasdaman tiling clause wiki:Tiling Indicates the tiling of the array holding the cell values. No

The response of a successful coverage request is the coverage id of the newly inserted coverage.


The following example shows how to insert the coverage available at: http://schemas.opengis.net/gmlcov/1.0/examples/exampleRectifiedGridCoverage-1.xml. The tuple list is given in the GML body.


The following example shows how to insert a coverage stored on the server on which rasdaman runs. The cell values are stored in a TIFF file (attachment:myCov.gml), the coverage id is generated by the server and aligned tiling is used for the array storing the cell values.

    &coverageRef=file:///etc/data/myCov.gml&useId=new&tiling=aligned [0:500, 0:500] Deleting coverages

To delete a coverage (along with the corresponding rasdaman collection), use the standard DeleteCoverage WCS-T request. For example, the coverage ‘test_mr’ can be deleted with a request as following:


Deleting coverages is also possible from the WS-client frontend available at http://yourserver/rasdaman/ows (WCS > DeleteCoverage tab). Non-standard requests


The following requests are used to create/delete downscaled coverages. Internally they are used for efficient zooming in/out in WMS, and downscaling when using the scale() function in WCPS or scaling extension in WCS.

  • InsertScaleLevel: create a downscaled collection for a specific coverage and given level; e.g. to create a downscaled coverage of test_world_map_scale_levels that is 4x smaller:

    ▶ show

  • DeleteScaleLevel: delete an existing downscaled coverage at a given level; e.g. to delete downscaled level 4 of coverage test_world_map_scale_levels:

    ▶ show

wcst_import can send InsertScaleLevel requests automatically when importing data with it with scale_levels option in the ingredients file, more details here.

5.4. Non-standard functionality

5.4.1. Update coverage’s metadata from WSClient

Since v9.8, coverage’s metadata can be updated from WSClient by selecting a text file (mime type: text/xml | application/json | text/plain) containing new metadata and upload it to petascope. Then, petascope will read the content of the text file and update corresponding coverage’s metadata.


This feature only exists in WSClient: OGC WCS > Describe Coverage tab when one is already logged in with petascope admin user in Admin tab.

The endpoint for this feature in petascope is http://your-server/rasdaman/ows/UpdateCoverageMetadata which requires “multipart/form-data” POST requests. The request should contain 2 parts: the first part is coverageId to update, the second part is a path to a text file to be uploaded to server.

5.4.2. Transform CIS 1.0 coverages to CIS 1.1 coverages in petascope

Since rasdaman v9.7, WCS and WCPS services in Petascope allows to transform a coverage imported in CIS 1.0 to CIS 1.1 with output in application/gml+xml format and a new non-standard parameter outputType=GeneralGridCoverage.


This feature only applies to WCS version 2.1.0 and WCPS. WCS

When requesting with WCS version 2.1.0 with DescribeCoverage/GetCoverage requests, one can transform coverage imported in CIS 1.0 to CIS 1.1 by adding extra request parameter outputType=GeneralGridCoverage as example below:

▶ show WCPS

For WCPS requests, the same can be achieved using the extra parameter outputType=GeneralGridCoverage in encode() with application/gml+xml. Example:

for c in (test_irr_cube_2)
return encode(c, "application/gml+xml",

5.4.3. Clipping in petascope

WCS and WCPS services in Petascope support the WKT format for clipping with MultiPolygon (2D), Polygon (2D) and LineString (1D+). The result of MultiPolygon and Polygon is always a 2D coverage, and LineString results in a 1D coverage.

Petascope also supports curtain and corridor clippings by Polygon and Linestring on 3D+ coverages by Polygon (2D) and Linestring (1D). The result of curtain clipping has same dimensionality as the input coverage and the result of corridor clipping is always a 3D coverage with the first axis being the trackline of the corridor by convention.

Below you find the documentation for WCS and WCPS with a few simple examples; an interactive demo is available here. WCS

Clipping can be done by adding a &clip= parameter to the request. If the subsettingCRS parameter is specified then this CRS applies to the clipping WKT as well, otherwise it is assumed that the WKT is in the native coverage CRS.


  • Polygon clipping on coverage with nativeCRS EPSG:4326.

    ▶ show

  • Polygon clipping with coordinates in EPSG:3857 (from subsettingCRS parameter) on coverage with nativeCRS EPSG:4326.

    ▶ show

  • Linestring clipping on a 3D coverage (axes: X, Y, ansidate).

    ▶ show

  • Multipolygon clipping on 2D coverage

    ▶ show

  • Curtain clipping by a Linestring on 3D coverage

    ▶ show

  • Curtain clipping by a Polygon on 3D coverage

    ▶ show

  • Corridor clipping by a Linestring on 3D coverage

    ▶ show

  • Corridor clipping by a Polygon on 3D coverage

    ▶ show WCPS

A special function that works similarly as in the case of WCS is provided with the following signature:

clip( coverageExpression, wkt [, subsettingCrs ] )


  • coverageExpression is some coverage variable like cov or an expression that results in a coverage like `cos(cov+10)`
  • wkt is a valid WKT construct, e.g. POLYGON((...)), LineString(...)
  • subsettingCrs is an optional parameter to specify the CRS for the coordinates in wkt (e.g “http://opengis.net/def/crs/EPSG/0/4326”).


  • Polygon clipping with coordinates in EPSG:4326 on coverage with nativeCRS EPSG:3857:

    ▶ show

  • Linestring clipping on 3D coverage (axes: X, Y, datetime).

    ▶ show

  • Linestring clipping on 2D coverage with coordinates (axes: X, Y).

    ▶ show

  • Multipolygon clipping on 2D coverage.

    ▶ show

  • Curtain clipping by a Linestring on 3D coverage

    ▶ show

  • Curtain clipping by a Polygon on 3D coverage

    ▶ show

  • Corridor clipping by a Linestring on 3D coverage

    ▶ show

  • Corridor clipping by a Polygon on 3D coverage (geo CRS: EPSG:4326) with input geo coordinates in EPSG:3857.

    ▶ show

5.4.4. Auto-ratio for scaling X or Y axis in WCPS

Since v9.8, the scale function in WCPS allows to specify the target extent of only one of the spatial X/Y axes (e.g. only Long). In this case, the extent of the other axis will be automatically determined to preserve the original ratio between the two spatial axes.

For example in the request below, petascope will automatically set the extent of Lat to a value that preserves the ratio in the output result:

for c in (test_mean_summer_airtemp)
return encode(scale( c, { Long:"CRS:1"(0:160) } ), "png" )

5.4.5. Extract domain interval from domain()/imageCrsdomain() in WCPS

Since v9.8, one can extract domain interval (lowerBound:upperBound or an individual bound) from result of domain and imagerCrsdomain operators on a specific coverage’s axis. The syntax is operator(.lo|.hi)? with (.lo or .hi) returns the lower bound or upper bound of this interval.

Example, coverage test_eobstest has 3 dimensions. By standard, imageCrsdomain(c) returns (0:5,0:29,0:39). With this extended feature, imageCrsdomain(c,Long) returns 0:39 and imageCrsdomain(c,Long).hi returns 39.

Also, the third argument (CRS URI) in domain() operator changed to optional. If this argument is not specified, domain() will use CRS URI of the selected axis (second argument) instead.

5.4.6. Resample a projected output in WMS request

By adding optional interpolation parameter in GetMap request, see details.

5.4.7. LET clause in WCPS

Since v10.0, an optional LET clause is supported in WCPS queries. It allows binding alias variables to valid WCPS query sub-expressions, and subsequently make use of the variables in the RETURN clause instead of repeating the aliased sub-expressions.

The syntax is

LET $variable1 := coverageExpression,
    $variable2 := coverageExpression,

For example

for $c in (test_mr)
let $a := $c[i(0:50), j(0:40)],
    $b := avg($c) * 2
return encode(scale($c, { imageCrsdomain($a) }) + $b, "png")

Note, there is a special case for shorthand subset expression. The variable in LET clause can have this syntax

LET $variable1 := [dimensionalIntervalList]

And a shorthand subset expression can use this variable directly with this syntax


For example

for $c in (test_mr)
let $a := [i(20), j(40)],
    $b := 10
return encode($c[$a] + $b, "json")

5.4.8. SWITCH in WCPS


  CASE condExp1 RETURN resultExp1
  (CASE condExpI RETURN resultExpI)*
  DEFAULT RETURN resultExpDefault

where condExp and resultExp are either scalar returning expressions, or coverage returning expressions.


  • all condition expressions must return either boolean values or boolean coverages
  • all result expressions must return either scalar values, or coverages
  • the domain of all condition expressions must be the same
  • the domain of all result expressions must be the same (that means same extent, resolution/direct positions, crs)

Evaluation rules:

If the result expressions return scalar values, the returned scalar value on a branch is used in places where the condition expression on that branch evaluates to True. If the result expressions return coverages, the values of the returned coverage on a branch are copied in the result coverage in all places where the condition coverage on that branch contains pixels with value True.

The conditions of the statement are evaluated in a manner similar to the IF-THEN-ELSE statement in programming languages such as Java or C++. This implies that the conditions must be specified by order of generality, starting with the least general and ending with the default result, which is the most general one. A less general condition specified after a more general condition will be ignored, as the expression meeting the less general expression will have had already met the more general condition.

Furthermore, the following hold:

  • domainSet(result) = domainSet(condExp1)
  • metadata(result) = metadata(condExp1)
  • rangeType(result) = rangeType(resultExp1). In case resultExp1 is a scalar, the result range type is the range type describing the coverage containing the single pixel resultExp1.


  case $c < 10 return {red: 0; green: 0; blue: 255}
  case $c < 20 return {red: 0; green: 255; blue: 0}
  case $c < 30 return {red: 255; green: 0; blue: 0}
  default return {red: 0; green: 0; black: 0}

The above example assigns blue to all pixels in the $c coverage having a value less than 10, green to the ones having values at least equal to 10, but less than 20, red to the ones having values at least equal to 20 but less than 30 and black to all other pixels.

  case $c > 0 return log($c)
  default return 0

The above example computes log of all positive values in $c, and assigns 0 to the remaining ones.

  case $c < 10 return $c * {red: 0; green: 0; blue: 255}
  case $c < 20 return $c * {red: 0; green: 255; blue: 0}
  case $c < 30 return $c * {red: 255; green: 0; blue: 0}
  default return {red: 0; green: 0; black: 0}

The above example assigns blue:255 multiplied by the original pixel value to all pixels in the $c coverage having a value less than 10, green:255 multiplied by the original pixel value to the ones having values at least equal to 10, but less than 20, red:255 multiplied by the original pixel value to the ones having values at least equal to 20 but less than 30 and black to all other pixels.

5.5. Data import

Raster data (tiff, netCDF, grib, …) can be imported in petascope through its WCS-T standard implementation. For convenience rasdaman provides the wcst_import.sh tool, which hides the complexity of building WCS-T requests for data import. Internally, WCS-T ingests the coverage geo-information into petascopedb, while the raster data is ingested into rasdaman.

Building large timeseries/datacubes, mosaics, etc. and keeping them up-to-date as new data becomes available is supported even for complex data formats and file/directory organizations. The systemtest contains many examples for importing different types of data. Following is a detailed documentation on how to setup an ingredients file for your dataset.

5.5.1. Introduction

The wcst_import.sh tool introduces two concepts:

Recipe - A recipe is a class implementing the BaseRecipe that based on a set of
parameters (ingredients) can import a set of files into WCS forming a well defined coverage (image, regular timeseries, irregular timeseries etc);
Ingredients - An ingredients file is a JSON file containing a set of parameters
that define how the recipe should behave (e.g. the WCS endpoint, the coverage name, etc.)

To execute an ingredients file in order to import some data:

$ wcst_import.sh path/to/my_ingredients.json

Alternatively, wcst_import.sh tool can be started as a daemon as follows:

$ wcst_import.sh path/to/my_ingredients.json --daemon start

or as a daemon that is “watching” for new data at some interval (in seconds):

$ wcst_import.sh path/to/my_ingredients.json --watch <interval>

For further informations regarding wcst_import.sh commands and usage:

$ wcst_import.sh --help

The workflow behind is depicted approximately on Figure 5.1.


Figure 5.1 Ingestion process with wcst_import.sh

An ingredients file with all possible options can be found here; in the same directory you will find several examples for different recipes.

5.5.2. Recipes

As of now, these recipes are provided:

For each one of these there is an ingredients example under the ingredients/ directory, together with an example for the available parameters Further on each recipe type is described in turn. Common options

Some options are commonly applicable to all recipes.

config section

  • service_url - The endpoint of the WCS service with the WCS-T extension enabled

  • mock - Print WCS-T requests but do not execute anything if set to true. Set to false by default.

  • automated - Set to true to avoid any interaction during the ingestion process. Useful in production environments for automated deployment for example. By default it is false, i.e. user confirmation is needed to execute the ingestion.

  • blocking (since v9.8) - Set to false to analyze and import each file separately (non-blocking mode). By default blocking is set to true, i.e. wcst_import will analyze all input files first to create corresponding coverage descriptions, and only then import them. The advantage of non-blocking mode is that the analyzing and importing happens incrementally (in blocking mode the analyzing step can take a long time, e.g. days, before the import can even begin).


    When importing in non-blocking import mode for coverages with irregular axes, it will only rely on sorted files by filenames and it can fail if these axes’ coefficients are collected from input files’ metadata (e.g: DateTime value in TIFF’s tag or GRIB metadata) as they might not be consecutive. wcst_import will not analyze all files to collect metadata to be sorted by DateTime as in default blocking import mode.

  • default_null_values - This parameter adds default null values for bands that do not have a null value provided by the file itself. The value for this parameter should be an array containing the desired null value either as a closed interval low:high or single values. E.g. for a coverage with 3 bands

    ▶ show

    Note, if set this parameter will override the null/nodata values present in the input files.

  • tmp_directory - Temporary directory in which gml and data files are created; should be readable and writable by rasdaman, petascope and current user. By default this is /tmp.

  • crs_resolver - The crs resolver to use for generating WCS-T request. By default it is determined from the petascope.properties setting.

  • url_root - In case the files are exposed via a web-server and not locally, you can specify the root file url here; the default value is "file://".

  • skip - Set to true to ignore files that failed to import; by default it is false, i.e. the ingestion is terminated when a file fails to import.

  • retry - Set to true to retry a failed request. The number of retries is either 5, or the value of setting retries if specified. This is set to false by default.

  • retries - Control how many times to retry a failed WCS-T request; set to 5 by default.

  • retry_sleep - Set number of seconds to wait before retrying after an error; a floating-point number can also be specified for sub-second precision. Default values is 1.

  • track_files - Set to true to allow files to be tracked in order to avoid reimporting already imported files. This setting is enabled by default.

  • resumer_dir_path - The directory in which to store the track file. By default it will be stored next to the ingredients file.

  • slice_restriction - Limit the slices that are imported to the ones that fit in a specified bounding box. Each subset in the bounding box should be of form { "low": 0, "high": <max> }, where low/high are given in the axis format. Example:

    ▶ show

  • description_max_no_slices - maximum number of slices (files) to show for preview before starting the actual ingestion.

  • subset_correction (deprecated since rasdaman v9.6) - In some cases the resolution is small enough to affect the precision of the transformation from domain coordinates to grid coordinates. To allow for corrections that will make the import possible, set this parameter to true.

  • insitu - Set to true to register files in-situ, rather than ingest them in rasdaman. Note: only applicable to rasdaman enterprise.

recipes/options section

  • import_order - Allow to sort the input files (ascending (default) or descending).Currently, it sorts by datetime which allows to import coverage from the first date or the recent date. Example:

    ▶ show

  • tiling - Specifies the tile structure to be created for the coverage in rasdaman. You can set arbitrary tile sizes for the tiling option only if the tile name is ALIGNED. Example:

    ▶ show

    For more information on tiling please check the Storage Layout Language

  • wms_import - If set to true, after importing data to coverage, it will also create a WMS layer from the imported coverage and populate metadata for this layer. After that, this layer will be available from WMS GetCapabilties request. Example:

    ▶ show

  • scale_levels - Enable the WMS pyramids feature. Level must be positive number and greater than 1. Syntax:

    ▶ show

hooks section

Since v9.8, wcst_import allows to run bash commands before/after ingestion by adding optional hooks configuration in an ingredient file. There are 2 types of ingestion hooks:

  • before_ingestion: run bash commands before analyzing input file(s) (e.g: using gdalwarp to reproject input file(s) from EPSG:3857 CRS to EPSG:4326 CRS and import projected EPSG:4326 input file(s)) to a coverage.
  • after_ingestion: run bash commands after importing input file(s) to coverage (e.g: clean all projected file(s) from gdalwarp command above).

When importing mode is set to non-blocking ("blocking": false), wcst_import will run before/after hook(s) for the file which is being used to update coverage, while the default blocking importing mode will run before/after hook(s) for all input files before/after they are updated to a coverage. Parameters are explained below.

▶ show

Example: Import GDAL subdatasets

The example ingredients below contains a pre-hook which replaces the collected file path into a GDAL subdataset form; in this particular case, with the GDAL driver for NetCDF a single variable from the collected NetCDF files is imported.

▶ show Mosaic map

Well suited for importing a tiled map, not necessarily continuous; it will place all input files given under a single coverage and deal with their position in space. Parameters are explained below.

▶ show Regular timeseries

Well suited for importing multiple 2-D slices created at regular intervals of time (e.g sensor data, satelite imagery etc) as 3-D cube with the third axis being a temporal one. Parameters are explained below

▶ show Irregular timeseries

Well suited for importing multiple 2-D slices created at irregular intervals of time into a 3-D cube with the third axis being a temporal one. There are two types of time parameters in “options”, one needs to be choosed according to the particular use case:

▶ show General coverage

The general recipe aims to be a highly flexible recipe that can handle any kind of data files (be it 2D, 3D or n-D) and model them in coverages of any dimensionality. It does that by allowing users to define their own coverage models with any number of bands and axes and fill the necesary coverage information through the so called ingredient sentences inside the ingredients. Ingredient Sentences

An ingredient expression can be of multiple types:

  • Numeric - e.g. 2, 4.5
  • Strings - e.g. 'Some information'
  • Functions - e.g. datetime('2012-01-01', 'YYYY-mm-dd')
  • Expressions - allows a user to collect information from inside the ingested file using a specific driver. An expression is of form ${driverName:driverOperation} - e.g. ${gdal:minX}, ${netcdf:variable:time:min. You can find all the possible expressions here.
  • Any valid python expression - You can combine the types below into a python expression; this allows you to do mathematical operations, some string parsing etc. - e.g. ${gdal:minX} + 1/2 * ${gdal:resolutionX} or datetime(${netcdf:variable:time:min} * 24 * 3600) Parameters

Using the ingredient sentences we can define any coverage model directly in the options of the ingredients file. Each coverage model contains a

  • global - specifies fields which should be saved (e.g. the licence, the creator etc) once for the whole coverage. Example:

    ▶ show

  • local - specifies fields which are fetched from each input file to be stored in coverage’s metadata. Then, when subsetting output coverage, only associated local metadata will be added to the result. Example:

    ▶ show

  • colorPaletteTable - specifies the path to a Color Palette Table (.cpt) file which can be used internally when encoding coverage to PNG to colorize result. Example:

    ▶ show

    Since v10, general recipe with slicer gdal reads colorPaletteTable automatically if the first input file (TIFF format with Color Table (RGB with 256 entries)) contains this metadata when colorPaletteTable is set to auto or not specified in the ingredients file. If colorPaletteTable is set to empty, this metadata is ignored when creating coverage’s global metadata.

  • slicer - specifies the driver (netcdf, gdal or grib) to use to read from the data files and for each axis from the CRS how to obtain the bounds and resolution corresponding to each file.


    “type”: “gdal” is used for TIFF, PNG, and other 2D formats.

An example for the netCDF format can be found here and for PNG here. Here’s an example ingredient file for grib data:

▶ show Possible Expressions

Each driver allows various expressions to extract information from input files. We will mark with capital letters, things that vary in the expression. E.g. ${gdal:metadata:YOUR_FIELD} means that you can replace YOUR_FIELD with any valid gdal metadata tag (e.g. a TIFFTAG_DATETIME)


Take a look at this NetCDF example for a general recipe ingredient file that uses many netcdf expressions.

Type Description Examples
Metadata information ${netcdf:metadata:YOUR_METADATA_FIELD} ${netcdf:metadata:title}
Variable information ${netcdf:variable:VARIABLE_NAME:MODIFIER} where VARIABLE_NAME can be any variable in the file and MODIFIER can be one of: first|last|max|min; Any extra modifiers will return the corresponding metadata field on the given variable ${netcdf:variable:time:min} ${netcdf:variable:t:units}
Dimension information ${netcdf:dimension:DIMENSION_NAME} where DIMENSION_NAME can be any dimension in the file. This will return the value on the selected dimension. ${netcdf:dimension:time}


For TIFF, PNG, JPEG, and other 2D data formats we use GDAL. Take a look at this GDAL example for a general recipe ingredient file that uses many GDAL expressions.

Type Description Examples
Metadata information ${gdal:metadata:METADATA_FIELD} ${gdal:metadata:TIFFTAG_NAME}
Geo Bounds ${gdal:BOUND_NAME} where BOUND_NAME can be one of the minX|maxX|minY|maxY ${gdal:minX}
Geo Resolution ${gdal:RESOLUTION_NAME} where RESOLUTION_NAME can be one of the resolutionX|resolutionY ${gdal:resolutionX}
Origin ${gdal:ORIGIN_NAME} where ORIGIN_NAME can be one of the originX|originY ${gdal:originY}


Take a look at this GRIB example for a general recipe ingredient file that uses many grib expressions.

Type Description Examples
GRIB Key ${grib:KEY} where KEY can be any of the keys contained in the GRIB file ${grib:experimentVersionNumber}


Type Description Examples
File Information ${file:PROPERTY} where property can be one of path|name|dir_path|original_path|original_dir_path original_* allows to get the original input file’s path/directory (used only when using pre-hook with replace_path to replace original input file paths with customized file paths). ${file:path}

Special Functions

A couple of special functions are available to deal with some more complicated cases:

Function Name Description Examples
grib_datetime(date,time) This function helps to deal with the usual grib date and time format. It returns back a datetime string in ISO format. grib_datetime(${grib:dataDate}, ${grib:dataTime})
datetime(date, format) This function helps to deal with strange date time formats. It returns back a datetime string in ISO format. datetime("20120101:1200", "YYYYMMDD:HHmm")
regex_extract(input, regex, group) This function extracts information from a string using regex; input is the string you parse, regex is the regular expression, group is the regex group you want to select datetime(regex_extract('${file:name}', '(.*)_(.*)_(.*)_(\\d\\d\\d\\d-\\d\\d) (.*)', 4), 'YYYY-MM')
replace(input, old, new) Replaces all occurrences of a substring with another substring in the input string replace('${file:path}','.tiff', '.xml')

Band’s unit of measurement (uom) code for netCDF and GRIB recipes

  • In netCDF recipes you can add uom for each band by referencing the metadata key of the specific variable. For example, for variable LAI:

▶ show

  • In GRIB recipes adding uom for bands is same as for netCDF, except that a GRIB expression is used to fetch this information from metadata in the GRIB file. Example:

    ▶ show

Local metadata from input files

Beside the global metadata of a coverage, you can add local metadata for each file which is a part of the whole coverage (e.g a 3D time-series coverage mosaiced from 2D GeoTiff files).

In ingredient file of general recipe, under the metadata section add a “local” object with keys and values extracted by using format type expression. Example of extracting an attribute from a netCDF input file:

▶ show

Afterwards, each file’s envelope (geo domain) and its local metadata will be added to the coverage metadata under <slice>...</slice> element if coverage metadata is imported in XML format. Example of a coverage containing local metadata in XML from 2 netCDF files:

▶ show

Since v10.0, local metadata for input files can be fetched from corresponding external text files using the optional metadata_file setting. For example:

▶ show

When subsetting a coverage which contains local metadata section from input files (via WC(P)S requests), if the geo domains of subsetted coverage intersect with some input files’ envelopes, only local metadata of these files will be added to the output coverage metadata.

For example: a GetCoverage request with a trim such that crs axis subsets are within netCDF file 1:

▶ show

The coverage’s metadata result will contain only local metadata from netCDF file 1:

▶ show

Customized axis labels in coverage

This feature is available since rasdaman version 9.8 for general recipe. Before, axis labels for a coverage must match axis abbreviations in CRS’s GML definition when they are configured in the ingredient file under section "slicer"/"axes". With this new feature, one can set an arbitrary name for each axis label by adding optional configuration "crsOrder" for each axis accordingly the position index which starts from 0 of axis in coverage’s CRS.

For example with below configuration, coverage will be created with 3 customized axes MyDateTimeAxis, MyLatAxis and MyLongAxis based on coverage’s CRS (AnsiDate (1 DateTime axis) and EPSG:4326 (Lat and Long axes)):

▶ show

Group several coverage slices into a group

Since v9.8+, wcst_import allows to group input files on irregular axes (with "dataBound": false) by optional sliceGroupSize: value (positive integer). E.g:

▶ show

If each input slice corresponds to index X, and one wants to have slice groups of size N, then the index would be translated with this option to X - (X % N).

Typical use case is importing 3D coverage from 2D satellite imageries where time axis is irregular and its values are fetched from input files by regex expression. Then, all input files which belong to 1 time window (e.g: "sliceGroupSize": 7 (7 days in AnsiDate CRS) will have the same value which is the first date of this week).

Band and dimension metadata in netCDF

Metadata can be individually specified for each band and axis in the ingredient file. This metadata is automatically added to the result output when encoding to netCDF. Example:

▶ show

Since v9.7, for this metadata can be automatically derived from the input netCDF files.

  • band metadata:

    • If "bands" is set to "auto" or does not exist under "metadata" in the ingredient file, all user-specified bands will have metadata which is fetched directly from the netCDF file.
    • Otherwise, the user could specify metadata explicitly by a dictionary of keys/values. Metadata for 1 band is collected automatically if: 1) band is not added. 2) band is set to "auto".
  • axis metadata:

    • If "axes" is set to "auto" or does not exist under "metadata" in the ingredient file, all user-specified axes will have metadata which is fetched directly from the netCDF file. The axis label for variable is detected from the min or max value of CRS axis configuration under "slicer/axes" section. For example:

      ▶ show

    • Otherwise, the user could specify metadata explicitly by a dictionary of keys/values. Metadata for 1 axis is collected automatically if: 1) axis is not added. 2) axis is set to "auto". 3) axis is set with ${netcdf:variable:DimensionName:metadata}. Import from external WCS

Allows to import a coverage from a remote petascope endpoint into the local petascope. Parameters are explained below.

▶ show Import Sentinel 1 data

This is a convenience recipe for importing Sentinel 1 data in particular; currently only GRD/SLC product types are supported, and only geo-referenced tiff files. Below is an example:

▶ show

The recipe extends general_coverage so the "recipe" section has the same structure. However, a lot of information is automatically filled in by the recipe now, so the ingredients file is much simpler as the example above shows.

The other obvious difference is that the "coverage_id" is templated with several variables enclosed in ${ and } which are automatically replaced to generate the actual coverage name during import:

  • modebeam - the mode beam of input files, e.g. IW/EW.
  • polarisation - single polarisation of input files, e.g: HH/HV/VV/VH

If the files collected by "paths" are varying in any of these parameters, the corresponding variables must appear somewhere in the "coverage_id" (as for each combination a separate coverage will be constructed). Otherwise, the ingestion will either fail or result in invalid coverages. E.g. if all data’s mode beam is IW, but still different polarisations, the "coverage_id" could be "MyCoverage_${polarisation}";

In addition, the data to be ingested can be optionall filtered with the following options in the "input" section:

  • modebeams - specify a subset of mode beams to ingest from the data, e.g. only the IW mode beam; if not specified, data of all supported mode beams will be ingested.
  • polarisations - specify a subset of polarisations to ingest, e.g. only the HH polarisation; if not specified, data of all supported polarisations will be ingested.


  • Only GRD/SLC products are supported.
  • Data must be geo-referenced.
  • Filenames are assumed to be of the format: s1[ab]-(.*?)-grd(.?)-(.*?)-(.*?)-(.*?)-(.*?)-(.*?)-(.*?).tiff or s1[ab]-(.*?)-slc(.?)-(.*?)-(.*?)-(.*?)-(.*?)-(.*?)-(.*?).tiff. Import Sentinel 2 data

This is a convenience recipe for importing Sentinel 2 data in particular. It relies on support for Sentinel 2 in more recent GDAL versions. Importing zipped Sentinel 2 is also possible and automatically handled.

Below is an example:

▶ show

The recipe extends general_coverage so the "recipe" section has the same structure. However, a lot of information is automatically filled in by the recipe now, so the ingredients file is much simpler as the example above shows.

The other obvious difference is that the "coverage_id" is templated with several variables enclosed in ${ and } which are automatically replaced to generate the actual coverage name during import:

  • crsCode - the CRS EPSG code of the imported files, e.g. 32757 for WGS 84 / UTM zone 57S.
  • resolution - Sentinel 2 products bundle several subdatasets of different resolutions:
    • 10m - bands B4, B3, B2, and B8 (base type unsigned short)
    • 20m - bands B5, B6, B7, B8A, B11, and B12 (base type unsigned short)
    • 60m - bands B1, B8, and B10 (base type unsigned short)
    • TCI - True Color Image (red, green, blue char bands); also 10m as it is derived from the B2, B3, and B4 10m bands.
  • level - L1C or L2A

If the files collected by "paths" are varying in any of these parameters, the corresponding variables must appear somewhere in the "coverage_id" (as for each combination a separate coverage will be constructed). Otherwise, the ingestion will either fail or result in invalid coverages. E.g. if all data is level L1C with CRS 32757, but still different resolutions, the "coverage_id" could be "MyCoverage_${resolution}"; the other variables can still be specified though, so "MyCoverage_${resolution}_${crsCode}" is valid as well.

In addition, the data to be ingested can be optionall filtered with the following options in the "input" section:

  • resolutions - specify a subset of resolutions to ingest from the data, e.g. only the “10m” subdataset; if not specified, data of all supported resolutions will be ingested.
  • levels - specify a subset of levels to ingest, so that files of other levels will be fully skipped; if not specified, data of all supported levels will be ingested.
  • crss - specify a list of CRSs (EPSG codes as strings) to ingest; if not specified or empty, data of any CRS will be ingested. Image pyramids

This feature (v9.7+) allows to create downscaled versions of a given coverage, eventually achieving something like an image pyramid, in order to enable faster WMS requests when zooming in/out.

By using the scale_levels option of wcst_import when importing a coverage with WMS enabled, petascope will create downscaled collections in rasdaman following this pattern: coverageId_<level>. If level is a float, then the dot is replaced with an underscore, as dots are not permitted in a collection name. Some examples:

  • MyCoverage, level 2 -> MyCoverage_2
  • MyCoverage, level 2.45 -> MyCoverage_2_45

Example ingredients specification to create two downscaled levels which are 8x and 32x smaller than the original coverage:

▶ show

Two new WCS-T non-standard requests are utilized by wcst_import for this feature, see here for more information. Creating your own recipe

The recipes above cover a frequent but limited subset of what is possible to model using a coverage. WCSTImport allows to define your own recipes in order to fill these gaps. In this tutorial we will create a recipe that can construct a 3D coverage from 2D georeferenced files. The 2D files that we want to target have all the same CRS and cover the same geographic area. The time information that we want to retrieve is stored in each file in a GDAL readable tag. The tag name and time format differ from dataset to dataset so we want to take this information as an option to the recipe. We would also want to be flexible with the time crs that we require so we will add this option as well.

Based on this usecase, the following ingredient file seems to fulfill our need:

▶ show

To create a new recipe start by creating a new folder in the recipes folder. Let’s call our recipe my_custom_recipe:

▶ show

The last command is needed to tell python that this folder is containing python sources, if you forget to add it, your recipe will not be automatically detected. Let’s first create an example of our ingredients file so we get a feeling for what we will be dealing with in the recipe. Our recipe will just request from the user two parameters Let’s now create our recipe, by creating a file called recipe.py

▶ show

Use your favorite editor or IDE to work on the recipe (there are type annotations for most WCSTImport classes so an IDE like PyCharm would give out of the box completion support). First, let’s add the skeleton of the recipe (please note that in this tutorial, we will omit the import section of the files (your IDE will help you auto import them)):

▶ show

The first thing you need to do is to make sure the get_name() method returns the name of your recipe. This name will be used to determine if an ingredient file should be processed by your recipe. Next, you will need to focus on the constructor. Let’s examine it. We get a single parameter called session which contains all the information collected from the user plus a couple more useful things. You can check all the available methods of the class in the session.py file, for now we will just save the options provided by the user that are available in session.get_recipe() in a class attribute.

In the validate() method, you will validate the options for the recipe provided by the user. It’s generally a good idea to call the super method to validate some of the general things like the WCST Service availability and so on although it is not mandatory. We also want to validate our custom recipe options here. This is how the recipe looks like now:

▶ show

Now that our recipe can validate the recipe options, let’s move to the describe() method. This method allows you to let your users know any relevant information about the ingestion before it actually starts. The irregular_timeseries recipe prints the timestamp for the first couple of slices for the user to check if they are correct. Similar behaviour should be done based on what your recipe has to do.

Next, we should define the ingest behaviour. The framework does not make any assumptions about how the correct method of ingesting is, however it offers a lot of utility functionality that help you do it in a more standardized way. We will continue this tutorial by describing how to take advantage of this functionality, however, note that this is not required for the recipe to work. The first thing that you need to do is to define an importer object. This importer object, takes a coverage object and ingests it using WCST requests. The object has two public methods, ingest(), which ingests the coverage into the WCS-T service (note: ingest can be an insert operation when the coverage was not defined, or update if the coverage exists. The importer will handle both cases for you, so you don’t have to worry if the coverage already exists.) and get_progress() which returns a tuple containing the number of imported slices and the total number of slices. After adding the importer, the code should look like this:

▶ show

In order to build the importer, we need to create a coverage object. Let’s see how we can do that. The coverage constructor requires a

  • coverage_id: the id of the coverage
  • slices: a list of slices that compose the coverage. Each slice defines the position in the coverage and the data that should be defined at the specified position
  • range_fields: the range fields for the coverage
  • crs: the crs of the coverage
  • pixel_data_type: the type of the pixel in gdal format, e.g. Byte, Float32 etc

You can construct the coverage object in many ways, we will present further a specific method of doing it. Let’s start from the crs of the coverage. For our recipe, we want a 3D crs, composed of the CRS of the 2D images and a time crs as indicated. The two lines of code would give us exactly this:

▶ show

Let’s also get the range fields for this coverage. We can extract them again from the 2D image using a helper class that can use GDAL to get the relevant information:

▶ show

Let’s also get the pixel base type, again using the gdal helper:

pixel_type = gdal_dataset.get_band_gdal_type()

Let’s see what we have so far:

▶ show

As you can notice, the only thing left to do is to implement the _get_slices() method. To do so we need to iterate over all the input files and create a slice for each. Here’s an example on how we could do that

▶ show

And we are done we now have a valid coverage object. The last thing needed is to define the status method. This method need to provide a status update to the framework in order to display it to the user. We need to return the number of finished work items and the number of total work items. In our case we can measure this in terms of slices and the importer can already provide this for us. So all we need to do is the following:

▶ show

We now have a functional recipe. You can try the ingredients file against it and see how it works.

▶ show

5.6. Data export

WCS formats are requested via the format KVP key (<gml:format> elements for XML POST requests), and take a valid MIME type as value. Output encoding is passed on to the the GDAL library, so the limitations on output formats are devised accordingly by the supported raster formats of GDAL. The valid MIME types which Petascope may support can be checked from the WCS 2.0.1 GetCapabilities response:

▶ show

In case of encode processing expressions, besides MIME types WCPS (and rasql) can also accept GDAL format identifiers or other commonly-used format abbreviations like “CSV” for Comma-Separated-Values for instance.

5.7. Configuration

The petascope configuration can be found in $RMANHOME/etc/petascope.properties; editing this file requires restarting Tomcat.

5.7.1. Database connection

Below is a list of settings for different databases that have been tested successfully with rasdaman for holding the array geo-metadata (not the array data!); replace db-username and db-password with your individual values (it is a major security risk to use the default values coming during installation). Note that only a single datasource can be specified in petascope.properties at any given time, more than one is not supported for simultaneous use. Note also that changing to another database system requires more than just changing these entries, some migration process is involved instead.

▶ show

For non-postgresql DBMS like H2, HSQLDB, check the petascope.properties to add path to its JDBC jar driver on your system.

▶ show

When migrating to a different database system (e.g. PostgreSQL to HSQLDB), you need to specify connection details for the existing database like this:

▶ show

5.7.2. Standalone deployment


petascope can run as a standalone web application with embedded tomcat by running:

$ java -jar rasdaman.war

In this case, the port for embedded tomcat will be fetched from server.port configuration in petascope.properties (e.g: 9009). Then petascope can be accessed via URL:


One can also run embedded petascope with its own dedicated petascope.propeties by adding an option which points to a folder containing this property file, e.g:

$ java -jar rasdaman.war --petascope.confDir=/opt/rasdaman/new_etc/


Same as petascope, one can run secore as a standalone web application with embedded tomcat by running:

$ java -jar def.war


Configuration secoredb.path must be set in secore.properties file first to a folder which system user can create XML database files inside it, e.g: secoredb.path=/tmp/

The port for embedded tomcat will be fetched from server.port configuration in secore.properties file (e.g: 9010). Then secore can be accessed via URL:


Start/stop embedded petascope/secore

Each standalone application needs a unique port on which to listen (e.g petascope on port 8080 and secore on port 8081). This should be configured in the properties files as below:

  • $RMANHOME/etc/petascope.properties

    ▶ show

  • $RMANHOME/etc/secore.properties

    ▶ show

With these settings, petascope and secore will be started/stopped when rasdaman is started or stopped in the usual way. Specific applications, e.g. only petascope, can also be selectively started/stopped:

$ start_rasdaman.sh --service (secore | petascope)
$ stop_rasdaman.sh --service (secore | petascope)

Please see the --help of start_rasdaman.sh and stop_rasdaman.sh for more details.

5.7.3. Logging

At the end of petascope.properties you will find the logging configuration. It is recommended to adjust this, and make sure that Tomcat has permissions to write the petascope.log file.

5.7.4. Static content via petascope

Since v9.8, external static content (HTML/CSS/Javascript) which exists outside of rasdaman.war can be served by petascope at http://petascope-endpoint/rasdaman/ if setting static_html_dir_path in petascope.properties is set to an existing directory (absolute) path containing index.html as entry web page which the user running Tomcat can read.

5.8. Database migration

5.8.1. Upgrade

Below we outline the steps for migrating petascopedb (from vX.X to vY.Y, or from one DBMS to another, like PostgreSQL to HSQLDB):

  1. If using an embedded database like HSQLDB, which does not support multiple connections from different applications, make sure that the (new) petascope 9.5 is stopped.
  2. Execute the migration script: ./migrate_petascopedb.sh
  3. All coverages in pre 9.5 petascopedb will be read by the old CoverageMetadata model which is imported in the new petascope as a legacy package.
  4. If coverage id doesn’t exist in the new petascopedb, a process to translate from old CoverageMetadata model to CIS coverage data model is done and then persisted in petascopedb.
  5. While running the migration, all services of the new petascope web application, such as: WCS, WCPS, WMS, and WCS-T, will not be available to make sure the data is migrated safely.


Migrating from v9.4 to v9.5 will create a new database petascopedb, and will not modify the existing petascopedb, just rename it to petascopedb_94_backup.

5.8.2. Rollback

The following parts of petascope are saved to allow for rollback in the unlikely case of an unsuccessful migration.

  • The old petascopedb is preserved (renamed to petascopedb_94_backup) to allow the user to test the new petascope version first. It is their decision, ultimately, whether to drop the previous database or not.
  • The old petascope web application will be moved from rasdaman.war to rasdaman.war.old in the Tomcat webapps directory
  • The petascope.properties is backed up to petascope.properties.DATE.bak by the update_properties.sh script.

To rollback:

  1. rename the new petascope (rasdaman.war and petascope.properties to rasdaman.war.new and petascope.properties.new)
  2. rename the old petascope back to the original (rasdaman.war.bak and petascope.properties.DATE.bak to rasdaman.war and petascope.properties).
  3. rename the backuped database from petascopedb_94_backup to petascopedb.
  4. restart Tomcat

5.9. 3rd party libraries

Petascope uses various 3rd party libraries, documented on the overall rasdaman code provenance page.