[Geoserver-users] Passing params in WMS getMap for a Feature Query

Hi All,

I am writing a datastore to work on a set of tab delimited files
representing density layers which I have almost working.
I have a feature which effectively names the file but am stuck with a
really stupid problem... I can't work out how to pass in the feature in
the WMS URL to name it (at the moment, I just hard code the filename).

Can someone please tell me what I need to do to this openlayers code to
pass in the file="a.txt" query:

var fileLayer = new OpenLayers.Layer.WMS( "fileLayer",
                "http://localhost:8080/geoserver/wfs?", {layers:
"gbif:TestFile", version: "1.0.0", transparent: "true",
format: "image/png"} );

Or even if someone could tell me how to pass in a filter in the WMS url.

I am assuming it will come through as a Query or Filter somewhere in the
code (I must admit that I am hacking a debugging a little without full
understanding...)

Cheers

Tim
(For a more complete background: I am doing this because I already have a
backend system that can produce density layers in tab delimitted format
that I am trying to get geoserver to render maps from them by basically
mappng a callback file URL to the feature and the crafting an Attribute
reader to transform the file to a format geoserver understands)

trobertson@anonymised.com ha scritto:

Hi All,

I am writing a datastore to work on a set of tab delimited files
representing density layers which I have almost working.
I have a feature which effectively names the file but am stuck with a
really stupid problem... I can't work out how to pass in the feature in
the WMS URL to name it (at the moment, I just hard code the filename).

Can someone please tell me what I need to do to this openlayers code to
pass in the file="a.txt" query:

var fileLayer = new OpenLayers.Layer.WMS( "fileLayer",
                "http://localhost:8080/geoserver/wfs?", {layers:
"gbif:TestFile", version: "1.0.0", transparent: "true",
format: "image/png"} );

Or even if someone could tell me how to pass in a filter in the WMS url.

I am assuming it will come through as a Query or Filter somewhere in the
code (I must admit that I am hacking a debugging a little without full
understanding...)

GeoServer uses the FeatureType names your datastore provides. But you
have to register each feature type by hand into the GeoServer configuration system, you cannot simply tell it "load file xxx.dat".
Is this the problem you are facing?
As for passing a filter, you can use the FILTER=XXX or CQL_FILTER=XXX
parameters, the first accepts an xml encoded OGC filter, the second
a simpler syntax known as CQL, see:
http://docs.codehaus.org/display/GEOSDOC/WMS+vendor+parameters

Cheers
Andrea

Hi Andrea

I have it now working - thanks for your help.

I have registered a feature type of "density layer" which takes a filter
"fileURL = some_url" which must be present or the Datasource returns no
records. The dataSource uses the passed through file URL and iterates
over the results creating a featureCollection.
I am likely to deploy it to the public servers sometime this week and will
post on our wiki what we did. It will serve density layers (counts of
occurrences by 1x1 and 0.1x0.1 degree) of species distribution by name,
country and grouped by our providers. In actual fact, our server would
render any density layer if we open up the firewall to allow the callback
url - we serve maps for other people already, and will probably offer this
service for people sharing biodiversity data.

Best wishes,

Tim

trobertson@anonymised.com ha scritto:

Hi All,

I am writing a datastore to work on a set of tab delimited files
representing density layers which I have almost working.
I have a feature which effectively names the file but am stuck with a
really stupid problem... I can't work out how to pass in the feature in
the WMS URL to name it (at the moment, I just hard code the filename).

Can someone please tell me what I need to do to this openlayers code to
pass in the file="a.txt" query:

var fileLayer = new OpenLayers.Layer.WMS( "fileLayer",
                "http://localhost:8080/geoserver/wfs?", {layers:
"gbif:TestFile", version: "1.0.0", transparent: "true",
format: "image/png"} );

Or even if someone could tell me how to pass in a filter in the WMS url.

I am assuming it will come through as a Query or Filter somewhere in the
code (I must admit that I am hacking a debugging a little without full
understanding...)

GeoServer uses the FeatureType names your datastore provides. But you
have to register each feature type by hand into the GeoServer
configuration system, you cannot simply tell it "load file xxx.dat".
Is this the problem you are facing?
As for passing a filter, you can use the FILTER=XXX or CQL_FILTER=XXX
parameters, the first accepts an xml encoded OGC filter, the second
a simpler syntax known as CQL, see:
http://docs.codehaus.org/display/GEOSDOC/WMS+vendor+parameters

Cheers
Andrea

trobertson@anonymised.com ha scritto:

Hi Andrea

I have it now working - thanks for your help.

I have registered a feature type of "density layer" which takes a filter
"fileURL = some_url" which must be present or the Datasource returns no
records. The dataSource uses the passed through file URL and iterates
over the results creating a featureCollection.

Aaah, nice solution! So all of the files have the same structure,
and thus the set of files can be thought as a partition of a bigger table.

I am likely to deploy it to the public servers sometime this week and will
post on our wiki what we did. It will serve density layers (counts of
occurrences by 1x1 and 0.1x0.1 degree) of species distribution by name,
country and grouped by our providers. In actual fact, our server would
render any density layer if we open up the firewall to allow the callback
url - we serve maps for other people already, and will probably offer this
service for people sharing biodiversity data.

Very nice indeed, good work :slight_smile:
Cheers
Andrea

Hi Andrea et al,

Many thanks. I have put up a demo at http://geoserver.gbif.org, and attach
an openlayers client to demo it. This is likely to be dropped in the near
future as it does not implement correct caching (on our part, not
geoservers) and thus will be blow up with memory issues.

The background:
GBIF collate many datasets of Biodiversity Information into a single
searchable index (mysql) and have a data portal at http://data.gbif.org.
Currently our mapping is very primitive and we are investigating how to
better serve our data following correct standards. One of our issues is the
quantity of data (85 million records and increasing).

This is demo of a proposed stop gap solution. We already have the ability
to serve tab delimited data of counts per 1x1 degree cell and 0.1x0.1 degree
cell (e.g. density counts) but we were not serving this using open
standards. This implementation is a custom DataSource which takes an OGC
filter which must contain the "path" - this path is a URL to the density
file that our subsystems can already serve. The DataStore gets this file,
and then transforms it into features that Geoserver can then render.
It is primitive - and it only does the 1x1degree cells right now, but it is
standards compliant and allows the passing in of SLD etc. but was a quick
way of getting an 85 million record dataset online, and getting into the
geoserver architecture.

What I would ideally like to have done is write a plugin for geoserver which
allowed you to define some "dynamic catalogue". E.g. the request comes in
naming a dynamic layer - rather than having to add the "path" to the
FeatureType. This could then be used for named partitions for example.
Speaking of which - I am wondering how Geoserver would cluster...

Best wishes,

Tim

gbif.html (2.91 KB)

Tim Robertson ha scritto:

Hi Andrea et al,

Many thanks. I have put up a demo at http://geoserver.gbif.org, and attach
an openlayers client to demo it. This is likely to be dropped in the near
future as it does not implement correct caching (on our part, not
geoservers) and thus will be blow up with memory issues.

I noticed that you load two cell layers, sumperimposed, in your attached
demo.

The background:
GBIF collate many datasets of Biodiversity Information into a single
searchable index (mysql) and have a data portal at http://data.gbif.org. Currently our mapping is very primitive and we are investigating how to
better serve our data following correct standards. One of our issues is the
quantity of data (85 million records and increasing).

This is demo of a proposed stop gap solution. We already have the ability
to serve tab delimited data of counts per 1x1 degree cell and 0.1x0.1 degree
cell (e.g. density counts) but we were not serving this using open
standards. This implementation is a custom DataSource which takes an OGC
filter which must contain the "path" - this path is a URL to the density
file that our subsystems can already serve. The DataStore gets this file,
and then transforms it into features that Geoserver can then render.
It is primitive - and it only does the 1x1degree cells right now, but it is
standards compliant and allows the passing in of SLD etc. but was a quick
way of getting an 85 million record dataset online, and getting into the
geoserver architecture.

Ah ha, seems like a good solution to me.

What I would ideally like to have done is write a plugin for geoserver which
allowed you to define some "dynamic catalogue". E.g. the request comes in
naming a dynamic layer - rather than having to add the "path" to the
FeatureType. This could then be used for named partitions for example.

Hmm... frankly I would do the opposite, that is, to keep the files like
you are doing. To have this standard compliant, WFS GetCapabilities would have to report back all of the files you are serving, and as a result the capabilities document would be very very large, putting into
troubles most clients.

Speaking of which - I am wondering how Geoserver would cluster...

Hmm, should not be a problem if you have the same configuration on
all machines, and it's stable. If the config changes, you would have
to replicate the configuration on all machines, or store it in a network
disk, elect an instance as the one you use for chaning the files, and
then issue a set of HTTP requests micking a user to the other machines
so that the config is re-read and updated to the latest version.
I know it's clunky, but it's what we got now :frowning:

Cheers
Andrea