Updating CITE tests

Hi all, especially @aaime @groldan @jive and those who have knowledge about CITE tests.

I have spent a few days getting a feeling for the CITE (Compliance & Interoperability Testing Evaluation) tests. This is what I have found so far, and I would appreciate guidance on where to go next. I see there are marked differences between running the CITE tests in the 3/4 different ways:

  1. the official TEAM Engine
    1b. beta: TEAM Engine
  2. GitHub actions
  3. Jenkins
  4. local Makefile

The differences are in:

a. the version of TeamEngine
b. the number of tests run/passed
c. the success logging

a. the version of TeamEngine b. the number of tests passed/run
features10 wms13 wms11 wfs11 wfs10 wcs11
1. the official TEAM Engine 5.6.1 260/278 + 411/411 228/228 + 0/0 + 16/16 + 7/7 old report format old report format old report format
beta: TEAM Engine 5.7 260/278 + 411/411 228/228 + 0/0 + 16/16 + 7/7
2. GitHub actions 1.7.1-teamengine-5.4.1 2284/2381 ?585/585 ?263/263 ?291/291 ?897/897 ?303/303 *
3. Jenkins last run approx 3y 8mo ago
4. local Makefile ogccite/ets-ogcapi-features10:1.8-teamengine-5.7 2284/2381 can’t run can’t run can’t run can’t run can’t run **

* The file logs/testng-results.xml does not exist. Skipping the report generation.
** Can’t run locally on Windows, still trying on Ubuntu WSL

Observations

  1. It would appear that GitHub actions (example) are the most recent and complete. All 6 tests pass. However, for the original 5 suites, there is no summary of tests (make print-full-failures → “The file logs/testng-results.xml does not exist. Skipping the report generation.”) so I simply counted the number of “Passed” strings in the logs.

  2. The number of tests in GitHub actions appear to be more than that on TEAM Engine. Does the TE version have anything to do with this?

  3. I have not attempted any WFS (destructive) tests yet. I also note our minutes from 2024-11-19 saying that WCS 1.0 version negotiation does not work; so we need a cut down WAR.

  4. For the old report format in TEAM Engine, there appears to be no indication whether the test is successful or not. Is this expected? How does certification work in this case?

Actions

Please guide me where to go from here. Should the old tests be changed to use the REST API similar to ogcapi-features10 or should their logs be fixed to report Pass/Fail properly?

Should I try and certify the current main branch (incl. ogcapi-features10 extension, if so how does point 4 (old report format) above impact it?

How do I prepare a WCS 1.0 WAR file? I assume I need to exclude later WCS 2.0?

Thanks

Peter

Hi, Peter,

About a year ago, I did a bunch of little things to get the CITE tests passing.

I was using a local TEAM engine - which is really quite difficult to get setup. I made some PRs to fix some of the tests.

I didn’t complete the WCS tests, if I recall correctly - there were some problems that I didn’t think could be solved easily. Some of that has to do with version negotiation and inconsistency between the various versions of the spec for version negotiation, if I recall.

Hope that helps…
Dave

On Sun, Dec 15, 2024 at 10:54 AM Peter Smythe via OSGeo Discourse <noreply@discourse.osgeo.org> wrote:

Peter
December 15

Hi all, especially @aaime @groldan @jive and those who have knowledge about CITE tests.

I have spent a few days getting a feeling for the CITE (Compliance & Interoperability Testing Evaluation) tests. This is what I have found so far, and I would appreciate guidance on where to go next. I see there are marked differences between running the CITE tests in the 3/4 different ways:

  1. the official TEAM Engine
    1b. beta: TEAM Engine
  2. GitHub actions
  3. Jenkins
  4. local Makefile

The differences are in:

a. the version of TeamEngine
b. the number of tests run/passed
c. the success logging

a. the version of TeamEngine b. the number of tests passed/run
features10 wms13 wms11 wfs11 wfs10 wcs11
1. the official TEAM Engine 5.6.1 260/278 + 411/411 228/228 + 0/0 + 16/16 + 7/7 old report format old report format old report format
beta: TEAM Engine 5.7 260/278 + 411/411 228/228 + 0/0 + 16/16 + 7/7
2. GitHub actions 1.7.1-teamengine-5.4.1 2284/2381 ?585/585 ?263/263 ?291/291 ?897/897 ?303/303 *
3. Jenkins last run approx 3y 8mo ago
4. local Makefile ogccite/ets-ogcapi-features10:1.8-teamengine-5.7 2284/2381 can’t run can’t run can’t run can’t run can’t run **

* The file logs/testng-results.xml does not exist. Skipping the report generation.
** Can’t run locally on Windows, still trying on Ubuntu WSL

Observations1. It would appear that GitHub actions (example) are the most recent and complete. All 6 tests pass. However, for the original 5 suites, there is no summary of tests (make print-full-failures → “The file logs/testng-results.xml does not exist. Skipping the report generation.”) so I simply counted the number of “Passed” strings in the logs.

  1. The number of tests in GitHub actions appear to be more than that on TEAM Engine. Does the TE version have anything to do with this?

  2. I have not attempted any WFS (destructive) tests yet. I also note our minutes from 2024-11-19 saying that WCS 1.0 version negotiation does not work; so we need a cut down WAR.

  3. For the old report format in TEAM Engine, there appears to be no indication whether the test is successful or not. Is this expected? How does certification work in this case?

Actions

Please guide me where to go from here. Should the old tests be changed to use the REST API similar to ogcapi-features10 or should their logs be fixed to report Pass/Fail properly?

Should I try and certify the current main branch (incl. ogcapi-features10 extension, if so how does point 4 (old report format) above impact it?

How do I prepare a WCS 1.0 WAR file? I assume I need to exclude later WCS 2.0?

Thanks

Peter


Visit Topic or reply to this email to respond.

To unsubscribe from these emails, click here.

1 Like

I was using a local TEAM engine - which is really quite difficult to get setup. I made some PRs to fix some of the tests.

We’ve been playing with it recently a bit too, eventually found an easy way:

docker run -p 8081:8080 --rm ogccite/teamengine-production

On port 8081 you have a teamengine like the one on the production site, login as ogccite/ogccite and start testing.
If you’re on LInux and want to test a GeoServer on the host, use 172.17.0.1 as the IP, if on Mac or Windows, use
host.docker.internal instead.

See also:

Cheers
Andrea

1 Like

I have spent a few days getting a feeling for the CITE (Compliance & Interoperability Testing Evaluation) tests. This is what I have found so far, and I would appreciate guidance on where to go next. I see there are marked differences between running the CITE tests in the 3/4 different ways:

  1. the official TEAM Engine
    1b. beta: TEAM Engine
  2. GitHub actions
  3. Jenkins
  4. local Makefile

You can safely ignore Jenkins for the moment, it’s still based on old machinery.
The Github actions are simply running the makefile, executing a few commands against it, in sequence.

The differences are in:

a. the version of TeamEngine

Actually, also the version of the test suite being run. If you compare production and beta team engines,
you may find a different version of the teamengine itself, but most importantly, a different version of the test suite.
Teamengine is like a servlet container, does nothing on its own, but offers and environment to execute the test suite.

b. the number of tests run/passed
c. the success logging

a. the version of TeamEngine b. the number of tests passed/run
features10 wms13 wms11 wfs11 wfs10 wcs11
1. the official TEAM Engine 5.6.1 260/278 + 411/411 228/228 + 0/0 + 16/16 + 7/7 old report format old report format old report format
beta: TEAM Engine 5.7 260/278 + 411/411 228/228 + 0/0 + 16/16 + 7/7
2. GitHub actions 1.7.1-teamengine-5.4.1 2284/2381 ?585/585 ?263/263 ?291/291 ?897/897 ?303/303 *
3. Jenkins last run approx 3y 8mo ago
4. local Makefile ogccite/ets-ogcapi-features10:1.8-teamengine-5.7 2284/2381 can’t run can’t run can’t run can’t run can’t run **

* The file logs/testng-results.xml does not exist. Skipping the report generation.
** Can’t run locally on Windows, still trying on Ubuntu WSL

No attempt has been made to run the makefiles on anything but *nix enviroments.
Certification wise, you don’t need to bother yourself with anything but option number 1, we need a record of a successful
run against production to ask for certification, all other methods are for our own internal testing only.

Observations1. It would appear that GitHub actions (example) are the most recent and complete. All 6 tests pass.

With the exception of ogcapi-features-10, which is the most recent, all other test suites are outdated by several years.
Gabriel made them run again, leveraging the work that was done 6 years ago by GeoSolutions, to have tests
runnable with docker. Back then, the docker version was only runnable via HTML,the CLI was not dockerized,
and the REST interface was available only for a few test suite (see also a discussion one year down the line,
and links to work to be done in order to get them going). Eventually, in time, most suites became runnable via
REST, with the notable exception of WCS 1.0, which is not runnable even today.
so my colleagues forked the teamengine to get things going. The work needed to get one suite runnable via REST
is not trivial, see the WCS 1.1 example.

To get things going, my colleagues forked the teamengine docker project and make it work with CLI as well.
You can find the actual test suite version numbers used for “classic suites” here. As you can see, we’re several
versions behind. But even like this, some testing is better than no testing at all.

In the meantime, I’ve started an exploratory branch to see if we can get wcs20 run with the REST API, and I’m waiting for some feedback from Gabriel:
https://github.com/geoserver/geoserver/pull/8095

The idea of the branch is to figure out how to run an old test suite, still written in XML rather than TestNG, and process its results
(which come in a different format). Specifically for WCS 2.0, it’s going to take a while, there are fixes to be done in GeoServer,
but also in the test suite itself (I have a tentative fix for it going, but blocked on executing the test suite).

  1. However, for the original 5 suites, there is no summary of tests (make print-full-failures → “The file logs/testng-results.xml does not exist. Skipping the report generation.”) so I simply counted the number of “Passed” strings in the logs.

That one should just print the failures and it’s designed only for ogcapi-featres. However, if you skip to “print teamentine logs” you’ll find something like this:

teamengine-1 | Testing suite wms:main in Test Mode with defaultResult of Pass ...
teamengine-1 | ******************************************************************************************************************************
teamengine-1 | Testing wms:wms_main type Mandatory in Test Mode with defaultResult Pass (s0001)...
teamengine-1 | Assertion: This WMS is valid
teamengine-1 | ******************************************************************************************************************************
teamengine-1 | Testing wms:wms-no-profile type Mandatory in Test Mode with defaultResult Pass (s0001/d68e11628_1)...
teamengine-1 | Assertion: This WMS is valid against 'no' profile
teamengine-1 | ******************************************************************************************************************************
teamengine-1 | Testing wms:basic_elements-param_rules-order_and_case-3 type Mandatory in Test Mode with defaultResult Pass (s0001/d68e11628_1/d68e12128_1)...
teamengine-1 | Assertion: When a GetCapabilities request contains a parameter which is not defined by the spec, the result is valid.
teamengine-1 | Test wms:basic_elements-param_rules-order_and_case-3 Passed
teamengine-1 | ******************************************************************************************************************************
teamengine-1 | Testing wms:basic_elements-param_rules-order_and_case-4 type Mandatory in Test Mode with defaultResult Pass (s0001/d68e11628_1/d68e12137_1)...
teamengine-1 | Assertion: When a GetMap request contains a parameter which is not defined by the spec, the result is valid.
teamengine-1 | Test wms:basic_elements-param_rules-order_and_case-4 Passed
teamengine-1 | ******************************************************************************************************************************
teamengine-1 | Testing wms:basic_elements-param_rules-order_and_case-5 type Mandatory in Test Mode with defaultResult Pass (s0001/d68e11628_1/d68e12151_1)...
teamengine-1 | Assertion: When a GetFeatureInfo request contains a parameter which is not defined by the spec, the result is valid.
teamengine-1 | Test wms:basic_elements-param_rules-order_and_case-5 Passed
teamengine-1 | ******************************************************************************************************************************
teamengine-1 | Testing wms:basic_elements-version-negotiation-2 type Mandatory in Test Mode with defaultResult Pass (s0001/d68e11628_1/d68e12165_1)...
teamengine-1 | Assertion: When a GetCapabilities request is made for a supported version, then the response is the requested version.
teamengine-1 | Test wms:basic_elements-version-negotiation-2 Passed
teamengine-1 | ******************************************************************************************************************************
teamengine-1 | Testing wms:basic_elements-version-negotiation-4 type Mandatory in Test Mode with defaultResult Pass (s0001/d68e11628_1/d68e12172_1)...
teamengine-1 | Assertion: When a GetCapabilities request is made for version 0.0.0, the response is between 0.0.0 and [[VAR_WMS_VERSION]], inclusive.
teamengine-1 | Test wms:basic_elements-version-negotiation-4 Passed
teamengine-1 | ******************************************************************************************************************************
teamengine-1 | Testing wms:dims-declaring-3 type Mandatory in Test Mode with defaultResult Pass (s0001/d68e11628_1/d68e12179_1)...
teamengine-1 | Assertion: All declarations for the time dimension use 'ISO8601' for units.
teamengine-1 | VAR_LAYERS_WITH_NONSTANDARD_TIME_UNITS:
teamengine-1 | Test wms:dims-declaring-3 Passed
...

This is the actual complete test report offered by a classic XML based test suite.

  1. The number of tests in GitHub actions appear to be more than that on TEAM Engine. Does the TE version have anything to do with this?

Depends on the test suite. Classic test suites like WMS 1.1 or WFS 1.1 require specific data, and should run the same set of tests, assuming they are run with the
same configuration. The configuration controls which parts of the test suite are run (e…g, we don’t run WMS dimension tests for time and elevation, for example).
Newer suites do not make assumptions on datasets, meaning they will test what they find, and since they run a number of tests on each offered layer, the number
of tests run varies depending on the test data.

  1. I have not attempted any WFS (destructive) tests yet. I also note our minutes from 2024-11-19 saying that WCS 1.0 version negotiation does not work; so we need a cut down WAR.
  1. For the old report format in TEAM Engine, there appears to be no indication whether the test is successful or not. Is this expected? How does certification work in this case?

You look at the “detailed old test report” and find the answer there.

Actions

Please guide me where to go from here. Should the old tests be changed to use the REST API similar to ogcapi-features10 or should their logs be fixed to report Pass/Fail properly?

Attacking this problem IMHO requires a core developer, it’s not just a matter of machinery, but oftentimes, to make fixes both in GeoServer and the test suite, which in turn requires good
knowledge of the specification, and some discussion with the spec leads.

Working on certification is a more attainable problem instead:

  1. Stand up a public GeoServer
  2. Run a test suite
  3. If it works, ask for certification (we need to figure out the process here)
  4. If not, maybe it’s a matter of changing the test data fed to the suite, or reducing the number of optional classes. Having something certified is better than where we stand now,
    and we’ll tackle what’s missing in a future iteration.

Should I try and certify the current main branch (incl. ogcapi-features10 extension, if so how does point 4 (old report format) above impact it?

I don’t think we can do it, not until the test suite fixes made by Gabriel, and currently found in tag 1.8,
are made available in the production engine (currently providing 1.6).

There is a good chance we could pass the tests in WMS 1.1 and 1.3, WCS 1.1, as wel as WFS 1.0 and 1.1,
even in their “read only” version (hopefully one can choose not to test transactions).
And then get certified again based on those results.
That would be already quite the achievement, considering we have been without certification for a long while.

How do I prepare a WCS 1.0 WAR file? I assume I need to exclude later WCS 2.0?

You need to exclude both WCS 1.1 and WCS 2.0 (unless they fixed the test suite in the meantime, I have not tried).

Cheers
Andrea

1 Like

Many thanks indeed, Andrea, for the comprehensive response. I will see what progress I can make today, and then discuss in the PSC meeting tonight.

Peter