Elastic field limit exceeded harvesting large catalogs (GeoNetwork 4.4 / Elastic 8)

With GeoNetwork 4.4.9 and Elasticsearch 8.15.3 the field limit of 4000 is exceeded when harvesting a large catalog (EMODnet in this case).

When adding harvested records to the local node, eventually this error appears for every document that tries to add new fields:

geonetwork-1        | 2026-02-19T18:11:03,733 ERROR [geonetwork.index] - Document with error #5c9baab1-04c3-4caa-9196-da5611d1a910: [1:1684] failed to parse: Limit of total fields [4000] has been exceeded while adding new fields [17].

Records that do not add new fields are indexed just fine.

With GeoNetwork 4.2 and Elasticsearch 7 the issue did not appear. Both are running from official docker images with minimal, presumably unrelated configuration changes.

Arbitrarily increasing the total field limit in Elasticsearch is discouraged as it may decrease performance: Mapping limit settings | Reference

Using a flattened field type can reduce the number of fields in dynamic mapping but might affect search functionality: Flattened field type | Reference

Using dynamic templates or explicit mapping could also reduce the number of dynamic field mappings as far as I understand, but might require a lot of work to fully cover the metadata standard used in the catalog.

Has someone already experienced this issue and can recommend a suitable approach for Elasticsearch and GeoNetwork?