I have been trying to figure out the best way to perform a bulk update on the
data. The process that I was expecting to work is the following.
- perform a query to identify the records to be changed.
- Select all records from the query
- Select "Action on Selection" -> "Export Zip" which seems to be a MEF2
format.
- Uncompress the MEF file
- Perform bulk update on the metadata.xml records
- recompress the MEF file
- upload MEF file to geonetwork.
First I was having the problem with MCP records causing issues. (Thanks
Simon for fixing this) but now I' having the issues where the zip file
contains metadata.iso19139.xml files in the metadata folder. Because this
file exist, when loading the data, it chooses this file as it is the
preferred schema.
The metadata that I'm performing updates on contain a mix of multiple schema
- I don't have a preferred schema I just want to get the modified
metadata.xml reloaded with the changes.
The only way I was able to do this was to add another step to the process in
which I remove all files from the metadata folder that are not called
metadata.xml but this does not seem right!
Suppose I simply wanted to migrate data between to geonetwork nodes. I would
suspect that the process would be the same as above minus the editing of the
files. It seems to be weird to have an exported zip file that cannot be
simply load it on the new site.
1 - Is it possible to create a MEF file without creating the other
metadata.iso19139.xml files?
2 - Is it possible to give higher priority to the original schema - ie - the
metadata.xml file during the import?
Maybe I'm doing it all wrong and there is a better way...
Testing was done with 2.8RC2
--
View this message in context: http://osgeo-org.1560.n6.nabble.com/Preferred-schema-during-bulk-import-tp5012568.html
Sent from the GeoNetwork users mailing list archive at Nabble.com.