Releases: IQSS/dataverse
v4.11
This release adds OAI-ORE and BagIT for archival submissions (development led by the Qualitative Data Repository), additional custom homepage options, custom analytics, and file hierarchy support for zip files.
For the complete list of issues, see the 4.11 milestone in Github.
For help with upgrading, installing, or general questions please post to the Dataverse Google Group or email support@dataverse.org.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
- Undeploy the previous version.
- <glassfish install path>/glassfish4/bin/asadmin list-applications
- <glassfish install path>/glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory, start
- service glassfish stop
- remove the generated directory: rm -rf <glassfish install path>glassfish4/glassfish/domains/domain1/generated
- service glassfish start
-
Install and configure Solr v7.3.1
See http://guides.dataverse.org/en/4.11/installation/prerequisites.html#installing-solr -
Deploy this version.
- <glassfish install path>/glassfish4/bin/asadmin deploy <path>dataverse-4.11.war
- Run db update script
psql -U <db user> -d <db name> -f upgrade_v4.10.1_to_v4.11.sql
-
Restart glassfish
-
Index all metadata
curl http://localhost:8080/api/admin/index
- If you have Google Analytics or Piwik analytics configured, remove the deprecated :GoogleAnalyticsCode, :PiwikAnalyticsId, :PiwikAnalyticsHost, :PiwikAnalyticsTrackerFileName settings, and use :WebAnalyticsCode. The new setting works like the custom HTML files for branding, which allows for more control of your analytics, making it easier to customize what you prefer to track. See Web Analytics Code in the Guides for more details.
A note on upgrading from older versions:
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version with the exception of db updates as noted.
We now offer an EXPERIMENTAL database upgrade method allowing users to skip over a number of releases. E.g., it should be possible now to upgrade a Dataverse database from v4.8.6 directly to the current release, without having to deploy the war files for the 5 releases between these 2 versions and manually running the corresponding database upgrade scripts.
The upgrade script, dbupgrade.sh is provided in the scripts/database directory of the Dataverse source tree. See the file README_upgrade_across_versions.txt for the instructions.
v4.10.1
This is a patch release that fixes an issue where datasets sometimes had trouble publishing when file doi minting was enabled and DataCite was configured as PID provider. This issue was a latent bug that existed in earlier versions but was revealed by a recent change in DataCite API behavior.
Thanks to Jim Myers (@qqmyers) for troubleshooting this and providing a solution!
See #5427 for more details.
Upgrade instructions from v4.10:
- Undeploy current war file
- Stop Glassfish
- Remove /usr/local/glassfish4/glassfish/domains/domain/generated directory
- Start Glassfish
- Deploy new war file
v4.10
This release includes support for large data transfers and storage, a simplified upgrade process, and internationalization.
All installations will be able to use Dataverse's integration with the Data Capture Module, an optional component for deposition of large datasets (both large number of files and large file size). Specific support for large datasets includes client-side checksums, non-http uploads (currently supporting rsync via ssh), and preservation of in-place directory hierarchy. This expands Dataverse to other disciplines and allows project installations to handle large-scale data.
Administrators will be able to configure a Dataverse installation to allow datasets to be mirrored to multiple locations, allowing faster data transfers from closer locations, access to more efficient or cost effective computation, and other benefits.
Internationalization features provided by Scholar's Portal are now available in Dataverse.
Dataverse Installation Administrators will be able to upgrade from one version to another without the need to step through each incremental version.
Configuration options for custom S3 URLs of Amazon S3 compatible storage available.
See configuration documentation for details.
For the complete list of issues, see the 4.10 milestone in Github.
For help with upgrading, installing, or general questions please email support@dataverse.org.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
- Undeploy the previous version.
- <glassfish install path>/glassfish4/bin/asadmin list-applications
- <glassfish install path>/glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory, start
- service glassfish stop
- remove the generated directory: rm -rf <glassfish install path>glassfish4/glassfish/domains/domain1/generated
- service glassfish start
- Deploy this version.
- <glassfish install path>/glassfish4/bin/asadmin deploy <path>dataverse-4.10.war
- Run db update script
psql -U <db user> -d <db name> -f upgrade_v4.9.4_to_v4.10.sql
- Restart glassfish
- Update citation metadata block
curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @citation.tsv -H "Content-type: text/tab-separated-values"
- Restart glassfish
- Replace Solr schema.xml, optionally replace solrconfig.xml to change search results boost logic
-stop solr instance (service solr stop, depending on solr installation/OS, see http://guides.dataverse.org/en/4.10/installation/prerequisites.html#solr-init-script)
-replace schema.xml , optionallyreplace solrconfig.xml
cp /tmp/dvinstall/schema.xml /usr/local/solr/solr-7.3.0/server/solr/collection1/conf
cp /tmp/dvinstall/solrconfig.xml /usr/local/solr/solr-7.3.0/server/solr/collection1/conf
-start solr instance (service solr start, depending on solr/OS)
- Kick off in place reindex
http://guides.dataverse.org/en/4.9.3/admin/solr-search-index.html#reindex-in-place
curl -X DELETE http://localhost:8080/api/admin/index/timestamps
curl http://localhost:8080/api/admin/index/continue
- Retroactively store original file size
Starting with release 4.10 the size of the saved original file (for an
ingested tabular datafile) is stored in the database. We provided the
following API that retrieve and permanently store the sizes for any
already existing saved originals:
/api/admin/datafiles/integrity/fixmissingoriginalsizes (see the
documentation note in the Native API guide, under "Datafile
Integrity").
It will become necessary in later versions (specifically 5.0) to have these sizes in the database. In this version,
having them makes certain operations more efficient (primary example
is a user downloading the saved originals for multiple files/an entire
dataset etc.) Also, if present in the database, the size will be added
to the file information displayed in the output of the /api/datasets;
which can be useful for some users.
- Run ReExportall to generate JSON-LD exports in the new format added in 4.10: http://guides.dataverse.org/en/4.10/admin/metadataexport.html?highlight=export#batch-exports-through-the-api
A note on upgrading from older versions:
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version with the exception of db updates as noted.
We now offer an EXPERIMENTAL database upgrade method allowing users to skip over a number of releases. E.g., it should be possible now to upgrade a Dataverse database from v4.8.6 directly to v4.10, without having to deploy the war files for the 5 releases between these 2 versions and manually running the corresponding database upgrade scripts.
The upgrade script, dbupgrade.sh is provided in the scripts/database directory of the Dataverse source tree. See the file README_upgrade_across_versions.txt for the instructions.
v4.9.4
This release addresses a bug introduced in 4.9.3 which prevented users from logging in with an email address (#5129).
For help with upgrading, installing, or general questions please email support@dataverse.org.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
- Undeploy the previous version.
- <glassfish install path>/glassfish4/bin/asadmin list-applications
- <glassfish install path>/glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory, start
- service glassfish stop
- remove the generated directory: rm -rf <glassfish install path>glassfish4/glassfish/domains/domain1/generated
- service glassfish start
- Deploy this version.
- <glassfish install path>/glassfish4/bin/asadmin deploy <path>dataverse-4.9.4.war
- Restart glassfish
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version.
v4.9.3
Note: We recommend upgrading to 4.9.4, which includes a fix for a bug that prevented users from logging in using an email address. Learn more in the 4.9.4 Release Notes.
This release is focused on expanded options for handling datasets and files in Dataverse. The dataset linking feature is now available to all users, not just superusers. It is now accessible through the UI in addition to the API. Users now have the option of downloading all files in a dataset in their original file format via the Download All button, in addition to the already available "archival format" option. Installations can now configure whether or not PIDs will be minted for files. We have also made the application more stable by addressing leaks.
For the complete list of issues, see the 4.9.3 milestone in Github.
For help with upgrading, installing, or general questions please email support@dataverse.org.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
- Undeploy the previous version.
- <glassfish install path>/glassfish4/bin/asadmin list-applications
- <glassfish install path>/glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory, start
- service glassfish stop
- remove the generated directory: rm -rf <glassfish install path>glassfish4/glassfish/domains/domain1/generated
- service glassfish start
- Deploy this version.
- <glassfish install path>/glassfish4/bin/asadmin deploy <path>dataverse-4.9.3.war
- Run db update script
psql -U <db user> -d <db name> -f upgrade_v4.9.2_to_v4.9.3.sql
- Restart glassfish
- Replace Solr schema.xml
-stop solr instance (service solr stop, depending on solr installation/OS, see http://guides.dataverse.org/en/4.9.3/installation/prerequisites.html#solr-init-script)
-replace schema.xml
cp /tmp/dvinstall/schema.xml /usr/local/solr/solr-7.3.0/server/solr/collection1/conf
-start solr instance (service solr start, depending on solr/OS)
- Kick off in place reindex
http://guides.dataverse.org/en/4.9.3/admin/solr-search-index.html#reindex-in-place
curl -X DELETE http://localhost:8080/api/admin/index/timestamps
curl http://localhost:8080/api/admin/index/continue
- If you are running TwoRavens as part of your Dataverse installation, please find the following line in your TwoRavens app_ddi.js file:
dataurl = dataurl+"?key="+apikey+"&gbrecs=true";
(or, if you are using an older version, it may look like this:)
dataurl = dataurl+"?key="+apikey;
and change it as follows:
dataurl = dataurl+"?key="+apikey+"%26gbrecs=true";
(this ensures that the download counts are properly incremented for TwoRavens explore sessions, and eliminates the confusing "Warning: The request is not valid json. Check for special characters" messages that some users were seeing after the 4.8.6 upgrade)
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version.
v4.9.2
This release is focused on ingest upgrades, new import APIs, and infrastructure upgrades.
Stata 14, Stata 15, and .tsv files will now be ingested by Dataverse. New APIs will allow datasets with existing DOIs to be imported into Dataverse. Bootstrap and Primefaces, which power the Dataverse front end, have been updated.
For the complete list of issues, see the 4.9.2 milestone in Github.
For help with upgrading, installing, or general questions please email support@dataverse.org.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
- Undeploy the previous version.
- <glassfish install path>/glassfish4/bin/asadmin list-applications
- <glassfish install path>/glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory, start
- service glassfish stop
- remove the generated directory: rm -rf <glassfish install path>glassfish4/glassfish/domains/domain1/generated
- service glassfish start
- Deploy this version.
- <glassfish install path>/glassfish4/bin/asadmin deploy <path>dataverse-4.9.2.war
- Run db update script
psql -U <db user> -d <db name> -f upgrade_v4.9.1_to_v4.9.2.sql
- Restart glassfish
If your Dataverse is configured to use R, you need to install one extra R library (haven); Dataverse is now using it to export tabular datafiles as RData. Install the package with the following R command (for example):
install.packages("haven", repos="https://cloud.r-project.org/", lib="/usr/lib64/R/library" )
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version.
v4.9.1
This release contains a patch for a permissions issue introduced in the last release where a contributor's permission is not preserved when they add a file while creating a new dataset ( #4783 ). It also includes an updated postgres driver in order to support Postgres v9.6. Since this driver is universal, older drivers have been removed but earlier versions such as v9.3 should continue working.
For the complete list of issues, see the 4.9.1 milestone in Github.
For help with upgrading, installing, or general questions please email support@dataverse.org.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
- Undeploy the previous version.
- <glassfish install path>/glassfish4/bin/asadmin list-applications
- <glassfish install path>/glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory, start
- service glassfish stop
- remove the generated directory: rm -rf <glassfish install path>glassfish4/glassfish/domains/domain1/generated
- service glassfish start
- Deploy this version.
- <glassfish install path>/glassfish4/bin/asadmin deploy <path>dataverse-4.9.1.war
- Restart glassfish
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version.
v4.9
Note: We recommend upgrading to 4.9.1, which includes a patch to address a high impact bug. Learn more in the 4.9.1 Release Notes.
This release introduces new features, File PIDs and Provenance. A new metrics API has been included. We have updated the Solr version used for search, improved error handling for file upload, fixed memory leaks in Export and added several more useful APIs: move dataverse, link dataset and dataverse, and uningest a tabular data file. Numerous bug fixes and documentation improvements have been made.
- File PIDs
- Provenance
- Metrics API
- Update Solr to v7.3
- Move Dataverse API
- Link Dataset and Dataverse APIs
- Uningest tabular file
- Make file upload more robust by improving error handling
- Fix memory leak in Export
- Fix issues with contact us email, make from address Dataverse server, reply to address requestor
- Change the way DOIs and Handles are stored in the database to be more flexible with respect to format.
- Add Mixtepec Mixtec language to metadata list of languages.
- Make metadata URLs clickable, ie. Alternative URL
For the complete list of issues, see the 4.9 milestone in Github.
For help with upgrading, installing, or general questions please email support@dataverse.org.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version.
This release has a number of extra steps, most notably upgrading Solr, migrating DOIs to a new storage format, and reindexing. This will require a brief downtime and a period of incomplete search records as the index rebuilds, post Solr upgrade. It is strongly recommended you test this upgrade in a test environment and back up your database before deploying to production.
When upgrading from the previous version, you will need to do the following:
- Shut down access to production service, you do not want users interacting with site during upgrade.
- Undeploy current version of Dataverse from each web server.
- <glassfish install path>/glassfish4/bin/asadmin list-applications
- <glassfish install path>/glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory, restart glassfish
- service glassfish stop
- remove the generated directory: rm -rf <glassfish install path>glassfish4/glassfish/domains/domain1/generated
- service glassfish start
- Back up production database
- Install and configure Solr v7.3
See http://guides.dataverse.org/en/4.9/installation/prerequisites.html#installing-solr - Deploy v4.9 to web servers
- <glassfish install path>/glassfish4/bin/asadmin deploy <path>dataverse-4.9.war
- Upgrade the database by running the update script.
Once again, we STRONGLY RECOMMEND taking a full backup of the database before proceeding with the upgrade. Among other changes in this release, we are rearranging the way DOI identifiers are stored in the database. While your existing persistent identifiers stay the same (as the name suggests!), the update script will modify the database entries (it has to do with how the "authority" and "shoulder" suffix are stored). And since we are modifying something as important as the identifiers of your datasets, it's a great idea to have a handy way to restore your database as it was, in the unlikely event anything goes wrong.
pg_dump --clean <db name> is a good way to save the entire database as an importable .sql file.
Run the upgrade script:
- psql -U <db user> -d <db name> -f upgrade_v4.8.6_to_v4.9.0.sql
- (Optionally) Enable Provenance
curl -X PUT -d 'true' http://localhost:8080/api/admin/settings/:ProvCollectionEnabled
- Update metadata languages list
curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @citation.tsv -H "Content-type: text/tab-separated-values"
- Restart glassfish
- Clear index, then index all metadata
curl http://localhost:8080/api/admin/index/clear
curl http://localhost:8080/api/admin/index
Please note: Do not run the registerDataFileAll
command below if you do not plan to give your files persistent identifiers, which are no longer required in 4.9.3 or later (#4929).
- Run the retroactive file PID registration script or register all file PID endpoint
Note: if you have a large amount of files being registered, you may want to contact your doi provider in advance to determine whether this level of traffic will cause a problem for their service.
curl http://localhost:8080/api/admin/registerDataFileAll?key=<super user api token>
This utility logs progress to server.log and a completion message with a total and any failures.
12. When file registration completes, perform in-place reindex.
curl -X DELETE http://localhost:8080/api/admin/index/timestamps
curl http://localhost:8080/api/admin/index/continue
If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg
IMPORTANT: If you are running TwoRavens as part of your Dataverse installation:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.9/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R
If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:
cd [files directory]
rm -f `find . -name '*.prep'`
If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.
v4.8.6
This release introduces a modular Explore feature to support external tools. It includes performance enhancements for S3 storage, provides an API endpoint to move datasets and other improvements, includes documentation improvements and fixes a number of bugs.
- Modular Explore (enables tools such as Data Explorer and Two Ravens)
- Redirect to the S3 location, instead of streaming.
- New API: Add api end point to move datasets
- Terms of use for Native API
- Developer Guides - As a new developer, I want a single page set of quick and easy instructions for installing Dataverse
- Fix for: Guestbook - Downloads via API are not counted (if your installation uses TwoRavens, please see the note below!)
- Fix/Clean up handling of 403 and 404 exit codes in data access API
- Fix for: Dataset page: Page fails to load when it cannot understand an image file to generate thumbnail
- Fix for: When setting harvesting schedule to weekly, settings don't persist
- Fix for a memory leak in IndexAll;
- Improvements for: Slow Page Load: Some dataset pages are slow to load, resulting in a timeout error.
For the complete list of issues, see the 4.8.6 milestone in Github.
ATTENTION: If you are running TwoRavens as part of your Dataverse installation, the bug fix for the counting of file downloads via the API has a side effect of having users' data explore sessions counted as two downloads. To avoid this, a change in the TwoRavens configuration is required. Find the following line in the TwoRavens app_ddi.js file:
dataurl = dataurl+"?key="+apikey;
and change it as follows:
dataurl = dataurl+"?key="+apikey+"&gbrecs=true";
For help with upgrading, installing, or general questions please email support@dataverse.org.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:
- Undeploy the previous version.
- <glassfish install path>/glassfish4/bin/asadmin list-applications
- <glassfish install path>/glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory, start
- service glassfish stop
- remove the generated directory: rm -rf <glassfish install path>glassfish4/glassfish/domains/domain1/generated
- service glassfish start
- Deploy this version.
- <glassfish install path>/glassfish4/bin/asadmin deploy <path>dataverse-4.8.6.war
- Run db update script
psql -U <db user> -d <db name> -f upgrade_v4.8.5_to_v4.8.6.sql - Restart glassfish
- Configure Two Ravens as an External Tool
Because the Explore button is now modular, the previous way of enabling Two Ravens using settings will no longer work. To ensure Two Ravens Explore continues to be available, following the instructions here:
http://guides.dataverse.org/en/4.8.6/installation/r-rapache-tworavens.html#e-enable-tworavens-button-in-dataverse - Reindex to update doi link format in citation
http://guides.dataverse.org/en/4.8.6/admin/solr-search-index.html#reindex-in-place
Reexport all to update doi link format in citation
http://guides.dataverse.org/en/4.8.6/admin/metadataexport.html?highlight=export#batch-exports-through-the-api
If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg
IMPORTANT: If you are running TwoRavens as part of your Dataverse installation:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.8.6/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R
If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:
cd [files directory]
rm -f `find . -name '*.prep'`
If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.
v4.8.5
This release fixes issues with IP Groups and Guestbook. It improves the Download All behavior and introduces an experimental backup to secondary storage utility. It also supports AWS IAM role and multiple regions. Other changes support future functionality.
- IP Group access permissions were not activated.
- Guestbook entry form validation was not working and overwriting prepopulated field values were not being saved.
- Download All check box now downloads all files, regardless of scroll bar position.
- An experimental backup to secondary storage utility is provided.
- A docker container with a standalone Dataverse instance is provided to support automatic integration tests.
- Support for AWS IAM role for S3 driver and multiple regions.
- Modular Configure
- Some infrastucture to support differential privacy.
For the complete list of issues, see the 4.8.5 milestone in Github.
For help with upgrading, installing, or general questions please email support@dataverse.harvard.edu.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:
- Undeploy the previous version.
- /glassfish4/bin/asadmin list-applications
- /glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory, start
- service glassfish stop
- remove the generated directory: rm -rf /usr/local/glassfish4/glassfish/domains/domain1/generated
- service glassfish start
- Deploy this version.
- /glassfish4/bin/asadmin deploy dataverse-4.8.5.war
- Restart glassfish
If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg
IMPORTANT: If you are running TwoRavens with your dataverse:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.8.3/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R
If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:
cd [files directory]
rm -f `find . -name '*.prep'`
If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.
TROUBLESHOOTING NOTE:
Potential issue with the PostgreSQL JDBC driver version incompatibility causing the Dataverse timer to malfunction.
Dataverse uses the EJB timer perform scheduled tasks: harvesting from remote servers and keeping the local OAI sets and metadata expor
ts updated. We've discovered that this timer may stop working if the version of the JDBC driver used by Glassfish is no longer in sync
with the version of the PostgreSQL database. The symptoms: if your scheduled harvests are no longer running and there are error messa
ges in the server.log with the following lines in them:
Internal Exception: java.io.StreamCorruptedException: invalid stream header ...
Exception Description: Could not deserialize object from byte array ...
it most likely means that your JDBC driver needs to be upgraded. For example, if you are running the version 9.3 of PostgresQL, make s
ure you have the driver postgresql-9.3-1104.jdbc4.jar in your /glassfish/lib directory. The correct version of the d
river for your version of PostgreSQL can be found at https://jdbc.postgresql.org/download.html. If you have an older driver in glassf
ish/lib, stop Glassfish, remove the old driver and replace it with the new version. We recommend that you remove the entire contents o
f /glassfish/domains/domain1/generated/ before starting Glassfish again.