Releases: IQSS/dataverse
v4.8.4
Overview:
This release adds schema.org metadata to dataset pages for better indexing by search engines, allows downloading of dataset metadata in schema.org format, fixes a bug in publishing a dataset where affiliation is not set, and several orcid-related issues.
- Add schema.org markup to dataset pages
- Export dataset metadata in schema.org format and add it to download list
- Fix navbar search box that was throwing errors
- Fix issue where dataset without an author affiliation could not be published
- Fix oauth button labels where connect button became statically typed to orcid rather than the authentication provider (thanks to Ruben Andreassen and Pete Meyer for the fix)
- Fix internal server error when logging in using Google
- Fix url provided in sample orcid.json file. It was not allowing user account info to be prepopulated on create
- Remove default gray background in dataverse theme
- Reword failed ingest error message to emphasize upload completed, only ingest failed
For the complete list of issues, see the 4.8.4 milestone in Github.
For help with upgrading, installing, or general questions please email support@dataverse.harvard.edu.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:
- Undeploy the previous version.
- /glassfish4/bin/asadmin list-applications
- /glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory, start
- service glassfish stop
- remove the generated directory: rm -rf /usr/local/glassfish4/glassfish/domains/domain1/generated
- service glassfish start
- Deploy this version.
- /glassfish4/bin/asadmin deploy dataverse-4.8.4.war
- Run db update script
psql -U <db user> -d <db name> -f upgrade_v4.8.3_to_v4.8.4.sql
- Restart glassfish
Note: the url provided in the v2.0 orcid.json file was slightly in error, requiring an update.
Using the v2.0 orcid.json file, http://guides.dataverse.org/en/4.8.4/_downloads/orcid.json , enter the client id and secret and then update the provider information:
curl -X POST -H 'Content-type: application/json' --upload-file orcid.json http://localhost:8080/api/admin/authenticationProviders
After updating this information, restart glassfish.
If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg
IMPORTANT: If you are running TwoRavens with your dataverse:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.8.3/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R
If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:
cd [files directory]
rm -f `find . -name '*.prep'`
If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.
TROUBLESHOOTING NOTE:
Potential issue with the PostgreSQL JDBC driver version incompatibility causing the Dataverse timer to malfunction.
Dataverse uses the EJB timer perform scheduled tasks: harvesting from remote servers and keeping the local OAI sets and metadata expor
ts updated. We've discovered that this timer may stop working if the version of the JDBC driver used by Glassfish is no longer in sync
with the version of the PostgreSQL database. The symptoms: if your scheduled harvests are no longer running and there are error messa
ges in the server.log with the following lines in them:
Internal Exception: java.io.StreamCorruptedException: invalid stream header ...
Exception Description: Could not deserialize object from byte array ...
it most likely means that your GDBC driver needs to be upgraded. For example, if you are running the version 9.3 of PostgresQL, make s
ure you have the driver postgresql-9.3-1104.jdbc4.jar in your /glassfish/lib directory. The correct version of the d
river for your version of PostgreSQL can be found at https://jdbc.postgresql.org/download.html. If you have an older driver in glassf
ish/lib, stop Glassfish, remove the old driver and replace it with the new version. We recommend that you remove the entire contents o
f /glassfish/domains/domain1/generated/ before starting Glassfish again.
v4.8.3
Overview:
This release supports ORCiD schema v2.0 in accordance with their roadmap: https://members.orcid.org/api/news/xsd-20-update. It also fixes a bug on the file landing page, allowing users to request access to restricted files.
- Support ORCiD schema v2.0
- Fix a bug on the file landing page preventing users from requesting access to restricted files from that page
- Include additional Postgres drivers to support newer versions of Postgres with recommended drivers
For the complete list of issues, see the 4.8.3 milestone in Github.
For help with upgrading, installing, or general questions please email support@dataverse.harvard.edu.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:
- Undeploy the previous version.
- /glassfish4/bin/asadmin list-applications
- /glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory, start
- service glassfish stop
- remove the generated directory: rm -rf /usr/local/glassfish4/glassfish/domains/domain1/generated
- service glassfish start
- Deploy this version.
- /glassfish4/bin/asadmin deploy dataverse-4.8.3.war
If you previously had ORCiD authentication configured, please note that we have upgraded to the v2.0 API in this release. To be in sync with their upgrade policy, you must update the ORCiD authentication provider information, see http://guides.dataverse.org/en/4.8.3/installation/oauth2.html#dataverse-side .
Using the v2.0 orcid.json file, http://guides.dataverse.org/en/4.8.3/_downloads/orcid.json , enter the client id and secret and then update the provider information:
curl -X POST -H 'Content-type: application/json' --upload-file orcid.json http://localhost:8080/api/admin/authenticationProviders
After updating this information, restart glassfish.
If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg
IMPORTANT: If you are running TwoRavens with your dataverse:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.8.3/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R
If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:
cd [files directory]
rm -f `find . -name '*.prep'`
If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.
TROUBLESHOOTING NOTE:
Potential issue with the PostgreSQL JDBC driver version incompatibility causing the Dataverse timer to malfunction.
Dataverse uses the EJB timer perform scheduled tasks: harvesting from remote servers and keeping the local OAI sets and metadata expor
ts updated. We've discovered that this timer may stop working if the version of the JDBC driver used by Glassfish is no longer in sync
with the version of the PostgreSQL database. The symptoms: if your scheduled harvests are no longer running and there are error messa
ges in the server.log with the following lines in them:
Internal Exception: java.io.StreamCorruptedException: invalid stream header ...
Exception Description: Could not deserialize object from byte array ...
it most likely means that your GDBC driver needs to be upgraded. For example, if you are running the version 9.3 of PostgresQL, make s
ure you have the driver postgresql-9.3-1104.jdbc4.jar in your /glassfish/lib directory. The correct version of the d
river for your version of PostgreSQL can be found at https://jdbc.postgresql.org/download.html. If you have an older driver in glassf
ish/lib, stop Glassfish, remove the old driver and replace it with the new version. We recommend that you remove the entire contents o
f /glassfish/domains/domain1/generated/ before starting Glassfish again.
v4.8.2
Overview:
In this release we have fixed an issue so curators may edit datasets while in review. We provide an OpenShift template for Dataverse and Docker image for experimenting with this configuration. We've made several additional bug fixes. For a complete list of changes, see the closed issues in this milestone.
- Allow curators to edit datasets while in review
- Provide OpenShift template and Docker image
- Document how one can experiment with Dataverse and OpenShift
- Fix index exception when there are Null values in TextBox-type metadata fields
- Fix missing thumbnail images for map files when stored on S3
- When Terms of Use and Terms of Access are defined, expand all populated fields on Dataset Terms tab
- Mitigate against password guessing by adding a math question to the form after 2 failed attempts
- Fix displayed navigation URLs to use Pretty URLs format
For the complete list of issues, see the 4.8.2 milestone in Github.
For help with upgrading, installing, or general questions please email support@dataverse.harvard.edu.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:
- Undeploy the previous version.
- /glassfish4/bin/asadmin list-applications
- /glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory, start
- service glassfish stop
- remove the generated directory: rm -rf /usr/local/glassfish4/glassfish/domains/domain1/generated
- service glassfish start
- Deploy this version.
- /glassfish4/bin/asadmin deploy dataverse-4.8.2.war
If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg
IMPORTANT: If you are running TwoRavens with your dataverse:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.8.2/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R
If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:
cd [files directory]
rm -f `find . -name '*.prep'`
If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.
v4.8.1
Overview:
This release improves performance for the dataset page, especially those with many files. It also includes a usability improvement to the verify email link.
For help with upgrading, installing, or general questions please email support@dataverse.org.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:
- Undeploy the previous version.
- /glassfish4/bin/asadmin list-applications
- /glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory, start
- service glassfish stop
- remove the generated directory: rm -rf /usr/local/glassfish4/glassfish/domains/domain1/generated
- service glassfish start
- Deploy this version.
- /glassfish4/bin/asadmin deploy dataverse-4.8.1.war
If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg
IMPORTANT: If you are running TwoRavens with your dataverse:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.8.1/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R
If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:
cd [files directory]
rm -f `find . -name '*.prep'`
If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.
v4.8
Overview:
In this release we introduce support for AWS S3 file storage, providing Dataverse installations with a cloud option. We also include support for Large Data upload via rsync and integration with an external application, the Data Capture Module (DCM). Other enhancements include improved Swift object storage, csv file ingest improvements, support for increased password complexity, downloading large guestbooks, removal of a user's roles, improved documentation, and various bug fixes.
- Provide S3 storage driver
- Improved Swift storage driver
- Support for large data upload and download workflows using rsync and DCM
- Improved CSV ingest.
- Configurable password complexity
- Fix and improve downloading of data for large guestbooks
- Fixed problem publishing a dataset with unescaped characters in title
- Disable user by button to remove all assigned roles
- Submit for review API endpoint
- Return to author API endpoint
- Create public-only configuration setting
- Restrict file API endpoint
- Improve migration documentation
- Improve User Guide section on permissions
- Provide installation-wide setting to control which metadata fields appear at top of dataset page
As always, thanks to all members of the Dataverse Community who contributed to this release by submitting suggestions, code, or other changes. Special thanks to Brian Silverstein, Oscar Smith, Rohit Bhattacharjee, and Sarah Ferry for their work on CSV and S3/Swift. Thanks to Pete Meyer for all the work on Rsync. Thanks to Don Sizemore and Akio Sone from Odum for fixes having to do with Glassfish and SPSS. Thanks to Solomon HM for improving migration documentation. Thanks to Jacob Makar-Limanov for work on UI accessibility.
For the complete list of issues, see the 4.8 milestone in Github.
For help with upgrading, installing, or general questions please email support@dataverse.org.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:
- Undeploy the previous version.
- /glassfish4/bin/asadmin list-applications
- /glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory, start
- service glassfish stop
- remove the generated directory: rm -rf /usr/local/glassfish4/glassfish/domains/domain1/generated
- service glassfish start
- Deploy this version.
- /glassfish4/bin/asadmin deploy dataverse-4.8.war
- Run the database update script.
psql -U <db user> -d <db name> -f upgrade_v4.7.1_to_v4.8.sql
- Run the workflow script.
psql -U <db user> -d <db name> -f 3561-update.sql
- Update citation.tsv to add grant agency and number to facet and advanced search.
curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @citation.tsv -H "Content-type: text/tab-separated-values"
Change in behavior note:
In this release, when a dataset is submitted for review by an author, the dataset no longer may be edited by the curator while it is in review. It must be returned to author before it can be modified.
If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg
IMPORTANT: If you are running TwoRavens with your dataverse:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.8/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R
If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:
cd [files directory]
rm -f `find . -name '*.prep'`
If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.
v4.7.1
Overview:
This release introduces a user management view to the administrator dashboard, listing relevant user information, providing search and super user promotion/demotion functionality. A few other community requested fixes and modifications are also provided.
- Display installation users in table format.
- Provide super user toggle functionality.
- Add additional information to user table: creation time, last login, last api use.
- Open search API to not require an API key.
- Properly validate username field on account creation to prevent non-functioning accounts.
- Add Department as a dataverse category.
- Prevent the Files facet on the MyData page hanging.
- On dataset page, autofill Identifier Scheme field with ORCiD id if available/logged in.
As always, thanks to all members of the Dataverse Community who contributed to this release by submitting suggestions, code, or other changes.
For the complete list of issues, see the 4.7.1 milestone in Github.
For help with upgrading, installing, or general questions please email support@dataverse.org.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:
- Undeploy the previous version.
- /glassfish4/bin/asadmin list-applications
- /glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory, start
- service glassfish stop
- remove the generated directory: rm -rf /usr/local/glassfish4/glassfish/domains/domain1/generated
- service glassfish start
- Deploy this version.
- /glassfish4/bin/asadmin deploy dataverse-4.7.1.war
- Run the database update script.
psql -U -d -f upgrade_v4.7_to_v4.7.1.sql
Note: this script will also remove any partially created, not functioning user accounts that may have resulted from a bug fixed in this release of not validating username on account creation. - Optionally restore old behavior of requiring API tokens to use Search API.
-Search API does not require token now but if want to preserve old behavior run command:
curl -X PUT -d true http://localhost:8080/api/admin/settings/:SearchApiRequiresToken
If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg
IMPORTANT: If you are running TwoRavens with your dataverse:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.7.1/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R
If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:
cd [files directory]
rm -f `find . -name '*.prep'`
If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.
v4.7
Overview:
This release provides more customization and branding options for installations, improves documentation, provides better interoperability with citation tools, and incorporates code and bug fixes contributed by the Dataverse developer community during our recent community meeting hackathon.
- Allow creating a custom homepage, header, footer, navbar logo.
- Remove the system generated word Dataverse from dataverse names, making it optional.
- Make all system notifications use the name of the root dataverse in place of the word Dataverse.
- Make About link optional, off by default, on if link is specified.
- Make default guides link be an entry point to User Guide only to allow for other installation-specific structures.
- Allow specifying guides URL and overriding guides URL versioning mechanism.
- User and API Guide improvements contributed during the community meeting. Thank you, @acme146 for your help proofreading the User Guide!
- Developer Guide improvements, updates to README.md and CONTRIBUTING.md
- Add metadata tags to dataset page to improve interoperability with citations tools: Zotero, Endnote, Altmetrics
- Redirect after log in to page of origin for Shibboleth users. Thank you @aivanov100 for pull request #3910 and @donsizemore for testing it!
Big thanks to all members of the Dataverse Community who contributed to this release by submitting suggestions, code, or other changes.
For the complete list of issues, see the 4.7 milestone in Github.
For help with upgrading, installing, or general questions please email support@dataverse.org.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:
- Undeploy the previous version.
- /glassfish4/bin/asadmin list-applications
- /glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory, start
- service glassfish stop
- remove the generated directory: rm -rf /usr/local/glassfish4/glassfish/domains/domain1/generated
- service glassfish start
- Deploy this version.
- /glassfish4/bin/asadmin deploy dataverse-4.7.war
- Optionally run the database update script.
Run this script if you want to preserve the word Dataverse after your current dataverse names. Uncomment the UPDATE line before running it.
psql -U -d -f upgrade_v4.6.2_to_v4.7.sql - Optionally reindex all
If you do not wish to keep Dataverse in existing dataverse names and so did not run the update script in step 4, you should reindex so that search cards reflect the new names.
If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg
IMPORTANT: If you are running TwoRavens with your dataverse:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.7/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R
If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:
cd [files directory]
rm -f find . -name '*.prep'
If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.
v4.6.2
Overview:
This release includes improvements to mapping tabular data via WorldMap, support for object storage using Swift, support for Handles as persistent identifiers, improvements to the guides and various bug fixes:
- Fixed classification for latitude/longitude maps
- Restored WorldMap preview modal
- Added map thumbnail to dataset page
- Remove map data that subsequently becomes restricted
- Verify WorldMap links remain valid
- Improved Geoconnect documentation based on UX review
- Support for Handles as persistent identifiers
- Support for optional cloud-based object storage using Swift, including download URLs
- Support for alternative local identifier schemes, eg. sequential numbers rather than random strings
- Allow users to upload a dataset thumbnail
- Support for non-interactive installation
- A new Style Guide section for developers, Patterns
- Updated Two Ravens documentation
- Updated Developer Guide to reflect current process
- Fixed Guestbook Response Download to properly handle records for deleted files
Big thanks to all members of the Dataverse Community who contributed to this release by submitting suggestions, code, or other changes. Special thanks to the groups from DANS and CIMMYT that worked to restore the Handles functionality in the application.
For the complete list of issues, see the 4.6.2 milestone in Github.
For help with upgrading, installing, or general questions please email support@dataverse.org.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:
-
Undeploy the previous version.
- {glassfish install path}/glassfish4/bin/asadmin list-applications
- {glassfish install path}/glassfish4/bin/asadmin undeploy dataverse
-
Stop glassfish and remove the generated directory, start
- service glassfish stop
- remove the generated directory: rm -rf /usr/local/glassfish4/glassfish/domains/domain1/generated
- service glassfish start
-
Deploy this version.
- {glassfish install path}/glassfish4/bin/asadmin deploy {path}/dataverse-4.6.2.war
-
Run the database update script.
psql -U {db user} -d {db name} -f upgrade_v4.6.1_to_v4.6.2.sql -
Update the citations metadatablock:
We have made a minor update to the Citations metadata configuration: The "Related Publication" fields
will now be included on the "Create New Dataset" page. (And not just on the "Edit Metadata" page for
existing datasets. To keep your Citation metadata configuration up-to-date, re-ingest the metadata
block file supplied with the Dataverse distribution, as follows:curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @data/metadatablocks/citation.tsv -H "Content-type: text/tab-separated-values"
If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg
IMPORTANT: If you are running TwoRavens with your dataverse:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.6.2/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R
If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:
cd [files directory]
rm -f find . -name '*.prep'
If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.
v4.6.1
Overview:
This release introduces support for ORCID authentication, a file replace feature and several important enhancements:
- Support for OAuth2 remote authentication using ORCID, Google, or Github
- File replace functionality makes updating files easier
- A file version history on the file landing page
- A download URL for public files
- A reworked Log in page includes support for Remote Authentication Only mode
- Support uploading and replacing files with Native API
- Include file tags in metadata returned through the API
- Fixed a mapping bug introduced in v4.6 that prevents editing an existing map
- Fixed a request access bug affecting access to a single restricted file
- Some native API endpoint support CORS, allowing client-side Javascript code to use them (see here for native API guide - CORS enabled endpoints are labeled).
For the complete list of issues, see the 4.6.1 milestone in Github.
For help with upgrading, installing, or general questions please email support@dataverse.org.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:
-
Undeploy the previous version.
- /glassfish4/bin/asadmin list-applications
- /glassfish4/bin/asadmin undeploy dataverse
-
Stop glassfish and remove the generated directory
- service glassfish stop
- remove the generated directory: rm -rf /usr/local/glassfish4/glassfish/domains/domain1/generated
-
If not yet present, add a jvm option to /usr/local/glassfish4/glassfish/domains/domain1/config/domain.xml to support timers, start glassfish
- -Ddataverse.timerServer=true (note, see http://guides.dataverse.org/en/latest/admin/timers.html if using more than one server)
- service glassfish start
-
Verify the two jhove files are in the config directory:
For the file uploads to work properly, please follow the instructions in the README.txt file in https://github.com/IQSS/dataverse/tree/master/conf/jhove (In future version of the Dataverse the installer script will be taking care of this). -
Deploy this version.
- {glassfish install path}/glassfish4/bin/asadmin deploy {path}dataverse-4.6.1.war
-
Run the database update script.
psql -U {db user} -d {db name} -f upgrade_v4.6_to_v4.6.1.sql -
If using Shibboleth, update the configuration to work with the new Log In page
Refer to the "Add the Shibboleth Authentication Provider to Dataverse" instructions on adding an authenticationProviders:
http://guides.dataverse.org/en/4.6.1/installation/shibboleth.html -
Remove old Shibboleth configuration setting :
From the console of a server running Dataverse,
curl -X DELETE http://localhost:8080/api/admin/settings/:ShibEnabled -
You might want to make use of the new
:DefaultAuthProvider
setting if you want an auth provider other than Username/Email to be the default: http://guides.dataverse.org/en/4.6.1/installation/config.html#defaultauthprovider
If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/d/msgid/dataverse-migration-wg
Please note: v4.x does not currently support creating new handles though it will support existing ones. We intend to add this feature but have not yet scheduled this work.
IMPORTANT: If you are running TwoRavens with your dataverse:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.6/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R
If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:
cd [files directory]
rm -f find . -name '*.prep'
If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.
v4.6
Overview:
This release introduces a new File Landing Page and several important enhancements:
- Introduce a new File Landing Page
- Improve Deaccession Behavior
- API clean up
- OAI-PMH compliance improvements
- Optionally support SHA1 in place of MD5 checksum
- Fix an important issue with Request Access workflow
- Improve File Upload behavior, particularly drag and drop
- Document how to run Dataverse with SELinux enabled
For the complete list of issues, see the 4.6 milestone in Github.
For help with upgrading, installing, or general questions please email support@dataverse.org.
Installation:
If this is a new installation, please see our Installation Guide.
Upgrade:
If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:
- Undeploy the previous version.
- /glassfish4/bin/asadmin list-applications
- /glassfish4/bin/asadmin undeploy dataverse
- Stop glassfish and remove the generated directory
- service glassfish stop
- remove the generated directory: rm -rf /usr/local/glassfish4/glassfish/domains/domain1/generated
- If not yet present, add a jvm option to /usr/local/glassfish4/glassfish/domains/domain1/config/domain.xml to support timers, start glassfish
- -Ddataverse.timerServer=true (note, see http://guides.dataverse.org/en/latest/admin/timers.html if using more than one server)
- service glassfish start
- Deploy this version.
- /glassfish4/bin/asadmin deploy dataverse-4.6.war
- Run the database update script.
psql -U -d -f upgrade_v4.5.1_to_v4.6.sql - Update tool tip in metadata block.
- Download the attached, latest social science metadata block file (social_science.tsv) to your glassfish server:
- Update metadata block with latest .tsv file. From glassfish server in directory where .tsv was downloaded:
- curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @./social_science.tsv -H "Content-type: text/tab-separated-values"
- Update schema.xml.
- Stop running solr process (kill -9 )
- Replace current schema.xml file with latest attached version.
- Restart solr (java -jar start.jar &)
- Run incremental index.
- Clear timestamps, from glassfish server run:
- curl -X DELETE http://localhost:8080/api/admin/index/timestamps
- Run incremental index, from glassfish server run:
- Clear timestamps, from glassfish server run:
If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/d/msgid/dataverse-migration-wg
Please note: v4.x does not currently support creating new handles though it will support existing ones. We intend to add this feature but have not yet scheduled this work.
IMPORTANT: If you are running TwoRavens with your dataverse:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.6/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R
If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:
cd [files directory]
rm -f find . -name '*.prep'
If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.