- Goal – Enrich sets on Alma with Wikidata URIs. Release of the Wikidata Primo implementation is now expected at the beginning of 2024
DEMO AND OBSERVATIONS (BENN)
- Alma refine documentation on their GitHub - https://github.com/ExLibrisGroup/alma-refine/wiki/Help
- Load a demo set for architecture monographs
- Preview option (for context). Sometimes you don't get a preview (why?)
- Wikidata is only by default targeting the 100 fields
- Changes can be immediately reviewed on the metadata editor
- Settings
- User based, not system wide. Persistent throughout sessions, but will be lost with each sandbox refresh
- It is good that they are user based, but might be a problem for consistent practices
- You can configure the system to target other MARC fields. This is being after you select a reconciliation service since you might want to have different target fields depending on the source dataset
Sometimes you don't get a preview (why?) - Correct term option – might need more exploration
- You can use Alma Refine with Sets, Search Results, and with single records opened on the Metadata Editor
- Large datasets can take a lot of time to reconcile
OBSERVATIONS (PALOMA)
- Main concern about settings has been resolved by Benn's demo. Happy to see that you can add target fields
- Seems like it also allows you to choose the target subfield for the URI ($1 or $0) which can also be an issue if folks are not aware of whether the target dataset is RWO or not
- Was impossible to do the testing without constantly comparing with OpenRefine
- The screen only displays up to 25 items per page, which is an issue for big datasets
- Reconciliation result has to be manually selected one by one
- Doesn't seem to allow selection of more than two data points for the reconciliation and does not show confidence level the way that OR does
- Something else that OR does that AR does not is letting you filter labels that have something in common (e.g. that have dates), and then just look at those ones. No way to select a items based on the label characteristics
- The URI is not linkable on the Record view, while the value of the 856 $u is (not sure if that matters in any way, but I thought it was curious)
- How to leverage the power of OpenRefine to get a dataset enriched with URIs, and then use AlmaRefine to import the URIs in Alma? Should we try to develop a local reconciliation service, something like with OR does with ReconcileCSV?
NEXT STEPS
- Each institution creates a set of bib records that they want to enhance with Wikidata items
- HRC is going to create the set based on the archival collections. Will require a lot of clean up, but this has been a pending project for a while. The goal is to cross 100 field with EAD Creators for which we already have a wikidata ID. We already have a dataset of names and wikidata URIs, we just need to put it on the system. Paloma would like to work on a local reconciliation service to use with AlmaRefine
- Architecture also wants to focus it on the archival collections. Katie would like to set up a working meeting before the next meeting to do some testing as a group
SIDE NOTE
- UTL is starting to think about implementing more efficient workflows for archival collections discovery by harvesting the metadata directly from TARO