Collection Anywhere Minutes - December 01, 2016
Submitted by sarah.peterson on Fri, 12/09/2016 - 13:42

Meeting Date / Time:
Thursday, December 1, 2016 - 8:00am
Meeting Location:
Service Center
Attendees:
Sarah Peterson-Chair, Janet Brooks, Kellie Delaney, Kristin Hill, Heidi Johnson, Pauline Rodriguez-Atkins, Tim Rogers
Guests:
Anne Fischer
Discussion of what has happened at these meetings in the past and how we will use them going forward:
- Share what we discuss with all staff so everyone knows what is going on in the various areas
- Determine priorities and activities for Collection Anywhere
- Talk about issues that impact multiple areas
- Let each other know what else is going on in each area
- Discuss other things happening in the library as a whole
Agenda issues
Ebook MARC records
- Can we change matchpoint to the OCLC number?
- Anne noted that the first matchpoint is the control field, but that since we are not currently getting records from OCLC, the match will be on the ISBN number, which may cause some issues with duplicates.
EDS soft launch
- Our EDS discovery platform will be made available to the public on Monday, December 12th. We are calling it a soft launch because while we will mention it on the website and social media, we will not be promoting it heavily at this point. We will consider user feedback as we go forward and will promote the service more completely at a later date.
- Anne will check with TLC to see if new records are currently being loaded for the discovery platform.
- Digital Library will get out information to the staff about the change.
- The search box on the web site will change, but the link to the catalog will remain the same.
- Suppressed records are not sent to the discovery platform.
Other discussion
- Requested that Anne check with TLC to find out if we can prevent records with all items “in process” being suppressed. This would help prevent the extra requests for purchase and interlibrary loan requests received when the new titles disappear from the catalog temporarily.
- Collection Anywhere space
- Sarah is meeting with Morgan Jones tomorrow to find out what she needs to know as she begins to think about this project
- Tim suggested considering visits to peer libraries who are changing the way they use space in their buildings – Denver, Cincinnati, check the list of other peer libraries
- Need to start putting pencil to paper and making plans for what this center will be over the next 6 months
- May want to consider more space than we need, and then leasing it out to other libraries
- Anne reported that the vendor is working on self checks at Edmond and they are great and easy to use. A roll out of additional self checks will be planned once we see how these work. The sorter is also in process, but not all the parts are here yet.
- Big Bin discussion – Big Bin has not been working properly for some time. The retrieval process is now a 2 step process, with more manual work, and we are lacking statistics previously available that not only tracked the amount of materials, but also how long materials had been in Big Bin. We are working on Southwest Solutions on a fix for the problems.
Processing report
- Working to get out holiday materials, reserves, and everything else.
- Do need to be able to capture statistics on how much work is getting done
- Jones transfers were rushed – we’ll look around to make sure everything has left the building
Cataloging and ILL report
- All holiday magazines came in the 3rd week of November, and they were sent to Processing by Thanksgiving – they also have gotten through Processing and should be out to the libraries.
- Brandon is now handling the process of purchase requests that come to ILL
- Catalogers are learning to catalog periodicals
- The whole team did a great job while Pauline was away.
Digital Library report
- Consultant is here doing UX focus sessions – she has a script and is recording how people use our site for various tasks so we can figure out how to make it work better for them. Considering additional UX sessions, potentially for people in each audience cluster, for mobile testing. This would also be extremely helpful for catalog usability.
- Kellie, Janet, and Sadie are working on use cases for digital experiences so we check on not only the content but the overall digital experience before selecting a resource. We’re using this for the first time as we evaluate Hoopla.
- Breck, Bobby, and Buddy are working together on using the folklore collection to present music at the Jones opening – wrote the music to accompany some of the (pre-1923) lyrics in that collection.
Materials Selection report
- Basement materials are still a problem. The fire marshal came through and required that items stored under the stairs be moved immediately. Matthew from IT helped with the pallet jack and the team got everything moved – thank you to everyone for that! Janet and Sarah are meeting on Monday to take a look at the basement space and discuss processes down there. Kristine and Marlene have cleared off pallets that were in the way and we are sending more from the book sale.
- Susan is meeting with downtown staff regarding the Holocaust Center – Susan, Risa, and Dave and working with the Jewish Federation to develop new guidelines to shape the future of that collection. Tim noted that we may be considering other supported collections and may want to talk to the development group as we build these guidelines, so they can be used in the future as well.
- Materials Selection and the Ralph Ellison Library will also be taking another look at the Black History Collection for updates and changes that will increase that collection’s use and value to the community.
- Janet will now be doing quarterly statistical analyses for the libraries, rather than annual, which should be very helpful with goal setting and business plans. Tim suggests we find a place for those statistics on the Intranet. We also need to keep John in the loop. Janet attended the Savannah training and we should also be able to analyze data against those of our peer libraries. We need to look into more use of Orangeboy and the Savannah interface to create our own analytical elements and see more clearly how changes here impact what happens in the library.
Sarah will set a date for the next meeting.
- Log in to post comments
Comments
Will this include putting links in the 856 field? We have Zinio, PressReader, Consumer Reports, et cetera. I think that this would be a great advantage to having a digital catalog. It doesn't seem to look bad in the Library Corporation's new catalog.
Will this include alternative titles with magazine appended? E.g. "Time" becomes "Time Magazine" I know that this isn't probably to standards, but it is likely how the public would search for Time magazine.
Will there be a better way to find a listing of all of our magazines? I know that in the ILS Q&A that we have a procedure to find 459 results for a listing of all of our magazines, but I don't think that it is 100% accurate. A search of the collection, MZ, yields 541 results. Some of these are obviously errant because any location can change the location code. This leads me to think that we have more results for Magazines than the method in the ILS Q&A produces. All of the formats in the facets are created via the Library Corporation so I don't know how they are chosen, but the 22 "books" in the MZ collection won't appear in the results because the O.P.A.C. has them in book format for the facets.
These are all good questions, and thank you so much for asking! I always appreciate the opportunity to share information about how the catalog works, how staff and members may use it more effectively, and to get ideas about how we in Cataloging can make changes to make it work even better.
First, a little background. The statement that “Catalogers are learning to catalog periodicals” unfortunately does not mean that we will be taking any action on records already in the catalog. I don’t know exactly when you joined MLS, but prior to our migration to CarlX, there were no records for magazines in the catalog. All magazine information was contained in a separate database and had to be searched separately. The records in that database were not in MARC format, which controls how records display in an OPAC.
When we migrated to the TLC system, the periodical database went away, and we were required to have MARC records for periodicals. Given the large number of titles and the limited time, Jimmy Welch created a program that took data in the periodical database records and converted it to a quasi-MARC format. It would be great if Cataloging had time to go back and clean up all those records, but we don’t. As MLS adds new magazine titles to its collections, Cataloging adds records for those titles to the catalog. I’ve been doing it, but am passing that responsibility on to the Catalogers. We don’t add new titles all that often; we have added less than 100 new magazine titles since August 2014, and some of them have been magazines that we already received, but which changed titles.
With that basic information out of the way, the answers to your questions are broken down into separate comments below.
I wasn't a large fan of CyberMARS, but I do remember the Magazine Database that had to be paged thru page by page and didn't have filtering for subjects or anything else. It really is surprisingly similar to the current situation other than I assume that it had a complete list at the time and subjects have been added to some magazines AND Patrons don't know how to search for magazines.
Unfortunately, linking from catalog records for magazines to databases that contain those magazines is not as simple as it seems. Databases like Zinio, PressReader, etc., contain thousands of magazine titles, and MLS does not subscribe to all those titles in paper format. Cataloging staff would have to go through the entire list of periodicals in a database, determine whether MLS owns that periodical title, and insert a link to the database. It would be highly time consuming and would significantly delay Cataloging’s primary goal of getting new materials into users’ hands as quickly as possible.
A secondary issue is that these databases change the titles that they offer frequently, and without notice to libraries subscribing to the database. This would mean that even if we were able to add links to the databases to catalog records, there would be no guarantee that the links would work. I can only imagine the number of frustrated members that the broken links would cause! Just maintaining those links would be almost a full-time job.
In the future, if MLS subscribes to electronic versions of specific periodical titles, there is no reason that those links could not be added to catalog records. Also, Cataloging is beginning to explore adding catalog records for entire databases, as a means to inform users of their availability. I’m not sure that all databases are good candidates for that, but some may be.
If Materials Selection provides a list of all magazines, could the general staff be able to add links to a spreadsheet so that all of the work wouldn't be confined to Cataloging? I know that this would be much work, but I believe that it is important. If it is too much that public floor or behind-the-scenes staff doesn't do it, then the Membership won't do it for sure.
When I read Materials Services Division Minutes - December 03, 2015 I was prompted to create a spreadsheet to figure how many of our magazines we had a electronic offerings. I failed because I couldn't be certain that I had a list of every magazine, but I was able to work backwards. Of our Zinio holdings (way back then...) 157 of 293 entries had a physical counterpart in our collection - meaning that I found an entry in LS2OPAC. I know how frustrating broken links are because I find them, but not having a link and having a broken link lead to the same place. Maybe we could try with just Zinio as a pilot?
Alternate title entries can quickly become a slippery slope. One thing that I have discovered in my years working in Cataloging and Interlibrary Loan is that no one can anticipate all the ways that users will search for a given title. It seems easy enough to add an entry for “Time magazine” on the assumption that users might search for that title, but when you start to consider all the possible variations of a title like “Consumer reports”, things get complicated.
I am also not sure whether this is a major problem for users. Online catalogs make it easier for users to find what they are looking for even with an inexact search. In your example, a search for “time magazine” in the OPAC will actually retrieve the record for the magazine. A search for “people magazine” will not, but a search for “people”, filtered by magazine, will. Without hard evidence that users are consistently or frequently unable to get to records for materials that they want or need, I prefer not to insert added entries.
I started to go thru the list of magazines that I received from Greg and add tags to them with their titles appended by Magazine. This was very tedious and I learned that we were gaining a federated search so my efforts may have been for naught. I did not make it to Time nor near the letter T going alphabetically, but in my preparations for my comment I searched for "time magazine" and filtered and clicked it which automatically creates the result or added it manually - I cant' remember. (That is not to be read as me thinking that automatic tagging is an only positive thing...) I have now manually added the entry "people magazine" as a tag which makes searching for "people magazine" no longer a false negative search. I also added subjects to the tags for magazines, but I learned that at least one subject, Lesbian, is not allowed as a tag for Out or Advocate which further defeats the purpose of going thru and tagging the magazines to improve their search-ability since I couldn't do it for all of them anyway.
It seems that the methods we currently have work fairly well; your count of 541 titles is accurate. I can understand your concern about the fact that items for some issues of magazines have been coded with the media code “book”; and that there are some “book” records in the catalog that actually represent magazine issues. Luckily, these few items don’t throw off the total count. As long as at least one issue of a magazine has the media code “magazine”, and at least one record for the magazine title is created as such, a search for the MZ collection will retrieve the title.
I am not sure why some issues have been coded as “books”, but I am sure it is an easy error to make. The “book” records for individual titles were mostly created in the early days of CarlX, when magazine checkin was still being figured out. If a member wanted to check out an issue, and the issue did not appear in CarlX, staff sometimes created temporary records to allow the issue to check out. That should no longer be happening.
You are correct that the search provided in the ILS Q&A does not retrieve all magazine titles. It is a flawed search in that it is based on the assumption that the term “frequency” appears in all records for magazine titles; it does not. It appears in records that were machine-generated from the old system, but not in new records generated by Cataloging. To locate records added by Cataloging, replace “frequency” in the search with “periodicals”. Unfortunately, I have not been able to come up with a method to combine both searches; LS2PAC does not seem to offer an “or” search option.
The CarlX Staff Client contains many more records for magazine titles than are retrieved using either the MZ collection search or the OPAC. Approximately half the titles listed no longer have active holdings; no MLS library currently subscribes to the title, and all issues have been withdrawn. The advantage of using the OPAC to search for periodicals is that it only displays results for titles with active holdings.
I would think that MSL could provide a list of all magazine titles that our libraries currently subscribe to, if that would be helpful to you.
no:"frequency" && facets.collection:"MAGAZINE" && facets.format:"Magazine" yields 51 results. su:"periodicals" && facets.collection:"MAGAZINE" && facets.format:"Magazine" yields 450. Combining them with
(no:"frequency" || su:"periodicals") && facets.collection:"MAGAZINE" && facets.format:"Magazine" does yield 501 results. I don't know that this is an exact list, but it likely is closer. Out of curiosity, what makes LS2OPAC categorize Popular Mechanics as a Book? The record is very short, not that that means that I understand 3 out of 4 of the fields.
The record for Popular Mechanics that displays as a book is a temporary record created by a Del City staff member. The "record" is very short because there really is no record, just a title. Temporary records always create as books, no matter what the media actually is. I'm not able to say why a staff member created a temporary record, unfortunately, but the record will show in the OPAC as long as it has an active item attached to it. When/if DC withdraws the item, the record will no longer appear in the OPAC.
I must have had a real lapse in intelligence because I swapped location codes with format... I guess because they are both facets...
The best examples of what I was trying to ask is probably Il sorpasso and Dogs on the inside. Why are these artifacts?/How does LS2OPAC decide formats?
I'm curious because I have tried to use the facets to filter results, but I have found them to be erroneous. The biggest error is DVD because it is such an attractive click away. DVD only contains a fraction of the Visual Materials facet (which doesn't contain all of the DVD either). Of course I run into a similar issue with location codes as evidenced in your previous comment.
LS2PAC determines facets such as format and collection based on codes in four MARC record fields: 000, 006, 007, and 008. Fields 000 and 008 are present in all records; 007 is present only in records for nonprint (which includes all AV and e-media); 006 is present if a record describes material in multiple formats (such as a book with a CD).
Each of these fields consists of multiple codes. Each code is a single alpha-numeric character. The meaning of each code is determined not only by the character, but by its position (byte) in the field. The fields look completely confusing when you see them in a record, like a string of random characters and/or spaces. Even people familiar with MARC records need a translation device to make sure that all characters are in the right places!
There are several problems with our catalog data and the way that LS2PAC interprets it. A fair number of the records imported from our old catalog were missing essential fields, usually 007. Our former catalog system did not accept 007 fields until about 10 years ago, so any records added before that time don’t have the field. Many of the 007 fields that existed in our catalog records were corrupted during import into CarlX; the field is present, but the coding doesn’t correlate to what the material actually is. Both these situations contribute to records displaying with very generic formats, like Visual Materials; or as being in formats that MLS does not own, like laser disc.
These records can be updated with the correct field coding, but it takes time. Cataloging does not have a complete list of the affected records. As we locate them or they are reported to us, we make corrections. However, we cannot spend so much time on corrections that we fail to get new materials out to libraries.
Another issue is that LS2PAC does not always interpret data correctly. A great example is that LS2PAC displays 4 records with the format type Blu-Ray, even though MLS does not own nor circulate Blu-Ray discs. Each record describes a DVD with an accompanying CD. For some unknown reason, LS2PAC translates the codes describing the CD as referring to a Blu-Ray. TLC is working to resolve the problem. Fortunately, it only impacts a small number of records.
It appears that CarlX sometimes randomly changes coding in records. In the case of the records for the two titles that you cited, “Il sorpasso” and “Dogs on the inside”, a single alphanumeric code was changed. This led to the records displaying as artifacts rather than DVDs. The records were coded correctly when they were uploaded, and I have no idea what happened to them after that.
In our old system, MARC records were sometimes corrupted. I used to say that Cataloging tried to give them a good start in life, but sometimes they just fell in with the wrong crowd, and were exposed to corrupting influences. Now it seems that on some days Carl just doesn’t like Cataloging; at least, that’s the only explanation that I have come up with.
Please feel free to let Cataloging know when you encounter records that display with incorrect formats. We’ll do our best to get them corrected quickly.