Dec 5, 2011

Are Library Degrees Becoming Computer Science 'Light'?

I'd like to thank Professor Lankes for introducing the term 'computer science light'. He brought up the term when a student in our class last year asked him basically, "We keep hearing about all these encoding languages, programming languages, schemes, systems, etc... should I be learning all of these?" His response argued that, no, we shouldn't treat the field of study like "Computer Science light", especially because: why wouldn't you just then take Computer Science? You would basically be setting yourself up to be not as good as a Computer Science major, let alone a C.S. graduate student. He then went on to explain how, yes, it is valuable to learn and get some degree of expertise in these technologies, but at the end of the day, this shouldn't be our primary focus.

After a year removed from that lecture, which I fully agreed with at the time, I'm not sure that holds true anymore. Sure, his argument is sound and that's what the ideal situtation would be, but I'm finding it doesn't really hold true in the library professional world. Budgets are tight. Libraries cannot afford a fully-staffed IT Services department anymore to handle the 'heavy-lifting'. Learning these technologies not only increases your worth to potential employers, but it immediately makes you qualified for job postings that ask for 3-5 years experience, because, guess what? In 3 to 5 years of digital librarianship, you can't help but learn many of these technologies. To be a digital librarian these days, you are expected to know an alphabet soup of metadata languages, XML, TEI, Web design, databases, and programming. That's the stark reality of the profession right now.

This goes back to the fundamental friction between librarianship as a professional practice, and librarianship as a scholarly pursuit. We still can't really figure out if the degree should work almost like a trade school; after all, librarianship, although varied, ultimately is a fairly homogeneous profession, and Professor Lankes himself argues that we shouldn't be splintering ourselves into over-specialized groups (unfortunately, though, positions such as 'Web services librarian', 'reference librarian' and 'metadata librarian' are the norm). SU's approach of having a 'core' program that you can then adapt to a certain specialization (which I think is a good approach) is turning out to not properly reflect the marketplace. These specializations, like Digital Libraries, School Media, and eScience, are all under attack, and in danger of being abandoned. They're at a crossroads where they either return to a broad umbrella of a program, or splinter even further into highly specialized sub-degrees.

The response may be to make Digital Libraries and eScience programs even more involved, with more mandatory technology courses in order to produce qualified grads. But then we truly are making a computer science light profession.I'm not sure what the solution is.

Video Games are Hard to Archive

Today I read an interesting article that took a critical look at the Smithsonian, who have an exhibit entitled "The Art of Video Games". The article argues that many 'classic' games have been omitted, because the poll conducted for what items to include resulted in only hugely popular, home console games.

I've dealt with how to go about archiving video games before. It's a fairly rich history, and from a certain viewpoint it seems simple, since it is so recent and many of the 'originals' are still in existence. But as I found out, it is very hard to get the history straight, organize the items, and above all else describe them. I attempted to do as much in my metadata schema that I adapted from Dublin Core and EAD (here), but I frequently ran into decisions that had to be made. Would I include arcade games? Ultimately I decided to, just because arcade games are vital in video game history, despite their insignificance today. I also included PC games, despite their relative obscurity.

Underlying all this is the issue of whether video games are art in the first place. I've argued they are in the past, but not in the same way others do. That is, many people want to compare video games to existing forms of art in order to find some legitimacy. I am of the opinion that people will only fail with this comparison. I think the way to do it is to recognize video games as a new form of art, instead of trying to awkwardly compare it to completely different media. Was it important to describe how a photograph was similar to a painting? Or was the real breakthrough to develop the photograph as a wholly different art medium? I would argue the latter, because initially all comparisons between a photograph and a painting resulted in the photograph coming up short.

I'm still skeptical that video games will see legitimacy in the world of archives and the arts, but there is hope, simply because of the popular growth of video games, with some grossing more in their first days than some movies make in an entire year. However, as the article points out, the legitimacy starts to evaporate if we judge the artistic merits of a game based on sales. There is significant overlap, of course, because the NES Super Mario Bros. is both an artistic high water mark and a commercial blockbuster. But much in the same way as independent and foreign films, whose commercial success is very small and yet receive critical acclaim, we must document the video games that had something to say, or presented something significant, despite being commercially insignificant.