Horn Library Content Strategy

Content Assessment

Assessment of information resources is a fundamental function of Horn Library and part of the content management life cycle.  Evaluating how information sources fit with our undergraduate and graduate curricula, how they meet information and research needs, their ease of use, accessibility, and their actual usage all contribute to assessing the impact and value of resources to the college.  Content assessment is highly collaborative, drawing on the expertise of the library's research and instruction team and the librarians who are liaisons to academic divisions.

E-Content Evaluation


  • Breadth and depth of coverage  (incl. interdisciplinary, global, multicultural)
  • Uniqueness and value-add
  • Quality & authority of sources/information
  • Academic and/or popular content
  • Types of content (e.g. articles, papers, book reviews, reports, company profiles, market research, statistics, etc.)
  • Textual and non-textual content/formats (e.g. video, audio)
  • Selective vs complete source coverage
  • Supports curriculum and assignments
  • Supports faculty research
  • Supports job or career search & business planning
  • Up-to-date content (update frequency, embargos)
  • Completeness of archive
  • Accuracy of indexing, categorization
  • Full-text vs. abstract/indexed
  • Images, charts, graphs included in articles
  • Full content vs limited content for academic institutions
  • Comparison with print counterpart
  • Comparison with vendor's content offerings in aggregator services


  • Does the source fill a gap in our collection
  • Is it duplicative; if yes, of which sources
  • How does it compare with similar sources


  • Ease of use/interface
  • Search functionality, i.e. simple, advanced, filters, etc.
  • Relevance of results
  • Unique search features or limitations
  • Integration with other databases by the same vendor
  • Saving searches
  • Help features and tutorials on website


  • Options for presentation and customization of results
  • Variety of download, export, and print formats
  • Image availability for charts/graphs/pictures
  • Availability of image of original article


  • “My Briefcase” type features
  • Customized alerting & RSS capabilities
  • Creation of own content subsets


  • Robustness, reliability
  • IP authentication (not username- and password-based); SSO
  • Referring URL
  • Federated search compatibility
  • Open URL compliant, e.g. Article Linker, Fulltext Finder
  • Browser compatibility (multiple preferred)
  • Usage tracking/statistics
  • Response time
  • Availability of MARC records, if applicable


  • Number of users overall and simultaneous
  • Number of searches or sessions
  • Content viewed or downloaded
  • Academic  and non-curricular applications
  • User group, e.g. students, faculty, staff, Exec Ed, alumni, walk-ins
  • On-site  vs. remote
  • Access vs. ownership
  • Account set-up and privacy (by librarian, by user)


  • Trusted vendor/long-term viability of vendor
  • Customer Service/Help Desk support
  • Training provided by vendor
  • Provision of usage statistics
  • Contractual issues
  • Billing issues
  • Responsiveness of rep
  • Works with consortia


  • Cost model, i.e. overall price or price per session or per content view/download
  • Pricing model, e.g. based on FTE, Carnegie Classification, usage
  • Hosting fees and frequency
  • Back files, archive
  • Vendor discounts
  • Consortia discounts


  • Product reviews
  • Comparable libraries subscribe
  • Libraries in consortia or professional associations we’re part of subscribe, e.g. Academic Business Library Directors (ABLD)

Initiate Trial (if new resource)

  • Initiate personal trial with vendor or upon request, trial for a limited audience
    • Update Electronic Resource Management System (ERMS) with contact information and resource status information
    • Get ballpark pricing information up-front
    • Find out up-front whether source will work based on IP recognition
  • Assess whether to roll out product trial to larger group
    • If yes, provide trial details and evaluation worksheet to research librarians
    • Worksheet is on Google Drive and includes five major categories for commentary:
      • Content (coverage, quality, etc.)
      • Curriculum & faculty research suppot
      • Usability
      • Comparison with other sources
      • Anything else. 
    • If not, document preliminary assessment and file on Share drive; update ERMS accordingly

Product Evaluations/Comparisons

  • Gather available information, e.g. brochures, reviews (if new source under consideration), usage statistics (if subscribed-to source), cost information, etc. 
  • If possible, obtain license agreement to aid in evaluation process
  • Identify deal breakers that will make further analysis unnecessary, e.g. if passwords are required, authorized users are too limited, costs are too high, etc.
  • Ensure feedback form includes the views of:
    • Research & Instruction librarians for classroom & curricular perspective, subject expertise, and user perspective
    • In some cases Faculty – for their assessment and buy-in
    • Consider views of Electronic Resources Librarian as to technical issues
  • Arrive at decision based on feedback and availability of funds (if new resource)


  • Ensure ERMS and Share drive are updated with relevant details and documents (copy Google document into Word and file on Share drive)