Abstract

How we measure success This issue, volume 15, number 2, is certainly one of our most substantial. Over the past couple of volumes we’ve averaged about 85 pages per issue, and this one is clocking in at over 150. We’d like to say that we’re just cleaning up a backlog of accepted manuscripts, but the truth is that we’re receiving more and more quality submissions. Since the date of the initial submission included in this issue, we’ve received almost 200 manuscripts for consideration. And this uptick in quality is not going unnoticed. We’ve had over 46,000 articles downloaded since the publication of our last issue. We’ve averaged almost 104,000 articles downloaded every year for the past five years, and we’re on pace this year to exceed that number. Although this means more work for our tireless editorial board, this is a happy problem to have. If you’re interested in contributing to the field by joining our Editorial Board, you can complete a self-nomination form at this address: https://forms.gle/4VT2hbcW33uLJEEh9. Since moving to our current platform, we’ve had almost 700,000 articles downloaded, with another 71,000 complete articles read online. We now have 595 articles online (this issue will make it 610). And while the math here doesn’t account for the relative popularity of each article, a simple back-of-the-envelope calculation says that articles are downloaded an average of about 1,000 times in a decade. We would like to be able to offer authors a more concrete and reliable methodology for determining the penetration and effectiveness of their work. We could be very precise, and offer some cross-discipline comparisons with metrics like the impact factor of an article or an entire journal, a citation analysis for an article, or a particular author’s h-index. But while the SoTL field has continued to gain more practitioners and garner more attention from scholars, it still has difficulty articulating its own legitimacy, and therefore its justification for inclusion in the analytic tools that sit behind the metrics above. The Social Sciences Citation Index (SSCI), used by the Web of Science in determining the impact factor of a journal or a single article, indexes 1,645 journals on teaching and learning, yet fewer than a dozen of these are not specific to a particular discipline. Scopus (run by Elsevier) provides three separate metrics: CiteScore, SJR (SCImago Journal Rank) and SNIP (Source Normalized Impact per Paper). Its database contains over 22,000

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call