Over the past couple of years, digital badges have become a hot topic in higher education. In his poem “Thirteen Ways of Looking at a Blackbird,” Wallace Stevens writes that he was “of three minds / Like a tree / In which there are three blackbirds.” Me, I’m of many minds, like a professor in a university in which there are proliferating ways of recognizing student learning and experience. Still, I think that badges—as a flexible way of recognizing learning, if not as a term—are going to persist.

Let’s begin with a round-up of what badges are and do.

As expressed by badge enthusiasts, there are multiple ways to approach digital badges:

Badges, of course, have their skeptics. Among the most-cited reasons for skepticism is that, as Henry Jenkins points out, badges emphasize extrinsic motivation over intrinsic and may impose hierarchy, structure and system in informal learning environments where system is not a desirable feature. Badges are the wrong way to motivate students because students might, in Mitchel Resnick’s words:

focus on accumulating badges rather than making connections with the ideas and material associated with the badges— the same way that students too often focus on grades in a class rather than the material in the class, or the points in an educational game rather than the ideas in the game.

Another criticism arises from the “wild west” nature of badging, and represents an anxiety about credentialing free-choice, informal learning experiences. Anyone can award badges. Jackie Gerstein writes:

Another possible flaw in and potential downfall of this system revolve[s around] the difficulties and dilemmas of deciding what the badges represent, how one earns the badges, and how badges will be standardized for recognition of “institutions” of learning and of employment. This lack of consensus about the meaning of badges will create further problems once the learner leaves that learning platform. What value will the badges have in unrelated institutions?

When I bring up the topic with my colleagues, their response is skepticism—and almost inevitably a rendition of “Badges? We don’t need no steeenkin’ badges!” But maybe we do need badges—just not in the form they have thus far been presented to us. “Badges” is a useful term, I think, for describing interdisciplinary learning experiences that form a more-or-less coherent cluster of related knowledge and skills—much the way a major or minor traditionally has within disciplines and departments. And unlike the interdisciplinary concentrations that have sprung up at both undergraduate and graduate levels—I myself earned a “designated emphasis” in Feminist Theory and Research—badges might allow us to indicate that learning transgresses the boundaries of formal and informal learning.


Badges aren’t, of course, the first alternative scheme for acknowledging student learning and skill acquisition, but they are a part of a cultural moment where skill development is being emphasized even within intellectual environments. (Witness the exhortation that everyone should learn to code, including humanities scholars like myself.) Eric Landrum writes:

we need an enhanced emphasis on pedagogical practices to help students acquire skills. We need to devote expertise and resources to develop multiple measures of skill competency to assess and document both student achievement and institutional performance.

Landrum points out that students don’t retain knowledge long after they finish their courses (or even long after each class meeting, I might add), and thus, he argues, it doesn’t make sense to keep knowledge development and retention at the core of the undergraduate curriculum. I’m not entirely persuaded by Landrum’s argument—I don’t see the knowledge/skills dichotomy being as clear-cut as his article suggests—but I think we as faculty might trust students (at least some students) to map their own archipelago of courses and learning experiences.

Does this mean I think universities should adopt the model proposed for the New University of California, a “university” offering no instruction but conferring degrees if students demonstrate proficiency through tests? Not at all. Nor does it mean I’m ready to embrace Landrum’s every-skill-is-measurable approach to assessing students’ readiness for graduation.

When I say I want faculty to grant more autonomy to students in designing their degrees, I’m drawing on my own experience as an undergraduate. Classes like landscape architecture, detective fiction, the history of photojournalism, museum literacy, French poetry, collaborative video art, a cultural studies methods course on television studies and others may look like a random assemblage of whatever-was-offered-that-quarter, but taken collectively they constitute a meaningful constellation of knowledge and skills.

At the same time, I was doing a lot of work that wasn’t formally recognized on my transcript, but which accelerated both my knowledge development and skill acquisition. I worked at a science center, first as a classroom outreach specialist, then exhibition developer, then program evaluator. I started a couple of blogs, one personal, one on museums. I attended museums conferences and met all kinds of interesting people. This work helped me cinch my first post-Ph.D. job, in academic technology—again, a subject in which I had no formal training. That job flowed into one in higher ed pedagogy—again, something I had not studied formally. Those two jobs, along with a notion that I could write a dissertation based on archival material—for which, surprise!, I had no formal training–helped land me my current position teaching public and digital history in a history department.

My experience may look atypical (at least for an academic), but increasingly it is—or will become—the norm. It’s hard to capture all these experiences, however, on a traditional one- or even two-page résumé. Might there be a way for institutions of higher ed to formally recognize this cluster of skills that I developed half in the classroom, half out of it?


I think what most disappoints me about badges is that too often their implementation is facile; it smacks of the every-kid-gets-a-trophy soccer tournament. Attend a conference plenary session? You get a badge. Get a D- or better on an exam at the end of an online course? You get a badge. Complete one week of a MOOC? You get a badge.

I suspect I’m not alone among faculty in seeing badges as little more than a not particularly meaningful gamification of learning. That doesn’t mean, however, that the concept of the badge is completely useless. In fact, we might take advantage of the current trendiness of badges to sell university leaders on investing in a true restructuring of the curriculum that benefits both students and faculty.

What if, instead, we thought of badges as a variation of the minor or even the major? Many colleges and universities offer programs where students can design their own majors or create custom interdisciplinary concentrations that serve much the same function as minors. Traditionally, these paths required intensive faculty advising and mentoring of students. In a digital age, however, it’s possible to assign keywords, tags, or categories to courses. Students could propose a course of study from the offered classes, with the database suggesting courses that might be compatible based on these tags and a recommendation engine that recognizes patterns of student enrollment; students also might be limited from taking too many courses in the same category. As part of this system, faculty could designate a cluster of courses across disciplines that could function as an alternative to a major or minor. These clusters become badges, and students could be both creative and strategic about which combination of badges they pursue.

Let’s consider some possibilities in a field like history.

A traditional course sequence for history majors includes several foundational survey courses based on geography and/or era, followed by more in-depth seminars on particular regions (e.g. North Africa), time periods (the Reformation), or themes (war and genocide). Students also might take one or two courses that teach the conventions of writing in the discipline. If faculty structure their courses thoughtfully and students actually do the (often volumnious) reading and participate fully in class, the history major will graduate with the ability to conduct research across multiple media, analyze primary and secondary sources, synthesize ideas and evidence into an argument and communicate effectively in writing. That’s the ideal, however. In an era of growing class sizes, adjunctification (in which faculty teach ever-greater numbers of students as they piece together full-time work across several colleges and universities) and MOOCs, students aren’t held to the same levels of accountability as they might have been in the past. If courses focus on retention of knowledge—as demonstrated, for example, on multiple-choice or short-answer exams—rather than student research and writing, students’ critical thinking and communication skills might be underdeveloped at graduation. Furthermore, students may lose interest in engaging intellectually with history if they are subjected to too many required courses that just aren’t interesting to them.

Instead, students interested in history might pursue a series of badges that emphasize complementary knowledge and skills. These may or may not be related to employment. Say a student wanted to pursue a career in history outside the classroom—a field typically termed “public history,” with jobs traditionally in museums, archives, and government agencies, but also increasingly encompassing game design, film consulting and other digital content development. In lieu of a traditional history degree, she might pursue badges in Original Research (Humanities), Communicating Knowledge to the Public and Digital Media Production. The cluster of courses for the Original Research badge, for example, might include several history or literature seminars that emphasize research and writing, a statistics course, an internship in the state archives and a capstone course in which the student produces a significant work of original research—perhaps a traditional essay, a mini documentary, or a carefully curated and interpreted repository of digitized primary-source documents and artifacts.

A student who would have traditionally majored in English or a foreign language might pursue badges in Cross-Cultural Literacy, Digital Publishing and Translation. A gender studies major might opt instead to pursue badges in Original Research (Social Science), Advocacy and Activism and Cross-Cultural Literacy. A studio art major might seek badges in Visual Literacy, Digital Media Production and Theory and Practice of Creativity. The last of these might include courses representing traditional studio practice but also philosophy, psychology, cognitive science and arts and humanities pedagogy, as well as observation of K-12 students in music, art, lab science and creative writing classes. A student who wants to start up her own fashion design studio might pursue courses in drawing, graphic design, accounting, marketing and textiles to earn a badge in Arts Entrepreneurship.

Badges might thus comprise knowledge and skill development appropriate for employment, yes, but also emphasize the kinds of broad liberal-arts-and-sciences training and cultural literacy essential for contributing fully to civic life. Students with traditional majors and general ed courses on their transcripts have difficulty articulating to employers and others why their particular degrees are useful. Because a student can’t explain how her knowledge and academic experiences transfer to the workplace, a nonprofit’s director might not see the immediate benefit in hiring a history or anthropology or philosophy major. Rethinking traditional majors as badges like those I describe above might help students, employers and other stakeholders better understand a student’s areas of experience and expertise.


All of this is, of course, merely a rough sketch, but it draws on and takes advantage of several cultural, economic, and technological trends:

  • a decline, especially in regions like Idaho, in middle-class jobs, due to automation and overseas outsourcing
  • the availability online of all kinds of more or less formalized learning, including MOOCs, language-acquisition software, digital community-development, entrepreneurial opportunities, and more
  • low graduation rates at many universities, and an insistence from parents, students, and state legislators that these rates improve
  • an increasing interest in competencies instead of credit hours
  • “big data” computing capacity on campus that might be used to recognize patterns of student course-taking; track the placement of graduates, the courses they pursued, the skills they use in their new jobs, and their satisfaction in their lives and careers, and match these to current students’ (as yet unrecognized) interdisciplinary curricular clusters; and the development of recommendation engines that could identify course clusters that students might find useful in good careers

Does this mean a ton of extra work for faculty?  It shouldn’t. Many of us already help students figure out which courses will most help them in securing jobs in their chosen fields, as well as determine which courses from other colleges students have attended might transfer for credit. Student advising staff, the registrar’s office, departments and deans might all work together to draft guidelines for what constitutes sufficient quantity and quality of work for a badge.

Perhaps the meaning of “digital badges” has already been too diluted by being issued in so many varying contexts. Still, I foresee a paradigm shift in which universities will increasingly need to recognize different forms of learning.  Students, who more and more perceive themselves as consumers, will choose universities that are less constraining about general education and major requirements and offer a more flexible path to graduation.

Share:Tweet about this on TwitterShare on FacebookShare on LinkedInShare on Google+Share on TumblrBuffer this pageEmail this to someonePrint this page

The views and opinions expressed here are those of the writer and do not necessarily reflect those of Boise State University or the School of Public Service.

  • It’s a relief to see a measured critique of badges that is not totally dismissive. We do need alternative forms of assessment, and we need those assessments to be meaningful.

    My personal hunch is that the employment marketplace will make them meaningful. Perhaps your colleagues who say “we” don’t need badges are right about academic settings. But with the credentialing bubble, enrollment bottlenecks and out-of-control costs in higher education, employers do need some other form of assessment besides the degree.

    Robert McGuire
    Editor, MOOC News and Reviews

  • I agree…an excellent, well-balanced article! I also think that badge-based training courses, MOOCs and other alternative “skill validation” approaches will be extremely beneficial for niche, professional learning scenarios. For example, we offer eLearning and skill development for project, innovation and product management professionals or seekers. There really aren’t many traditional, formal education opportunities for these learners, so having an option to recognize and validate self-learning via online lectures, homework in the form of follow-up reading, concept development/skill practice via templates, assessments, work processes and simulation-based experiential learning becomes a valuable option to both the learner and potential employer.

  • Thanks for your comments. I agree about needing alternative forms of assessment because higher ed institutions are experiencing the issues Robert describes: enrollment bottlenecks and escalating costs in particular.

    I think the challenge for providers of alternative assessments will remain quality control; there are plenty of badges for completion, and those are easy enough to automate, but awarding badges based on quality of work or extent of learning is another beast altogether. It’s akin to the difference between taking class attendance and evaluating students’ final projects for a course. The first is simple; the second takes some savvy and–speaking from experience–a good deal of time.

    • James

      Are you interested in connecting with me? 🙂