Contributor: Jenae Druckman Cohn
Affiliation: Stanford University
Email: jdcohn at stanford.edu
Released: 14 August 2018
Published: Fall 2018 (Issue 23.1)
When students compose in digital, public-facing spaces, they take on the dual role of writer and designer by considering how the visual and spatial design of web-based content may shape a reader’s understanding of their work. As Kristin Arola, Jennifer Sheppard, and Cheryl Ball (2014) attested, “both design and content influence how audiences respond to a text’s message, so developing familiarity with design practices as well as textual composition is critical for successful communication” (p. v). Considering the audience’s responses to the combination of textual composition and visual design is a key component of multimodal pedagogy, yet conversation about how the combination of text and design influences readers has not fully made its way into ePortfolio pedagogy.
Scholars interested in ePortfolios have explored the ePortfolio’s role in the composition classroom from a variety of perspectives, including student-centered assessment practices (Eliot et al., 2016; Kelly-Riley, Eliot, & Rudniy, 2016; Silva, ey, Cochran, Jackson, & Olivares, 2015; White & Wright, 2016), the development of metacognitive thought (Bokser et al., 2016; Bowman, Lowe, Sabourin, & Sweet, 2016; Colby, 2005; Munday, Rowley, & Polly, 2017; Yancey, 2016), the development of audience awareness (Benander & Refaei, 2016; Cambridge, 2008; Clark, 2010; Gallagher & Poklop, 2014; Jones & Leverenz, 2017), and the cultivation of digital composing skills for visual design and accessibility (Klein, 2013; Oswal, 2013; Willis & Rice, 2013). What I am interested in exploring is a more deliberate engagement with how composing as writers and designers in the ePortfolio space can impact the user experience, whether that’s simply for the instructor as assessor or whether it’s for a broader public. Indeed, if we are going to take seriously the “e” side of constructing an ePortfolio, it is valuable to consider the ways in which a public-facing web presence may impact how users engage with and understand the experience of reading an ePortfolio.
User testing is already a popular activity in technical communication courses (Bartolotta, Newmark, & Bourelle, 2017; Blythe, 2001; Miller-Cochran & Rodrigo, 2006; Salvo, 2001), but there is little documentation of how user testing exercises could be implemented for helping students assess the efficacy of ePortfolio projects. By employing the language of user testing to help students understand the impact of their ePortfolio on an audience of their choosing, students may better understand how to make effective web design choices and how to identify those effective web design choices in their peers’ work. In so doing, the reflective exercises that are already part of ePortfolio construction become manifest at the level of design; students engage in reflection not only about their writing, but also reflect on their design choices and how others have responded to their design. Composition scholars argue that metacognition can be a factor that facilitates transfer more readily; therefore, the value of constructing an ePortfolio may become even more transferable to contexts where students may continue to curate and design content for public-facing digital audiences (Jarratt, Mack, Sartor, & Watson, 2009; Nelms & Dively, 2007; Peters & Robertson, 2007; Wardle, 2007). Activating awareness of how design and navigability choices impact communication choices also advance the affordances of the ePortfolio itself, making more visible the ways in which the digital space may especially help students reach their intended audiences (Klein, 2013; Willis & Rice, 2013).
This webtext will detail a series of user testing activities that can be implemented into classes that employ ePortfolio pedagogy and where the final ePortfolio product is intended to be used as a showcase of the student’s work. This activity could also be used in classes where the final ePortfolio is intended to be evidence of a student’s learning, but it may have greater value for students who intend to show their ePortfolios to audiences outside of the classroom. The intention of this user testing activity is to offer a peer or self-assessment framework for students that can give them a clearer sense of how navigating an ePortfolio may impact their ability to understand the ePortfolio’s content and the story that the ePortfolio tells about a student and their writing experiences.
The following exercises were developed in the context of two upper-division ePortfolio preparation classes within Stanford University’s Notation in Science Communication (NSC) program. The NSC allows students to pursue a “mini-minor;” they take three classes within Stanford’s Program in Writing and Rhetoric and two classes within the sciences that have a writing-intensive focus. In addition, their required capstone project is to create a showcase ePortfolio that reflects how they see themselves as science communicators. This showcase ePortfolio is introduced as a way for students to communicate what they’ve done in the NSC program to outside stakeholders like graduate admissions committees and future employers.
By encouraging students to create a showcase ePortfolio, the program aims to help students develop agency over their writing, understand questions of digital identity, and consider how the curation of their writing may change for different audiences, per the work of other ePortfolio initiatives (Clark, 2010; Gallagher & Polkop, 2014; Jones & Leverenz, 2017). To advance these goals, the students choose which work they’d like to include in their ePortfolios, and many students include a variety of works, ranging from research papers to scientific posters and podcasts to policy briefs, to show the breadth of work they’ve done to communicate both with specialist scientific audiences as well as popular audiences who may not have a scientific background. The program encourages the inclusion of artifacts across this range of modes and genres with the awareness that scientific discourse communities are particularly interested in producing, consuming, and circulating multimodal content (Reid, Snead, Pettiway, & Simoneaux, 2016).
As an Academic Technology Specialist at Stanford University, my role is to offer undergraduate students and faculty alike guidance on how to best leverage digital tools for learning. Because the ePortfolio is such a central component of the NSC, I frequently plan lessons within the portfolio preparation classes to help students think carefully about the choices they make in designing and building their ePortfolios. These activities are ones that I developed to ensure that our students’ work in the ePortfolio creation software our campus uses (Digication) was not just focused on aesthetics for aesthetics’ sake, but also considered how design was interlinked to content and the experiences of the multiple audiences to whom they are creating their ePortfolios in the first place. As a technologist, it is always my goal to link the use of tools with digital pedagogies. I see metacognitive engagement with the affordances and limitations of particular tools, like our ePortfolio platform, as essential to the work of helping students become more critically engaged with how the technologies they use impact the communication of their ideas. Indeed, were we to use another ePortfolio platform or allow students to choose their own ePortfolio platform altogether, the same metacognitive principles would apply.
Figure 1 (below) offers a visual guide to the six steps I’ll discuss in detail with text and image in the two following sections. The first section, “Scaffolding the User Testing Exercise,” explores steps 1–3 while the second section, “Conducting the User Testing Exercise” explores steps 4–6.
In this section, I detail the first three of six steps for this user testing exercise, exploring the ways in which the experience of user testing students’ ePortfolios can be scaffolded. Before students began considering the design of their own ePortfolios, I put students in the position of curating another student’s content so that they can make choices at a critical distance from their own work. This critical distance — of organizing someone else’s artifacts and thinking through their own experience both as a potential author and user — allows them to conceptualize more clearly what they would eventually have to do with their own ePortfolio. Additionally, organizing someone else’s artifacts provides students the opportunity to develop the metacognitive awareness around the task of curation itself. By justifying organizational choices, students gain the ability to reflect on their upcoming curation choices in their own ePortfolios.
For the first part of the user testing exercise, students are organized into small groups of three to four students. Each group receives a stack of papers; each piece of paper has a description of an artifact that a student has put into an ePortfolio. For example, one piece of paper might say, “Literature review about genomics research” or “Podcast on the effects of climate change on a small town in Ecuador.”
The students are then given the task to group the artifacts together into thematic clusters that will make up the navigational menus of the Portfolio. That is, the students must decide how, if they were the curators of the content, they would help a reader understand and navigate what the relationships are between the artifacts or what stories the artifacts might tell about who the student is as a science communicator (see Figure 2 for a clearer illustration of what this activity looks like).
The small groups of students then share the logic of their artifacts’ organization. For example, they will answer questions like, “why did you group together a literature review essay with a scientific poster?” and “why did you group together a podcast with a blog post?” The result of this exercise is that many students wind up curating content into categories based on audience, genre, or topical theme. This exercise reveals to students some of the different ways that their own audiences will read and understand the organization of their artifacts, and it is up to them as designers of that content to help their audience understand what the groupings of artifacts attempt to say about their work.
After deciding on a way to group the artifacts, the students then have to write a one-sentence narrative that explains the “story” of the person whose artifacts they’ve grouped. One example of a story might be, “This person’s ePortfolio reflects how they are able to communicate scientific topics to specialist, popular, and business-minded audiences.”
Once students come up with their one-sentence narrative, they move to identifying a color scheme and a visual metaphor to help tell the ePortfolio’s story in a visual, rather than simply a textual, form. While the organization of content is an important component of user experience, the visual design is also central to helping the viewer understand precisely what the ePortfolio aims to communicate (Munday et al., 2017). Students have a limited period of time to accomplish these tasks, and they can either draw what they think the visual metaphor should be or find an image online. Similarly, for the color palette, they can find a palette online to share or they can use colored markers or pens to illustrate what they think the color palette should be. The results tend to be creative: a group of students who claimed that their ePortfolio showed a science communicator’s journey to becoming an international activist chose a globe as their visual metaphor and an earthy green color palette to reflect her dedication to environmental causes.
At the end of this exercise, the instructor facilitates a conversation that ties the work back to user experience. This opening content curation and visualization exercise helps students gain some critical distance from their own ePortfolios and recognize the importance of finding pithy and visual ways to represent the experiences they aim to chronicle.
As a follow-up to the content curation activity, students then engage in some user testing of their own ePortfolios. Before students dive into accessing the ePortfolios and testing out the navigation and experience of each ePortfolio, students review a handout that breaks down several categories imperative for user testing (see Figure 3). These handouts were developed primarily based on my experiences doing web development and user testing work for my program’s own websites. As a web developer myself, I knew that it was vital to assess ease of use alongside the rhetorical functionality of visuals and text on a website. My background as a computers and compositionist helped me bridge my understanding of web development practice with questions that could evoke metacognitive thought and reflective practice.
It was important to me that students use language from industry-specific terminology to 1) give the task greater authenticity and 2) for students to see how the work that they are doing for the ePortfolio overlaps with work that professional digital communicators engage in. Therefore, the language of the headers in each of the tables on the handout are drawn from Jakob Nielsen (1994) and usability.gov, a resource affiliated with the Digital Communications Division in the U.S. Department of Health and Human Services (2017).
Once students receive the handout (Figure 3), they are advised to use terms and language from this handout to ensure that the conversation about the ePortfolios does not turn into personal attacks or critiques of design choices, but rather are framed by observations of the experience. The handout also allows some authentic engagement with understanding the multiple audiences that will encounter the ePortfolio and help situate ePortfolios as products that could be encountered serendipitously in online spaces (Cambridge, 2008). Students are, of course, given some guidance to respond to their peers’ work respectfully, knowing that their peers, the creators of the ePortfolios, will be with them, and these guidelines do not deviate much from best practices in peer review policy (Corbett, LaFrance, Decker, Smith, & Smith, 2014).
When the class is ready to begin the user testing the room is divided into peer review groups of three, and each group has access to a computer workstation for accessing their ePortfolios. At the start of the user testing process, the group is informed that each group will rotate through three rounds of review, where the roles of each group member will rotate. The rotating roles for the group members are as follows:
- User: One student takes on the role of navigating through an ePortfolio (as long as it is not the user’s own ePortfolio), talking aloud about their experience of finding the information that’s on the ePortfolio, and commenting throughout on how easily they are able to find the information and how much their experience of accessing the information aligns with their expectations of what they would find in the ePortfolio. They use the handout distributed at the start of the activity to ensure that they’ve answered the questions that will be useful for helping the student make improvements to the ePortfolio.
- Note-taker: One student takes notes about what the user is saying as they navigate through the ePortfolio. The student in this role is typically the student who has authored the ePortfolio that the user is reviewing. The note-taker can use the handout distributed at the start of the activity but may also take separate notes; the handout is meant to be a guide rather than a strict heuristic. The note-taker keeps a record of the observations so that the team can then study and analyze the comments later.
- Analyst: One student analyzes the user’s experience while the user is navigating through the ePortfolio and, after an initial round of navigation, the analyst asks the user any follow-up questions that the analyst may have about the user’s experience, including questions on the handout distributed at the start of class that may not have been answered during the user test itself. The analyst is typically not the student who has authored the ePortfolio that the user is reviewing.
By giving each member of the peer review team the opportunity to be a “user,” each group member gets to remember what it is like to be on the outside of the design process looking in. Assigning the author of the ePortfolio to the role of note-taker, the author also gets to see whether the user’s experiences or the analyst’s observations align with their own intentions and goals for what they want the ePortfolio to communicate. As the note-taker, the author of the ePortfolio may also notice concerns with inclusion in their web design practices. Indeed, seeing a user operate the ePortfolio in real time could make visible the different pathways that non-normative audiences could potentially take. Indeed, rotating roles throughout this exercise allows students to reexamine their positionality and to consider what their work is communicating from multiple, including non-normative, perspectives.
Once the user testing process is complete, students have the opportunity to examine the feedback they received from the user and from the analyst to create a roadmap for revisions of their ePortfolio. They may determine that the feedback they received means that they have a different story to tell than the one that they originally conceived. Alternatively, they may discover that the order and navigability of their artifacts prevent the user from understanding exactly the narrative they hope to tell within their ePortfolio. Either way, this is a moment in the exercise when students may engage in metacognitive work, as they will need to consider not only how to justify the work they’ve done, but to consider how their choices might change in light of seeing their audiences’ interactions.
Given the importance of responding quickly to user feedback, each student has an opportunity after the user testing is over to create a list of the next steps that they hope to take for their ePortfolio revisions. They are instructed to list out at least three changes they hope to make to their ePortfolio before the next class deadline.
After the class session is over, students are encouraged to revise their ePortfolios and redesign their ePortfolios to respond to the user feedback before the final ePortfolio is submitted.
Many of the students' final ePortfolios reflect the work that they've received from their peers. To see some examples of what final ePortfolios look like, check out the screencast video below.
The language of user experience gives students a clear, concrete, and transferable way to discuss the work that they do in developing ePortfolios. As more students begin to create web-based, public-facing, multimodal content, introducing them to language about revision and development from professional contexts helps them to understand not only how an audience will understand and receive their work, but also how their work for class might be a part of a larger professional conversation. Many of the students in the science communication program go on to compose other kinds of public-facing, multimodal work; their experiences of user testing may allow them to consider the positionality of the audience, including disabled and non-normative audiences, in receiving and understanding the kinds of digital-born texts they may continue to create.
A conversation about user design is useful beyond the context of ePortfolio pedagogy. Any student composing for the Web today must think critically about the ways users might navigate a webtext. Given that public-facing webtexts tend not to encourage any single navigational pathway for users to explore, students need to consider how all of the possible pathways available in a webtext may impact a user’s understanding of a website’s purpose and meaning. Even for students who are not designing webtexts, it is vital for them to develop a vocabulary that allows them to identify how user interaction may impact an understanding of how audiences might engage with other forms of interactive, multimodal work (e.g. interactive maps, robotic devices, etc). User testing gives students an awareness of how a webtext can be inclusive and accessible by revealing that there may be numerous ways that audiences engage with their work.
The intersection between rhetoric and user design is an emerging field of inquiry, one that could continue to be explored with concrete connections between the language of industry web development and that of the rhetorical situation. User experience research is a field committed to understanding how and why users make particular choices when navigating an interactive space while rhetorical research is a field committed to understanding how and why rhetors make particular language choices when constructing an argument. As our students move into the roles of content creators and curators, it would behoove us to make the connections between user experience and rhetoric concrete so that students can approach their future composing experiences informed by the foundation that the work of understanding users is also the work of understanding successful communication strategies.
Arola, Kristin L., Sheppard, Jennifer, & Ball, Cheryl E. (2014). Writer/Designer: A guide to making multimodal projects. Boston, MA: Bedford/St. Martin’s.
Bartolotta, Joseph, Newmark, Julianne, & Bourelle, Tiffany. (2017). Engaging with online design: Undergraduate user-testers and the practice-level struggles of usability learning. Communication Design Quarterly, 5(2).
Benander, Ruth, & Refaei, Brenda. (2016). How authors and readers of ePortfolios make collaborative meaning. International Journal of ePortfolio, 6(2). 71–84.
Bokser, Julie A., Brown, Sarah, Chaden, Caryn, Moore, Michael, Clearly, Michelle Navarre, Reed, Susan, Seifert, Eileen, Zecker, Liliana Barro, & Wozniak, Kathryn. (2016). Finding common ground: Identifying and eliciting metacognition in ePortfolios across contexts. International Journal of ePortfolio, 6(1). 33–44.
Bowman, Jim, Lowe, Barbara J., Sabourin, Katie, & Sweet, Catherine S. (2016). The use of ePortfolios to support metacognitive practice in a First-Year Writing program. International Journal of ePortfolio, 6(1). 1–22.
Blythe, Stuart. (2001). Designing online courses: User-centered practices. Computers and Composition, 18(3). 329–346.
Cambridge, Darren. (2008). Audience, integrity, and the living document: eFolio Minnesota and lifelong and lifewide learning with ePortfolios. Computers and Education, 51(3), 1227–1246.
Clark, J. Elizabeth. (2010). The digital imperative: Making the case for a 21st-century pedagogy. Computers and Composition, 27(1). 27–35.
Colby, Richard. (2005). Digital portfolio sensibility: An interview with Kathleen Blake Yancey. Computers and Composition Online, 5(2).
Corbett, Steven J., LaFrance, Michelle, Decker, Teagan E., Smith, Allison D., & Smith, Trixie G. (2014). Peer pressure, peer power: Theory and practice in peer review and response for the writing classroom. Southlake, TX: Fountainhead Press.
Eliot, Norbert, Rudniy, Alex, Deess, Perry, Klobucar, Andrew, Collins, Regina, & Sava, Sharla. (2016). ePortfolios: Foundational measurement issues. Journal of Writing Assessment, 9(2).
Gallagher, Chris W., & Poklop, Laurie L. (2014). ePortfolios and audience: Teaching a critical twenty-first century skill. International Journal of ePortfolio, 4(1). 7–20.
Jarratt, Susan C., Mack, Katherine, Sartor, Alexandra, & Watson, Shevaun E. (2009). Pedagogical memory: Writing, mapping, translating. Writing Program Administration, 33(1–2), 46–73.
Jones, Beata, & Leverenz, Carrie. (2017). Building personal brands with digital storytelling portfolios. International Journal of ePortfolio, 7(1), 67–91.
Kelly-Riley, Diane, Eliot, Norbert, & Rudniy, Alex. (2016). An empirical framework for ePortfolio assessment. International Journal of ePortfolio, 6(2), 95–116.
Klein, Lauren F. (2013). The Social ePortfolio: Integrating social media and models of learning in academic ePortfolios. In ePortfolio Performance Support Systems: Constructing, Presenting, and Assessing Portfolios. Perspectives on Writing. Fort Collins, Colorado: The WAC Clearinghouse and Parlor Press.
Miller-Cochran, Susan, & Rodrigo, Rochelle. (2006). Determining effective distance learning designs through usability testing. Computers and Composition, 23(1), 91–107.
Munday, Jennifer, Rowley, Jennifer, & Polly, Patsie. (2017). The use of visual images in building professional self-identities. International Journal of ePortfolio, 7(1), 53–65.
Nelms, Gerald, & Dively, Ronda L. (2007). Perceived roadblocks to transferring knowledge from First-Year Composition to writing-intensive major courses: A pilot study. Writing Program Administration, 31(1–2), 214–240.
Nielsen, Jakob. (1994). Enhancing the explanatory power of usability heuristics. CHI'94 Conference Proceedings.
Oswal, Sushil K. (2013). Accessible ePortfolios for visually-impaired users: Interfaces, designs, and infrastructures. In ePortfolio Performance Support Systems: Constructing, Presenting, and Assessing Portfolios. Perspectives on Writing. Fort Collins, Colorado: The WAC Clearinghouse and Parlor Press.
Peters, Brad, & Robertson, Julie F. (2007). Portfolio partnerships between faculty and WAC: Lessons from disciplinary practice, reflection, and transformation. College Composition and Communication, 59(2), 206–236.
Reid, Gwendolynne, Snead, Robin, Pettiway, Keon, & Simoneaux, Brent. (2016). Multimodal communication in the university: Surveying faculty across disciplines. Across the Disciplines 13(1). Retrieved from http://wac.colostate.edu/atd/articles/reidetal2016.cfm.
Salvo, Michael J. (2001). Ethics of engagement: User-centered design and rhetorical methodology. Technical Communication Quarterly, 10(3), 273–290.
Silva, Mary L., Delaney, Susan A., Cochran, Jolene, Jackson, Ruth, & Olivares, Cory. (2015). Institutional assessment and the integrative core curriculum: Involving students in the development of the ePortfolio system. International Journal of ePortfolio, 5(2), 155–167.
Usability.gov. (2017). User research basics. Retrieved from https://www.usability.gov/what-and-why/user-research.html.
Wardle, Elizabeth. (2007). Understanding ‘Transfer’ from FYC: Preliminary results from a longitudinal study. Writing Program Administration, 31(1–2), 65–85.
White, Edward, & Wright, Cassie A. (2016). Assigning, responding, evaluating: A writing teacher’s guide. Boston, MA: Bedford/St. Martin’s.
Wills, Katherine V., & Rice, Rich. (Eds.). (2013). ePortfolio performance support systems: Constructing, presenting, and assessing portfolios. Perspectives on Writing. Fort Collins, Colorado: The WAC Clearinghouse and Parlor Press.
Yancey, Kathleen. (2016). A rhetoric of reflection. Logan, UT: Utah State University Press.