Data-Based Decisions: Technical and Professional Communication


2020-21   2019-20
   2018-19   2017-18   2016-17  2015-16   2014-15   2013-14   2012-13  2011-12  2010-11

2020-21

Data-Based Decisions Correlated to Direct Measures
The assessment tool is still brand new, so it makes sense that we want to revise it based on our experience this year and this discussion. Despite the need for tweaks, we found the new tool extremely useful for curricular assessment.

Our assessment shows that students are consistently strong in research, project management, and accessibility, but, as we noted in our last assessment, many students still spoke about social justice in terms of accessibility only, rather than diversity and inclusion.

Proposed Solution 1: Integrate Social Justice Concerns. Regarding the assessment tool question on social justice question, we will revisit the matrix, revisit the assignment design, and revisit the curriculum map to articulate differing approaches to social justice beyond accessibility course topics and assignments. One way to this is to integrate conversations on race, gender/sexual minorities, class, and ability into existing course material so students can experience these topics in the context of professional and technical writing—as integral to the practice of technical communication rather than as stand-alone topics of concern.

Proposed Solution 2:  Tweak the Assessment Tool. We will also revisit the assessment tool. Granular data show gaps which we believe are the result of the ambiguity, our first real run with the new assessment tool. At our next assessment, we will drop considerations of holistic document design and only use “information design” because the category overlaps and it is unclear how we might assess “document design” separately. Further, in keeping with the spirit of the assignment (oral interview preparation) we do not need to assess the design of document they are producing. In other words, the assessment of “document design” does not match the “rhetorical situation” of the document—it is not really made clear that they are being assessed on it. Instructors of capstone are encouraged to design final assignment around the assessment tool.

2019-2020

Data-based decisions correlated to direct measures

The assessment tool is still brand new, so it makes sense that we want to revise it based on our experience this year and this discussion. Despite the need for tweaks, we found the new tool extremely useful for curricular assessment.

Our assessment shows that students are consistently strong in research, project management, and accessibility, but many students spoke about social justice in terms of accessibility only, rather than diversity and inclusion.

Proposed Solution 1: Integrate Social Justice Concerns. Regarding the assessment tool question on social justice question, we will revisit the matrix, revisit the assignment design, and revisit the curriculum map to articulate differing approaches to social justice beyond accessibility course topics and assignments. One way to this is to integrate conversations on race, gender/sexual minorities, class, and ability into existing course material so students can experience these topics in the context of professional and technical writing—as integral to the practice of technical communication rather than as stand-alone topics of concern.

Proposed Solution 2: Tweak the Assessment Tool. We will also revisit the assessment tool. Granular data show gaps which we believe are the result of the ambiguity, our first real run with the new assessment tool. At our next assessment, we will drop considerations of holistic document design and only use “information design” because the category overlaps and it is unclear how we might assess “document design” separately. Further, in keeping with the spirit of the assignment (oral interview preparation) we do not need to assess the design of document they are producing. In other words, the assessment of “document design” does not match the “rhetorical situation” of the document—it is not really made clear that they are being assessed on it. Instructors of capstone are encouraged to design final assignment around the assessment tool.

Data-Based Decisions That Rely on Indirect Measures
Building on the collaboration reported in last year’s assessment, the indirect measures suggest that our continued work with Susie Parkinson (the adviser) to balance courses that emphasize critical thinking (more theory-heavy courses) with those that emphasize building, creating, and making (more application-heavy courses)* is successful. Taking into account Susie’s and student feedback, along with industry trends, we will continue to consult the new curriculum map as we design our individual courses each semester to ensure that we address not only course-specific learning objectives but also that each course reinforces broader curricular learning objectives and that the curriculum provides multiple opportunities to develop relevant literacies. All instructors are encouraged to consider the course objective map when designing their courses, so all program objectives are being reached over the course of the program.

*To be clear, all courses involve some amount of critical thinking and application, but the proportions vary.

2018-2019

Data-based decisions correlated to direct measures

Technological literacy continues to be the strongest area, with the majority of students demonstrating at least an “acceptable” level of literacy. Rhetorical literacies were strong as well, but ethical literacy appears considerably weaker. We suspect that what appears to be a weakness may be in part due to a mismatch between the assignments evaluated and our evaluation tool. Last spring the curriculum committee updated the curricular learning objectives, mapping them to each undergraduate course. We also revised the capstone assignment that is evaluated for our undergraduate assessment; this revised assignment should better map to student motivations and better align with the updated curricular map. But the new assignment has not yet been used; the timing of annual program assessment has us evaluating old assignments with a new tool this year. Next year we expect a more accurate assessment. However, ethical literacy has been a programmatic weakness in the past, so we are not discounting this year’s pattern. We will continue to incorporate considerations of justice into our curriculum, and (building upon last year’s feedback), we will design more assignments that provide opportunities for students to intervene in communicative situations and apply ethical literacies. Last year, we designated ENGL 5400 as a course that fulfills the department’s new diversity requirement, and this year we are also designating ENGL 4200 as fulfilling the diversity requirement. Ensuring that these courses meet the criteria of diversity courses will also bolster our students’ opportunities to develop ethical literacy.

Data-based decisions that rely on indirect measures 

The indirect measures suggest that our collaboration with Susie Parkinson (the adviser) to balance courses that emphasize critical thinking (more theory-heavy courses) with those that emphasize building, creating, and making (more application-heavy courses)* is successful. We will continue to consult the new curriculum map as we design our individual courses each semester to ensure that we address not only course-specific learning objectives but also that each course reinforces broader curricular learning objectives and that the curriculum provides multiple opportunities to develop relevant literacies.

*To be clear, all courses involve some amount of critical thinking and application, but the proportions vary.

2017-2018

Data-based decisions correlated to direct measures

Technological literacy was the strongest area, with every student but one demonstrating at least an “acceptable” level of literacy. Ethical and rhetorical literacies were slightly more mixed, with 3 students demonstrating marginal levels of literacy for each of these categories. We noted that students who scored particularly strongly in ethical literacy mentioned Digital Writing classes in which they intervened specifically to make existing digital “texts” (including videos) more accessible to users with a wider range of abilities. This change was implemented last year in response to mixed ethical literacy levels, though not perhaps in time for all capstone students to take such courses. We will continue to incorporate into our curriculum (including more classes than just one advanced course) assignments that provide opportunities for students to intervene in communicative situations and to apply the ethical literacies addressed in theory courses and other places across the curriculum. We are also designating ENGL 5400 Social Justice in Technical Communication and ENGL 4200 History of Diverse Englishes as courses that fulfill the department’s new diversity requirement. Ensuring that these courses meet the criteria of diversity courses will also bolster our students’ opportunities to develop ethical literacy. We are surprised by the slightly mixed performance in rhetorical literacy, as our undergraduate curriculum heavily emphasizes customizing documents to unique stakeholders and situations. In response to a potential misalignment of our curriculum and the literacies we assess, we will revisit and update our curriculum map. This document maps each course in the undergraduate TCR curriculum to the list of literacies we want to help students develop and demonstrate. Updating this document will allow us to revisit potentially outdated conceptions of technological literacy (which offers one possible one reason why student perspectives and our assessment misalign), as well as to inform coordinated, intentional course design to ensure that all literacies are addressed in multiple courses and multiple ways across the curriculum.

Data-based decisions that rely on indirect measures
To mediate the apparent tension in student needs re: technological competency, the STC student club has been holding learning hours in the smart classroom Ray B West 101, allowing a dedicated time and space for students to share and develop their technology skills. In addition, we have increased the emphasis on digital editing in ENGL 4400 and made the technologies course, ENGL 3410, a prerequisite for editing (ENGL 4400). Finally, we continue working with Susie Parkinson (the adviser) to balance courses that emphasize critical thinking (more theory-heavy courses) with those that emphasize building, creating, and making (more application-heavy courses).*

*To be clear, all courses involve some amount of critical thinking and application, but the proportions vary.

2016-17

Data-based decisions correlated to direct measures

In response to indirect measures (refer below), we changed the measure of basic literacy to technological literacy in our assessment plan. The three measures now include the following: technological, rhetorical, and ethical literacies. Surprisingly, in contrast to the students’ concerns expressed below, technological literacy was the strongest area (or at least every student assessed demonstrated at least an “acceptable.” This was followed by rhetorical and then ethical literacy. One pattern we noticed re: ethical literacy is that the portfolios produced with a certain web design technology continued (see 2015-16) to lack accessibility measures such as alt text for images (which enables screen-reading technologies to read descriptions of images). Now that this type of assessment has occurred twice, as mentioned below, we suspect that students may be largely unaware of how to use this particular tool to enact the accessible communication practices they have learned in classes. Thus, this is not just an ethical literacy issue but also a technological literacy issue, one that our technological literacy rubric is not assessing. Based on this data, we will adjust the technological literacy assessment rubric to include the ability to demonstrate use of accessibility technologies. We will continue to maintain the existing emphasis on ethics of accessible design, but we will place a greater emphasis on learning accessibility technologies. If one contributor to lower ethical literacy is ignorance of how to use accessibility technology tools, then technological literacy and ethical literacy must overlap at this point. In concert with decisions that rely on indirect measures, then, we have implemented semester-long technology projects—where students evaluate and teach the class about a technology—into four courses (ENGL 5420, 4410, 3410, and 5410). Included in these projects will be accessibility technologies.

Data-based decisions that rely on indirect measures

Past responses from students revealed that they appreciate the mix of theoretical and applied approaches to the field that they experienced in the undergraduate program. However, this feedback had been so purely positive that the faculty tested a new approach to garnering indirect measures. Rather than adding a few custom questions to the IDEA evaluation for the undergraduate capstone course (as we stated we would do in the 2015-16 assessment below), we solicited anonymous feedback in a planning meeting with the students in the PTW capstone class. This approach allowed graduating seniors an anonymous way to provide reflective feedback and suggestions to the professors who grade their work. We asked the following questions: In reflecting on your experience in the Professional and Technical Writing program, what feedback do you have? What would you like to see more of? What would you like to see less of? Students still appreciated the opportunity to shape their projects to their own professional interests, but many graduating students reported feeling unsure of their next professional goal, be it grad school or a career in industry. The majority of the respondents said they would feel more confident if they had more technology training. As a result, we have implemented semester-long technology projects—where students evaluate and teach the class about a technology—into four courses (ENGL 5420, 4410, 3410, and 5410). We have also implemented a regular technology-based workshop. Finally, we have changed the measure of basic literacy to technological literacy in our assessment plan so that we may evaluate that data in relation to these indirect measures.

2015-16

Data-based decisions correlated to direct measures

In sum, this group of Professional and Technical Writing graduates scored highly on all three measures: basic, rhetorical, and ethical literacies. Rhetorical literacy was the strongest area, followed by basic and then ethical. One pattern we noticed re: ethical literacy is that the portfolios produced with a certain online tool lacked accessibility measures such as alt text for images. We suspect that students may be largely unaware of how to use this particular tool to enact the accessible communication practices they have learned in classes. Based on this data, we will 1) maintain the existing emphasis on ethical practices such as citation and respect for confidentiality, 2) increase the emphasis on accessible design, and 3) explicitly include accessibility in the portfolio peer review activities to catch these issues earlier. If one contributor to inaccessible design is ignorance of how to use popular tools, we can address this contributor with click-along exercises in class, as well as student presentations of technology skills focused on enacting accessible design.

Data-based decisions that rely on indirect measures

The responses from students revealed that they appreciate the mix of theoretical and applied approaches to the field that they experienced in the undergraduate program. However, this feedback has been so purely positive that the faculty will be testing a new approach to garnering indirect measures. This approach involves adding a few custom questions to the IDEA evaluation for the undergraduate capstone course: e.g., In reflecting on your experience in the Professional and Technical Writing program, what feedback do you have? What would you like to see more of? What would you like to see less of? This approach allows graduating seniors an anonymous way to provide reflective feedback and suggestions to the professors who grade their work. Further, we are maintaining relationships with a group of strong alumni to provide feedback re: the fit of the undergraduate curriculum with their initial experiences in the workplace or graduate school. This, too, provides an opportunity for feedback without power dynamics that could potentially reduce constructive criticism and suggestions.

2014-15

Data-based decisions correlated to direct measures

Demonstration of the three literacies was consistently stronger than the performance reported last year. The stronger performance in basic and rhetorical literacies noted in last year’s assessment continued this year. Compared to previous years, performance in ethical and critical literacies showed marked improvement, which we largely attribute to redesigned courses (ENGL 4410, 5400, 5410, 5490) in which students have more numerous and in-depth opportunities to develop these literacies. We expect this improvement to continue as students presently engaging in the redesigned curriculum go on to develop portfolios for assessment in their senior capstone class.

Overall, we are pleased with the improvements reflected in the portfolios assessed this year.

2013-14

Data-based decisions correlated to direct measures

Based on our data regarding the three literacies, we are incorporating more explicit references to the literacies in our courses, starting with ENGL 3400 in academic year 2015-16. This will improve the students' experience in two ways:

  1. Students will already understand how those literacies manifest in their work by the time they are asked to focus on the litracies in the capstone class (ENGL 5430)
  2. Being able to discuss their work in terms of the literacies may give our students an advantage when they compete for jobs. It will give them a specific, substantive, and fresh way to answer the classic job interview questions: “Why should we hire you?” and “What do you think makes you better than our other applicants?” The job applicant can reply, “USU’s program has trained me in different literacies, all of them vital to your workplace. Would you like me to explain?”

Almost all assessed portfolios adequately or excellently demonstrated that students know how to apply visual design principles, engage in primary and secondary research, select appropriate technologies, and select communication strategies appropriate for the needs of particular situations and users. Technological skills were especially strong, with nine of the twelve portfolios ranking as exemplary.

However, ethical literacy was poor or missing in approximately half of the portfolios. The faculty have identified these literacies as areas of focus for program involvement. Efforts are underway to more explicitly and critically address ethics in upper-division undergraduate courses, such as document design (ENGL 4410) and digital media production (ENGL 5410). In the fall of 2014, both of these courses involved viewing technical communication topics through the lens of social justice, incorporating critical reflections and ethics-relevant readings into course design. Faculty members are discussing how to better assess, teach and/or emphasize grammar, style, and mechanics and students’ understanding of how stakeholders shape effective discourse.

2012-13

Data-based decisions correlated to direct measures

During 2011-12, the Technical Communication emphasis focused re-organizing its PhD and developing new curriculum and procedures for it. Therefore, it did not do an undergraduate assessment.

2011-12

Data-based decisions correlated to direct measures

During 2011-12, the Technical Communication emphasis focused re-organizing its PhD and developing new curriculum and procedures for it. Therefore, it did not do an undergraduate assessment.

2010-11

Data-based decisions correlated to direct measures

During 2011-12, the Technical Communication emphasis experienced personnel changes affecting the assessment process, so no data was collected for this year.