|Teaching and Learning Forum 2013 [ Refereed papers ]|
Fernando F. Padró and Anita Frederiks
University of Southern Queensland
Email: firstname.lastname@example.org, email@example.com
This paper discusses the importance of evaluating the impact of the learning centre on student learning and satisfaction at a regional university with a significant online presence. The foci of this article are the aspiration and challenges in creating a database to begin a formal self-evaluation process to help determine the benefits that student learning support programs have on student learning accomplishments in academic programs. An argument is made for how to evaluate a mature student learning support program in an era of change and high accountability expectations and how this framework will shape the creation and use of a database using existing data heretofore not collected, with potential capacity for linkages to other campus student record databases.
From an institutional perspective, the issue becomes one of how to utilise student-focused data in evaluating academic and support programs targeted and improving student learning (or at least performance). The challenge and potential limitation is that '[s]ome of the more difficult to measure aspects of student success are the degree to which students are satisfied with their experience and feel comfortable and affirmed in their learning environment' (Kuh et al., 2007, p. 8). To mitigate some of the difficulties of measuring satisfaction, Krause and Coates (2008), suggestive of Tinto's (1987) view of formal and informal systems, argue for a broad-based, multi-dimensional definition of student engagement. Gray and Daymond (2010), argue for expanding the definition even further by adding a holistic student-campus engagement dimension that promotes connections to the university which stimulate personal development and student motivation viewed from a customer satisfaction prism. Their rationale is that student satisfaction cannot be measured in a transaction-specific analysis; student engagement adds a dimension of student in the roles of learner and member of the university community linking to the institution's service quality dimension. In contrast, Hu and Kuh's (2002) proposed a more limited academic performance-focused definition of student engagement: 'the quality of effort students themselves devote to educationally purposeful activities that contribute directly to desired outcomes ...' (p. 555).
Coates and Ransom (2011) define support 'broadly as the university's interaction with a student, whether it be with academic or service professional staff, that enhances the study experience' (p. 2). Drilling down to a more specific level, the definition of what a learning assistance centre within a LAP environment is has changed over the years (Truschel & Reedy, 2009). According to the Council for the Advancement of Standards in Higher Education [CAS] (2012), LAPs help students to succeed academically, facilitate student development, and develop in students appropriate strategies to increase learning efficiency. These programs usually provide individualised instruction in the form of tutoring, mentoring, academic coaching, and counselling, thus operating at the crossroad of academic affairs and student services (CAS, 2012). The nature of what learning centre activities as distinguished from other LAP activities or units entail as part of instilling knowledge requires the understanding of the broader learning environment (institutional, social, and educational) in which learning takes place (Entwistle, 2009). Areas needing further research are how and what type of data has to be gathered to measure student learning and improve LAPs because, as Trammell (2005, as cited in CAS, 2012) points out, LAPs have to demonstrate effectiveness and not only that they are providing services to students. Effectiveness needs to be viewed as to how well the LAP is aligned and supports institutional mission and in so doing, compliments the teaching happening at the university's programs.
When there is an emphasis on online learning course offerings provides the additional challenge of demonstrating LAP effectiveness for online learners as well as face-to-face. A further challenge is added when the online program is international in nature. A third challenge may also come from pursuing an OpenCourseWare (OCW) strategy in online learning (cf. Huijser, Bedford, & Bull, 2008). Learning and learning assistance can be seen from the lens of student issues in general as per Krause, Coates and James (2005, as cited in Msweli, 2012, p. 98) and/or more specifically from a support of international student perspective. Online distance learning (ODL) goals converge with those of internationalisation (Msweli, 2012) even if Elkin, Devjee, and Farnsworth's (2005) and Elkin, Farnsworth, and Templer's (2008) dimensions of internationalisation place the support of international students as a dimension occupying lesser importance in terms of rank order of investment and strategic prioritisation. Their findings regarding the lower priority may reflect the more traditional thinking evidenced in practice that LAP activities are primarily based on tutorial and workshop programs (Truschel & Reedy, 2009), be these face-to-face or online generic skills development workshops, subject specific task-based assistance/support, ancillary subject tutoring, or home-made or off-the-shelf web-based tutorial modules and programs. If the assumptions made by Elkins, Devjee, and Farnsworth (2005) and Elkin, Farnsworth and Templer (2008) are true, the premise that LAP is of secondary importance is myopic because as AUSSE data suggest that '[i]ndividualisation is a key component of successful support – students' perceptions that the assistance meets their specific needs increases student satisfaction and consequently retention' (Coates & Ransom, 2011, p. 2). As important is Coates and Ransom's (2011) observation that there are disjuncts between the support students need from universities to meet their goals/needs that may have adverse consequences ranging from unstated dissatisfaction to levying formal complaints to 'voting with their feet' and moving elsewhere or leaving higher education entirely. All three challenges identified above have potential adverse consequences for a university. At the least dissatisfaction has the potential to create a negative 'word-of-mouth'. This can impact future recruitment and create negative reporting widgets impacting quality assurance (QA) reporting of institutional performance. Worst case scenario has students permanently leaving the institution, thus costing revenue and adding to the cost of doing business because of the need to find replacement students.
The one-to-one (or small group) consultations are delivered either face-to-face, over the phone or via email with an academic staff member. At the Toowoomba campus, bookings by on-campus, online and external students can be made online. Bookings can be made in person at all three University sites. The consultations are only available during standard working hours, local time. Bookings are encouraged, but drop-ins are catered for if time is available. This encourages students who are studying off campus to book phone consultation in their "lunch hours" or book asynchronous email consultations.
There are approximately eight workshops for each academic learning skills and mathematics which include topics such as time management, writing academically, grammar, using a scientific calculator and mathematics refreshers, just to name a few. The online resources include self-tests for mathematics, to allow students to self-diagnose their mathematical ability and then develops a study plan to assist to develop the knowledge required. Other online resources include short "quick tips" flyers, larger self-paced content documents and short multimedia presentations for troublesome concepts.
At present, USQ, as part of a University Participation & Partnership Program (UPPP) grant, is looking at its support program offerings as a means of improving student retention rates and increasing the number of domestic students from low socio-economic status (SES) backgrounds. The framework of this program, called the Student Personalised Academic Road to Success (SPARS):
... facilitates student academic success and experiences by connecting and formalising essential informal academic learning support, non-academic student support, administrative and strategic quality enhancement processes to a single support point. This is not a 'one-size fits all' solution but one of many in USQ's suite of student support and services initiated to increase student retention/ progression as well as to enhance students' experience throughout their journey in the university. (Kek, 2012, p. 1)SPARS' objective is one of integrating programs such as those provided by TLC, the Library and various elements from the University's student services sector (Figure 1). One of its two outcomes is enhanced measurable student engagement in academic study skills development (Kek, 2012). To meet this outcome, it is in the University's best interest to be able to measure the impact and influence these different non-academic support components have on student learning as measured by the Course Experience Questionnaire (CEQ).
Figure 1: Conceptual framework for USQ student retention (Source: Kek, 2012)
A higher resolution version of Figure 1 is available [Figure 1 - high res]
The rationale for the changes occurring at USQ echo Holt, Palmer, and Challis' (2011) views regarding the profound change occurring for academic program development from the student support perspective. This is because at USQ both elements are housed within the same Teaching and Learning unit. Therefore, changes are part of our 'search for long-term strategic benefits' (Holt, Palmer, & Challis, 2011) From a standards-based evaluation approach based on explicitness and commitment to procedures and values (Stake, 2004), the question is how to be able to identify indicators and measures that are able to provide meaningful information. The CAS (2012, p. 8) standards dictate that LAPs must be intentionally designed; guided by theories and knowledge of learning and development; integrated into the life of the institution; reflective of developmental and demographic profiles of the student population; responsive to needs of individuals, populations with distinct needs, and relevant constituencies; and delivered using multiple formats, strategies, and contexts.
An evaluation framework for a maturing or mature learning centres, given the CAS standards and the views of those by Kuh et al. (2007), Scott (2003), and Meyer (2002, 2006) frames intentional, interconnected, and diversified learning support activities through the institutional lenses of student engagement and satisfaction, meeting individual needs of domestic and international students, value to the university and alignment to mission/vision, and TEQSA standards along with the larger social and policy concerns driving change. This is the goal driving the creation of the database and identification of indicators and metrics. Figure 2 demonstrates the evaluative framework once fully developed. At this point, the TLC can only be described as having a 'developing' framework precisely because we are at the identifying and developing antecedent and transactional data levels (Stake, 2004), beginning to form the processes and values that generate meaningful data for the unit.
Figure 2: Where USQ has to go: Evaluation framework for a mature LAP
LTS, as the provider of academic and student support programs, has been getting data in electronic and paper forms for minimum of three years to track support activities in academic language and learning and mathematics, but it was not formally collected for analysis and evaluation purposes until this past year. During this past year, all disparate data from the three campuses were collected and placed in an Excel spreadsheet. Data was collected by one academic and she cleaned up the data by ensure accuracy by cross-checking and, as needed, using other data to confirm the accuracy of the entry; established consistency in reporting from all three campuses; created algorithms to generate results from collected data; and determined what data is available for comparative purposes and/or trend analysis.
At this time, constructing the database has become a gap analysis to identify holes and limitations that need to be addressed. In other words, creating and performing due diligence on existing data provides a means to determine what other connections to databases are needed and/or what additional data can be captured to meet needs.
Data regarding one-on-one (or small group) sessions comes from the online booking system at the Toowoomba campus. These data from the other two locations are recorded manually, on paper. Currently this booking system is connected to the learning management system (LMS), only allowing access to certain parts of each student record.
When making a booking with Toowoomba staff, students are required to be logged into the LMS. Using the LMS, students are asked to indicate the course for which they need assistance with (voluntarily), give their contact number (the only mandatory field) and register. Students booking for an academic language consultation, have to identify what specific type of support they want.
Paper forms from the other locations have been inputted manually into the Excel master file. Figure 3 below identifies the data fields used to categorise these data. As a result of this exercise, we have found that data for these categories from the three campuses are available from semester 2, 2009 onward while data for the Toowoomba campus is available from semester 2, 2008 onward.
Semester, Year (e.g. S1, 2012):
Type of week:
End semester break
Broad support area (Academic Language and Learning, Mathematics)
Type of contact (Drop in, booked)
Actual time spent with student (in minutes)
Contact type (Face to face, phone, email)
Academic Language and Learning specific assistance options:
Logical presentation ideas
Learner advisor comment (optional)
Course level (Undergraduate or Postgraduate)
Faculty from where the course is offered
Enrolment type (on-campus, external or online)
Enrolment Location (Toowoomba, Springfield, Fraser Coast)
Date and time
Appointment status (Complete, did not show)
Student first name
Student last name
Contact details (Supplied by students in online booking system)
Creating a database from these sources has presented major challenges in data scrubbing to assure the accuracy and completeness of the data. The major concerns that have arisen are:
More cannot be done yet because of the changing context of LAP as envisioned in the SPARS Project and its impact on TLC activities. As the project matures Patton's framework for clarifying goals (1997 as cited in Stufflebeam & Shinkfield, 2007) will help provide the needed focus in as far as outcomes, analytics and indicators, and data collection and analysis plans. The mid-term plan is to link the larger data collection and analysis requirements of the new SPARS Program to be able to generate evidence of the benefits to student persistence and enhanced student access opportunities so that we can see the strengths of the interconnections of this approach as envisioned in Figure 2.
This paper brings together two issues: the impact LAPs have on student learning and the ability to measure that impact. TEQSA makes clear the expectations for performance have to be positivistic and somewhat framed from a cost benefit view. However, they do take LAP activity for granted in the sense that the important things are the outcomes of learning, persistence, and graduation. Any evaluation process of LAP activities must keep this in mind. The challenge for universities is that student engagement is multi-dimensional and that each dimension has a direct or indirect impact on student learning as measured by grades, engagement, AND satisfaction. Ergo the need to identify and use these interconnections to identify analytics that add to what these activities provide for students. USQ's exercise in creating a database to initiate a formalised evaluation structure of its LAP is an example of the challenges and steps required to understand and work through Donaldson's (2009) five points of what makes for credible evidence along with sound decisionmaking, feedback and reporting loops.
Australian Council for Educational Research [ACER] (nd). Broadening staff involvement in student learning. http://www.acer.edu.au/documents/aussereports/AUSSE_EG_Broadening.pdf
Baizerman, M. (2009). Deepening understanding of managing evaluation. In D.W. Compton & M. Baizerman (Eds.), Managing program evaluation: Towards explicating a professional practice. New Directions for Evaluation, 121, 87-98. http://au.wiley.com/WileyCDA/WileyTitle/productCd-0470482346.html
Challis, D., Holt, D. & Palmer, S. (2009). Teaching and learning centres: Towards maturation. Higher Education Research & Development, 28(4), 371-383. http://dx.doi.org/10.1080/07294360903067021
Coates, H. & Ransom, L. (2011). Dropout DNA, and the genetics of effective support. AUSSE Research Briefings, vol. 11. Melbourne: Australian Council for Educational Research. http://research.acer.edu.au/cgi/viewcontent.cgi?article=1000&context=ausse
Coates, H. (2006). Excellent measures precede measures of excellence. Proceedings of AUQF2006 Quality Outcomes and Diversity. Melbourne: Australian Universities Quality Agency. http://www.auqa.edu.au/files/publications/proceedings_2006_final.pdf
Commonwealth of Australia (2011). Tertiary Education Quality Standards Agency Act of 2011, Higher Education Standards Framework (Threshold Standards). Canberra: Author. http://www.comlaw.gov.au/Details/F2012L00003/Download
Council for the Advancement of Standards in Higher Education [CAS] (2012). CAS professional standards for higher education. (8th ed.). Washington, DC: Author.
Donaldson, S. I. (2009). Epilogue: A practitioner's guide for gathering credible evidence in the evidence-based global society. In S. I. Donaldson, C. A. Christie & M. M. Mark (Eds.), What counts as credible evidence in applied research and evaluation practice? Los Angeles: SAGE.
Elkin, G., Devjee, F. & Farnsworth, J. (2005). Visualising the "internationalisation" of universities. International Journal of Educational Management, 19(4), 318-329. http://dx.doi.org/10.1108/09513540510599644
Elkin, G., Farnsworth, J., & Templer, A. (2008). Strategy and the internationalisation of universities. International Journal of Educational Management, 22(3), 239-250. http://dx.doi.org/10.1108/09513540810861874
Entwistle, N. (2009). Teaching for understanding at university: Deep approaches and distinctive ways of thinking. Houndmills Basingstoke, UK: Palgrave Macmillan.
Gray, D. & Daymond, J. (2010). The influence of student engagement levels on satisfaction and behavioural intentions. In P. Ballantine & J. Finsterwalder (Eds.), Proceedings of the Australian and New Zealand Marketing Academy Annual Conference, 29 November - 1 December 2010. Christchurch, NZ: University of Canterbury, Department of Management, College of Business and Economics. http://anzmac.info/conference/2010/pdf/anzmac10Final00406.pdf
Holt, D., Palmer, S. & Challis, D. (2011). Changing perspectives: Teaching and learning centres' strategic contributions to academic development in Australian higher education. International Journal for Academic Development, 16(1), 5-17. http://dx.doi.org/10.1080/1360144X.2011.546211
Hu, S. & Kuh, G. D. (2002). Being (dis)engaged in educationally purposeful activities: The influences of student and institutional characteristics. Research in Higher Education, 43(5), 555-575. http://dx.doi.org/10.1023/A:1020114231387
Huijser, H., Bedford, T. & Bull, D. (2008). OpenCourseWare, global access, and the right to education: Real access or marketing ploy? International Review of Research in Open and Distance Learning, 9(1), 1-13. http://www.irrodl.org/index.php/irrodl/article/view/446
Kek, Y. C. M. A. (2012, 2 August). Integrated Student Learning Journey Initiative (ISLJI) Final Paper: The Integrated Student Learning Journey - Student Personalised Academic Road to Success (SPARS): A Framework for the Provision of Adaptive and Student-Directed, On-line, On-demand, Integrated Study Support to Students. Paper submitted to the Director, Learning and Teaching Support, Office of Pro-Vice Chancellor (Learning, Teaching and Quality), University of Southern Queensland.
Krause, K. L. & Coates, H. (2008). Students' engagement in first-year university. Assessment and Evaluation in Higher Education, 33(5), 493-505. http://dx.doi.org/10.1080/02602930701698892
Kuh, G. D., Kinzie, J., Buckley, J. A., Bridges, B. K. & Hayek, J. C. (2007). Special issue: Piecing together the student success puzzle: Research, propositions, and recommendations. ASHE Higher Education Report, Volume 32, Issue 5. San Francisco: Jossey-Bass. http://www.josseybass.com/WileyCDA/WileyTitle/productCd-0787997765.html
Mark, M. M. (2012). Program life cycle stage as a guide to evaluation decision making: Benefits, limits, alternatives, and future directions. American Journal of Evaluation, 33(2), 277-280. http://intl-aje.sagepub.com/content/33/2/263.full.pdf+html
Marshall, S. J., Orrell, J., Cameron, A., Bosanquet, A. & Thomas, S. (2011). Leading and managing learning and teaching in higher education. Higher Education Research & Development, 30(2), 87-103. http://dx.doi.org/10.1080/07294360.2010.512631
Meyer, K. A. (2002). Quality in distance education: Focus on online learning. ASHE-ERIC Higher Education Report, Vol. 29, Number 4. San Francisco: Jossey-Bass. http://au.wiley.com/WileyCDA/WileyTitle/productCd-0787963496.html
Meyer, K. A. (2006). The road map to cost-efficiencies of online learning. ASHE Higher Education Report, Vol. 32, Issue 1. San Francisco: Jossey-Bass. http://dx.doi.org/10.1002/aehe.3201
Mswelli, P. (2012). Mapping the interplay between open distance learning and internationalisation principles. The International Review of Research in Open and Distance Learning, 13(3), 97-116. http://www.irrodl.org/index.php/irrodl/article/view/1182
Padró, F. F. (2012). STEM: A job churning perspective. In C. P. Veenstra, F. F. Padró & J. Furst-Bowe (Eds.), Advancing the STEM agenda: Quality Improvement supports STEM, 193-204. Milwaukee, WI: ASQ Press.
Pascarella, E. T. & Terenzini, P. T. (2005). How college affects students. Volume 2: A third decade of research. San Francisco: Jossey-Bass.
Reid, I. (2005). Quality assurance, open and distance learning, and Australian universities. International Review of Research in Open & Distance Learning, 6(1), 1-11. http://www.irrodl.org/index.php/irrodl/article/view/222/305
Scott, G. (2003). Effective change management in higher education. EDUCAUSE Review, 38(6), 64-80. http://www.educause.edu/ero/article/effective-change-management-higher-education
Stake, R. E. (2004). Standards-based & responsive evaluation. Thousand Oaks, CA: SAGE.
Stufflebeam, D. L. (2001). Evaluation models: New directions for evaluation, no. 89. San Francisco: Jossey-Bass.
Stufflebeam, D. L. & Shinkfield, A. J. (2007). Evaluation theory, models, & applications. San Francisco: Jossey-Bass.
Taylor, K. L. (2005). Academic development as institutional leadership: An interplay of person, role, strategy, and institution. International Journal for Academic Development, 10(1), 31-46. http://dx.doi.org/10.1080/13601440500099985
Tinto, V. (1987). Leaving college: Rethinking the causes and cures of student attrition. Chicago: The University of Chicago Press.
Truschel, J. & Reedy, D. L. (2009). National survey - What is a learning center in the 21st century? Learning Assistance Review, 14(1), 9-22. http://www.eric.ed.gov/ERICWebPortal/contentdelivery/servlet/ERICServlet?accno=EJ839147
University of Southern Qeensland (nd.). Strategic Plan 2009-2013: Creating sustainable features ... Embracing the digital education revolution. Toowoomba, QLD: Author.
Volkov, B. B. (2011). Beyond being an evaluator: The multiplicity of roles of the internal evaluator. In B. B. Volkov & M. E. Baron (Eds.), Internal evaluation in the 21st century. New Directions for Evaluation, 132, 25-42. http://www.josseybass.com/WileyCDA/WileyTitle/productCd-1118204301.html
|Please cite as: Padró, F. F. & Frederiks, A. (2013). Evaluating the impact of the Learning Centre on student learning and satisfaction. In Design, develop, evaluate: The core of the learning environment. Proceedings of the 22nd Annual Teaching Learning Forum, 7-8 February 2013. Perth: Murdoch University. http://ctl.curtin.edu.au/professional_development/conferences/tlf/tlf2013/refereed/padro.html|
Copyright 2013 Fernando F. Padró and Anita Frederiks. The authors assign to the TL Forum and not for profit educational institutions a non-exclusive licence to reproduce this article for personal use or for institutional teaching and learning purposes, in any format, provided that the article is used and cited in accordance with the usual academic conventions.