Institutional Effectiveness and Assessment Framework

Institutional Effectiveness and Assessment Framework at the American University of Madaba (AUM) for 2024

 

Preface: 

“Institutional effectiveness is the capacity of an institution to assess, verify, and enhance the fulfillment of its mission and purposes, giving primary focus to the attainment of its educational objectives. “

Assessment of learning is based on what students are expected to gain, by the time they complete their academic program. The process of understanding what and how students are learning focuses on the course, competency, program, and institutional level.

 

  • Understanding student learning is based on direct and indirect techniques, as students and as alumni. 

  1. Direct methods touch the knowledge, abilities, skills, attitudes, and values earned by students through their study. 

  2. Indirect methods read what students have learned, through the opinions and reflections of others including alumni.

 

The direct and indirect methods together tell the university what students are learning. The following are examples of the different kinds of student learning evidence collected by the university. 

 

Methods that can provide DIRECT EVIDENCE of student learning 

  • Evaluation of course learning outcomes  

  • Locally developed tests ex. Competency test 

  • Tests 

  • Internally and externally juried review of student projects

  • Performance on national licensure examinations 

  • Collections of student work 

  • Graduation projects 

 

Direct methods are implemented through the CLO and PLO analysis and assessment forms.

Methods that can provide INDIRECT EVIDENCE of student learning 

  • Alumni and employer surveys 

  • Student, faculty staff surveys, focus groups 

  • Exit surveys 

  • Study abroad experience 

  • Graduate follow-up studies, percentage of students who go on to graduate school 

  • Student performance in competitions 

  • Retention and transfer studies; employment statistics 

 

Indirect methods are implemented through the program annual monitoring form and the program review assessment form.

 

A. Course Learning Outcome Analysis and Assessment Guidelines

Article 1: The words and phrases below shall have, whenever they appear in these regulations, the specified meaning as indicated below:

 

Phrase

Meaning

CLOs:

Course Learning Outcomes.

LoC:

List of Competencies.

PLOs:

Program Learning Outcomes.

Assessment tools:

The methods used to evaluate students’ performance such as exams, in class assignments, homework,  projects, etc.

CLO analysis and assessment form:

An excel sheet designed to be used by instructors to analyze and evaluate the student Course learning outcome achievement of the course. 

 

Article 2: the student learning outcomes of every course are evaluated  through the CLO analysis and assessment form. 

 

Article 3: Every academic department has a defined a list of competencies (LoC) that students will develop or strengthen by completing the courses offered in their academic programs.

 

Article 4: In every course syllabus, instructors should define related CLOs that contribute to the achievement of their departments’ LOCs and can be assessed in the covered topics and planned assessment tools.

 

Article 5: Each CLO of Article 4 should be matched to one or more competencies of Article 3 in a CLO-LoC matrix for every course in the CLO analysis and assessment form.

 

Article 6: The assigned grade in every assessment tool should be distributed on one or more CLOs. If the assessment tool was covering more than one CLO, it should be divided into sections where every section relates to a single CLO.  

Article 7: Upon the completion of every CLO analysis; students’ grades in each section should be entered to the grade sheet. Instructors shall not enter total grades; instead, the grade sheet calculates total grades according to the entered grade in each section of the assessment tool. 

 

Article 8: Upon the completion of the course, the grade sheet automatically evaluates CLO s achievement ratios.

 

Article 9: The average achievement of CLOs will be reflected in the defined CLO-LoC                   matrix of Article 5 to evaluate the students’ achievement of each of the competencies of Article 3.

 

Article 10: The results of the activities defined in Articles 8 and 9 shall be analyzed by the instructor to define areas for improving the course design.

 

Article 11: The cumulative average achievement of each competency in all courses will be reflected in the PLO analysis and assessment form to evaluate the cumulative achievement of LoC at the program level.

 

Article 12: Every academic program has defined program learning outcomes (PLOs) that contribute to the achievement of the mission and vision of the Department.

 

Article 13: All PLOs in Article 12 should be matched in a matrix to the related competencies in the LoC of Article 3.

 

Article 14: The average achievement of each competency of Article 11 will be reflected in the defined PLO-LoC matrix of Article 13 to evaluate the cumulative achievement of all students in achieving PLOs and the vision and mission of the academic program.

 

Article 15: The results of the activities defined in the previous articles should be analyzed and discussed by the related academic department council and shall be maintained at the department for improvement.

 

Article 16: Each Academic department is responsible for implementing these regulations. 

 

B. Program Learning Outcome Analysis and Assessment Guideline

 

Article 1: The words and phrases below shall have, whenever they appear in these  

                  Regulations, the specified meaning as indicated below:

 

Phrase

Meaning

CLOs:

Course Learning Outcomes.

LoC:

List of Competencies.

PLOs:

Program Learning Outcomes.

Assessment tools:

The methods used to evaluate students’ performance such as exams, in class assignments, homework,  projects, etc.

PLO analysis and assessment form:

An excel sheet designed to be used by instructors to analyze and evaluate the Program learning outcomes.

 

 

Article 2: These regulations shall be used as guideline information that contributes to the program assessment framework used by AUM.  

 

Article 3: The Head of department is responsible for collecting the CLO analysis forms from all instructors and reflect them in the PLO analysis and assessment form.

 

Article 4: Each Department council shall discuss the PLO analysis findings.

 

Article 5: the PLO analysis and assessment form findings are utilized to review the program, identify areas of improvement and recommend changes to the curriculum if required.

 

Article 6: Program assessment is conducted every year.  

 

Article 7: Each Academic department is responsible for implementing these guidelines.


Article 8:
Formative assessment procedures: formative assessment rubric that should be applied at least twice (before the Midterm Exam (week 4 or 5) and after the Midterm Exam (week 10 or 11)). In each assessment the instructor should mark the criteria according to the form below to recommend a grade for each criterion.

 

Criterion

Excellent

Good pass

Satisfactory pass

Barely Pass

Critical

Fail

85% and above

75-84%

65-74%

55-64%

40-54%

0 – 39%

Attendance

Percentage of attendance

Students’ awareness of relation to other courses in the study plan

Pre requisite, post requisite and elective courses related course topics.

Course Related topics

Clearly understand the topics.

Awareness of Real-World implications

Aware of related news, events, companies, career opportunities and or any realted trends.

Rationale

Excellent and convincing rationale.

Course Difficulty

Students remarks about difficult to comprehend topics.

Other Observations 

 

 

Student success comprehensive rubric: The University Assessment Committee (UAC) compiles readings from various assessment tools into a unified rubric represents a strategic approach to evaluating student success at AUM. This process aims to create a holistic understanding of student achievement across multiple dimensions; 

  • Unified Rubric Development:
    • By integrating readings from different assessment tools, the UAC aims to develop a comprehensive evaluation rubric that can capture the multifaceted nature of student success. This is where the proposed rubric aims to monitor and evaluate students’ success from the different dimensions of academic success, community engagement, infrastructure, satisfaction and alumni success.
  • Holistic Evaluation of Student Success:
    • The move towards a holistic evaluation rubric recognizes that student success is not solely based on academic achievements but also includes personal development, social engagement, and preparedness for post-graduation challenges. This approach aligns with contemporary educational objectives that emphasize the development of well-rounded individuals.
  • Benefits of a Unified Rubric:
    • A unified rubric offers several benefits, including standardized evaluation criteria, easier comparison of data across different cohorts, and the ability to identify trends and areas for improvement. It can also facilitate targeted interventions by highlighting specific areas where students may need additional attention.
  • Implications for AUM:
    • Enhances the institution's ability to monitor and improve student outcomes effectively. It could also contribute to accreditation processes and institutional reputation by demonstrating a commitment to comprehensive quality assurance and student-centred education.
  • Challenges and Considerations:
    • Implementing a unified rubric involves challenges such as ensuring the inclusion of diverse assessment measures, aligning them with the institution's educational goals, and managing the data effectively. Furthermore, it requires ongoing review and adaptation to remain relevant and reflective of the evolving educational landscape.

The UAC effort to consolidate assessment tools into a unified rubric reflects a strategic and innovative approach to quality assurance in education, and  underscores AUM's commitment to understanding and enhancing student success in a holistic manner, which is crucial for preparing students for the complexities of the modern world. The UAC will manipulate and review the proposed rubric at the end of every academic year.

 

C. Program Annual Monitoring Report

 

Article 1: The program annual monitoring report (program annual monitoring form) aims to:

  1. Set, control and maintain program academic standards.

  2. Monitor, enhance and manage students’ program assessment with feedback. 

  3. Monitor students’ program performance and progression.

  4. Evaluate the effectiveness of program learning and teaching resources and identifying matters requiring attention. 

  5. Identify, promoting and disseminating good practice. 

  6. Gather evidence of local initiatives and progress in relation to the faculty and University strategic plan.

 

Article 2: By the end of every academic year, the Head of Department is responsible for preparing the program annual monitoring report.

 

Article 3: Each Department council shall discuss the program annual monitoring report findings.

 

Article 4: The program annual monitoring report findings are utilized to review the program, identify areas of improvement and recommend changes to the curriculum if required.

 

D. Program Review Assessment Guideline

 

Program Review Assessment (PRA) demonstrates the capability of the academic -program to improve academic offerings, student learning, and the student experience through a systematic feedback process from students, former students, employers, benchmarking and other relevant constituencies .

 

PRA is conducted every four to five years (Program Review Assessment from) by the academic department and in collaboration with the Accreditation and QA Department and other units; however data and information is collected through an ongoing cycle of data collections and analysis.

 

Program Review Assessment results are maintained in the related department 

D.1 Program Assessment Tools 

  • Program Learning Assessment (refer to program learning assessment guideline)

  • Exit survey filled by Students who completed their term and acquired the degree

  • Drop_out  forms filled by students who decide to leave prior to obtaining their degree

  • Surveys/ focus groups conducted by the academic and administrative related units or the Deanship of students affairs.  

  • Employers surveys related to the Market study  

  • Alumni surveys  

  • Faculty members evaluation by students 

  • Attrition, retention,  graduation and growth rates

  • Graduates information : Information on the employment status of Alumni in addition to enrolling in graduate studies conducted by the deanship of students affairs

  • AQACHEI competency test results 

  • Benchmarking against leading universities program

  • Extra-Curricular activities assessment 

  • General education assessment 

When 

  • Exit survey  are filled by end of each semester and are kept with the related academic department and/or Deanship of Students Affairs

  • Surveys/ brainstorming  conducted by the academic related units and/or Deanship of Students Affairs at the end of the 2nd semester 

  • Employers surveys related to the Market study - once every year – conducted by the ALUMNI office and maintained  in the Alumni office and the related academic Department – on going 

  • Alumni surveys  - once every year – conducted by the ALUMNI office and maintained  in the Alumni office and the related academic Department

  • Faculty evaluation by students: on line survey filled by all student every semester 

  • Graduates information: Information on the employment status of Alumni in addition to enrolling in graduate studies conducted by the deanship of students affairs every semester 

  • Benchmarking against leading universities program as part of the Program Review Assessment guidelines 

 

Services/facilities provided to students 

Surveys to acquire students feedback in relation to services provided and infrastructure. 

 

When Conducted by the deanship of students affairs/relevent department every academic year at the end of the 2nd semester. 

 

Extra Curricular Activities

Extra-curricular activities contribute to forming student character and developing their skills through providing extra –curricular opportunities for students to participate in.

 

These activities, whether they are clubs, sports clubs, community services, voluntary work etc. are all assessed to insure the efficiency of the activity goals and the improvement for the future.  

 

When at the end of the activity by the organizer.

 

D.2 General Education 

 

The general education (GE) for undergraduate programs reflects the institution’s mission and values and embodies the institution’s definition of an educated person and prepares students for the world in which they will live. 

 

The GE courses are diverse and extend across arts, humanities, sciences, and the social sciences. 

 

These GE courses fall within four different categories in the study plan: University Requirements, Faculty Requirements, Department Requirements, and Ancillary. 

 

The purpose of the University Requirements is to develop students’ knowledge, skills, habits and capabilities that they will use throughout their lives, to enable them to make well-informed choices that lead to productive lives and responsible participation in society.

 

D.2.1 General Education Assessment

General Education Courses (GE) are identified by the different academic departments. 

 

Courses that are considered part of (GE) courses shall clearly state general education learning outcome and undergo course learning outcome analysis and assessment. Also some (GE) courses are part of the subject matter of some programs. These courses are also assessed as part the program they are in and as part of the (GE) program.

 

Forms

Course Learning Outcome Analysis and Assessment form

Program Learning Outcome Analysis and Assessment form

Program Annual Monitoring form 

Program Review Assessment from

 

Download PDF