SACS Principles of Accreditation
Red Back Arrow
Report
Red Forward Arrow
Back to Sections    Close Window

Section III: COC Core Requirement

3.3.1 The institution identifies expected outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.



OFF-SITE REVIEW COMMITTEE COMMENTS

The institution indicates that it is in partial compliance. It has established processes for the identification of expected outcomes for its educational programs and its administrative and educational support services; assessing whether it achieves these outcomes; and providing evidence of improvement based on analysis of those results. However, the institution indicates that it has not yet matured to the state where it can document continuous improvements as a result of those processes.  The institution needs to provide such documentation.



JUDGMENT OF COMPLIANCE

Compliance.

NARRATIVE/JUSTIFICATION FOR JUDGMENT OF COMPLIANCE

The University has implemented an institution-wide process of assessment and evaluation that engages all degree programs and administrative/educational support units. In July 2002, the University President established the University Assessment Committee (UAC) with the charge of assisting all degree programs and administrative/educational support units to initiate and use the assessment process effectively. In addition, the committee reviews outcomes and measurements for all programs and units. The UAC consists of fourteen faculty and administrators from across the University.

University Assessment Committee Members
2003-2005

Dr. Carol Waters

Committee Chair,
Associate Professor of Political Science and Public Administration,
Associate Dean of College of Arts and Sciences

Dr. Ron Anderson

Associate Professor of Education, College of Education

Dr. Susan Baker

Professor of Nursing,
Director of Dr. F.M. Canseco School of Nursing

Dr. Jeffrey Cass

Associate Professor of English, College of Arts and Sciences
Associate Provost

Conchita Hickey

Executive Director, Programs for Academic Support and Enrichment

Dr. Ned Kock

Associate Professor of International Business, College of Business Administration
Interim Chair, Department of Management Information Systems and Decision Science

Dr. Juan Lira

Regents Professor of Education, College of Education,
Chair, Department of Curriculum and Instruction

Veronica Martinez

Director of Institutional Effectiveness

Betty Momayezi

Executive Director of Student Life

Dr. Dan Mott

Associate Professor of Zoology, College of Arts and Sciences
Associate Dean, College of Arts and Sciences
Chair, Department of Biology and Chemistry

Dr. Bonnie Rudolph

Associate Professor of Psychology, College of Arts and Sciences
Director of Counseling Psychology

Dr. Chen-Han Sung

Professor of Mathematics, College of Arts and Sciences
Chair, Department of Mathematical and Physical Sciences

Mary Treviño

Associate Vice President for Academic Affairs
Director of Title V Program

Rodney Webb

Director of Killam Library

The committee is further divided into four subcommittees to allow the members the time and ability to review the reports and to develop expertise in specific areas. The four subcommittees include:

Subcommittee 1:
Degree Programs of the College of Arts and Sciences; as well as General Education.
Subcommittee 2:
Degree Programs of the College of Business Administration, the College of Education and the Canseco School of Nursing.
Subcommittee 3:
Administrative Support Units which include the divisions of Academic Affairs, Finance and Administration, Student Affairs, Institutional Advancement, Public Affairs, Special Programs and research centers such as the Center for the Study of Western Hemispheric Trade and the Texas Center for Border Economics and Enterprise Development.
Subcommittee 4:
Educational Support Units which include the Killam Library, the division of International Programs, and the division of Programs for Academic Support and Enrichment (Developmental Studies, Testing, Writing Center, etc.).

A uniform assessment reporting process and template were initiated in Fall 2002 to allow for the efficient use of time, as well as standardization of both information and resources. Informational sessions were conducted with all employees involved in the assessment process to introduce the new format and explain the report submission process. Academic programs submit their assessment plans each semester; while administrative/educational support units submit their plans on an annual (September to August) basis. At the beginning of the academic year, all academic departments meet to review their educational programs. Faculty members of each department lead the assessment efforts in their specific areas. In a similar fashion, all administrative/educational support units annually review and revise their assessment procedures. All assessment plans are in line with the Institutional Mission as well as the college/school and department unit mission.

From the initial implementation of the uniform assessment reporting process to the present time, a great deal of progress has been made in the development of effective assessment plans. The UAC has worked diligently to provide guidance and leadership to the University community and to foster a better understanding and appreciation of the importance of the assessment process. Each year, the UAC reviews the effectiveness of the assessment process as well as the templates used for reporting, and makes modifications and improvements to the process as needed.

The current method for submitting assessment plans is outlined below:

  • Each degree program/support unit visits the Institutional Effectiveness (IE) web page to obtain the appropriate template, either Program Assessment Form for academic programs or Unit Assessment Form for administrative/educational support units. The template is a user-friendly Microsoft Word document that is easy to access and complete.

  • Once the template is completed, the degree program and administrative/educational support unit coordinator sends the assessment plan as an e-mail attachment. All e-mail correspondence pertaining to assessment is sent to a general e-mail address, assessment@tamiu.edu, which routes a copy of each message to all UAC members, ensuring an even flow of communication.

  • Once the assessment plans are submitted, they are loaded to WebCT. A WebCT directory identifies the plan as an academic program or an administrative/educational support unit plan.

  • Once the plan is on WebCT, the UAC subcommittee members review the plans and post their comments.

  • After reviewing the format and content of the plans, the UAC recommends to either post the plan on the IE web page or send the plan back to the appropriate coordinator for revisions. Once all plans have been reviewed by the UAC, they are posted on the IE web page to allow departments to have easy access to them.

    .
  • At the end of the semester or fiscal year, depending on which timeline applies, the departments submit their completed plans (with use of results) to the UAC for review. Upon final review of the completed plans by the UAC, all department plans are posted on the IE web page. This process allows departments to view their current plans as well as all the previous completed assessment reports.

The IE web page now serves as the central repository of all assessment materials, including forms, plans, and reports. This approach provides the opportunity to learn about assessment methods and instruments used by other University departments.

The Assessment Plan Reporting Process is illustrated in the chart below:

A timeline is distributed by the UAC to the University community with specific information on the activity and the due date. A sample of the timeline is provided below:

Texas A&M International University
University Assessment Committee (UAC)
Fiscal Year 2005 Assessment Plans Timeline

Academic Reports

Fall 2004 Semester

 

 

Preliminary Plans

By October 15, 2004

 

 

Objectives (3 minimum)
Means of Assessment (1 or 2 per objective)
Criteria for Success (Benchmark)

 

 

 

 

 

Review of plans by UAC Subcommittees

October 20 to 28, 2004

 

 

 

 

 

UAC Subcommittees Feedback to UAC
and Departments

By October 29, 2004

 

 

 

 

 

Revised Plan from Departments Submitted to
UAC Subcommittees

By November 15, 2004

 

 

 

 

 

Final Plan Submitted to UAC

By January 31, 2005

 

 

Summary of Data
Use of Results

 

 

 

 

 

 

UAC Reviews Final Plans

First week of February 2005

 

 

 

 

Spring 2005 Semester

 

 

Preliminary Plans

By March 15, 2005

 

 

Objectives (3 minimum)
Means of Assessment (1 or 2 per objective)
Criteria for Success (Benchmark)

 

 

 

 

 

 

Review of plans by UAC Subcommittees

March 22 to 29, 2005

 

 

 

 

 

UAC Subcommittees Feedback to UAC
and Departments

By March 30, 2005

 

 

 

 

 

Revised Plan from Departments Submitted to
UAC Subcommittees

By April 15, 2005

 

 

 

 

 

Final Plan

 

By September 15, 2005

 

 

Summary of Data
Use of Results

 

 

 

 

 

 

UAC Reviews Final Plans

Last week of September 2005

 

 

 

 

Administrative/Educational Support Units

 

 

 

 

 

Fiscal Year 2004-2005

 

 

 

 

 

Preliminary Plans

By October 15, 2004

 

 

Objectives (3 minimum)
Means of Assessment (1 or 2 per objective)
Criteria for Success (Benchmark)

 

 

 

 

 

 

Review of plans by UAC Subcommittees

October 20 to 28, 2004

 

 

 

 

 

UAC Sub-Committees Feedback to UAC and Departments

By October 29, 2004

 

 

 

 

 

Revised Plan from Departments Submitted to UAC Subcommittees

By November 15, 2004

 

 

 

 

 

Final Plan

By September 15, 2005

 

 

Summary of Data
Use of Results

 

 

 

 

 

 

UAC Reviews Final Plans

Last week of September 2005

 

 

 

 

The implementation of this online method has allowed the UAC the flexibility to review the plans on their own time and has increased their productivity when they meet as a group to discuss the issues. It has also cut back on the need to print the plans for distribution to the committee members. Training is provided to the UAC members on the use of WebCT so that the group can efficiently review the plans. Training sessions are also provided to the University community on the uniform assessment reporting process.

The University is committed to the development of an effective uniform assessment process. This initiative has involved the entire institution and the process has been enhanced by the use of technology. As inidicated earlier in this report, general e-mail address (assessment@tamiu.edu) was created to simplify submission of reports and to assure that all assessment reports are received by all of the members of the University Assessment Committee (UAC). All assessment reports are submitted and reviewed electronically by the UAC. Upon approval, the final plan is posted to the Institutional Effectiveness (IE) web page and is available to the entire campus community. The IE web page now serves as the central repository of all assessment materials, including forms, plans, and reports.

While there is evidence of significant improvement in many areas; however, time is needed to achieve a higher rate of success in the use of data for improvement in other areas. Some of the newer and smaller academic degree programs need to have a sufficient number of students and graduates to generate adequate data and make constructive recommendations for program change.

To demonstrate use of assessment data, assessment outcomes matrices have been compiled that outline the progress and improvements made by degree programs and administrative/educational support units. The assessments of degree programs are inherently different from those of administrative/educational support units. In particular, the outcomes of degree programs reflect the expectations of the graduates of those programs. The outcomes of administrative/educational support unit assessments are expressed in terms of service and service satisfaction. This difference is directly related to the mission of the administrative/educational support units to provide the resources students and faculty need to accomplish the Institutional Mission.

The outcomes matrices provide the following information:

  • Degree Program or Administrative/Educational Support Unit Name
  • Person Responsible for Assessment
  • Method of Assessment
  • Frequency of Administration/Assessment
  • Objective
  • Criteria or Benchmark
  • Summary of Data and Results Achieved
  • Use of Results and Action Taken

The University community uses the data compiled in the assessment outcomes matrices to document positive change. The information is located in a table format to allow for easier identification of data and use of results. Some examples of the use of assessment data for improvement of programs and services include:

  • The Grant Resources web page was enhanced to assist faculty and staff in their search efforts for external funding. The Director of Grant Resources meets with each department to develop a grant-seeking plan. A faculty survey is also distributed each semester to determine factors related to participation or non-participation in grant applications.

  • The Property Management and Receivables Department modified their notification process from paper-based to e-mail to advise departments of merchandise arrivals. This new process has reduced the turn-around time for merchandise pick-up from seven days to a range of two to four days, increasing efficiency.

  • In the area of Student Activities, a Student Programming Advisory Board was created to provide the campus with diverse entertainment and quality student oriented events. Events are now planned under the auspices of the new board, which encourages more active participation from students. Recent events held include a Pep Rally, Rush Week and Movie Night.

  • The Texas Center for Border Economic and Enterprise Development provides high school students with the opportunity to explore their entrepreneurial interests through a special program entitled Youth Entrepreneur Summer (YES) Camp. Although students expressed strong satisfaction with the program overall, a number of constructive comments were used to improve the program. The results of the survey led to the following adjustments to the program (a) community business leaders were invited to speak to the class and (b) visits were conducted to businesses in the community.

  • Concerns about low police visibility on campus led the University Police Department to purchase a patrol vehicle, golf carts, and patrol bikes to increase patrol activity and visibility on campus.

  • To address deficiencies identified in the results of the Texas Examination of Educator Standards (TExES) Examination for English Language Arts and Reading for Secondary School Certification, the Department of Language and Literature has adopted a multifaceted approach. In the Spring 2004 semester, for example, a new text - Teaching Reading in High School English Classes - was added to the ENGL 4390 (Problems in Teaching English) course reading list when TExES score reports suggested that some students were testing poorly in areas related to reading instruction. Spring 2004 results show slightly stronger performance on questions pertaining to reading instruction, but weaknesses on the written component of the examination were still present. In the future, English 4390 will integrate more practice in timed literary explanation and writing exercises.

  • The Bachelor of Arts in History degree plan was revised beginning with the 2003-2004 University Catalog. Initially undertaken in response to concerns about student pass rates on ExCet/TExES teacher certification examinations, the new degree plan also reflects the concern with these scores. By putting greater emphasis on students taking a wide variety of courses from different areas (European, World, U.S. Chronological, and U.S. Thematic history courses), the intent is to broaden student knowledge of history. In doing so, particular emphasis was placed on the U.S. Chronological courses, and the creation of two new courses (HIST 4317, American Revolution/Early National Era 1763-1815 and HIST 4318, The Age of Jackson to the Civil War, 1815-1865) to complete the coverage of U.S. history in upper-division courses. HIST 4318 was taught for the first time in Spring 2004; HIST 4317 will be taught for the first time in Spring 2005. We continue to monitor the subscale scores for student progress as the degree plan matures.

  • For the Bachelor of Science in Criminal Justice, the following actions have been taken: (1) The rating form used by external internship managers has been revised to include additional criteria to generate more specific information on each intern. In addition a five point Likert scale is now being used. (2) To improve student writing within the program CRIJ 1301, CRIJ 1306, CRIJ 3305, CRIJ 3306, and CRIJ 4340 include more writing assignments. For example in CRIJ 4340, students are required to write a paper which discusses school delinquency programs, including specific crime prevention models and criminological theories. (3) Faculty identified a need for additional professionals to meet the demands of the expanding undergraduate program.

  • In the Master of Science in Criminal Justice program, to further strengthen student writing and critical thinking, graduate course CRIJ 5303 (Law and Criminal Justice), now emphasizes writing in the syllabus, and two new writing assignments critiquing CJ films/videos are required. A group meeting of faculty was implemented to review comprehensive exams, making the process a collective one rather than simply polling individual assessments.

  • For the Master of Arts in Spanish, faculty met on April 21, 2004 and discussed the importance of encouraging qualified graduates to apply to doctoral programs and the importance of preparing a thoughtful, meticulous graduate application. The Spanish faculty will hold an annual seminar for individuals applying to the collaborative Ph.D. program. The purpose of the seminar is to assist the graduate applicants in their quest for a Ph.D.

  • In the Master of Business Administration program, the graduate Business test produced and graded by the Educational Testing Service is used to determine mastery of the common business concepts. Overall our students did not place at the 50th percentile of all institutions that administered the test during Spring 2003. The faculty recommendation was to make appropriate modifications to the content and coverage of the common business core of the MBA program.

  • In the Master of Science in Information Systems, an objective test developed by members of the department is given to all the students attending a required graduate course. This instrument measures the student’s knowledge of the following areas: General Systems Concepts; Networking Concepts; Systems Management; Database Concepts; and Programming Concepts. The test is administered as part of the CIS 5390 capstone course. Prior to the administration of the test, the students were given three twenty-minute review sessions as part of the three lectures preceding the administration of the test. The area with the lowest score was Networking Concepts. New faculty with expertise in the area of Networking were recruited for the department in Fall 2004.

  • In the Bachelor of Science in Interdisciplinary Studies in Early Childhood Education Reading, the average score of students who took the Pedagogy and Professional Responsibilities (PPR) Texas Examination of Educator Standards (TExES) in Fall 2003 on Domain I (Designing Instruction and Assessment to Promote Student Learning) was 76.83%. Based on achievement of over 70% passing on Domain I, the next goal was to address Domain III (Implementing Effective, Responsive Instruction and Assessment). Domain III was chosen because Spring 2004 data indicate below average scores on this domain.

  • The goal of the Bachelor of Science in All-Level Special Education is that students will compare favorably statewide with their understanding of learners with special needs. Based on the assessment data collected for Domain I (Understanding Learners with Special Needs), this objective was successfully achieved. The department will now focus on Domain III because post-baccalaureate students and students on deficiency plans indicated in the analysis that this domain could be a potential area of weakness. Changes to the undergraduate degree to align the degree to the new domains and standards of the Texas Examinations of Educator Standards (TExES) examination were approved, and was implemented in the Fall 2004.

  • For the Master of Science in Education with a major in Educational Administration, after a review of the results of the Texas Examination of Educator Standards (TExES) state examination for principal certification, the Educational Administration faculty agreed on the following strategies: (1) Increase the field activities and course activities that help students apply their knowledge concerning school community leadership, and (2) Increase research activities in the classroom in order to emphasize learning in school and community.

  • In an effort to increase the retention rate of students enrolled in developmental courses, the Office of Developmental Studies implemented a pilot Learning Communities cohort. Starting in Spring 2005, all first-time freshmen developmental students must take GENU 1300 (Theories and Applications of Learning) whether or not they have met the minimum SAT or ACT entrance requirement.

  • To increase the effectiveness of the Study Abroad Fair and the usefulness of the study abroad library materials, the Office of International Education implemented the following: A two-step training was established (one for interoffice procedures, the other for the study abroad library). Part one of the training was completed in October 2004 and part two (study abroad library training) will take place in late January 2005. Information regarding updating the new OIE web page was submitted to the Webmaster in mid Fall 2004. Two new handouts were created (1) How to Research Study Abroad, and (2) Frequently Asked Questions. Previous study abroad students were recruited and participated in the Fall 2004 Study Abroad Fair.

  • In order to increase the retention of program participants of the TRIO Student Support Services, the Program Director and Academic Coordinator met in September 2004 to discuss the retention results and made the following changes: (1) A mentoring program that pairs freshmen with upper level students with like majors was initiated in Fall 2004. The students meet to discuss issues that pertain to their academic goals and identify any difficulties that may be affecting their success; (2) Telephone calls were made by all TRIO staff to encourage students to come to tutoring sessions. A decision was made by the Program Director and Academic Coordinator that all freshmen enrolled in developmental courses must attend weekly tutoring sessions; (3) Students classified as seniors were strongly encouraged by TRIO staff to enroll on a full-time basis. The position of Academic Coordinator was changed to a full-time position allowing greater flexibility in scheduling appointments for all graduating seniors for career and academic advising. The position change will also enable the program to provide evening hours by appointment.

  • In General Education, actions have been taken to address the core areas of Reading, Writing, and Mathematics. In Reading, to determine TAMIU students' entry level reading preparation as measured by their ACT reading scores, the University requested the ACT-CAAP linkage report which compares the students' ACT scores with their CAAP scores. In summary, of this cohort, 10% did not score as high as expected when the percentage differences of the lower and higher than expected categories are summed. Although this cohort may not represent the entire student body, it does indicate that TAMIU students enter with lower reading scores than their national peers and that their cumulative progress is 10% lower than expected for this group of students. The data will be shared with the University Assessment Committee for recommendations. In Writing, the English curriculum was changed to a yearlong sequence of continuous composition courses, ENGL 1301 and ENGL 1302, before the student takes the University Writing Assessment (UWA). The analytical analysis of UWA essays was conducted for the third semester to identify areas of student weakness in writing that can be used to formulate suggestions for alterations in both curriculum and test administration. Of the five categories (1) organization and focus, (2) development, (3) logic and coherence, (4) syntax and style, and (5) mechanics, Spring 2004 students were most deficient in the areas of logic and coherence, development, and mechanics. The analytical rubric data indicate that while grammar and usage remains a dominant category for evaluating student writing, the development of ideas and arguments appears to be the most important analytical category in predicting student success on the UWA. In Mathematics, the results for the Fall 2003 administration of the common algebra examination improved significantly. Some of the suggestions for improvement made in unit services for the spring semester have been implemented, particularly in the areas of tutoring and early intervention. More emphasis is being placed on traditional college algebra content (including general problem solving, inequalities, quadratic optimization problems, and inverse functions) in order to meet or surpass the benchmark.

All academic and administrative/educational support unit assessment reports may be found on the Office of Institutional Effectiveness web page. To provide an easier identification of use of assessment data, assessment matrices have been compiled to outline the progress and improvements made by degree programs and administrative/educational support units.

The Assessment Outcomes Matrices provide detailed information on all degree programs and administrative/educational support units and can be found online. (Assessment Outcomes Matrix - College of Arts and Sciences , Assessment Outcomes Matrix - College of Business Administration, Assessment Outcomes Matrix - College of Education, Assessment Outcomes Matrix - Dr. F. M. Canseco School of Nursing, Assessment Outcomes Matrix - General Education, Assessment Outcomes Matrix - Athletics, Assessment Outcomes Matrix - Administrative Units, Assessment Outcomes Matrix - Educational Support Units).

The implementation of a uniform assessment reporting process has been beneficial in a number of ways, including:

  • It provides a better understanding of the importance of assessment throughout the University.

  • It allows all departments/units to participate in discussions about their goals and the information they acquire through their assessment methods.

  • It allows students to participate through various types of assessment methods and provide information to the University on how to make the programs and services more beneficial to them.

  • It serves as a basis for institutional improvement by using the results of the assessment reports for improving programs and services.

  • It makes assessment an integral part of the planning and budgeting process, for data-based decision making and identifying resources needed to implement improvements.

Establishing the current mode of uniform assessment has been challenging; however, the strong support of the University community and the education and training provided by the University Assessment Committee and the Office of Institutional Effectiveness, have allowed for a steadily maturing, collegial process. The process in place at our institution has grown out of our own experiences and works well on our campus. It is a model that can be easily managed and modified as the need arises, allowing for continued growth and technological advances.





SUPPORT DOCUMENTATION


SOURCE

LOCATION/Special Instructions

Institutional Mission http://www.tamiu.edu/general.shtml - mission

Institutional Effectiveness web page

http://www.tamiu.edu/adminis/ie/assessment.shtml

Assessment Outcomes Matrix - College of Arts and Sciences http://www.tamiu.edu/archives/matrix/coas.pdf This is Adobe PDF file.
Assessment Outcomes Matrix - College of Business Administration http://www.tamiu.edu/archives/matrix/coba.pdf This is Adobe PDF file.
Assessment Outcomes Matrix - College of Education http://www.tamiu.edu/archives/matrix/coed.pdf This is Adobe PDF file.
Assessment Outcomes Matrix - Dr. F. M. Canseco School of Nursing http://www.tamiu.edu/archives/matrix/cson.pdf This is Adobe PDF file.
Assessment Outcomes Matrix - General Education http://www.tamiu.edu/archives/matrix/gened.pdf This is Adobe PDF file.
Assessment Outcomes Matrix - Athletics http://www.tamiu.edu/archives/matrix/ath.pdf This is Adobe PDF file.
Assessment Outcomes Matrix - Administrative Units http://www.tamiu.edu/archives/matrix/admin.pdf This is Adobe PDF file.
Assessment Outcomes Matrix - Educational Support Units http://www.tamiu.edu/archives/matrix/edusup.pdf This is Adobe PDF file.


Red Back Arrow
Report
Red Forward Arrow
Back to Sections    Close Window