|
a. | General Systems Concepts
[67.41%] |
72.25%
|
(58.04%) +++
|
b. | Network Concepts
[45.19%] |
34.21%
|
(33.08%)
|
c. | Systems Management
[59.54%] |
77.73%
|
(56.21%) ++++
|
d. | Database Concepts
[61.73%] |
75.79%
|
(66.15%) ++
|
e. | Programming Concepts
[76.77%] |
84.69%
|
(74.83%) ++
|
Overall the average score on the test was 69.158%, compared to 60.92%
in Fall 2003. This represents a 13.5% improvement in overall performance,
but the result is still slightly below the acceptable standard. All knowledge
areas, except Networking, showed improvement, while networking remained
consistently well below standard. However, the small sample size (19)
suggests that caution should be used in implementation of program changes
based solely upon test results.
__1__a. Use of Results to Improve Instructional Program:
The results will be reviewed by the Department at the first Departmental
meeting in Fall 2004, as specified by unanimous vote of the faculty.
Several changes in faculty composition will impact the quality and structure
of the program in the future. These changes include, but are not limited
to: the recruiting of two new Assistant professors, as well as a change
in the department chairmanship, and the additional offerings required
to support the new doctoral program.
While the performance in all areas except networking improved, and the overall score improved, two factors were identified as being major concerns: [1] international students, especially from certain parts of Asia, do not fully comprehend the American definition of plagiarism, and consequently are often discovered submitting work that is not their own. When discovered, grades suffer, but prior to discovery, learning has not occurred. [2] Most international students do not understand the necessity to read the assigned texts prior to class, nor do they do optional exercises from textbooks, because they simply do not own the required texts. An informal survey of two Spring 2004 classes suggest that over 90% of international students in the MS-IS program neither own the required textbook or understand the necessity to take lecture notes, but would rather rely upon copies of lecture slides and/or transparencies.
Master of Science in Information Systems (MS-IS)
Instructional Degree Program
Spring 2004
Assessment Period Covered
July 1, 2004
Date Submitted
Intended Educational (Student) Outcome:
__2__ Graduates will demonstrate competency in communication skills
First Means of Assessment for Outcome Identified Above:
__2__a. Means of Program Assessment & Criteria for Success:
In a reasonable effort to allow the students to demonstrate their
communications skills, the products from the MIS 5390 Project Management
course will be presented in a public forum, and the Departmental faculty
as well as College and University Administrators will be asked to evaluate
the apparent functionality of the products. The participants were asked
to evaluate each product on the following characteristics: Apparent Quality
and the Presentation of the product. It was determined that no evaluation
should be below 3.5.
__2__a. Summary of Assessment Data Collected:
Fourteen (14) people evaluated the projects in addition to the client.
They were four MIS faculty, five members of the University staff, and
five guests who were local business people. [Note: in the following summary
of data, the staff and guest evaluations are combined as “Guest Evaluation”].
Faculty Evaluation | Invincible | Midnight Skipper | Waves | Triumphs |
Quality of product | 3.75 | 3.70 | 4.20 | 4.50 |
Presentation | 3.45 | 3.20 | 4.25 | 4.75 |
Guest Evaluation | ||||
Quality of product | 4.10 | 3.90 | 4.20 | 5.00 |
Presentation | 3.80 | 4.00 | 4.05 | 4.70 |
Overall | ||||
Quality of product | 3.88 | 3.44 | 3.87 | 4.60 |
Presentation | 3.66 | 3.35 | 3.92 | 4.60 |
__2__a. Use of Results to Improve Instructional Program:
The results were overall acceptable. One group received overall mean
evaluations below the 3.5 standard (e.g, 3.44 for quality and 3.35 for
presentation); however the class as a whole exceeded the standard. During
the first Fall 2004 faculty meeting the department will determine if
this approach will be continued or if another outcome will be specified
and studied. In part, the re-evaluation of this outcome and its assessment
will be driven by the changes in the department’s faculty composition.
Master of Science in Information Systems (MS-IS)
Instructional Degree Program
Spring 2004
Assessment Period Covered
July 1, 2004
Date Submitted
Intended Educational (Student) Outcome:
__3__ Students completing the masters program will
demonstrate their knowledge of theories, models and tools relevant to
the field of Information Systems through the development of a fully functional
software product designed to meet a specific client’s needs. This product
is one of the outputs of the CIS 5390 “Capstone” course, and it’s development
provides the students the opportunity to experience “service learning”
or the completion of learning objectives while providing a benefit to
a community member or entity.
First Means of Assessment for Outcome Identified Above:
__3__a. Means of Program Assessment & Criteria for Success:
The client will be asked to accept the product as being in compliance with
the client’s new system specifications, and the client will be asked to adopt
the product for in their organization. However, to provide us with a higher
degree of granularity in our assessment, it has been decided that the clients
would be asked to provide answers on a 5-point scale of acceptance, with
the following items: 5 = Excellent, 4 = Very Good, 3 = Good, 2 = Average,
1 = Poor. The participants were asked to evaluate each product on the following
characteristics: Function, Suitability, Quality, Presentation, and Overall
(please see attached evaluation form for definitions). It was determined
that no evaluation should be below 3.5, except the Overall characteristic
which was used to comparatively rank all of the products.
For the Spring 2004 semester, the CIS 5390 the University Registrar requested help with the development of an automated Commencement Control system that would provide, among other items, correctly ordered seating charts, check-out forms, mailing labels for degrees, and processional order rosters. The class was divided into four teams; each team selected a name; data will be reported on each team and summarized. The teams were: Waves, Triumphs, Invincible, and Midnight Skippers.
The completed projects were presented in open forum to the client (the University Registrar and selected members of the Registrar’s staff), MIS/DS faculty, University Administrators and Staff, and invited guests. Everyone who attended the presentation was given an opportunity to review the software, question the student participants, and review system documentation. The presentation were evaluated by the students in the class (whose evaluations were not included in this assessment), four (4) MIS/DS faculty, and five (5) members of the Staff and Administration, and five (5) guests, as well as the client.
__3__a. Summary of Assessment Data Collected:
Client’s evaluation of product:
Waves – Acceptable as is, not perfect, but useable; this product has been institutionalized and was used in conjunction with May 2004 Commencement.
Triumphs – Acceptable, but required minor modifications
Invincible – Acceptable, but requires major modifications
Midnight Skippers – Not useable as submitted at deadline.
Fourteen (14) people evaluated the projects in addition to the client. They were four MIS faculty, five members of the University staff, and five guests who were local business people. [Note: in the following summary of data, the staff and guest evaluations are combined as “Guest Evaluation”].
Faculty Evaluation | Invincible | Midnight Skippers |
Waves | Triumphs |
Functionality | 3.63 | 3.38 | 4.10 | 4.50 |
Suitability of Purpose | 4.00 | 3.80 | 4.50 | 4.50 |
Quality of product | 3.75 | 3.70 | 4.20 | 4.50 |
Presentation | 3.45 | 3.20 | 4.25 | 4.75 |
Guest Evaluation | ||||
Functionality | 4.10 | 4.10 | 4.40 | 5.00 |
Suitability of Purpose | 3.80 | 3.70 | 4.20 | 4.90 |
Quality of product | 4.10 | 3.90 | 4.20 | 5.00 |
Presentation | 3.80 | 4.00 | 4.05 | 4.70 |
Overall | ||||
Functionality | 3.96 | 3.52 | 4.07 | 4.69 |
Suitability of Purpose | 3.70 | 3.40 | 4.00 | 4.60 |
Quality of product | 3.88 | 3.44 | 3.87 | 4.60 |
Presentation | 3.66 | 3.35 | 3.92 | 4.60 |
__3__a. Use of Results to Improve Instructional Program:
The overall result of the projects’ evaluations was acceptable. A
similar evaluation technique may be used in the future; however, during
the first Fall 2004 faculty meeting the department will determine if
this approach will be continued or if another outcome will be specified
and studied. In part, the re-evaluation of this outcome and its assessment
will be driven by the changes in the department’s faculty composition.
Second Means of Assessment for Outcome Identified Above:
__3__b. Means of Program Assessment & Criteria for Success:
No second means was used
SUPPORT DOCUMENTATION
Enter any document referenced above in this summary table. There are two examples listed below. If no documents are cited, please remove the two examples from the table.
SOURCE |
LOCATION/Special Instructions |
Informing Science (Special issue on Organizational Learning), V.3, No.3.
|
Kock, N., Auspitz, C. and King, B. (2000), Using the Web to Enable Industry-University Collaboration: An Action Research Study of a Course Partnership, Informing Science (Special issue on Organizational Learning), V.3, No.3, pp. 157-167. |
Communications of the ACM, V.46, No.9.
|
Kock, N., Auspitz, C. and King, B. (2003), Web-supported Course Partnerships: Bringing Industry and Academia Together, Communications of the ACM, V.46, No.9, pp. 179-183. |
MIS 5390 – Project Evaluations – Spring 2004
Presentations:
Project presentations were evaluated by three different groups: peer-evaluation (i.e., other MIS 5390 students), MIS/DS faculty, and TAMIU Administrators, Staff and other visitors. The evaluations contributed by each of these groups was considered, then weighted based upon project development experience, then combined to determine an overall presentation evaluation. In each case (i.e., functionality, suitability, quality, and presentation), the following 5 point scale was used:
5 = Excellent
4 = Very Good
3 = Good
2 = Average
1 = Poor
Invincible
Peer Evaluation
Functionality 4.14
Suitability of Purpose 3.20
Quality of product 3.80
Presentation 3.73
Faculty Evaluation
Functionality 3.63
Suitability of Purpose 4.00
Quality of product 3.75
Presentation 3.45
Guest Evaluation
Functionality 4.10
Suitability of Purpose 3.80
Quality of product 4.10
Presentation 3.80
Overall
Functionality 3.96
Suitability of Purpose 3.70
Quality of product 3.88
Presentation 3.66
Midnight Skippers
Peer Evaluation
Functionality 3.07
Suitability of Purpose 2.90
Quality of product 2.71
Presentation 2.86
Faculty Evaluation
Functionality 3.38
Suitability of Purpose 3.80
Quality of product 3.70
Presentation 3.20
Guest Evaluation
Functionality 4.10
Suitability of Purpose 3.70
Quality of product 3.90
Presentation 4.00
Overall
Functionality 3.52
Suitability of Purpose 3.40
Quality of product 3.44
Presentation 3.35
Waves
Peer Evaluation
Functionality 3.71
Suitability of Purpose 3.30
Quality of product 3.21
Presentation 3.46
Faculty Evaluation
Functionality 4.10
Suitability of Purpose 4.50
Quality of product 4.20
Presentation 4.25
Guest Evaluation
Functionality 4.40
Suitability of Purpose 4.20
Quality of product 4.20
Presentation 4.05
Overall
Functionality 4.07
Suitability of Purpose 4.00
Quality of product 3.87
Presentation 3.92
Triumphs
Peer Evaluation
Functionality 4.57
Suitability of Purpose 4.30
Quality of product 4.29
Presentation 4.36
Faculty Evaluation
Functionality 4.50
Suitability of Purpose 4.50
Quality of product 4.50
Presentation 4.75
Guest Evaluation
Functionality 5.00
Suitability of Purpose 4.90
Quality of product 5.00
Presentation 4.70
Overall
Functionality 4.69
Suitability of Purpose 4.60
Quality of product 4.60
Presentation 4.60
OVERALL PRESENTATION EVALUATION (and comments):
1st- Triumphs (A)
· Nice search of database
· Promotional brochure is a nice touch
· Nice use of multi-media
· Single student report is a nice touch
2nd - Waves (A-)
· Nice report format
· Nice use of web and multi-media (should check for spelling)
· Nice offer of choices of dates (etc.)
· Should there be a pop-up for student editing.
3rd - Invincible (B+)
· Did anyone beside the Manager (?) work on this project (“my project,” “my software,” “MY system”).
· Why was the entire team in the front of the room when only one person presented?
· Can not add college? Or program? What about the new Ph.D.
· Too much time spent on reports, too little on why this is “the right solution”
· Speak – pause; speak – pause; etc., etc. the presentation must have wasted at least 5 or 6 minutes of available time saying nothing
· On-line help was apparently missing
4th - Midnight Skippers (B+)
· Security is a nice touch
· Why only one tie? Isn’t this a professional presentation?
· Too hard to see
· Row length not adjustable
· Too quick, not enough detail; how do we know this the solution
· Typos in Ceremony
CLIENT’S REPORT ON ACCEPTABILITY
WAVES
Acceptable – not perfect, but will be used as is: A
TRIUMPHS
Acceptable with minor modifications: A-
INVINCIBLE
Acceptable with major modifications: B+
MIDNIGHT SKIPPERS
Not useable as submitted: B