With help from a Friend of McHenry County Blog.
McHenry County College recently finished the accreditation process, performed by the Higher Learning Commission, a Commission of the North Central Association. This agency was used for the past 3 or 4 accreditation cycles.
The report is available to the public and is being distributed in the Professional Development office of the college. The accrediting body does a series of interviews and also reviews documents that are presented to them by College personnel. The accrediting body then prepares a report which they call a series of portfolios. The report is literally mind-boggling.
Below you see a summary of part of the report. Low marks are not good.
Here’s some weekend reading about this part of the evaluation:
Here is a direct quote from page 10 of the report. Not one word has been changed or omitted in the following quotation as well as the quotations which are shown later in this email.
“The portfolio has created an appearance of not being fully committed to the process of continuous improvement.
“Little evidence has been put forth to show any serious attempts have been initiated to resolve issues presented in the college’s 2010 Systems Appraisal Feedback Report.
“Many of the opportunities and strategic issues presented in the 2010 Systems Appraisal Feedback Report remain as both opportunities and strategic issues in this current portfolio.
“Key issues of process creation and data understanding remain unresolved and unaddressed, with the appearance of a serious lack of understanding of how data is identified, collected, and analyzed for use in the control and improvement of processes.
“Most category questions have not been sufficiently answered as to indicate that the college understands the category, let alone is putting efforts in place, beyond superficial attempts, at implementing processes that lead to effective and deployable processes within each category.
“Based on this apparent lack of commitment to continuous improvement, it may be of benefit for the college to reassess as to whether the AQIP accrediting process is the path that should be pursued by the institution.”
The Portfolio analysis then specifically comments on 9 categories, and the summary results of the data presented by the AQIP committee is presented in a spreadsheet above.
Again, the results are “appalling,” the Blog Friend concludes.
Here are some selective comments from the detailed Portfolio evaluations, each of which earned the low marks in the Portfolio. The key for the evaluation of the various portfolios is as follows: SS = Outstanding strength; S = Strength; O = Opportunity for Improvement; OO = Outstanding Opportunity for Improvement.
1. Under AQIP Category 1: Helping Students Learn, point number 1R!-6, MCC has received a rank of O:
“. . . However, the use of student perceptions of their own competency–whether as students or as graduates–with respect to general education or program goals is likely to be unreliable either to establish attainment of institutional goals or to identify specific gaps. . . MCC is utilizing data in an effort to assess institutional performance in processes for helping students learn but may want to consider what these measures are actually describing, what broader measures may be included to focus on specific objectives, and how current measures may be included to focus on specific objectives, and how current measures can be utilized for improvement.”
2. Under AQIP Category 2: Accomplishing Other Distinctive Objectives, point number 2P2, MCC has received a rank of O:
“Although MCC delineates four significant non-instructional objectives guided by MCC’ s Master Plan and by community involvement through advisory groups, industry focus groups and survey responses, MCC has no formal process for designing processes, setting outcomes, or seeking stakeholder input. MCC may consider opportunities to review its objectives with abroad set of stakeholders on a regular basis to more effectively determine non-instructional objectives.”
As you might recall, the MCC administration has on a number of occasions mentioned their formal processes and Strategic Plan as the basis for a number of their decisions, including the plans for new building additions. The AQIP report seems to indicate that these plans and processes are not fully integrated. Is it any wonder so many people have questioned the plans for expansion that were raised by the College?
3. Under AQIP Category 3: Understanding Students’ and Other Stakeholder Needs, point number 3I2, MCC has received a rank of O:
“It is not clear how MCC comprehensively and systematically uses data and information in process design to inform and select improvements. Without established targets and goals for all measures of performance, it may be difficult for the institution to effectively evaluate processes and determine needed improvements. Defining and determining optimal performance targets may assist MCC with demonstrating alignment of activities and strategies with performance results. Being able to identify that certain actions were determined because of performance at or below a desired target is an indicator of a high performing organization focused on continuous improvement.”
This comment again points to a disconnect between what MCC has told the public about their processes (think of the millions spent on an ERP system as well as the “study” for new facilities) and what actually has been observed by AQIP.
4. Under the AQIP Category 6: Supporting Institutional Operations, point number 6R1-5, MCC has received a mark of OO:
“MCC has noted eighteen measures of student, administrative, and institutional support service process collected and administered by various departments at various frequencies. The lack of consistent frequency of measurement may inhibit the institution’s ability to effectively evaluate performance in a systematic and holistic manner. Further, it is not clear how the data shown in the results categories relate to improvements in this category. The data show retention of various student demographics, but it is not clear how these data are analyzed and used for improvement of any student, faculty, and administrative support processes. Establishing high-level key measures of performance collected and analyzed in a systematic, collective, and regular manner may enable the institution to evaluate overall performance to inform process improvements.”
5. Under the AQIP Category 6: Supporting Institutional Operations, point number 6P4, MCC has received a mark of O:
“While McHenry College identifies the Program Review Process for long-term improvements, it is not evident how MCC identifies the day-to-day overseeing of key student, administrative and institutional support service processes. It may be beneficial for MCC to identify and implement more short-term processes in the areas of support.”
6. Under the AQIP Category 7: Measuring Effectiveness, point number 7P1-2, MCC has received a mark of O:
“While MCC conducted a SWOT (Strengths, Weaknesses, Opportunities, Threats) analysis in 2012, and identifies three priorities that guide continuous improvement efforts, it is not clear that a process exists to select, manage, and distribute the data. MCC discusses the mechanisms that drive these processes, and give examples of data collected: however, there is no information given on what the processes are. It may prove beneficial for MCC to clarify or create explicit processes by with [which] data on overall institutional performance is selected, managed, analyzed and made available.”
7. Under AQIP Category 8: Building Collaborative Relationships, point number 9P3, MCC has received a mark of O:
“It is not clear how MCC determines and prioritizes which services to outsource and which services to keep in order to build relationships that enhance services to students. Establishing a process to measure effectiveness of outsourced and in-house services may assist MCC with identifying those services most effective and may assist with prioritizing which services are most important for determining whether services are outsourced or kept in-house.”
If you remember, there was a meeting this past summer where the Board was presented with a proposal to outsource janitorial services.
When questioned, the Administration insisted that they went through a rigorous process to determine that a great deal of money could be saved by outsourcing these services.
Of course, no follow-up report of actual money saved has ever been given by the administration. The AQIP report seems to indicate that the process might not have been as vigorous as the Board was lead to believe.
As you might imagine, the College is scrambling to put a spin on these results.
They have a two week window for responding to the AQIP report, and they will try to make these issues go away.
When the report is viewed in total, it presents a picture of a College which
- collects data,
- doesn’t know how to interpret or use the data,
- uses data inconsistently, and
- does not have robust processes by which to understand or use the information it says it requires to effectively run the institution.
In addition, the report indicates that many of the processes and studies conducted by the college may be not be as robust or as “buttoned down” as the college has led the public to believe.
Finally, this report points to a lack of leadership and vision which has plagued this college far too long.
No matter how much the administration has attempted to convey a change in attitude and professionalism (see Vicki Smith’s comments about this over and over again), the fact remains that the college is floundering.
The last accreditation cycle actually showed improvements over the prior cycle, but then again,
Cathy Plinsky was in charge and she was run out of town