The framing that hit hardest in this module: most boards assume their value rather than measure it. I've been one of those programs.
I track program health — enrollment, credential attainment, Skilled Trades Fair participation, pathway-level data. What I don't have is a systematic record of what the board recommended, what got implemented, and what the cumulative impact of those recommendations looks like over time. Without that record, the IAC's value is something I feel, not something I can prove.
Three things I'm building from this module.
A recommendation tracker. Every time the board identifies a gap, flags a problem, or proposes a direction — that gets logged. Implemented or not, and why. The record becomes the evidence.
Board quality metrics alongside program metrics. Meeting attendance, membership breakdown against bylaws standards, whether leadership rotations are actually happening. I have the sign-in sheets. I haven't been reading them as data. I will now.
Equity disaggregation with the board, not just internally. Completion, placement, and credential attainment broken down by demographics — gender, race, socioeconomic status — shared with members as part of the State of METT snapshot. The board can't advocate for equitable outcomes they haven't seen.
How I'm applying it: recommendation log built into the shared drive before Q1 SY27, board quality dashboard added to the quarterly council meeting agenda, and equity metrics folded into the next State of METT presentation.
The goal is a board whose value I can prove — to leadership, to funders, and to the members themselves.