10 Comments

My sense as a former CSU Dean is that is 100% about getting around the CFA. Note the trainers are not in a union

Expand full comment

Thanks for that perspective!

It is hard to read the politics from the other side of the continent, but I am interested in seeing what happens when the reality of what's happening runs into the CFA and is subject to other forms of faculty governance review.

If the Fullerton rollout of TitanGPT is any indication, there is no there there. I know Gertrude Stein was talking about Oakland when she penned that famous line, but it so fits the idea that giving everyone a ChatGPT subscription at no cost and a few LinkedIn Learning videos is an answer to the educational challenges raised by generative AI.

Expand full comment

I just posted the rather poorly thought out CFA response on another thread but note it here. I was at Sonoma State, which just imploded dramatically and so very tragically for many faculty who just lost their jobs. Faculty need to look ahead. I've been writing about this on my substack for the last six months https://www.calfac.org/wp-content/uploads/2024/10/Resolution-for-a-New-CBA-Article-Governing-the-Use-of-AI-with-Friendly-Amendments-10.2024.pdf

Expand full comment

I'm so glad you commented here. I was an early subscriber to Anecdotal Value, but it doesn't seem to be making it into my feed. I'll try to correct that.

We are coming at these questions from a very different perspective, which is helpful. So much of the discourse in the academy around AI is in the arena of what Henry Farrell calls AI Fight Club, which requires you to pick a side as either an enthusiast or a resister. Not a context that allows for nuance or exploration.

Expand full comment

I swear, the only reason they are making these deals with publishers, governments and institutions is the hope that it will advance their mission of creating AGI/ASI. They don’t care one iota about improving our lives and they don't really expect longterm profits from subscriptions and licensing deals. The objective is to subsume as many human minds as possible to feed their god incubator. Once they have what they need they'll pull the plug and we will be waiting in darkness for what comes next. I sincerely hope they never achieve their goal. I wish so many humans were not unwittingly helping them.

Expand full comment

This is definitely a score for the pro-AGI, anti-human agenda. The idea that a subscription for everyone is actual progress on figuring out what to do when it comes to this new technology is sad. That said, waking up faculty governance groups and widespread frustration about blowing $16.9 million will have consequences.

Expand full comment

This was a real headscratcher, particularly in light of budget cuts. What drove this decision? Was it in the name of equity, to make the more advanced model available to everyone for free along the lines of the CSUCCESS program? On the student side, has every CSU faculty member instituted clear guidelines for AI use in their courses?

Not 23 for long, by the way. It will be 22 campuses this summer when Cal Maritime merges into Cal Poly SLO.

Expand full comment

Given what I'm hearing from folks close to the situation (and it sounds like you may be among them?), I think the CFA is going to be asking those questions angrily, as there does not seem to have been much in the way of consultation. My sense is that there is an equity dimension in that technology $20 a month for a subscription is a lot for many students, so giving it away to everyone at no cost "solves" that problem. Certainly, that's how institutions describe the reasons they sign enterprise, meaning institution-wide agreements with Grammarly and other companies. CSU officials gave that reason as well.

I try to give everyone the benefit of the doubt, but as I said in my piece, the main reason executives blunder into these sorts of decisions is pressure from their board and the overall discourse to do something about AI. Due to the complexity off the issues involved, there are no moves here that are good and easy and fast, so they go with easy and fast.

Expand full comment

For context, I work on the student side of admissions, based in CA. CA students are fortunate to have many institutions of higher education and many pathways to choose from, at an in-state cost. I try to stay on top of trends and happenings so I know what options are available to students. AI use is also a big topic (who uses it, what do they use it for, should you use it, what shouldn’t you use it for, etc.) (If you’re a student coming across this comment, the answer is no, don’t use it for your application essays.) I’m a big fan of CSUs but, like I said, this was a head scratcher. If students are going to have that access, every class needs to have clear guidelines in the syllabus.

Agree, easy and fast; but is it a good?

Expand full comment

From what I hear from folks and read in The Mercury News, the CSU faculty are upset about how all this is happening. There is little clarity about what's happening at the institutional level. And, yes, faculty should have clear guidelines about using AI in the class.

The lack of course information, not just AI, but what is expected of students and what they should expect, is a long-standing problem in many institutions of higher ed. Faculty should themselves be responsible for establishing that as a norm, and enforce it among their peers.

Expand full comment