this post was submitted on 15 Oct 2024
544 points (96.9% liked)

Fuck AI

1424 readers
192 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 8 months ago
MODERATORS
 

A Massachusetts couple claims that their son's high school attempted to derail his future by giving him detention and a bad grade on an assignment he wrote using generative AI.

An old and powerful force has entered the fraught debate over generative AI in schools: litigious parents angry that their child may not be accepted into a prestigious university.

In what appears to be the first case of its kind, at least in Massachusetts, a couple has sued their local school district after it disciplined their son for using generative AI tools on a history project. Dale and Jennifer Harris allege that the Hingham High School student handbook did not explicitly prohibit the use of AI to complete assignments and that the punishment visited upon their son for using an AI tool—he received Saturday detention and a grade of 65 out of 100 on the assignment—has harmed his chances of getting into Stanford University and other elite schools.

Yeah, I'm 100% with the school on this one.

you are viewing a single comment's thread
view the rest of the comments
[–] brucethemoose@lemmy.world 8 points 1 month ago* (last edited 1 month ago) (3 children)

When I was a kid, we had a period of some repetitive math work I got sick of. So I wrote a TI-84 program to automate it, even showing its work I would write down.

I wasn't really supposed to do that, but my teacher had no problem with this. I clearly understood the work, and its not just punching the equation into WolframAlpha.

It would be awesome if there was an AI "equivalent" to that. Like some really primitive offline LLM you were allowed to use in school for basic automation and assistance, but requires a lot of work to set up and is totally useless without it in. I can already envision ways to set this up with BERT or Llama 3B.

[–] ptz@dubvee.org 15 points 1 month ago* (last edited 1 month ago) (3 children)

It would be awesome if there was an AI "equivalent" to that

It's called your brain / learning. That's why you're there. If the specifics of the curriculum are too tedious, that's on the school to address.

Learning how to parse and comprehend information to find an answer is just as important as the answer.

[–] groucho@lemmy.sdf.org 7 points 1 month ago

As a survivor of homeschooling, this is the one thing I wish more people understood: school is not about cramming enough data into a kid until they magically evolve into an adult. School is supposed to teach you how to think.

Not in an Orwellian sense, but in a "here's how to approach a problem, here's how to get the data you need, here's how to keep track of it all, here's how to articulate your thoughts, here's how to ask useful questions...." sense. More broadly, it should also teach you how to handle failure and remind you that you'll never know everything.

Abstracting that away, either by giving kids AI crutches or -- in my case -- the teacher's textbook and telling them to figure it out, causes a lot of damage once they're out of the school bubble and have to solve big, knotty problems.

[–] CarbonIceDragon@pawb.social 4 points 1 month ago (1 children)

to be fair, understanding something well enough to automate it probably requires learning it in the first place. Like obviously an AI that just tells you the answer isnt going to get you anywhere, but it sounds more like the user you were replying to was suggesting an AI limited enough that it couldnt really tell you the answer to something, unless you yourself went through the effort to teach it that concept first. Im not sure how doable this is in practice, My suspicion is that to actually be able to be useful in that regard, the AI would have to be fairly advanced and just pretend to not understand a concept until adequately "taught" by the student, if only to be able to tell if it was taught accurately and tell the student that they got it wrong and need to try again, rather than reinforce an incomplete or wrong understanding, and that theres a risk that current AI used for this could instead be "tricked" by clever wording into revealing answers that its supposed to act like it doesnt know yet (on top of the existing issues with AI spitting out false information by making associations that it shouldnt actually make), but if someone actually made such a thing successfully, I could see it helping with some subjects. I'm reminded of my college physics professors who would both let my class bring a full page of notes and the class textbook to refer to during tests- under the reasoning that a person who didnt understand how to use the formulas in the text wouldnt be able to actually apply them, but someone who did but misremembered a formula would have the ability to look them up again in the real world. These were by far some of the toughest tests I ever had. Half of the credit was also from being given a copy of the test to do again for a week as homework, where we were as a class encouraged to collaborate and teach eachother how so solve the problems given, again on the logic that explaining something to someone else helped teach the explainer that thing too.

[–] brucethemoose@lemmy.world 1 points 1 month ago* (last edited 1 month ago)

You worded this much better than I could.

Yes I was thinking of two directions:

  • A "smarter" AI, though I think a better term would be "customized," specifically tailored to only help with knowledge that the student already "learned" in the context.

  • A "dumb" AI thats too unreliable to use for lazy ChatGPT style answers, but can be a primitive assistant to bounce ideas off of or help with phrasing, wording, formatting and basic tasks that are too onerous or trivial for a human/student to help with.

Not many people are familiar with the latter because, well, they only use uncached ChatGPT, but I find small LLMs to already be useful as a kind of autocomplete or sanity check when my brain is stuck (much like it was without my TI84 BASIC program), and the experience is totally different because the response is instant (as the context is cached on your machine).

[–] captainlezbian@lemmy.world 5 points 1 month ago

Yeah, I got sick of manually inputting my physics lab data in college. TA absolutely had no problem with me handing in a python script as my work instead of a bunch of handwritten formulas.

But this is writing. The thing about writing is that it is the critical skill being taught here. Most classes that involve much writing see it as the crucial element. The student is being taught to gather information, process concepts, and effectively communicate reasonable conclusions from all of it in a way that others can understand. And ideally in a way that’s pleasant to read.

I get it, I fucking hated writing in school. I thought it was pointless and frustrating and that I’d never benefit from it. But it turned out to be one of the most critical skills I was taught. It made me an effective communicator and taught me to better organize my thoughts when attempting to express them, or to understand them. I struggle to think of a way any generative tool could take some of the load without taking a large portion of the lesson away from the student in the process.

[–] halcyoncmdr@lemmy.world 5 points 1 month ago

I wasn't really supposed to do that, but my teacher had no problem with this. I clearly understood the work, and its not just punching the equation into WolframAlpha.

This is the way it should be. If you created the program on your own, as opposed to copying it from elsewhere, you had to know how to do the work correctly in the first place. You've already demonstrated that you understand the process beyond just being able to solve a single equation. You then aren't wasting time "learning" something you've already learned just to finish an otherwise arbitrary number of problems.