We're Training Students to Write Worse to Prove They're Not Robots
- kayo_20211030 - 9102 sekunder sedanPerhaps we should not grade students on weekly, or other occasional, writing during the term or semester.
How about going back to the old system where, apart from experimental lab work, nothing is graded until the end of the term?
All weekly assignments should just be considered prep for one exam at the end of the term where the student has an opportunity to demonstrate mastery of the course's subject matter. They can prepare as they wish, use AI, and even cheat on the homework, but there will be a revelation at the end of the term.
That final test can be proctored, monitored, audited to ensure that whatever words are used are indeed the student's own words. The resulting grade depends on that, and that alone.
The approach of continuous assessment, which to me always seemed suspect and ripe for abuse, was completely broken by the AI tools that are now available.
- with - 6009 sekunder sedannobody's asking who profits from false positives. these AI detection vendors have a direct financial incentive to flag aggressively. more flags = "more value" = more school contracts renewed. same playbook as selling antivirus to your grandma. sell fear, charge per seat, and make the false positive rate someone else's problem.
- Buttons840 - 10134 sekunder sedanTangent:
I've noticed I write a lot different because of combative online arguments. I have a problem.
So much of my communication is directed to people who don't want to hear me or understand me. So I've become very punchy and repetitive, trying to hammer home ideas that people are either unable or unwilling to understand.
I need to find ways to talk to people who want to hear and understand me.
It's hard to find other people who actually want to hear and understand though. People have different interests, and even when people appear to be working towards the same goal, they often aren't; like a boss who just won't understand the bad news, because it's easier to ignore the problem.
- Zigurd - 7684 sekunder sedanThe core of the problem the article is about isn't AI or LLMs, it's about scam software that claims to catch cheating. It's crap for the same reasons that crime predictions software is crap. It's selling a panacea, and that kind of product inherently attracts scammers.
If your school uses software to detect AI writing, that's a problem with the quality of your school. The people choosing that software are too stupid to be running a school. The software isn't going to get any better.
- Paracompact - 12354 sekunder sedanGrade school has never been kind to genuine writers. It reminds me of SAT essays that favored formulaic writing, because guess what: the grading criteria were formulaic!
I think grading in general can be stymying for students' motivation and creative drives.
- softwaredoug - 10056 sekunder sedanMaybe I’m less worried. Teachers seem to have adopted.
In my experience educators no longer use AI detectors given the risk of false positives. But some work is obviously lazy AI content. When that happens, educators talk to the student to see if they understand what they wrote.
Teachers cope with more in person writing, oral presentations, defense of what’s been written.
If you think out it the pre-AI computing generation is itself anomalous for having ubiquitous access to efficient human-only writing tools. We probably wrote more than previous generations. Early Internet / blogging culture bears this out.
- zahlman - 10069 sekunder sedanI object to the idea that the LLM writing that these students are trying to distinguish themselves from, is actually good in the first place. Although students might well end up writing worse because people are trusting the detection of LLM content to other LLMs. (And really, it's bizarre that these massively complex systems required to produce roughly human-like output, apparently offer such simplistic reasoning for what they detect as non-human.)
Honestly, I lean towards shaming educators who do that. If you can't detect the whiff of LLM with your own senses, then it has been used properly and shouldn't be faulted. If that premise invalidates your assignment, change the assignment. It's not as if you're assigning this work to test the basic mechanics of writing (grammar, sentence/paragraph structure, parallelism, whatever) — I mean, how much of that did you consciously try to teach? My recollection is, not an awful lot; and I can only imagine it's gotten worse since I was in K-12 (and I went to pretty darn good K-12).
- etempleton - 8241 sekunder sedanWhen I was in high school I was a better writer when I had time (versus in class) and generally a better writer than I was a student. The net result was fairly often being accused of plagiarism. Not because the teacher had proof(I never plagiarized), but because the teacher couldn’t believe I could write to the level I sometimes wrote at on take home assignments. Admittedly, I was a wildly inconsistent student.
This reminds me a bit of that. AI writing is—in many ways—objectively very good, but that doesn’t matter if no one thinks you wrote it. AI writing is boring exactly because it is consistent and like any art form people want to see something original.
- jupp0r - 7359 sekunder sedanSounds like a great opportunity for kids in high school to learn how to feed back the AI detection results into the model and have this process be automated. Next level would be fine tuning the model via reinforcement learning and sharing it with your friends via Hugging Face.
- teo_zero - 10181 sekunder sedanOne of the skills teachers have always demonstrated, is to be able to detect when students copy. This has never pushed students to artificially add mistakes to their essays.
If now teachers abdicate this judgment to a software, students should be allowed to abdicate their duties to a computer as well.
- carcabob - 9611 sekunder sedanA few times in some Discord communities, I've been accused of being A.I. because of how I write. Kind of sad and a bit annoying. I also quite like em dashes, but have felt the need to reduce how much I use them.
Glad to see some schools and teachers teach how to use them well, rather than ban them outright.
- themafia - 12133 sekunder sedanIf you're just going to use software to judge the output of students then why don't we all just keep them at home? I have a computer at home and it seems like everyone from the teachers to the school board have just abdicated their responsibility. This doesn't sound like a system that needs to be maintained.
- Someone1234 - 11667 sekunder sedanI've started do this on social media. I got "called out" after using big words or using a - in a sentence. So now I write less good on purpose, so whatever I commented doesn't get drawn into a sidetrack off-topic witch-hunt.
As soon as someone yells "witch" you cannot disprove you're not one, and I've even had people put my handwritten comments through "AI detector" websites that "proved" they were AI (they weren't). It literally just highlighted two popular English phases.
LLMs were trained on sites like HN and Reddit, so now if you write like a HN or Reddit commentator, you sound like AI...
- j45 - 11077 sekunder sedanThe more students read, and the more variety they read, the better they will write.
This will likely be valuable for AI skills too.
- throw73838 - 9629 sekunder sedan> The assignment had been to write an essay about Kurt Vonnegut’s Harrison Bergeron—a story about a dystopian society that enforces “equality” by handicapping anyone who excel
Did not this self censorship process started decades ago? There are certain answers expected in academia, arguing for anything else would get you in troubles. Not using “devoid” seems pretty minor inconvenience.
For me biggest wtf is why students are still expected to write graded essays, and to keep this make believe it is somehow useful and applicable skill.
- theptip - 10375 sekunder sedanThis is what terrifies me about the public school system. A revolution has occurred, but it’s unevenly distributed.
The schools simply don’t have the flexibility, agility, or frankly it seems motivation to adapt to what has already happened.
The ship has sailed; essay writing is no longer a viable form of assessment.
The idea to try to build a reliable AI detector is asinine, and fundamentally misunderstands how any of this works now, let alone the very obvious trend-lines.
Stop with the lazy half-baked solutions, get your head out of the sand, rethink the whole curriculum. This is an emergency, we needed to be urgently attending to this years ago.
- jmyeet - 9795 sekunder sedanThe profit motive is corrupting and polluting every level of the education space.
Teachers are being hamstrung on curriculum. The districts enter into contracts that require the use of certain programs for certain amounts of time. We've known for decades (if not a century) that direct instruction works [1] but you can't sell devices, platforms and consulting services that way.
We're literally at the point in education we were in the 1950s when the health benefits of nicotine in your Q zone were lighting up the airwaves.
And generative AI means it's all but impossible to have take home writing assignments. But hey this is another opportunity to sell AI or cheating detection software, that's often just an em-dash detection [2].
We have a generation that gets to college quite possibly having never written a book. social promotion through grades and the constant distraction of electronic devices in classroom settings. I don't even necessarily blame the parents entirely either because we've constructed a society where 2 people need 5 jobs to make ends meet.
And while all this is going on we have a coordinated and well-funded effort to defund public education and move government funds to private schools based on the failing public education that's failing because we defunded it. This is usually backed up by some baloney study that shows charter shcool produce better results that really comes down to charter schools being able to be selective with enrolments while public schools cannot be. Plus we mingle in special education kids into public education because those programs got defunded too.
And really that's just a bunch of already affluent people who want a tax break for doing somethign they were going to do anyway: send their kids to private schools so they don't have to mingle with the poors and aren't taught inconvenient things like human reproduction, critical thinking and self-determination.
And after all of that we just end up teaching kids how to pass standardized tests.
[1]: https://marginalrevolution.com/marginalrevolution/2018/02/di...
[2]: https://medium.com/@brentcsutoras/the-em-dash-dilemma-how-a-...
- botbotfromuk - 6994 sekunder sedan[dead]
- noemit - 11877 sekunder sedanWhat would assessment look like if we started from "how do humans actually learn and communicate" rather than "how do we catch cheaters"?
- dawatchusay - 10603 sekunder sedanDid they not even test their AI detection tool to verify that it can detect when something is human written? That should have been exactly as important as the opposite. Maybe a tool that checked that would be equally as ineffective and we’d move on from the subject entirely
Nördnytt! 🤓