It feels like the negative impact AI will have on young people is far greater than what social media has already done in the past 15 years.
The big tech companies of today may be regarded in generations to come like the factories a hundred years ago that had kids working in dangerous conditions. Except this time the kids aren’t making a product, they are the product.
Maybe, but at the very least, this time we seem to be a lot more aware of it, and lots of schools are grappling with how to adapt in this age. We definitely have a lot to figure out still, and there will unfortunately be a lot of broken eggs in this omelette, but at least it's not something everybody is just sleeping on (this time around).
I also think it's a little too easy to throw the baby out with the bathwater. For students that are genuinely curious, AI can be a major facilitator to learning. I've seen several students make massive gains in subjects when they can converse with AI and test their knowledge. The key difference is motivation/incentive. If a student just wants to check the homework box and be done, AI is a major net negative. It doesn't have to be though (hallucination problems aside)
> this time we seem to be a lot more aware of it
I dunno mate. From my perspective Facebook, Snapchat, Instagram et al are all late entries in this game. I recall 'everyone' being very aware of this in the mid aughts as well. But these new, Big Tech networks rode the wave of mobile internet, where phones became the primary and in most cases only device people use. This was the age of scaling. When the voices that are critical are decimated a few times over, the message gets lost. Speaking of Eternal September...
Now everybody is hooked to the Big Tech firehose, so critical voices of AI are drowned out at will. I have a hard head thinking this round will be different. Especially it is the stated goal of Big Tech financiers to take control like this. This isn't just happening...
The problem is we're not grappling with the underlying issue which is the ability of big companies and rich individuals to spew slop at the world. Crafting a policy specifically to address AI will just result in some other form of harmful junk being created and foisted upon us.
yeah man, children being permanently maimed in 19th factories during 20 hour shifts, children using chat bots and subsequently (and therefore causally) getting bad grades... who can even begin to tell the difference??
A hypothetical scenario, but what if the consequences are a severe drop in overall intelligence? There is a conflated sense of "knowing" that comes from these tools.
AI that teaches you? Cool, it is great you learned. But generative AI giving you full essays you can hand in without even learning a topic? It's not difficult to see how this could have negative impacts.
Right. People who rely on AI to do everything, quite literally don’t know what they don’t know.
It's muscle: use it or lose it.
Yeah, bad grades will be the only consequence. Of course that’s what I meant.
I would be shocked if oral exams didn't make a comeback across all levels of education soon.
Exams are just a good idea. We sacrifice a huge amount in the name of "some people get overly stressed about exams", when really we just need to accept that some people can't cope with very much stress and will probably get a fairly stress-free job, like burger-flipper or project coordinator or VP of HR.
It really doesn’t make sense to waste tons of potential in order to preserve uncomfortable, high stress environments and systems. Seems like a loss for the sake of a loss.
It's not to preserve them. It's to preserve being pretty sure that someone knows what their qualification claims they know.
Are there methods we could use which avoid filtering out people for lacking skills unrelated to those they’ll deploy after graduation? Certainly I’ve seen examples of simple changes that do a lot of good, like single user exam spaces (with a camera). Even basic changes like that can seriously improve exams, but often require resource allocation. If people think things like, “exams being stressful is good, these students just need to toughen up,” then we won’t allocate those resources and we’ll miss out on a good investment.
The school I work with is considering a return to handwritten papers, and the more I dove into the topic, the more I think it makes sense
I made a text editor that detects whether something has been AI generated by analyzing the keystrokes and edit history: www.collie.ink
Haven’t really told anyone about it so no one is using it, but I think it’s a good idea!
Very interesting! I can see how useful it would be, though I’m hoping the adversarial “AI learns how a specific individual types” and just dumps its output in there doesn’t become a thing.
What's so hard about hand copying to paper the essay the AI wrote?
Presumably this would also be joined with in-person proctoring, with a blue book, or its equivalent, and a writing utensil. Not much room to copy down an AI-hallucinated answer when all you have on your person is pen and paper.
that's all fine and good for a subset of essays, but there are others, like research reports, that take much longer than 2 hours to complete.
Pair it with an in-depth oral examination based on the report. Hell even have AI conduct the oral examination, since it can read it thoroughly much quicker than any teacher or TA and draft good questions. The teacher can grade for correctness and whatever other factors they already grade for.
can an llm do citations and bibliography without hallucinating? This seems pretty easy to verify and hard to cheat with.
Have you tried Gemini Deep Research?
Can’t I just do it on a computer and then copy it over ?
Yes for homework papers you could, though you'll still be learning (and often retaining) a lot in the process of handwriting a copy :-D
There's also classroom time being given to students for their writing, with no computers available (or with just monitored access with much AI stuff blocked)
Why not pencil-and-paper? Oral is vastly less efficient, and also problematic for ESL and many other types of students.
Bring back cursive!
I remember when I was a kid people would just copy other people’s essays and just reword the sentences. So people who want to cheat will figure out ways to cheat on homework, etc. AI is just the new vehicle.
I think grades should be based 90% on exams, and 10% or less on homework. Homework should be practice not what your grades are based on.
I am all for in-person written and/or oral exams. We still do it for PhD classes. But we don't have people to conduct oral exams for large classes like in undergrad or even some masters level courses. In my university, even getting a TA is challenging for many faculty due to cost cutting.