• 2pEXgD0fZ5cF an hour ago

    I don't see the whole AI topic as a large crisis, as others have mentioned: put more emphasis on in-person tests and exams. Make it clear that homework assignments are for practice, learning, and feedback. If a person thinks that copy/pasting helps them, give them the freedom to so, but if as a result they fail the exams and similar in-person evaluations, then so be it. Let them fail.

    I would like to hire students who actually have skills and know their material. Or even better, if AI is actually the amazing learning tool many claim then it should enhance their learning and as a result help them succeed in tests without any AI assistance. If they can't, then clearly AI was a detriment to them and their learning and they lack the ability to think critically about their own abilities.

    If everyone is supposed to use AI anyway, why should I ever prefer a candidate who is not able to do anything without AI assistance over someone who can? And if you hold the actual opinion that proper ai-independent knowledge is not required, then why should I hire a student at all instead of buying software solutions from AI companies (and maybe put a random person without a relevant degree in front of it)?

    • apatheticonion an hour ago

      It's a huge problem. I have several friends in university who have had assignments flagged as AI. They have had entire units failed and forced to retake semesters which is not cheap.

      Even if you fight it, the challenge goes into the next semester and pushes out your study timeline and associated costs.

      > put more emphasis on in-person tests and exams. Make it clear that homework assignments are for practice, learning, and feedback. If a person thinks that copy/pasting helps them

      Works for high school, not so much for university degrees. What's crazy is universities have an incentive to flag your work as AI generated as it forces the student to pay more money and is difficult to challenge.

      One friend now uses a dashcam to record themselves when writing an assignment so they can prove no AI was used when they are eventually flagged.

      • 2pEXgD0fZ5cF an hour ago

        Yeah bad choice of words on my part, I apologize. I can imagine that things are pretty chaotic right now and that there are quite a few problems like the one you describe. When I said I don't see a crisis here I meant that more in a more overarching sense and that I see this as solvable.

        > Works for high school, not so much for university degrees.

        I don't know about that. I can't speak for the US, but at the university where I got my degrees (Math & CS) and later worked prerequisite in-person tests to be allowed to take a given exam were not rare. Most modules had lectures (professor), tutorials (voluntary in-person bonus exercises and tutors to ask questions) and exercise groups where solutions to mandatory exercises were discussed. In the latter sometimes an additional part of the exam requirements was to present and explain a solution at least once or twice over the course of the semester. And some had small, mandatory bi-weekly tests as part of the requirement too.

        Obviously I can understand that this would not work equally well in each kind of academic programme.

        • apatheticonion 28 minutes ago

          > Yeah bad choice of words on my part, I apologize.

          All good!

          > I can't speak for the US

          I just had to respond to this as the implication of being American touched a nerve, haha. Australian here.

    • threemux an hour ago

      In person, proctored blue book exams are back! Sharpen those pencils kids.

      I've been wondering lately if one of the good things to come out of heavy LLM use will be a return to mostly in-person interactions once nothing that happens online is trustworthy anymore.

      • bambax an hour ago

        Yes! This "problem" is really easy to fix with in person exams and no computers in class, ever.

        • meroes an hour ago

          This is the “back to office” of education. It is not a one size fits all solution. There are so many remote and hybrid classes now you guys sound outdated.

          • analog31 an hour ago

            That’s fair, but at the same time, expecting any learning to occur in remote classes, when fair evaluation is impossible, may also be outdated.

            • kbelder an hour ago

              Learning is just as easy remote and with AI, maybe easier. It's testing and evaluation of that learning that's difficult.

              Universities make money not by teaching, but by testing and certifying. That's why AI is so disruptive in that space.

              • analog31 18 minutes ago

                Universities don’t make money.

                Granted, I’m 62, so I’m from the old world. I attended college, and taught a couple of college classes, before the AI revolution. There was definitely a connection between learning and evaluation for most students. In fact most students preferred more evaluation, not less, such as graded quizzes and homeworks rather than just one great big exam at the end. Among other things, the deadlines and feedback helped them budget their efforts. Also, the exercise of getting something right and hitting a deadline is not an overt purpose of education, but has a certain pragmatic value.

                Again, showing my age, in the pre-AI era, the technology of choice was cheating. But there were vanishingly few students who used cheating to circumvent the evaluations while actually learning anything from their courses.

                If teaching and certifying could be separated, they would be. In fact, it has happened to some extent for computer programming, hence the “coding interview” and so forth. But computer programming is also an unusual occupation in that it’s easy to be self taught, and questionable whether it needs to be taught at the college level.

          • seanmcdirmid an hour ago

            Until they need to start learning how to use them to get a job in the modern world?

            There should be a class that teaches you how to use AI to get things done, especially judging on how many even on HN admit they aren’t good at it.

            • idle_zealot an hour ago

              But is that webscale?

          • gotrythis an hour ago

            I put a day of careful thought into writing a cover letter for a job a few weeks ago. Knowing there is there was the potential of AI screening, I checked if it would get flagged.

            Every detection program I tried said the letter that I personally wrote by hand was 100% AI generated!

            So, I looked for humanizer programs and ran my cover letter through a couple. Without the results in front of me at the moment, I can only revert to my judgemental conclusion instead of solid observations...

            You need to write like an idiot to pass AI detection algorithms. The rewritten cover letter was awful, unprofessional, and embarrassing.

            • OsamaJaber 2 hours ago

              AI detectors punishing non native English speakers for writing too cleanly is the part nobody talks about enough -_-

              • Rexxar an hour ago

                For example, native English speakers often make phonetic spelling errors (such as its/it’s, your/you’re) that non-native English speakers usually avoid. It’s probably a sign that someone speaks more fluently when he starts making these types of mistakes from time to time.

                • Tade0 an hour ago

                  Or picked up English before they learned to read and write properly.

                  I'm cursed with this as I was put in an international environment right before turning five, went back to my home country to start grade school and only in fifth grade started having English classes.

              • HeavyStorm 2 hours ago

                This ship has sailed.

                It's how it was with the internet. I grew up in the 90s, and teacher didn't know how to deal with the fact we no longer had to go through multiple books in the library to get the information they needed. We barely needed to write it.

                Now nobody expects students to not use the internet. Same here: teachers must accept that AI can and will write papers and answer questions / do homework. How you test student must be reinvented.

                • xeromal 2 hours ago

                  I know a lot of teachers are reverting back to handwritten papers. People can generate it but at least you're doing something.

                  • randall 2 hours ago

                    this is like irl cryptographic signatures for content lol

                    • idiotsecant 2 hours ago

                      This is just about the worst possible response it seems. It manages to probably hurt some wrists not used to long handwriting sessions, completely avoid learning how to use and attribute AI responsibly, and still probably just results in kids handwriting AI generated slop, anyway.

                      • singpolyma3 2 hours ago

                        While I don't think it's the right solution, it will force them to at least read what they're submitting which means some learning :)

                        • ethin an hour ago

                          It also disadvantages people with disabilities. How exactly are they supposed to do these papers and tests? Dictate everything to someone else, using Blindness as an example? Because that seems very very inefficient and extremely error-prone.

                          • ThrowawayR2 an hour ago

                            As someone with an actual visual impairment, please do not attempt to use my affliction to justify generalized use of AI. Educational assistance for those with disabilities is not a new thing; AI is likely going to have a role but how remains exactly to be seen.

                          • BugsJustFindMe 2 hours ago

                            > It manages to probably hurt some wrists not used to long handwriting sessions

                            I'm sorry but, lmao. You cannot be serious.

                            > attribute AI

                            Oh no!

                            > still probably just results in kids handwriting AI generated slop

                            Not if they're doing it in person. And at least they then need to look at it.

                            • jxf an hour ago

                              We've been writing with our hands for thousands of years. I suspect that on balance a Butlerian Jihad against AI slop would be perfectly fine for our hands.

                          • smoyer 2 hours ago

                            When I was in high school, we were not allowed to use calculator for most science classes ... And certainly not for math class. I'm ten years, will you want to hire a student who is coming out of college without considerable experience and practice with AI?

                            • AlotOfReading an hour ago

                              LLMs work best when the user has considerable domain knowledge of their own that they can use to guide the LLM. I don't think it's impossible to develop that experience if you've only used LLMs, but it requires a very unusual level of personal discipline. I wouldn't bet on a random new grad having that. Whereas it's pretty easy to teach people to use LLMs.

                              • paulryanrogers 19 minutes ago

                                Kids need to learn the fundamentals first and best. They can learn the tools near the end of school or even on the job.

                                I loved computer art and did as many technical art classes at university as I could. At the beginning of the program I was the fastest in the class, because we were given reference art to work from to learn the tools. By the end of the class I couldn't finish assignments because I wasn't creative enough to work from scratch. Ultimately I realized art wasn't my calling, despite some initial success.

                                Other kids blew me away with the speed of their creations. And how they could detach emotionally from any one piece, to move on to the next.

                                • nkrisc an hour ago

                                  If all they know is AI, and they supplanted all their learning with AI, why even hire them? Just use the AI.

                                  • hazbot 32 minutes ago

                                    Yes, it is much easier to train someone to use AI than to train them to have sufficiently baked-in math and language skills to be able to leverage the AI.

                                    • ThrowawayR2 an hour ago

                                      Should I, by some miracle, be hiring, I'd be hiring those who come out of college with a solid education. As many have pointed out, AI is not immune to the "garbage in, garbage out" principle and it's education that enables the user to ask informed and precisely worded questions to the AI to get usable output instead of slop.

                                      • croes an hour ago

                                        Why would I want to hire such a student? What makes him better the better pick than all the other students using AI or all the other non-students using AI?

                                      • croes an hour ago

                                        This is not how AI will be in the future.

                                        At some point the will have to make profit, that will shape AI.

                                        Either by higher prices or ads. Both will change the use of AI

                                        • AndrewKemendo 2 hours ago

                                          I remember when websites couldn’t be considered valid sources for graded assignments

                                          • MattGaiser 2 hours ago

                                            I was dealing with this even in 2014 when I was in high school. Even then, entire classes of government data weren’t published in some print volume.

                                            • AndrewKemendo an hour ago

                                              In my case at least there was some validity to it in 1995

                                        • zkmon 2 hours ago

                                          Teachers are also heavy users of AI. Entire academic business staff is using AI.

                                          The goals of academic assessment need to change. What are they assessing and why? Knowledge retention skills? Knowledge correlations or knowledge expression skills? None of these going to be useful or required from humans. Just like the school kids are now allowed to use calculators in the exam halls.

                                          The academic industry need to redefine their purpose. Identify the human abilities that are needed for the future that is filled with AI and devices. Teach that and assess that.

                                          • Espressosaurus an hour ago

                                            A calculator is more consistent and faster at calculating than I am, but I still need to understand how to multiply, divide, add, and subtract before I can move on to more complicated math. I need to intuitively understand when I'm getting a garbage result because I did an operation wrong, moved a decimal place by accident, or other problem.

                                            Memorization has a place, and is a requirement for having a large enough knowledge base that you can start synthesizing from different sources and determining when one source is saying something that is contradicted by what should be common knowledge.

                                            Unless your vision of the future is the humans in WALL-E sitting in chairs while watching screens without ever producing anything, you should care about education.

                                            • zkmon 40 minutes ago

                                              > A calculator is more consistent and faster at calculating than I am, but I still need to understand how to multiply, divide, add, and subtract..

                                              Exactly. If the calculator knows what to do and how to do, you just need to be able to specify a high level goal, instead of worrying about whether to add or multiply.

                                            • kyykky an hour ago

                                              Teaching is about moving our knowledge (the stuff we’ve collectively learned ourselves, from others and our parents [instead of everyone needing to find out on their own]) to the next generation. While some skills may become obsolete in some parts of professional life due to AI, the purpose of academia does not change much.

                                              • zkmon 36 minutes ago

                                                > the purpose of academia does not change much.

                                                The purpose did change a lot. During Greeks time, purpose was pure knowledge or geometer skills. Industrial revolution and office work changed the purpose of the education to produce clerical staff. With AI, the purpose changes again. You need skills in using the AI and devices.

                                              • ThrowawayR2 an hour ago

                                                > "Knowledge retention skills? Knowledge correlations or knowledge expression skills? None of these going to be useful or required from humans."

                                                I'm fascinated by these claims from some LLM advocates that people will no longer need to know things, think, or express themselves properly. What value then will such individuals bring to the table to justify their pay? Will they be like Sigourney Weaver's character in Galaxy Quest whose sole function was to repeat what the computer says verbatim? Will they be like Tom Smykowski in Office Space indignantly saying "I have people skills; I am good at dealing with people! Can't you understand that?!" Somebody, please explain.

                                                [EDIT] The other funny aspect about these claims is, given that such an individual's skills are mainly in using an AI, that they can simply be outspent by their peers on AI usage. "Wally got the job instead of me because he paid for a premium LLM to massage his application and I could only afford the basic one because I'm short on money."

                                                • zkmon 35 minutes ago

                                                  > What value then will they bring to the table to justify their pay?

                                                  They are skillful in shepherding a population of AI agents towards goals.

                                                • croes an hour ago

                                                  Teachers are also heavy users of solution books, but would not give them to students for this reason.

                                                • ashleyn 3 hours ago

                                                  I'm guessing this "humanizer" actually does two things:

                                                  * grep to remove em dashes and emojis

                                                  * re-run through another llm with a prompt to remove excessive sycophantry and invalid url citations

                                                  • emmp 2 hours ago

                                                    For student assignment cheating, only really the em dashes would still be in the output. But there are specific words and turns of phrases, specific constructions (e.g., 'it's not just x, but y'), and commonly used word choices. Really it's just a prim and proper corporate press release style voice -- this is not a usual university student's writing voice. I'm actually quite sure that you'd be able to easily pick out a first pass AI generated student assignment with em dashes removed from a set of legitimate assignments, especially if you are a native English speaker. You may not be able to systematically explain it, but your native speaker intuition can do it surprisingly well.

                                                    What AI detectors have largely done is try to formalize that intuition. They do work pretty well on simple adversaries (so basically, the most lazy student), but a more sophisticated user will do first, second, third passes to change the voice.

                                                    • dbg31415 2 hours ago

                                                      You’re absolutely right!

                                                      Ha. Every time an AI passionately agrees with me, after I’ve given it criticism, I’m always 10x more skeptical of the quality of the work.

                                                      • glitchcrab 2 hours ago

                                                        Why? The AI is just regurgitating tokens (including the sycophancy). Don't anthropomorphise it.

                                                        • 20260126032624 2 hours ago

                                                          Because of the way regurgitation works. "You're absolutely right" primes the next tokens to treat whatever preceded that as gospel truth, leaving no room for critical approaches.

                                                          • otikik an hour ago

                                                            Because I was only 55% sure my comment was correct and the AI made it sound like it was the revelation of the century

                                                        • the_fall 2 hours ago

                                                          No. No one is looking for em-dashes, except for some bozos on the internet. The "default voice" of all mainstream LLMs can be easily detected by looking at the statistical distribution of word / token sequences. AI detector tools work and have very low false negatives. They have some small percentage of false positives because a small percentage of humans pick up the same writing habits, but that's not relevant here.

                                                          The "humanizer" filters will typically just use an LLM prompted to rewrite the text in another voice (which can be as simple as "you're a person in <profession X> from <region Y> who prefers to write tersely"), or specifically flag the problematic word sequences and ask an LLM to rephrase.

                                                          They most certainly don't improve the "correctness" and don't verify references, though.

                                                          • smrtinsert an hour ago

                                                            providers are also adding hidden characters and attempting to watermark if memory serves.

                                                            • the_fall 32 minutes ago

                                                              It's more complex than that. It's called SynthID-text and biases the probabilities of token generation in a way that can be recovered down the line.

                                                        • grahamburger an hour ago

                                                          I've heard some teachers are assigning their students to 'grade' a paper written by LLM. The students use an LLM to generate a paper on the topic, print it out, then notate in the margins by hand where it's right and wrong, including checking the sources.

                                                          • postepowanieadm 2 hours ago

                                                            The Washing-Machine Tragedy was a prophecy.

                                                            • falloutx 2 hours ago

                                                              At some point, writing 2 sentences by hand will become more acceptable than this.

                                                              • vyskocilm 16 minutes ago

                                                                We have been banned from using Ami Pro for writing a technical reports on my highschool in late 90s and required to write them by hand as an attempt to combat copy&pasting. We "shared" all the calculations in the class anyway, so it haven't worked well.

                                                                • pinnochio 2 hours ago

                                                                  Shortly after, AI-powered prosthetic hands that mimic your handwriting will write those 2 sentences for you.

                                                                • yarrowy 2 hours ago

                                                                  just move to 2 hour in class writing blocks.

                                                                  • mc32 2 hours ago

                                                                    I get that students are using the LLM crutch -and who wouldn’t?

                                                                    What I don’t get is why wouldn’t they act like an editor and add their own voice to the writing. The heavy lifting was done now you just have to polish it by hand. Is that too hard to do?

                                                                    • BugsJustFindMe 2 hours ago

                                                                      Humans tend to be both lazy and stupid and are always looking for ways to pass by with minimal effort. Kids aren't different just because they're in school.

                                                                      • MattGaiser an hour ago

                                                                        It would be dull to do. Being a tone scribe would be terrible.

                                                                      • tgrowazay 2 hours ago

                                                                        Everyone knows about emdashes, but there are so much more!

                                                                        Here is a wiki article with all common tell-tales of AI writing: https://en.wikipedia.org/wiki/Wikipedia:Signs_of_AI_writing

                                                                        • jurgenaut23 2 hours ago

                                                                          I used em dashes heavily 15 years ago when writing my PhD thesis.

                                                                          • singpolyma3 2 hours ago

                                                                            So did every author of classic literature. People who think they can spot AI writing by simple stylistic indicators alone are fooling themselves and hurting real human authors

                                                                            • A_D_E_P_T an hour ago

                                                                              It's because LLMs were trained on classic literature that they began to use em-dashes in their now-famous manner.

                                                                              Seriously, highbrow literature is heavily weighted in their training data. (But the rest is Reddit, etc.) This really explains a lot, I think.

                                                                              • zeroonetwothree an hour ago

                                                                                Let’s just say when my coworkers started sending emails with tons of bold and bullet points when they had never done that before I felt pretty justified in assuming they used AI

                                                                              • SecretDreams 2 hours ago

                                                                                Same, but 5 years ago. Now they're ruined for me lol.

                                                                              • kbelder an hour ago

                                                                                My fear is people will actually take that article to heart, and begin accusing people of posting AI simply for using all sorts of completely valid phrases in their writing. None of those AI tells originated with AI.

                                                                                • AstroBen 2 hours ago

                                                                                  I saw someone created a skill to weaponize that exact list to humanize the AI's output

                                                                                  There are no clear signs, at least for anyone who cares to hide them