Looks like it is happening
- Chinjut - 3902 sekunder sedanNote the following comment by Jerry Ling: "The effect goes away if you search properly using the original submission date instead of the most recent submission date. By using most recent submission date, your analysis is biased because we’re so close to the beginning of 2026 so ofc we will see a peak that’s just people who have recently modified their submission."
- sixtyj - 7075 sekunder sedanWell… it is happening. You can’t put spilled milk back to bottle. You can do future requirements that will try to stop this behaviour.
E.g. in the submission form could be a mandatory field “I hereby confirm that I wrote the paper personally.” In conditions there will be a note that violating this rule can lead to temporary or permanent ban of authors. In the world where research success is measured by points in WOS, this could lead to slow down the rise of LLM-generated papers.
- wmf - 7184 sekunder sedanI assume hep = high energy physics in this context. PI = professor who received a government grant.
Peer review has never really been blind and I suspect PIs will reject papers from "outsiders" even if they are higher quality. This already happens to some extent today when the stakes are lower.
- dang - 5972 sekunder sedan> submission numbers in the last couple months have nearly doubled with respect to the stable numbers of previous years
This is showing up (no pun intended) on HN as well. The # of submissions and # of submitters, which traditionally had been surprisingly stable—fluctuating within a fixed range for well over 10 years—has recently been reaching all-time highs. Not double, though...yet.
- 8organicbits - 4549 sekunder sedan> when AI agents started being able to write papers indistinguishable in quality from [...]
Given that arXiv lacks peer review, I'm not clear what quality bar is being referenced here.
- zoogeny - 6837 sekunder sedanOne thing I have been guilty of, even though I am an AI maximalist, is asking the question: "If AI is so good, why don't we see X". Where X might be (in the context of vibe coding) the next redis, nginx, sqlite, or even linux.
But I really have to remember, we are at the leading edge here. Things take time. There is an opening (generation) and a closing (discernment). Perhaps AI will first generate a huge amount of noise and then whittle it down to the useful signal.
If that view is correct, then this is solid evidence of the amplification of possibility. People will decry the increase of noise, perhaps feeling swamped by it. But the next phase will be separating the wheat from the chaff. It is only in that second phase that we will really know the potential impact.
- general_reveal - 6014 sekunder sedan“And further, by these, my son, be admonished: of making many books there is no end; and much study is a weariness of the flesh.” - Ecclesiastes 12:12 (KJV)
I suppose we’re entering TURBO mode for of ‘making many books there is no end’.
- tombert - 3106 sekunder sedanI like AI, I use Codex and ChatGPT like most people are, but I have to say that I am pretty tired of low-effort crap taking over everything, particularly YouTube.
There have always been content mills, but there was still some cost with producing the low-effort "Top 10" or "Iceberg Examination" videos. Now I will turn on a video about any topic, watch it for three minutes, immediately get a kind of uncanny vibe, and then the AI voice will make a pronunciation mistake (e.g. confusing wind, like the weather effect or the winding of a spring), or the script starts getting redundant or repetitive in ways that are common with AI.
And I suspect these kinds of videos will become more common as time goes on. The cost to producing these videos is getting close to "free" meaning that it doesn't take much to make a profit on them, even if their views are relatively low per-video.
If AI has taught me anything, it's that there still is no substitute for effort. I'm sure AI is used in plenty of places where I don't notice it, because the people who used it still put in effort to make a good product. There are people who don't just make a prompt like "make me a fifteen minute video about Chris Chan" and "generate me a thumbnail with Chris Chan with the caption 'he's gone too far'", and instead will use AI as a tool to make something neat.
Genuine effort is hard, and rare, and these AI videos can give the facsimile of something that prior to 2023 was high effort. I hate it.
- sealeck - 7289 sekunder sedanThere are many really excellent papers out there - the kind which will save you hours/months of work (or even make things that were previously inviable to build viable).
That said, it is amazing how terrible a lot of papers are; people are pressured to publish and therefore seem to get into weird ruts trying to do what they think will be published, rather than what is intellectually interesting...
- hhsuey - 2331 sekunder sedanWhat's happening? I hate click bait titles like these.
- bitbytebane - 3143 sekunder sedanSTOP CITING YOUTUBERS AS A CREDIBLE SOURCE OF ANYTHING.
- mianos - 2746 sekunder sedanThis title should have been editorialised. It's like a headline from the daily mirror.
- hmokiguess - 5767 sekunder sedanI think this is solid proof that the bedrock of academia is deeply motivated by money and still defaults to optimizing where it impacts its bottom line. If professors can get more grants and more publications in less time with less spending, of course they are going to be doing that. This isn't just because of AI, but also because of how this system is designed in the first place.
- pavel_lishin - 5172 sekunder sedanApparently "hep-th" stands for "High Energy Physics - Theory".
- guerrilla - 5358 sekunder sedanWebsite's down. What was it about?
- sidrag22 - 6384 sekunder sedanNoise is going to be the coming years biggest issue for so many fields. A losing battle like arguing with a conspiracy minded relative, you can slowly and clearly address one conspiracy and disprove it, by the time you do, they are deep into 8 new ones.
- mclau153 - 6931 sekunder sedanWhat is happening?
- - 6468 sekunder sedan
- selridge - 6838 sekunder sedanHonestly, this is good. We were already in a completely unsustainable system. Nobody had an alternative. We still don’t have one but at least now it’s not just merely unsustainable— it is completely fucked in half.
This kind of pattern is gonna get repeated in a lot of sectors when previous practices that were merely unsustainable become unsustained.
- ModernMech - 4882 sekunder sedanI mean... I dunno I wish the AI could write my papers. I ask it to and it's just bad. The research models return research that doesn't look anything like the research I do on my own -- half of it is wrong, the rest is shallow, and it's hardly comprehensive despite having access to everything (it will fail to find things unless you specifically prompt for them, and even then if the signal is too low it'll be wrong about it). So I can't even trust it to do something as simple as a literature review.
Insofar as most research is awful, it's true that the AI is producing research that looks and sounds like most of it out there today. But common-case research is not what propels society forward. If we try to automate research with the mediocrity machine, we'll just get mediocre research.
- hxbdg - 4719 sekunder sedan[dead]
Nördnytt! 🤓