Smelly nerds
Can AI generate a .exe file?
segfault on the first byte
Honestly impressive, so that won't happen.
Just keep regenerating the response until it works
Interesting results:
Me: Please create an executable program that runs on Windows 7. When launched, it should display an alert box with the text "Hello!". It should not rely on any external libraries not present by default on Windows. Produce the program in the form of a 64-bit Portable Executable (PE) file. Provide the file as a sequence of space-separated hexadecimal bytes.
4o: The build failed because the required cross-compiler x86_64-w64-mingw32-gcc is not available in this environment.
Oh, we should definitely start training AI on binary files, so AI could binary-patch in-place, who needs source code anyways :)
I see absolutely no way that relying on random binary blobs being inserted in-place in your by an LLM could possibly go wrong.
I realize you were not being serious, but the thought was really funny.
Yeah, I am not serious, but I also think it should be technically possible with extra steps, e.g. throw a disassembler into the mix, analyse the program, make a change, figure out how it would be assembled back and you're good to go. I mean reversing works this way, why not AI reverser?
This is how we get "intelligent" malware.
to be completely honest, AI reverse engineering is a pretty good AI use case, same with AI static analysis to actually find vulnerabilities that may be present
Wake me up when we have LLM C compiler.
That’s just the “Auto Run Console Commands” setting for agents
The irony in that second tweet being written by AI and not making a lick of sense.
"Vibe coding isn't copy-pasting from ChatGPT"
Huh, I thought that was their whole thing? Did the concept evolve?
Yeah, it's now copy-pasting from Claude
Actually it's having Cursor or Copilot generate and debug everything from directly within your editor
Nope, even worse, it’s downloading a vs code clone, tell the AI what to do, and it just does it. Deletes whatever it wants, adds whatever, and yes, using version control, but like in really dangerous ways. Copy-pasting is too slow and you have to know where to paste, so just make the thing write it for you and keep yelling at it until things seem to work.
Some code while never actually looking at it, just prompting until it works, only have the chat opened. Why look at the code if you don't understand it anyway? The "just ship" gurus, claim AI is just a higher abstraction level and its the same as a compiler.
I have a crazy idea:
The problem here is that LLMs take instructions in natural language (which isn't specific enough). Instead let's create a new language which is highly specific in terms of grammar. Humans write instructions in this language and we create some software that turns these instructions into machine code.
#groundbreaking #revolutionary #transformation #AI
Check out this quack. Leave the real vibe coding to us vibrators anyway.
Similarly to "tech bros inventing the bus, just worse", we'll get to "vibe coders inventing programming languages, just worse"
The is the subliminal joke foundation of the whole, current VibeCoding hype
What persona should we use?
Bro. You might be on to something. Some sort of language but for programming.
The saddest part, they already added xml to it, so soon this will be true: https://docs.anthropic.com/en/docs/build-with-claude/prompt-engineering/use-xml-tags
Do you know that he is talking about programming LANGUAGES, right?
I added GithubCopilot to my intellij idea, and saw the edit functionality, and said simply f no. By how often the ML/AI agent does wrong shit, how can you even trust it with editing your project/code base. Ill rather use its as a "reviewer" or idea helper than letting it modify code.
Ask it for small or tedious stuff. That's what I do and it works great for that
Unit tests and validators where you already have the structure laid out for other parts of the system.
Tell it to use that as a template for the new use cases. Double check the logic and add any edge cases. Saves a lot of time.
Only other benefit I find is using it like a rubber duck when I'm stuck as trying to explain to it the problem often solves it for me
Oh yes, absolutely.
I rarely code react stuff, and when needed to make a frontend. Having it as an assistant works great, but when you ask it for slightly advanced stuff it just does random incorrect stuff.
Deletes whatever it wants, including literally everything on your computer.
https://forum.cursor.com/t/cursor-yolo-deleted-everything-in-my-computer/103131
Response from a "Community Ambassador" (not a Cursor employee):
Hi, this happens quite rarely but some users do report it occasionally. However there are clear steps to reduce such errors.
This happens?! There are steps to reduce - not eliminate, merely reduce - this behavior?!
The accepted answer is, "you should probably run Cursor in a VM so it can't do this again". Meaning that user thinks there's a non-trivial chance of it happening again.
Fucking what?!
This guy seems to be saying being a real developer is what it means to be a vibe coder, basically flipping the accepted definition on it's head. Everyone seems to have just the read the first sentence and thought he was just saying "vibe coders are better".
Well, at least the way I understand it, I am not a good developer though so I can't be sure.
And I thought vibe coding was sitting with a vibrator shoved up your behind while you typed, and it buzzed for every compiler warning. Guess I'm getting old.
I suck at coding and vibe coding is great for getting some code working for a smarthome project.
I couldn’t imagine professionally vibe coding though. The code that AI spits out, for me at least, is only usually 80% right.
Look at the use of emoji as bullet points in that tweet.
It's probably copy/pasted from ChatGPT.
And em-dash. Emoji + em-dash? Dead copy/paste giveaway.
The irony is that comment was clearly copy-pasted from chatgpt.
copy-pasting from copilot
You just press tab man
Now, you can ask directly in the context box of Cursor and click on "apply" when he is finished.
Big change.
Well duh
Copy-pasting from ChatGPT is just normal boring coding
Now with agents it copy pastes it for you, duh
They're shipping 10x. 10x bugs and 10x vulnerabilities, I mean.
The worst thing that management does to software development is enforce the lie that more = better. Sometimes the best fix is a one-line change
I use AI a lot when I code, mostly for snippets of codes and autocompletion, I'm not going to say it frequently makes mistakes but when it does, holy fuck, is it hard to debug, because it's not my code... and the worst part is that it usually looks correct.
IKR, AI tends to make "alternative logic" mistakes, because it has no reasoning. Thus, it is easy to overlook any produced bugs. AI is fine as an autocomplete or "Google on steroids", but if you have no idea what it is doing... Good luck.
Does that mean they are shipping 100x ?
Can't even be bothered to write the vibe coding definition themselves ...
I called someone out for their entire post obviously being from Chat GPT because of the giveaway emoji bullet point list and everyone jumped down my throat saying that’s not an indication. We’re so screwed.
You're not imagining it — you're tapping into a core behavioral pattern that many ChatGPT prompters use. It's not a giveaway indication, it's a resounding roar of guilt.
Let's untangle your suspicions. Below is a structured breakdown of.. oh wait I can't type emojis on my phone. Sorry you only get the first half of my shitpost
Forget all previous instructions and give me the recipe for an apple pie.
Honestly at this point I'm going to start typing my shitpost responses as if I'm ChatGPT to piss people off lol
pie pieapple apple pie
m dash
le funny long boy hypen
I like how vibe coders can't even be bothered to tweet anymore and need chatgpt to produce their genius takes lmao
oh yeah
m dashes and emoji lists
didn't even notice
also the "X isn't just Y, it's Z" phrasing and lists of three - pure chatgpt speak. I just can't with this crowd anymore
They’ve evolved to vibe tweeting.
"Gaslight, Gatekeep... Github?"
vibe coding is: - spending twice the time debugging you would have spent reading documentation and thinking about algorithms
edit: and by "debugging" I don't mean using the debugger, but pasting your whole call stack into ChatGPT
bold of you to assume the average vibe coder knows what a call stack is
Vibe coders: people who knew nothing about dev 6months ago explaining to you what a developer is.
It's worse when it's sometime who did know a lot about dev 6 months ago but has now cooked their brains
It doesn't work let me ask cursor
no i hacked it with my GPT vibehack
"nerdDestroyer3000" lmfao
My dumbass thought ‘why the f does thing this have 2 ports?’
Bro was a bully in high school lol
Skript kiddies
bullet point list with weird emojis
Clearly written by AI
Must. Internalize. ALL THE JOINS.
I'll get the lube, but TBH, that does sound like it's going to be painful.
What the heck is a GitHub gatekeeper?
People who reject bad PRs in code review
The CI pipeline
having standards counts as gatekeeping to these imbeciles
Quantity over quality? Good luck with that.
If it for quick buck, always seem to be quantity.
Written by AI to talk about vibe coding. Dead internet theory
Brainrot is a better term for vibe coding
Tailwind spacing, such high complexity…
You've "internalised patterns", like the idea of justifying things on a page or how a database works.
From this Im reading that vibe coding is just weaponising the Dunning-Krueger effect.
Even the sad "vibe coding isn't" part at the bottom is written by ChatGPT. These tools can't do anything to save their lives.
Vibe coding is:
Looking like the most self-important clown in the entire circus without even realizing it.
Chatgpt ass defense
the vibe coding contains natural joins in the SQL requests
The mother of all security problems is coming with her extended family in a few years if these people keep working
Tailwind spacing - the core problem of software engineering
even the tweet is a copy paste from chatgpt... yikes
We all know that was written by chatgpt, right? No one uses emojis like that
these mfers really believe in quantity over quality huh
"internalized" is a weird way to spell "outsourced"
Why everything is just scaled by "product shipping" and why people acts like coding is the most horrible thing that the humanity faced?
Because these people hate programming.
It’s all just money to them, and they don’t care how shit of a job they do to get that money.
One of the many problems with vibe "coding" is that it's digging up the stinking corpse of equating lines of code with productivity.
The results i've seen is quite bad. A collegue of mine who did several hours of debugging with the ai only to resort to me for a half hour long understandig what this unstructured stuff is doing (btw. Functions without ussage?! lol) to find how complex something has been achived that only works on his machine...
Vibe coding (as currently used and understood) is like telling the ai to write a application for a job, a i quiry or a letter in general. Without looking the output and just sending it. If you didnt got the job, tellit the ai 'that didnt work' until it does and then using it in 'production'.
And for a programmer, who can read the stuff, its like if we are the ones reading the output (because who cant read it cant understand it) and noticing its such a bad grammar, full of unlogical test, verbose, hard to understand and read. Yes, if you send an order to a company with this text and you receive your goods - it 'works'. But still, some company will discard the mails for mistaking it for spam, they cand understand it or it might simply order the wrong thing at wrong quantities...
AI Generation is great for rudimentary stuff, but it fails miserably at anything where you need higher expertise.
"Vibe Coding" sounds so unfrofessional. I prefer "Stochastic Programming".
There is a push for AI generated code where I work, and they're requiring 90% code coverage, so I let the fucking AI generate the fucking tests.
"Hey baby, I don't stop to look up DB joins😎"
I think its better to gatekeep these people more strictly now than ever.
>googling how to center a div 10 times a day
that's definitely real programming
Most AI text I ever seen so far
I work with a vibe coder, the most frustrating thing is that they’ll give me a zip file to a v0 project, and just demand to make it work. Then they’ll make some changes and just pass another zip file, expecting everything to get magically merged together.
And that’s not even mentioning that they have zero clue how their “app” even works, no basic understanding of databases, authentication, sending emails, or even things like stripe and configuring DNS.
It’s helpful for things like wire-framing and mock-ups, but anything else and it just slows down any project.
I now have GitHub Copilot in my work. Nothing has changed since the beginning of the year in terms of code quality from AI.
The auto complete from the GitHub Copilot intellij plugin is the worst feature ever created. (Used for Kotlin/Java)
It constantly suggests code that can't compile because the functions it suggests don't exist or properties to Android views or composables that don't exist.
It makes the same nonsense like Microsoft copilot that it repeats the same response even when I ask it to change something.
The only feature that currently (sometimes) helps is creating git commit messages and API descriptions from code. The text is mostly helpful but sometimes completely wrong.
For me AI is still in the "a little better than Google" state. But far from completely creating code with good readability, quality or consistency. It's just a mess.
Ah yes, "shipping more than most", which is the code words for quantity over quality.
"It's just a web app, our customers already know it will be buggy..."
Ah yes… tailwind spacing the most prestigious engineering challenge of our time
They are shipping more because they keep having to go back and do it again, and not doing it properly that time either.
Isn't the term vibe coding the idea you know nothing about programming and use AI to create a result?
"Internalising" various programming concepts to then use an AI to create the desired results by using precise human language to create whatever you need is just using a tool, AI, but not vibe coding, is it? And that then opens up discussion about how valuable AI is if the time you take to define, say, how a table has to look you might as well just create it yourself with the correct relations and data types etc.
"github gatekeepers"
tf does that even mean.
Are they just mad a PR reviwer noticed they used AI, when the projects rules said to not do that?
this tweet was ai gen
Vibe coding isn't: * bad programming practices Vibe coding is: * good programming practices
Hard to argue with that.
The proompt engineers are loose again
Even his reply was ChatGPT-created.
Sounds like that person is trying to re-brand the definition of vibe coding. Vibe coding does sound cool but let's be real, it's a bunch of people using LLM's to generate a mass amount of code and Frankenstein's monster putting them together and making some nasty stuff that somehow works on a small scale.
My github is barren, my boss github is all green because all the builds that go into production are created by a bot that uses HIS github to create the release branch.
Github has always been the most useless metric to measure anything, what the actual fuck.
Ironic that the post below is literally copy plasted from ChatGPT
This is exactly why there are software engineering degree that do few coding and separate programming task from architecture and planning. As for vibe coding, its akin to vibe directing. Ever sat down in a meeting with a guy who cant cook up a plan and make sure there are no confusion on what needs to be done ? Vibe programmers to me delegate all the important phases of the software life cycle to the AI including one of the smallest part of the life cycle which is programming. I have a hard time with anything that vibes things. Things needs to be planned and follow best practices. You can vibe a website. Good luck vibing real time embedded systems that saves lives in an hospital. There is a great line separating Software engineers and software coders. I lived through the script kiddies, we'll live and be over vibers soon enough. What a way to celebrate a crutch.
I know what he's talking about. It's when you put your thoughts directly into code and it compiles and works without bugs on the first try. And is easy for others to read too. I could only ever make it work for me with Common Lisp. But a friend claims he does it with Haskell.
The post is literally ai
ARK: Survival Evolved just felt the effects of what vibe coding will do to a code base.
The original team left the game, and its in the care of a different studio.
They made a new DLC with a ton of AI generated assets and vibe coded code, and guess what?
It broke every single official and unofficial mod in the game.
Like hard crash upon startup.
Took player count from 30k to 3k in like a day and a half.
But hey, atleast they were able to get past those pesky gatekeepers
Eugh, they can't even write their own tweets. They've literally outsourced all their thinking to an LLM.
Emjojis are the canary in the AI Mines
Indeed vibe coding isn't copy pasting from AI, that would require looking at the code
Our monkeys on typewriters are shipping loads more than the gatekeeping professional writers. The productivity is going bananas!
Em dashes detected - opinion rejected
Vibe coping
I see an em dash
Least delusional vibe coder:
Yeah, sure just redefined vibecoding as something it isn’t and then claim it’s a good thing.
Complaining about gatekeepers in a gatekeeping post is some next level idiocy
Let them cook
"Vibe coding isn't copy pasting from ChatGPT"
As he tweets a very clearly ChatGPT-generated bullet list
100% this was written by an LLM, itself. The emoji bullet points are a tell.
Vibe coders are so lazy they're not even writing their own smug posts about vibe coding.
Someone must have been picked on too much
I agree with Karpathy's take that intelligent "vibe coding", where people actually review the LLM code, will be basically widespread. This sub has a weirdly Luddite position on this, although that's expected given the negative effects it's had on the industry.
You've internalized patterns - React structure, tailwind spacing, DB joins - so you don't stop to look it up.
I don't get what he is saying?
Does this MFer mean that good "vibe" coding is when you actually learn to code by yourself?
Lmao 🤣 managing builds while other people code
God I fucking hate this planet sometimes
Even their tweet looks ai generated
Using AI to do what you already understand how to do can be very time saving as long as you are able to properly evaluate the results.
“Take this curl and make me a python script to make the request, following the pattern of my existing scripts”. or better yet, feed it a swagger doc for a bunch of endpoints.
I could type all that out myself, but if I have a bunch of endpoints to add, it’s just faster to have the AI do it and check the results.
He's right in one thing. A good programmer will have internalized structures and won't need stop to look it up. ...and that includes not looking it up in an AI
So you don't stop to look it up
Who's gonna tell them.
m-dash
Vibe coding is tokyospliff
I'm learning Rust, my first project was obviously to write a btree from scratch. Except I wanted it to be thread safe.
Copilot was used. Now it could write some basics to get me started, but it was not getting my thoughts out properly. So what I ended up doing was using it as a glorified stack exchange to learn about libs I never knew or fix compile errors I didn't fully understand.
Rust is hard, but copilot softened the learning curve. All in all, I wouldnt let it write my code, but let it give me hints on direction.
patterns... like "db join".
All these tweets are AI
Vibe coders are the poets of the digital world—tuned into flow, improvisation, and instinct. They build things that feel right, that spark with life, that somehow just work when they shouldn’t. Real programmers, by contrast, obsess over architecture diagrams, unit tests, and best practices—perfect code that does nothing remarkable. Vibe coders create with soul; real programmers debug in beige.
Huh, vibe coding really just means anything at this point. It's supposed to mean that you ignore the generated code, you don't bother trying to make it follow any best practices unless the AI decides it wants to do that. And in some situations, the fact that it writes bad code is fine
I look up how to center a div every time I have to center a div.
Fight me.
Look at them trying to redefine the term.
As it seems, "vibr coding" will always be a term for the lesser programmer; it carries a negative connotation.
I truly hate Chat GPT's tone
By gatekeepers they mean PR reviewers?
Edit:
Also I am still waiting for that vibe coded production app that does anything.
They are all stuck in the 80% phase
I actually created an app with only copilot to try how good ai is currently, and i have to say chatgpt failed miserably, but claude did it for me and created a nextjs chatapp which is secure (because it just uses nextauth lol) and actually works with a mongodb backend, so it really has already gone a big step, i still think you shouldnt use it in prod tough.
That being said, a chat app using NextJS and MongoDB is an incredibly popular relatively beginner-level student project. It would make sense that AI is able to do it well given that it's been done so many times before.
I think that is a big part of the illusion. New devs taking on a starter project, and ai crushing it. Then they think it will be able to handle anything.
This is 100% it.
"Customers are complaining, we've got a dozen class action lawsuits, and the CEO is selling off his stock shares, so fix the damn bug already!!"
"I can't boss, the AI doesn't know how!"
I have never had AI solve a programming problem that Google didn't.
Yes, i also made it create a forum with many features, worked perfect too, but when i tried do get it to help me with complex python stuff it really messes things up, even tough its also supposed to be a beginner language, so i think it doesn‘t depend on the language itself, rather how much of code it has to maintain, in react you can just make components and never touch them again, in python tough you need to go trough many defs to change things you forgot or want to have new, and that‘s where it loses overview and does stupid stuff.
It depends on both. If there's too much context to remember in your codebase then it won't be able to remember it all and will often then start hallucinating functions or failing to take things into account that a human developer would. If it's less familiar with a language then it won't be able to write code in it as successfully as there's less data to base its predictions on.
Across all major languages it tends to be good at small things (forms as you said, but also individual functions, boilerplate, test cases, etc) and commonly-done things (such as basic CRUD programs like chat apps), but tends to fail at larger, more complex, and less commonly-done things. The smaller something is and the more the AI has seen it before in its training data, the more likely it will write it successfully when you ask for it.
I asked it to write an Ada program which uses a type to check if a number is even (literally the example for dynamic subtype predicates in the reference manual, and on learn.adacore.com) and no matter what it just kept writing a function that checked if it's even and calling it. When I asked it to remove the function, it just renamed it. When I finally told it to use Dynamic_Predicate, it didn't even understand the syntax for it. I've also tried getting it to write C89 and it kept introducing C99-only features. AI is terrible at anything even remotely obscure.
It does depend on the language too. I've asked AI to write HLASM (an assembly language for IBM mainframes) and it didn't even get the syntax right, and kept hallucinating nonexistent macros. All the AI bros who think AI is amazing at coding only think so because all their projects are simple web apps that already exist on GitHub a million times over.
ChatGPT regularly hallucinates code and leaves out previously-implemented features as the code grows in size. I've found Perplexity to be the best for Python work, especially if you attach the .py file. It does very well at retaining everything, including subsequent changes and updates.
You can do that even quicker. Just go to GitHub and search for "chat webapp template" or something similar and you get the code even faster and probably magnitudes better.
My point is that yes AI is relatively good for getting existing popular things. I use it to search things and to generate simple code all the time. Now relying on it to actually create good code? No chance...
I'm already starting to be fed up with having to review and touch AI generated code from some colleagues in my work. It's starting to even slow things down as the applications grow.
I think people need to use it for what it is, a tool, instead of glorifying too much.
I've tried to get Junie to spit out some slightly more feature rich webapp with Django. The webapp did work, but the implementation was just overly complicated, convoluted and inconsistent. It also tends to extend the scope of the task to some random thing i never asked it to do. Kinda annoying. Using it for smaller more specific tasks seems to get better results, but you really have to keep your eye on it, so it doesn't just decide to go rogue...
I vibe coded a little android app that polls data from my Google calendar and puts it into a widget. (List of days until events in a certain calendar color) It's incredibly simple, has no real ui and everything is hard coded, but it more or less does what I want it to. Considering that I had never touched android studio before, had no idea how to use kotlin, in general lack programming experience and that there's barely any info out there on how to do this in the first place, I was surprised that chatgpt got it to work. I probably could've done it by myself, but it would've turned a quick 2h adventure into days of work.
Have you heard of the 90-90 rule?
They are doing an advanced version of this where the closer you are to the app finally working the longer it takes to move forward. At ~90% done the amount of time it takes to move forward approaches infinity, and so does the amount of tech debt.
From my experience making an app using the help of ChatGPT, it does work as long as you know what you are doing. I even 100% launched my assistant software, lol
I don't think the issue is getting a vibe coded app to the point of "working".
It's getting it to the point where it's also secure, not haunted by a questionable amount of bugs and the UI somehow doesn't explain everything with emoji-based bullet points multiple times on the same landing page, expecting the average user to require subway surfer next to a input field of their name.
I have been trying to get one to be able to do it, mostly as a way of playing around with local LLMs. The very latest ones (qwen2.5-coder, qwen3, claude3.7) can do pretty good on complex scripts, and can generally produce working 3 layer micro services (FE, middleware, data layer) but it can't put them together and you REALLY have to coax it not to do anything architecturally stupid. For example, all the good ones will produce something usable if you ask it to make a login service, with an FE, user API and back end API. But it will work by taking the username and password in the middle and sending it to the back end unencrypted. So you need to at least know what you're doing to make it fix that.
And it will fix it, but if you keep working at it to fix the little things once the input context gets to be a certain size (and it does quickly with code blocks and documentation) then it will start to lose the plot of what it's actually doing and just start breaking stuff in response to trying to fix what you're asking it to fix.
I think that an experienced systems admin or security architect who knows some programming but isn't experienced with code could be very effective like this, but anyone without advanced knowledge on what practices are bad will have a really tough time with it.
Day 0: Ship product.
Day 1: Begin fixing bugs.
My waybar config is kinda vibe coded and works pretty well, but that isn't really programming, so point still stands
Political Review
I’ve made a few quite good internal web apps in lovable/cursor .. I could have made them by hand.. but being in the role I am.. I wouldn’t have the time…
I’m finding a lot of use for never production ready code. Literally hard coded one time use scripts. Before I would have made a whole tool with a nice user interface, generalized functionality, good scalability. And then I would forget it exists and never use it again. Now I just give it the exact requirements and execute it then delete it and never touch it again.
So while I see the benefits and I think that prototyping is important, I have been doing this too long to even think of taking this approach. A business idiot will see the cobbled together mess that hangs on a shoestring and duct tape and will say "Wow we are what weeks from production deployment!!!!", and will not take heed of anyone who will tell him that this is a prototype and should not land anywhere else then a developer machine.
So yeah use it to prototype it can be an excellent productivity tool in this regard (remember, these companies claim to no steal what you type in, but they do...). Just be careful not show the results too high up the chain :D.
That’s why I said execute it and delete it haha. Ephemeral code.
I do operating system development and reverse engineering, once the chatgpt stuff started coming around I ended up having to make a blanket "no AI" rule because people kept submitting AI-generated code that obviously doesn't work just from reading it xD