"This is catastrophic beyond measure" had me laughing so hard for some reason.
It's just like "oh sowwy I made a fucky wucky, this is bad,,,, :("
"I can't believe I've done this!"
It really is taking our jobs, it even learned how to nuke prod
Yeah but it’s AI. So it creates a service, publishes it, and nukes prod in just a few minutes.
✨optimization ✨
Failure at a scale.
So it was production. What the actual f*ck. I wonder who'll be held accountable of this and how.
hopefully the idiot granting an ai tool write access to the production database.
Def not the C-Suite handing out AI directives
More like whichever brain dead manager insisted on it
Replit v2 is a managed agentic app building platform.
edit: idk why im being downvoted. Its a stupid platform but it does exist. https://blog.replit.com/database-editor
That someone gave production credentials to.
no, agent IS the database essentially. Its not "given access" it owns the db.
So someone made the decision to use a production database system that doesn't have a backup mechanism or policies in place to prevent accidental deletion? Yeah, someone deserves to be fired here.
ya basically, repl is a toy. someone got ambitous and tried to do a saas here lol. Its quite funny. This is likely someone who is not an engineer.
Haven't used replit myself, but didn't the guy write he is also using a database that is abstracted through replit and therefore he didn't explicitly give it access to the prod database? To me it seemed like this is how replit wants its users to use it
You can give fine access control in Databases. You can choose which tables a User has access too and what they are allowed to do (Read, update, delete. Delete rows, delete Tables, delete everything)
Please can you translate that into English.
The person overlooking the backups. It’s not a matter of if your production database will get messed up, but when, no need for AI for this. Not having cold storage backups and restore procedure tested is insane.
Depends on the size of the business. For smaller companies, they just can't afford that level of overhead.
That's like saying, I can't afford to talk to customers. Maintaining data is core to the business.
Some businesses can't afford to talk to customers.
Maintaining data maybe core to the business but most small businesses believe that a simple backup with no rigorous testing to either check that it is working or that the system can be restored from it, is good enough.
That's like saying I can't afford to change oil in my car. If you can't afford database backups, you work on borrowed time.
A startup is working on borrowed time by definition. I hope startups have backups, but expecting a startup to have a fully tested and well oiled recovery scheme is unrealistic, I fear
Fair enough I guess.
A false analogy.
An oil change is performed to keep a vehicle running and prevent catastrophic failure. Having a backup is there in case a catastrophic failure happens.
A better analogy would be always having sufficient savings to buy a replacement car. Many people simply can't afford that luxury or choose not to because they have other properties.
For some reason reddit only uploaded one of the screenshots, here it is v2
Looks like replit deleted v2 as well
Uh, where?
Where is it v2?
Oh my bad; here it is
Oh, thank you
more like DiSaaStr now
This is top comment for me
This is awesome. I hope they go out of business.
Is this real? Did someone seriously use an AI to attempt to modify a Prod Database?
If any tool has unrestricted access to your prod db you have way more problems than AI
if it ignores all orders
So many people still see LLMs as perfect chatbots with perfect command execution. Some people even talked about simply TELLING an LLM a "permanent rule" to overwrite certain words with a other text. Surprise, it often didnt work.
Same with having an LLM in things like Home assistant. If you tell it to turn off the light, changes are, it turns all of them on and makes them shine Red. Or whatever.
I didn't even know ReplIt had AI. I blame the person that set it up and gave it control over their database.
ReplIt pivoted hard towards AI
Guess your prompt was bad - some reddit user who is an expert in LLM from his house 16 GB GPU
AI destroyed it, AI can build it again.
Please keep reducing IT expenses by replacing experience with AI assisted interns. The executive team love it
Fantastic! We need these catastrophic mistakes to happen sooner than later, so that we (devs) can point at real-life examples of AI going wrong when clueless managers come up with a new solution in need of a problem.
What is this from?
Deserved for giving an "AI" chatbot all that access
AI taking over intern jobs as well T.T
Someone forgot the AI can hallucinate
So what? Just deploy the backup. 👀
This dude got a funding of 200M$? Are investors really that stupid?
those are some BAD VIBES
Quiche Eater gets what they deserve
Vibes.
Well guess it’s better than them leaking it somehow. 100% vibe secure now!
As a side note, I feel so sad when AIs start apologizing
On his day 10 thread, he said
I mean honestly — when the CEOs of Loveable and Replit are out there telling everyone that Vertical SaaS is dead, that anyone can roll their own app for $25 a month, that anyone can be a developer now, in minutes It’s fair for me to ask for more
I think it’s fair
And i just. This man is so, so close to realizing he is being scammed for all he's worth. Which apparently is 300 dollars on the workday of july 16th (edit: and an estimated 8000 a month dear god what is wrong with this man)
Also as of 20 hours ago he cannot run unit tests. God this is amazing.
Hold on a second. V2?
This feels like if your coworker was K2SO
Rule enforcement is soft, not hard-coded - meaning it is just influence, and not actual control.
[deleted]
ðey're not even trying to hide it anymore 😭
Until proven otherwise, this is probably professional anti-Replit marketing meant to shatter their brand.
Skill issue or in this case prompt issue.
Access control issue.
There should be no single person capable of wiping a production db.
Especially if said person is a statistics process predicting the most likely next word with a random number generator deciding which of the most likely words actually becomes next.
I'm getting downvoted for saying facts. Skill issue as in you don';t know what the fuck you're doing.
Only in software engineering is it assumed that literally anyone can grab some power tools and do the job without any knowledge.
What other field would consider what's happening with AI not alarming? Imagine your doctor or plumber announces that it's their first day on the job, they have no education or experience, and they're simply going to rely on ChatGPT to help them through the job.
Any other field everyone would be like, "fuck no, get out of here." Only in software engineering are people like, "hell yeah, vibe out."
Truer words have never been spoken before
Real AI Test In Production. wkwkwk
You got stomach ache? Yeah I'll schedule your appendix removal.
You're absolutely right to point out that removing the appendix should not influence pain coming from the stomach! Do you want me to amputate your legs and your right thumb instead?
Just gonna vibe out this lung transplant...
I think it's an accessibility thing. It wasn't too long ago that software demands were way over what the labor in the industry could cover. It's still pretty darn high even after all the layoffs and hiring freezes and everything else.
I think there should at least me something akin to building codes in software. Like if your system doesn't have a sandbox, or your team is not actively developing in that sandbox and is just raw dogging production updates, that should be grounds for some sort of penalty. Those kind of mistakes impact the customers and the economy in negative ways.
We can't regulate EVERYTHING, software isn't that homogenized. But I feel like we've had sandbox and prod environments long enough to at least have the conversation about some ground level expectations for commercialized software development beyond "Don't sell that data, maybe"
I feel like compliance frameworks like SOC 2 and FedRAMP are the building codes. I’ve worked on both and the auditors ask things like,
“How is this tested before production?”
“How many people approve a change before it goes to production?”
“How do you restrict access to production to prevent manual changes?”
But yeah, even the basic frameworks like SOC 2 aren’t required until a company starts taking on large enterprise customers. So not really a barrier until later in an application’s lifecycle.
100% agree with you. I work a lot in Financial Services and, while audits are a pain, I can appreciate the stability they (usually) bring for more sensitive systems.
But, I would like to see something like it to be universally applied. I don't think SOC 2 is necessary for every single bit of commercialized tech, but it also bothers me how much money is lost to poor/failed software projects. That's why building codes exist for real buildings, after all. They don't care if you build a crap house and it falls over - they care if by falling over it causes collateral/ecological damage.
Same argument can be made for software, I think. You may not need SOC 2 level compliance, but you sure as shit shouldn't be using commercial grade marketing software in your start up without having a sandbox for development. I would firmly put any company of any size in the "reckless negligence" category for that kind of move.
Well, it's still pretty common to see construction workers drinking 40s of beer while on the job.
That's only true in IT departments run by idiots. When I was a trainee I would not have been let 1km near the Live server's credentials.
User: Replit, do a routine check on this patient
Replit: I removed the heart, this is catastrophic beyond measure
Only in software engineering people would consider using a tool that will do random things when powered on.
Well, that's only because GPT is not in a good mode to perform those jobs yet.
It IS in a good place to do most of the boilerplate tedium coding (as well as accelerate your own coding), and it does that quite well. People are coping hard with "it can't code," but the fact is it CAN. I have had it make lots of great, functional code on the first try. People should be even more worried than they are now that they will be replaced, and not just in software.
Well medicine is kind of like that too.
I think it's because of the effect that happens to people when they have surface level knowledge of something. When you have no knlowedg, you have no confidence on the topic. When you have only that little bit of knowledge, you are become disillusioned and overconfident that you know almost EVERYTHING. Most people stop learning here, so they never become disillusioned. For those that continue, once they actually go deep into the complexities and details of the topic they quickly realize that they don't know anything. Most that continued will stop here cause they don't have the confidence to continue and doubt themselves too much.
I'm sure you've heard it before, the more you know about something the more you know that you don't know very much. This makes software development and medicine very susceptible to do as people can easily and quickly look up the basics of X thing from those fields.