If you care to read more of whats written on the left, he goes on to tell you that over 60fps, game runs faster, as in that physics are tied to fps in the game, in the year 2025.
GameMaker still ties game logic, physics, and rendering to the same loop, which is unfortunately a relic of its own past. You can use delta time and the new time sources to make things like movement and scheduling things run consistently at different framerates, but you can't really decouple those three things from each other.
One story I like to tell is how Hyper Light Drifter (made with GameMaker) was intially hardcoded to run at 30FPS. When they had to update the game to run at 60FPS, they basically had to manually readjust everything (movement, timings, etc.) to get it to work.
it’s actually a very common implementation in game engines. decoupling physics from fps is a bit more complicated… the naive thing is to use the system time, but you quickly find that this has very poor precision for action games. so you need a high resolution timer. but then you have to deal with scheduling imprecision and conservation wrappers around your physics or things blow up right when you get a little lag from discord or antivirus, etc. (basically your jump at 5 pps suddenly registers 2 seconds and you get a bigger jump than game designers factored for. so you clamp everything— but then you aren’t really running realtime physics.)
there can be legit reasons to lock it to fps.
You get almost all the benefits by locking your physics to a rate. That rate doesn't have to have any connection to your frames. For example you can run physics at a fixed 75Hz while your fps floats anywhere between 20 and 500.
if physics is paused between frames wouldn't gpu just rendered same frame multiple times?
Physics doesn't necessarily get paused, rather it accounts for variable frame time to produce expected results.
Imagine the first 3 frames take 0.12s, 0.13s, and 0.12s.
If your game logic is Move(1) every frame, you've now moved 3 units in 0.37s.
If the same 3 frames took 0.01s, 0.01s, 0.01s, it's still 3 units but now in 0.03s (much faster motion).
If your game logic said Move (1*deltaTime), now no matter how long each frame takes, you're going to move 1 unit per second.
No, you have to process animations and effects too.
The rendering can assume things will keep moving the same way for the next few milliseconds. The slight flaws won't be any worse than the flaws you get from a fixed framerate.
This is really just async programming in general. Any time you introduce parallelism or concurrency you get issues with accurately splitting up time quantums with respect to whatever process is running at really high throughputs. If there’s lag (a process taking a long time while other processes wait to use the cpu/gpu) you have to essentially backtrack processes or force them to wait, and if all of this is queued with similar lag it can quickly become a decoherent smeary mess running into race conditions or slow to a halt.
One of the best ways to handle this is to force everything to only process for a certain amount of time before it’s forced to wait for the rest to be processed, which is typically how concurrency works, but this, again, only really works until you end up with enough processes to cause all of them to slow down until enough threads are killed. Either that or you can split the work across cores and just have them all run independently of each other but this will obviously also cause problems if they depend on each other.
Then there’s the problem of who keeps track of the time? As you mentioned, you could use fps and just run everything in the render pipeline every 1/60th of a second but if your logic requires that to be fixed you end up with issues if it changes (I.e. if there’s a 1/60th buffer for an input/response but the system runs at 30fps you might drop the input because the game is expecting it to last twice as long as it actually can). You can tie it to system time but machines have issues managing time too, causing clocks to drift after a while, leading to the same problems.
This is such a huge fundamental problem that even reality itself seems to not have been able to figure it out, splitting clocks relative to scale and velocity (I.e. a fixed frame rate at quantum scales and a dynamic frame rate at relativistic scales), and preventing both from being rendered faster than the speed of light.
Oh so this is my usual fear of "I've been floating for 5 seconds on this rock and the game thinks I'm falling continuously, I'm gonna die"... except rather than me glitching myself into a falling state, it's a 3-second lagspike as I'm descending from a jump.
I'm not a games programmer so maybe I'm missing some nuance - but you don't actually *care* about the precision of the time itself, right? You're not looking for subsecond precision or trying to implement local NTP. You only care about ticks?
Can't just tie it to cpu cycles with something like QueryPerformanceCounter? Which can be precise down to microseconds?
Right, you want the simulation to target say 60 ticks/sec, but if the CPU maxes out and starts lagging, you can slow down the simulation. Nothing inside the simulation should care about how much real/wall time has passed. Stopping the ticks running is how you gracefully pause the game, while keeping things outside the simulation like the UI and input still working.
At any point inside the simulation you know how much time has passed by counting ticks, which are the atomic unit of time.
make sure to use Delta time right too
It's a shame there's no possible way to make games on anything other than game maker
Technically the game is from 2015, its over 10 years old while not even being released yet
Alright bud your banned from chat hope your happy
"Hope it was worth it bud" stretches
"yeah the game is not finished yet, i work hard on it every day *wink, what am i supposed to do for you?"
"Yeah, Animus is 99% percent complete"
- PeePeeSoftware for half a year
I actually saw a compilation of him saying almost that very same sentence for months lmao
10 years withou releasing? Riot might as well buy the ip and restart with a new engine.
We're here to chew gum and code games, and we're all out of code.
That makes no difference, even in the 2000 framerate-independent physics was a thing, maybe earlier but I'm not sure.
That's not necessarily an issue. It is a very common, easy way to solve these things, and frankly, for an indie game, it is more than fine. Not the first one, not the last one. Some shit is tied to FPS in games like Destiny for god's sake.
That being said, if you did decide to do it like that, just fucking cap the FPS.
I mean based on how arrogant he is, I’m not giving him the benefit of doubt. This mean boasted how good of a software person he is (I say person because he claims to be not just a dev).
Yeah it’s a common way to do it, yet still a bad way to do it, and definitely not the way to do it if you claim you are good at what you do.
I'm not sure I'd hold Destiny up as a shining example of what to do. Didn't it take the devs 12+ hours just to open a level for editing in the first one? At one point they basically said they fucked up by making their Tiger engine by hacking in pieces from their old Halo engine but they did it because they didn't know how to go about recreating the feel. Bungie isn't what it once was.
Counterpoint: it works just fine.
Their game functioning doesn't mean you shouldn't try to avoid the bad practices that caused them issues along the way and there are a lot of things they did wrong that caused them pain later. I get that you like the game but don't let that stop you from seeing the flaws.
Just run it in DOSBox
bro doesn't know what delta time is in 2025 😔
I would like to remind you that hes worked at Blizzard Entertainment, hes the only second gen Blizzard Entertainment worker in his family so we all should be more respectful towards him, because Blizzard Entertainment experience is really important.
Note: His position was in QA.
Some big studios are also guilty of this. Risk of Rain 2, new update by gearbox, tied everything, and I mean everything, to FPS, even mob spawns. It was as bizzare as it sounds lol.
No, it's for the ARG, obviously. What, you think he'd make a mistake???
That is due to the engine. In game maker, everything is run once per frame, and it defaults to running 60 frames per second.
Lots of modern games have physical actions linked to fps. The knife in RE2 remake does more damage if you are over certain fps.
They do, they shouldn’t.
To be honest it isn't a problem for retro style games. I don't mind stable 60 fps in pixel art titles, animations are hand drawn anyway at like 4 fps and whether you have 16ms or 4ms latency is effectively irrelevant. More FPS to reduce your input lag kinda... does nothing.
So if someone takes this shortcut (or uses a game engine that just does it by default) I wouldn't really hold it against them. As long as it's running consistently at 60 fps and it's properly locked to that value. Now, if your game looks like it should run a GeForce 3 and Pentium 4 1.2GHz and yet it drops to 45 fps on a PC 100x more powerful then it's a very different story.
Admittedly some larger studios still do it to this day too and they probably shouldn't. Funniest example I know of is Dark Souls 2 - console version runs at 30 fps. PC version runs at 60. And so PC release was way harder than the console one - your weapons broke all the time, dodging certain attacks was near impossible, you got less iframes. In the newer games From Software just upped it to default to 60 but you will still have glitches if you go beyond it. For those cases I 100% agree, physics and logic should have been decoupled ages ago.
FromSoftware is notorious with this, and probably my biggest complaint that their games are locked to 60fps.
yup, tried going above 60fps on ds3 and my running speed was super slow lmao
Paging Space Engineers, paging...Space Engineers. The Clang is calling.
Wait so he is not using delta time in his calculations xD?
Every beginner yt tutorial teaches you that.
delta time is not a realtime constraint… hence the scheduler may or may not provide a real delta— ie system lag or stutter… then your physics blows up.
this can be harder than it looks.
I’m not a game dev, but I know that you don’t tie your physics to your frame rate. I’ve heard that based on the tools you have, it’s rather easy to handle it as well.
Yes, Gamer Maker Studio his engine of choice has a built in delta time variable.
It returns an int that is the time between frames in milliseconds, you can use that to make your game frame independent.
Did it have delta time available when he started making the game a decade ago? It wouldn't surprise me if it did, I'm just not very familiar with Game Maker.
Lol, lmao even.
ah, yes, that’s more stable. it’s basically a scalar on the physics which is constant based on the chosen fps, so it doesn’t suffer from lag spikes.
When you look at his code, you would think the guy has never looked at the docs or any tutorial in his life.
lol took me right back to 90s
I mean it would take you to 90s because the idea that you should decouple game logic from the framerate is old enough to vote and pay a mortgage.
Thats pretty common in a lot of low barrier to entry game engines. Terraria, hyperlight drifter, etc. Back when PC settings weren't great and we had to fight for an fov slider, unlocking the fps past 60 could cause some crazy behavior.
I mean there is quite a few modern games where u can get fucked by having lower or higher fps
so?
Yes, because thats how it works in GameMaker Studio
II love how much this dude is getting ripped on every subreddit.
I'm out of the loop. What happened?
There was some petition to save abondonware games and this dude came out against it.
He like regularly suggests he's some beast developer or hacker or something, so when he pissed off the community they looked into his background as well as the code for his game and suddenly it looks like he may have been exaggerating a bit
I only know him as a QA tester from blizzard, not sure why he suggested he was a great coder
Because he loves to lie.
I only know him as that furry guy who was embroiled in a gay furry RP cheating ordeal
I keep seeing this one be mentioned but haven't been able to find any actual posts with information, you got any links that cover this?
that furry thing ordeal, i think the search term for it is maldavius(jason hall aka pirate software) and second life woodbury university
The gay furry erp part of the video is crazy, watch at your own risk or skip that part, here is the video covering it all: https://www.youtube.com/watch?v=rWoxDoDn44I
If you're just interested in his other side, where he claims to be a hacker and a good game developer, then watch this 2-part series: https://www.youtube.com/watch?v=0jGrBXrftDg&t=10s
None of those words are in the bible.
Well duh it wasn't written in English
Ordeal
Didn't he brag about working as a "hacker" for some govt nuclear plant or something? Or was it someone else?
Yeah, he did!
His linked in shows he *did* work with energy companies, and he probably *did* pen-testing, but from the look of that same linked-in, it's pretty clear his skillset is *fishing* and *social manipulation*.
Basically, he was probably one of the guys who would show up at the front door going "Hey man, my car broke down; I already called the tower, but do you have somewhere I can come inside and rest?", a couple minutes later it's "Hey dude, can I borrow you're bathroom?", and next thing you let him out of your site and he's walked somewhere unsecure and is making notes your manager is going to write you up about later.
...in other words, nothing at all involving software, let alone "hacking".
His GitHub repo is a wasteland.
Just heard of this guy now. I'm curious why he was against it?
He has his own publisher company. Simple conflict of interests.
a bit is an understatement
"he may have been exaggerating a bit"
you may have been understating a bit
He seems to lack any fundamental of programming not knowing basic concepts such as magic numbers or for loops and uses about the worst possible way to keep track of events in his game 1 massive array of 500 some indexes which he sets in 1 file and comments behind it what it's for and what can be filled into this particular index(imagine having to add another index at 215 to keep Related Story events grouped and then having to change all the subsequent entries and their calls). When called out on this he copes by saying it's so my ARG is easier and more doable or so the save file is more easily editable I saw him say both both sound like massive cope.
All these freshman level mistakes while he claims to be a 20 year experienced game dev who has worked at blizzard for 7 years. After some research people found he did QA(he equals this to being a dev due to it being a part of the development process) and what amounts to social engineering, his dad was also a founder at blizzard that is how he got the job he is blizzards first Nepo baby.
The worst part is he is just a smug and arrogant person who condescends everyone because he worked at blizzard and can never accept he is wrong or made a mistake no matter what, he gets into a lot of unnecessary drama due to that personality trait. He did an interview with a YouTube therapist where this also came up he denied this being the case and blamed others for not understanding how he was right.
I think the ARG is to find out how bad his code is and fix it.
That, or it's the sound any actual dev makes when they see his code.
He's a guy who basically claims to be an expert game developer, which he is not. He's worked on the same game for about 8 years, and it isn't complete, still stuck in early access at around 20 USD. People dug into his code, and found that it sucked, and was likely the reason why it was taking so long. The delay is not from a lack of capital, he makes a decent amount of money livestreaming himself coding and playing games, and actually earns enough to pay a sound engineer and artist. So the game's code is very likely the thing standing in the way of completion. He mostly scoffs at criticism, and just does his own thing regardless. He's sort of like an "Anti Jonathan Blow"; same egotistical bullshit, but without any real skill to back it up.
The actual reason anyone even cared to begin with is because he publicly doesn't like Ross Scott's "Stop Killing Games" movement. Its not the first time he's been wrapped up in drama. Discussion now is basically just about his hubris and lack of actual programming skill, more than his opinions about this or that thing.
IIRC, It took him 2 years to make the first 2 chapters, and then he was "86% done with chapter 3" for another 6 years with no updates to show for it...because he made it big, he started streaming 8-16 hours a day and never actually doing any dev work. Not that his dev work was that great to begin with, but at least Heartbound Chapters 1 an 2 *run*, which is more than can be said of the rest of the game.
it still baffles me that he managed to drag the making of a simple gamemaker game for 8 years and made 0 progress with it, I know some ero game devs will finish this shit within a year or less and it will still have more content.
TLDR he's just a massive asshole
Also he didn't press his mana gem
He made some up shit, framed other shit in a way to make him look cool and knowledgeable, rode the fame wave exploiting his relationship with Blizzard (while bashing them), so in general, kind of a shady person, some of his takes were objectively good (which I can say from experience), but generally one should take what he says with a fistful of salt.
Dude postures as a ex-Blizzard master game developer on Youtube and streams. He speaks like he has authority on everything and even uses a voice-changer to make his voice sound deeper and more authoritative on a subject. Eventually made a video series on something that was blatantly incorrect, spreading misinformation, and putting down a pretty reasonable cause.
People looked into his background and it turns out he was just a QA tester at Blizzard who got the position due to family connections. It turns out he knows very little about coding, game dev, etc and he's being exposed for speaking out his ass.
He didnt admit to being wrong in World of Warcraft, the rest is history
I love him as he got me into finally starting to learn game development. But the current release of him has some definite bugs and needs to go under review.
I actually don't. As a society we have this tendency to really engage in character assassination, cancelling people, witch hunts, whatever you want to call it, and I think it is wrong and dangerous. No one is perfect, everyone has issues. Think about the stuff people could theoretically rip you a new asshole for if the problem was on public display.
The top comment here is some genius who figured out that the game physics must be tied to fps because they said the game runs faster at higher than 60fps. As in that is an implication they had to draw, they didn’t get it from the second paragraph where they outright say it.
I really don’t know where people are getting their high horses from.
We're really beating a dead horse here...do it some more.
How much more? Three nested loops worth?
Nah, this guy is bad and everyone needs to know how bad he is. I don't want to ever work with or encounter anyone that learned any coding from his videos. There's so many better options out there.
Yes, that's the "do it some more" part. Keep beating the dead horse.
Does he even teach coding?
I only ever heard him talk about logistics of game development.
Yandev? Is that you?
I just can't get over the loop condition: why not simply xx < sprite_width ?
I mean, he could also just start the variable itself at 1, xx = 1; is totally fine.
Who's "we" ? Doesnt he develop this game by himself?
What is that language?
GML (GameMakerLanguage). Scripting language for the GameMaker engine
Its GML gamemakers custom language. It’s similar to javascript
Pretty sure its a proprietary gamemaker studio language
But! What's with that code!? This can't be real!
You don’t run nested for loops straight from a switch statement? Are you okay? /s
It is real, and there is a video of a game dev comparing this implementation to a better one
https://www.youtube.com/watch?v=jDB49s7Naww
... tying logic to fps? even 13yo me wouldn't do such a thing
It's a pretty common pattern in historical game dev. Used less now, but it's not as crazy as it sounds. You essentially couple your logic/physics to your drawing/rendering logic. Everything gets done in the same loop, you calculate your players position in the same loop that you draw them in the new position, and you do this loop 60 times per second. You don't calculate anything before its necessary, never wasting CPU cycles, and making certain behaviours more predictable. This make building an engine a lot simpler.
It's a lesser used pattern for modern games because they're played on a variety of platforms with different hardware. You can no longer predict a player's FPS precisely like you could back when games were only ever played on one generation of console (or on an arcade machine). You can lock a players FPS at 60 of course, but then if their hardware is worse than you expected and their framerate drops you'll have a bad time in the other direction.
For modern games, handling differing framerates is usually more complex then just decoupling your game logic from your rendering logic. So now game logic tends to run on a fixed timestep/interval, or is entirely event based, regardless of if the player is rendering the updates or not. Some big AAA games still use engines with logic tied to FPS though. Notably Bethesda's Creation Engine games (Fallout 4, Skyrim, Starfield, etc.) all still use the players FPS for physics.
Back in ye olden days, any given cpu instruction took literally the exact same number of clock cycles no matter when you ran it. Nowadays with hardware level branch prediction and speculative execution there is no way you can know how many clock cycles anything takes. Not to mention software level thread context switches that make timing anything impossible.
even in cases where logic isn't fully tied to fps many games has frame rate dependent quirks or glitches
Yeah, I’m a second year student studying game development and pretty much everyone defaults to that because it’s already the default function in unity when you start a script.
While this can be fine you see a lot of people running it in engine fine then waiting too late to test it in build and everything breaks.
Using Fixed Update instead was one of the most useful lessons I learnt in my first year.
For most tasks, it's easily patchable by multiplying by time deltas. In engines like Unity, it's still a pretty common practice to make most of the logic in Update() (tied to framerate) and use FixedUpdate() (mostly untied) only for things that strictly require a somewhat stable tick rate, like physics calculations
A) he is not a coder B) deltas exist
Tell that to console game developers.
It used to be common practice, even massive games like Bloodborne do it. It's just the most straightforward way to manage time in games with the FPS as a sort of global way to tie everything to, otherwise keeping everything in sync is difficult. Obviously it has many downsides and is a dying practice, but especially on older consoles and such where FPS was usually capped to 30 or 60 anyway, it was "okay" to do.
Pretty sure Fallout was tied to it in some parts until recently.
The Creation Engine would get weird if you uncapped your FPS as recently as Skyrim if I remember correctly (the normal edition, not the special one, at least). I was always told to cap it at 60. With Starfield, and the new version of the engine it uses, this is not necessary anymore (or at least, I've not noticed anything strange).
Yeah the carriage in Skyrim's intro would start doing flips and flying if you had an FPS above 60.
Eh, there are use cases. I don't mind it in Noita, for example. Better than the alternative.
Most beginner gamedev at their 30s still do that (like me). Like I know it's bad, but it just so easy to do.
If your using unity you can switch to Fixed Update which works almost the same except it uses fixed time instead of frames
I use ebitengine and the fact the engine use tick based update by default (not even passing delta time as update parameter), just make me not using that method by default. Some ECS framework built on top of ebitengine do help with this issues a bit.
And TBF from my limited experience, the engine is pretty performant like It still run 55-60 FPS on some logic heavy non optimized scene on opengl. But when I need to export it to WASM, the framerate drop feels abysmal and beyond obvious.
Its really not the end of the world. Tons of games still do it and it works. There are better designs to follow, but your game is probably fine.
For work I do a lot of cloud. When I was in school I was taugh monolthic arcitecture was archaic, dead and overall just terrible design and OOP was gods gift. Now monolithic is making a comeback as companies want more control and vertical scalability and OOP is running into limitations as its not as performany.
Not that monolithic architecture is better than cloud or vice versa or that OOP is worse than function patterns or any of that. They're tools in our tool box. There will trade offs, there be advtanges, and there will be times where it doesn't matter so pick which ever one you feel comfortzble with.
it's the worst fucking thing. the definition of "it works on my machine". it's a single division, not exactly rocket science
It's not like "it works on my machine", Its more like "it works on our machine, just slightly different". It's just division but if you have a lot of things move at different speed, they all need their own division and this can add up fast. This added by some things such as
- the engine has v-sync or try its best at 60fps
- the game is not too demanding (like basic 2D PNG sprite slapped on the screen with no fancy shader or other stuff)
- the inconsistency is hard to perceive in modern hardware.
Makes tying logic to frame tick just so convinient.
Yeah using tick is pretty common.
Ticks and fps are different things. A rock is Ticks are a sensible design pattern that solves problems. Fps just makes a huge mess of everything. A game tick is a fixed interval to repeat logic on. A frame is an extremely heavily variable that is different on every single machine and even the same machine from run to run and also highly dependent on what exactly is going on in game
I’m not sure about other engines, but in UE ticks are tied directly to FPS. You have to manipulate your values to decouple it from your frame rate, but inherently it ticks once for every frame.
FWIW it's how games used to be architected. A tick was a tick was a tick. Everything in a single thread.
But now we have more sophisticated methods of structuring code to work asynchronously so generally we avoid this to make it more resilient to different runtimes.
For a modern 2d game like this I'm frankly surprised it's having such a hard time on modern hardware, so it's quite apparent the code is just slow garbage.
Heartbound has game logic tied to the FPS so that certain parts of the ARG can work.
you can't make this shit up...
All Game Maker games are tied to the FPS
At this point... i believe this whole 10 year development was a money sink, some form of passive income while streaming video games. I mean with his poor coding skills I can only imagine what its like for him debugging, and his fake Smart Fridge scam he got going
At 10 years Dev time I would have assumed that the game was a major money lose, but according to some analytics sites he has sold around 100k copies and earned between 700 - 1200k
Hell he made 5.5k in the last week alone on it.
For a game he has barely touched in the last 3 - 4 years he is pocketing a gross amount of money.
Next to his streamer income it might not be considered much to him though.
Does anyone know if the game is actually good? Willing to believe maybe it has shit shit code but is a good game.
Undertale was an absolute mess as I recall (but at least that guy isn't going around masquerading as a good coder).
The reviews I actually believe are the ones says that it basically doesn't exist, it has a decent idea, okay writing but it's full of holes and all told maybe 4 hours long after 10 years.
It's on the same vein as Vaporware when compared to what it is supposed to be.
It's not that it is horrible itself, but the reality surrounding it, "kick started some 5 years ago", "Less than half done", "10 years in production", "No real content updates in 2 years", "Missing major game elements", it's the clear indication that it will never be a fully released game under the current production.
And when faced with the real concerns about the game from people who actually had faith in it, he lumps them in with those who clearly have ill intent so that he never needs to address any of the complaints at all.
Like "Code Jesus" video IS grifter content made to please the masses who already hate Pirate Software, and that helps Pirate deflect reality even more, because now he has a new flood of hate to shield his real issues.
Coding Jesus made a video about the performance of his lighting (which is horrible)
You remember GTAV taking forever to load? Yeah a modder solved it after nearly 10 years. The game was downloading a json file for the game store. They were parsing through the JSON multiple times. So as the store grew so did the loading times. Rockstar claimed for years they couldn't figure it out lol.
Nah no way this is real, can you show the source vid?
This guy has been on my “I know something’s off but I can’t prove it” list since he talked about single handedly tracking down some hacker at defcon. But other dev YouTubers seemed to like him so I said nothing. So glad people are roasting him now.
Interested to know if the claim "Game Maker Studio 2.3 has a ton of bugs" is legit.
I mean historically based on the person saying it, it's a lie, but that's not real fact checking.
I’ve been using 2.3 for a while now. Zero issues so far. Any issues he has is due to him trying to do stuff in backwards and inane ways
Maybe just like a "I copy pasted my entire game into the new version and things broke" type thing.
Some people had issues with working with larger projects in the IDE and the legacy JS-based web runner (the "HTML5 export module") had a lot of bugs and parity issues, though it has improved a bit after they open-sourced it. They'll probably sunset it some time in the future in favour of their newer WASM-based runner. Nowadays, GameMaker (as it is called now) receives much more frequent updates (a major update once every 2 months or so vs. whenever they felt like it) and major bugs are usually resolved faster in minor updates.
GameMaker has seen a lot of improvements since the 2.3 days, the biggest among them is probably making it free for non-commercial use. Frankly, the subscription model made it much less attractive to new users compared to Unity and was really holding it back.
https://gamemaker.io/en/blog/gamemaker-studio-2-dot-3-new-gml-features
nothing breaking, a lot of syntactic sugars. He's just saying it as an excuse lmao
What I find so weird about all of this is the fact that he’s been streaming him coding the game for a LONG time, yet people only started giving him shit for it now? Like yes this code is dogshit, but I don’t remember it looking bad in the past (or at least no one gave him shit for it before). Makes me wonder about people’s criticisms. Is he just updating code he did early in the game’s development? Is he fixing code an intern gave him? Are people just now noticing how bad his code has been the whole time and people have only now decided to point it out? I am so confused
People didn't really give a shit until first the WoW hardcore drama and then more recently the Stop Killing Games drama. But now he has angered the internet and people are digging through everything he ever did in order to hate him more.
I haven’t seen the whole thing, but I think what he’s doing could be done better with a shader. I’m pretty sure shaders were around in GameMaker when he started this project.
It’s absolutely what a shader should be used for.
Something I haven’t seen addressed is if this script is even used in game / if this is was ever intended to run in real time. This is on the order of being so slow the game would not run at all. I’ve written really dogshit code to try things out before. CJ presented it as just a file in the demo.
Im tired of seeing this dudes face just because he has radio voice
Gamemaker 2.3 has a lot of bugs
Lmao bro, even platform bugs would be preferable to poor coding and performance issues from your own code.
If only there was a way to check what code was causing issues. Must not have been invented yet
I have never heard of a game released and updated in this DECADE that ties game logic/physics to the users FPS, truly incredible he keeps outdoing himself
This decade as in the 2020s or this decade as in the last 10 years? Fallout 76 was a thing.
I feel so elite for understanding what’s happening here
Nier Automata had the same issue, its 60 fps locked! I swear!!! *puts in jail*
It’s kinda funny how his code looks exactly like yanderedevs
Ehhh, I'm conflicted. On one hand I get why the guy is getting criticised due to the Stop Killing Games petition and the WOW debacle, but on the other hand I feel for him because it can't be nice having the whole of the internet pick apart everything that you've ever done just to take the piss out of you.
I get that the things that he's getting ripped about are out in public (like his code) but at what point does memeing him cross the line into bullying and canceling him? I've seen witch hunts on the internet end badly and I'd hate that in this instance.
I'm used to enjoy post from this sub when it popped up from time to time before.
But now every time I see a post recommend from this sub, it's this kind of post. Wth happened
this sub was making absolute braindead, cs freshman level joke, yet now you're mad that it actually makes good, albeit still elementary, joke.
wait a moment, is this code real? looks like he checks collision for every point of sprite twice? once is stupid, but twice? dude doubles down even in code
It's used for some gradient objects and lightning effects in Heartbound. And yes those are collision checks happening for every pixel across the sprite, a 100x100 sprite becomes 10,000 collision checks every frame
What in the actual fuck
Lmao what's optimization
I optimized it by throwing more hardware at the issue
It runs slowly on some machines. We have no idea why. Don't ask us what a profiler is.
What does O(n) even mean... Do you think I would I get hired at Blizzard?
Spoiler alert: he wasn't a dev, he was one of the guys who would ban you for cheating in WoW
It's called a janitor
At least he didn't do it for free (I hope)
does your father works at and is one of the founders of blizzard? if the answer is no then no
for a total noob like me, what would an optimization for this look like?
I would put a bigger bouding box around the entire sprite, no need to check for collisions if other objects are not close
Then maybe I would devise a way to figure out where another object is coming from and I would only test pixels that are close to it
Also I would create a map that only has the outline of the sprite so I only test against the border
So I would reduce 10,000 checks to maybe 30 per frame
I could be wrong, but likely there is already a bounding box check done and this is ran on the objects that passed the bounding box check, hence why there is the object_index value.
And this part of the code is just doing a line scan on the x axis until a collision with said object to light that object for a set gradient so that the object can cast a shadow and only one face of the object is lit.
Looking at the code available to work with sprites in Game Maker Studio, it honestly looks like that these checks are being done on pixels close to the sprite to begin with. Only optimization that could really be done at this point is to make sure that there isn't any padding for the sprites and maybe have multiple box colliders if possible.
But if Game Maker Studio had a proper sprite API where you could get the color of a pixel in a sprite, I would probably include an additional sprite for each render sprite that would have the coordinate of the first non transparent pixel for each line along the x axis and then just read that pixel for each line to get that pixel coordinate instead of having to do collision tests. This second sprite could be created during the loading process or during sprite creation or during game building.
But Game Maker Studio doesn't have a performant method to fetch pixel data from sprites. So my optimization and yours with the map of the outline of the sprite is not possible. Mind you, as I said, it looks like your optimization is what is actually be ran in Thor's code, tho it is having to check against a collision mask instead of the actual pixels of the sprite.
That's actually interesting. I wasn't aware this is how GameMaker works
Maybe the only optimization possible then is to do a QST so you test on progressively smaller cuadrants until you get a hit? So you do 4 checks and then another 4 but only if you get a hit in any of them and so on...on a 100x100 sprite you would get the exact collision pixel on 6 checks
Well thinking about it, it seems that the lighting algorithm is a left or right edge detector based on light direction. I see what you are saying, essentially a binary search per line until the pixel collision is found. That is one way to do it.
Personally, I would move it fully onto the GPU and in the fragment/pixel shader for the sprite, do 5 samples or how many samples required for the shading to the left or right of the pixel depending on the light direction and count how many samples have color in them and then do the lighting fall off based on the count.
Bit heavier computationally, but it is highly parallelized and on the GPU which is what the GPU is for.
Narrow bounding boxes make for simple tests. Then you use small triangles. Center of triangle to center of triangle is just simple math. Then see if any lines of close enough triangles intercept. Works well.
There's other ways. You can also divide the object into a tree like structure and compare node distances like for example, on the forearm node exceeds a certain distance from another tree forget the whole branch and subbranches. Or both trees entirely. When key nodes get close enough and there could be a collision, look at the lines between the nodes of both objects. Like halfway down the branch. A few calculations later compare the lines between relevant nodes. Nodes with overlapping lines are collisions. Every calculation is really efficient. It's line math via coordinates
Both methods are built on how blazing easy and fast it is to compare line intersections after using a quick test, again with lines to narrow down the relevant lines.
An easy optimisation would be to use gamemakers built in lighting and shader functions
He is doing everything manually in an insanely stupid way
Wow. This kids, is why you don't try to re-invent the wheel unless you really, really know what you're doing.
Nonsense, reinventing the wheel is a great way to learn how the wheel works and what goes into making one.
A handmade wheel probably isn't the best thing to attach to your car in production though...
The thing is, Game maker studio doesn't have a built in lighting engine, so u/StrangeCurry1 is wrong in that he could use the built in lighting, since it doesn't actually exist.
BUT IM A LEET CODE HACKER. IM DEFINITELY GOING TO WRITE MY OWN SORT FUNCTION.
Game maker studio doesn't have a built in lighting engine.
If you can express a good enough approximate collision area as a bounding box, all you need to do is check x and y values of possible collisions. 2 checks
>It's used for some gradient objects and lightning effects in Heartbound.
Also for shadows.
Close, it runs the check along the x axis until it finds a collision and then moves to the next line. This is done for doing shadows. Not the best solution for it. But the solution given in the Code Jesus doesn't do shadows, it just handles lighting things inside the light bounds.
Holy shit that is completely insane
Reviewed here with alternative implementations proposed: https://youtu.be/jDB49s7Naww
Not 100% sure if that code is real, but I would not be surprised. Every single time I have seen code from this "20 Year industry veteran" is of this kind of intern tier quality.
I just saw this snippet in a youtube video and apparently it is a part of an open source demo he released a while back
Literally the worst code I have ever seen from a big name social that casts themselves as an expert. There's so many better influencers. Someone like primeagen is much better, I don't agree with everything he says but you can tell that he knows how to code and I'm interested in his opinion.
This guy is just like toilet code, you could actually get worse at programming by listening to him.
Me neither. I was making negative comments about him even before SKG and people were downvoting or arguing me constantly. Some things he was talking about gamedev or programming were not true, some of his tips could be even harmful.
his l33t h@xX0r experience if there was any is just stone soup with absolutely 0 relevant information coming from him and 100% coming from the chat whi actually has an idea of the field, and most of what he says is incorrect to bait out those people with actual knowledge
Yeah, everyone's mocking his personality, meanwhile I'm staring in horror at this quadruply-nested loop for something which I think doesn't need to be a loop at all. If I understand this correctly, you're checking whether two rectangles overlap, which can be done in O(1) time.
It seems like hes also manually implementing some sort of shader at the same time instead of using gamemakers built in shader functions.
This is multiple levels of shitty code
His excuse is he didn't use shaders so it would run better with integrated graphics. Not even joking.
No excuse when GameMaker supports GLSL. Pick a version to make your shader in. OpenGL 2.0 shaders where released in 30 April 2004 and OpenGL 4.6 was released in 14 June 2018.
In my games I use 3.3 just so that I know it will work on nearly everything.
Aaaand now it’s using 8x cpu
We’ve had one collision, yes. But what about second collision?
maybe anticheat
From what I see, it looks like a code that could've been a shader in Unity.
But I'm not familiar with Game Maker.
Ah just to be sure you know how it is