Timing Runs
2 years ago
Kentucky, USA
Super moderatorkeap3
He/Him, They/Them
2 years ago

Hello Gamers™,

Sations mentioned that starting a thread about this topic might be helpful to give a long form space to explain the theories and methodologies as well as give the community a space to publicly voice their opinion. I think this is a good idea as well, so I wanted to start one. Truthfully, these concepts still confuse me, but I'll give it my best shot. Hopefully sations can clarify or supplement where needed.

First and foremost, "time" is a real concept... though perspective can change the reality of time, and its this reason that it has been a struggle to find a consistent method to find "real time" in the context of this game. Thankfully, the game does give us an in-game timer to work with, but considering the extremely fast nature of some of these runs, having milliseconds present in the final real time is crucial for maintaining the competitive edge of the game and providing more accurate results. Unfortunately, the in-game timer doesn't provide that level of granularity, and with its variable FPS could STILL provide a difference between IGT and RTA. So sticking with real time as a timing means should be the standard in this game.

But what does real time mean? Well, its the real life actual time of the gameplay that isn't necessarily bound to the games measurement of in-game time. Seems simple. But it becomes complicated when recording, rendering, uploading, transferring, downloading, re-rendering, and frame counting begins. The way in which frames are defined, both in the rendered footage and in the software/hardware used to count those frames when verifying, can vary simply depending on which roads the footage takes between gameplay and verification.

Number 1, the game itself has a variable FPS. It tries hard (and mostly succeeds!) at running at 60 FPS, and graphically, it's not the most complex piece of software ever made, so it isn't rocket science to make this game run smoothly. But, really, consistent FPS in ANY game is a bit of a myth. Even in my experience running Super Mario World - which is run off of a cartridge onto proprietary hardware - the FPS varies. It claims to run at 60 FPS, but its more like 60.098813897441 ... and perhaps that is the short way of describing it. And even in those cases, the FPS can vary.

Every time the FPS varies, the concept of time changes for in-game time. The game can only be rendered in frames and if FPS is the unit of measurement here, in-game time is inherently tied to it.

What makes it more complicated though, is that collecting, rendering, and exporting footage is then bound by the the capabilities of the capture. Recording something in 60 FPS is possible (albeit still variable), but can take a significant amount of resources to do this. Delivering raw 60 FPS footage to a moderator can take even more resources, and at that point, the conversation becomes about access.

For most games on Speedrun.com, runners record footage and upload it to a service like YouTube or Twitch, and then submit the link to be considered. In that case, we as moderators extract that footage as a file and run it through a frame counting software.

The problem with this, is that there are tons of frame counting softwares! Some built better than others! Depending on knowledge about frame counting in general and what software is being used, your real time will vary!

In our experience and to the best of our knowledge, that's why we specifically state in the rules that runs will be timed using Avidemux. A software that is free and available to everyone and has consistent and understandable methods in rendering recorded footage and crawling through frames. Other methods in different softwares or browser extensions seem to be more unreliable.

We could go through every iteration of this argument and find a different answer for each one, but without landing on a method, no runs would ever get retimed with certainty. There are a lot of arguments to be had, but we are landing on Avidemux to create a singular reference point that is consistent. All rendered footage should time to be the same in Avidemux no matter where you are.

Though this is the decision we are landing on, it is not the one that is necessarily or perfectly "correct". It's simply the way we know how to do this to the best of our knowledge and in a way that provides consistency.

I post this thread as a way of explaining the method, but also as a way to give the community a voice. Feel free to chime in here with thoughts, questions, complaints, praise, suggestions, etc. Surely I'm forgetting parts of this argument, but I've gone on for too long and should probably stop. But please chime in or DM mods for further discussion if you want.

thanks,

peachy

Jaypin88 likes this
United States

I'd like to chime in and clarify a bit why Avidemux and yt-frame-timer specifically vary in their results, what exactly this means in the context of timing runs, explain a little more about why neither of them are very accurate, and ultimately why Avidemux is the timing method of choice.

yt-frame-timer is a great tool, and would be ideal for our purposes, but is ultimately relying on the frame scrobbling feature youtube provides, which is inaccurate, and can produce varying results even on the same system, in the same browser, during the same session. What this means is the starting or ending frame times can be off anywhere from a few thousandths of a second to multiple frames.

Avidemux on the other hand relies on a local file, so a moderator/reviewer has to extract the video from youtube or twitch. There are ways of doing this and pulling the raw video, but there's a catch (that I'm very surprised hasn't been worked around yet): when we pull the raw video from a youtube stream, the server auto-selects the video quality. Unfortunately the videos that are pulled are almost always in 30fps even if it has selected a 1080p instance of the video.

The reason why we've opted for Avidemux is simply because of the inaccuracy of youtube's scrobbling feature, and that the inaccuracies vary wildly. At least with a 30fps video we know each frame is always within +/- .01666s.

I think there's definitely moves that can be made to solve this timing issue, especially in the speedrunning community as a whole. We've tossed around the idea of working on an auto-splitter (which is something I don't really know anything about), and hopefully that would take some of the headache out of timing these.

I hope this helps explain where our heads are at. Please post thoughts, questions, suggestions, ideas, memes -sations

Edit: I also forgot to mention that it seems that the IGT for this game floors the end time, that is it always rounds down to the nearest whole second. Keap has mentioned that the IGT sometimes doesn't reset properly and it's possible there are fluctuations in it's starting/ending frames (which I have yet to reach out to the creator of the game about to ask).

Edited by the author 29 days ago
Jaypin88 and keap3 like this
Game stats
Followers
4
Runs
67
Players
7
Recent runs
Latest threads
Posted 2 years ago
1 reply
Moderators