Does anyone know of a particular equation that can give me an estimate of approximate encoding FPS based on hardware setup? For example, on my computer, Profile A may usually result in .86 FPS during the encode, causing a 3½ minute video to take about an hour and a half to encode. Profile B may usually result in .11 FPS, thereby taking about 12 hours for the same video. The resolution is the same.
If I wanted to estimate how fast it would be on my grandparents' computer, for example, is there a way that I could figure it out based squarely on their hardware setup? I know that the amount of action in the video can change speed even on the same computer, but this is simply meant as a rough comparison based on a single given video.
I'm not looking for suggestions on how to speed up encoding, as I'm perfectly happy with my excessively insane settings. I'm looking at this purely from a hardware perspective, as pretty much any computer nowadays would be able to encode faster than mine. I just want an idea of how much faster a different setup is without going out an physically testing it.
