Avatar

It's complicated. (Gaming)

by Cody Miller @, Music of the Spheres - Never Forgot, Tuesday, July 03, 2018, 10:49 (2135 days ago) @ uberfoop
edited by Cody Miller, Tuesday, July 03, 2018, 10:55

This isn't a matter that you can address with a simple acuity argument, because undersampling produces artifacts that can be detected much more easily. For instance, it's easy to make out specular flicker on normal map aliasing in some seventh-gen games even if you're playing them on a 40" screen from some ridiculous distance like 50 feet.

You never see artifacts in a movie, even when they are shot in 2K. There is no supersampling going on. There are a set amount of photosensors on the imaging chip. Some cameras like the F65 did supersample. But others like the Alexa don't. You are actually imaging at 2K. You don't see artifacts because the cameras have great post processing to eliminate them.

You can say that reality is analog, so all the 'effects' are done by the time it hits the sensor. Which is the point I'm trying to make. Reality is the ultimate 'resolution doesn't matter as much' argument because a 640 x 480 photograph still looks photorealistic.

The whole point behind anti-aliasing was that it was cheaper than supersampling. If it weren't a good 'bang for the buck' the algorithms wouldn't exist. In your example, the developers could have lowered the resolution, and increased the filtering to eliminate the flicker with the extra GPU cycles that freed up.

Likewise render farm time is not cheap. You aren't going to render at a higher resolution than you need to get the job done. Your example of using more rays for the ray tracing… this is completely independent of the resolution and actually supports my claim that beyond a certain point other things matter much more than final pixel count.


Complete thread:

 RSS Feed of thread