Avatar

It's complicated. (Gaming)

by Cody Miller @, Music of the Spheres - Never Forgot, Tuesday, July 03, 2018, 07:30 (2340 days ago) @ uberfoop
edited by Cody Miller, Tuesday, July 03, 2018, 07:34

Movies aren't rendered at 2K, they're mastered at 2K.

The CG work is in fact rendered at 2K if the film is finished in 2K.

When using a physical camera, a ton of photons are involved in determining the color of each pixel. When rendering CGI, a ton of samples are taken at every pixel.
It's easy to blow off this issue by saying "just use better antialiasing", but that's not really a complete solution. There are a lot of types of detail that contribute to a game scene, and within current rendering methodology, very few of them are actually being filtered correctly down from high sample rates. And that produces macro-scale inaccuracies that can't simply be swept under the rug.
Even if we're playing on a 1080p display from a considerable distance, there are a ton of respects in which games do look cleaner if you boost the rendering resolution beyond 1080p and do a good resample to produce the final output.

I am not doubting this, but the boost to image quality is slight., whereas adding better shadows, lighting, shaders etc, creates a massive jump in image quality. Again, Quake 1 looks like ass at ANY resolution, because the renderer is so primitive.

When we are talking about SD level, then resolution can have a great deal of importance. It's why Crash Bandicoot on the PS1 used the 512 x 240 mode: if they rendered at 320 x 240 crash's single pixel eye would often vanish among other things. But in full HD, you've got enough resolution to satisfy human visual acuity in most cases. VR, large cinema screens, and IMAX excepted.

Typical values for healthy young people are more in the ballpark of 20/15. (So a lot of people do even better than that.)

20/15 in my right eye, 20/10 in my left. Checked 6 months ago. And to me 4K on a small screen is such a minor difference.


Complete thread:

 RSS Feed of thread