


<?xml version="1.0" encoding="utf-8"?><rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/">
<channel>
<title>DBO Forums - Resolution vs Effects</title>
<link>https://destiny.bungie.org/forum/</link>
<description>Bungie.Org talks Destiny</description>
<language>en</language>
<item>
<title>Resolution vs Effects (reply)</title>
<content:encoded><![CDATA[<p><iframe style="border:none;" width="852" height="480" src="https://www.youtube.com/embed/SWcRtzjyH-c?autoplay=0&start="></iframe></p>
<p>The same folks that made the video in the original post seem to think that even the next gen of consoles won't try to target native 4K, but instead upscale in order to spend that extra GPU power on better images and frame rates.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=153237</link>
<guid>https://destiny.bungie.org/forum/index.php?id=153237</guid>
<pubDate>Sat, 14 Jul 2018 17:16:00 +0000</pubDate>
<category>Gaming</category><dc:creator>Cody Miller</dc:creator>
</item>
<item>
<title>It&#039;s complicated. (reply)</title>
<content:encoded><![CDATA[<blockquote><p>You never see artifacts in a movie, even when they are shot in 2K. There is no supersampling going on.</p>
</blockquote><p>Yes there is. Huge numbers of photons strike each photosensor element to contribute to the final image. In graphics rendering terms, that's basically supersampling.</p>
<p>The photosensor grids used in digital cameras often <em>do</em> have elements with a very narrow view, which results in aliasing if the cameras are used without a low-pass filter. Hence, low-pass filters are usually used in circumstances where aliasing is a notable problem with the scenes and cameras involved. </p>
<p>Although some post-processing is always done on digital imagery, the low-pass filters in question are usually optical filters that slightly blur the incoming light <em>before</em> the photosensor grid, causing light that might otherwise have &quot;missed&quot; a photosensor to strike it.<br />
This is the most correct way to antialias a signal: blurring before sampling prevents aliases from being introduced in the first place, whereas sampling and then blurring just blurs the aliases that result from the sampling process.</p>
<p>Anyway, the cleanliness of digital film isn't just a matter of post-process AA.</p>
<blockquote><p>The whole point behind anti-aliasing was that it was cheaper than supersampling. If it weren't a good 'bang for the buck' the algorithms wouldn't exist. In your example, the developers could have lowered the resolution, and increased the filtering to eliminate the flicker with the extra GPU cycles that freed up.</p>
</blockquote><p>Yes, but as I stated earlier, there are numerous respects in which the only current way to eliminate the artifacts while producing accurate results <em>is to supersample</em>. Saying that you just need to &quot;increase the filtering&quot; ignores that rendering is far from an easy and solved problem.</p>
<p>So, for the example of specular shimmer. It's very easy to reduce specular shimmer <em>inaccurately</em>.<br />
For example, Halo 1's normal maps undergo trilinear texture filtering, preventing sharp changes in surface normal from flickering in and out of existence. However, this also has the effect of visually flattening surfaces at a distance. This isn't a big issue for Halo 1, because the only materials in the game that have sharp normal maps are smooth besides some large cuts, so the flattening of the normals doesn't harm the perceived material types. But this can be a problem for things like micro-smooth surfaces with complex macro-roughness represented in the normal map; they go from being chunky up close to mirror-like at a distance. Consequentially, games sometimes choose to filter the normal maps sharply, to prioritize material accuracy at the cost of lots of flicker.<br />
Nowadays there are some techniques available to combat this, like Toksvig mapping, which essentially transform normal map contents into material roughness as the normals flatten out at a distance. But there are plenty of circumstances where they're still far off from a ground truth render.</p>
<blockquote><p>Your example of using more rays for the ray tracing… this is completely independent of the resolution and actually supports my claim that beyond a certain point other things matter much more than resolution.</p>
</blockquote><p>In terms of combating artifacts relating to inadequate sampling, using more rays is pretty much the same exact approach as rendering at a higher resolution. Both are an increase to the number of point-samples being taken to create the final image.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152971</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152971</guid>
<pubDate>Tue, 03 Jul 2018 18:42:08 +0000</pubDate>
<category>Gaming</category><dc:creator>uberfoop</dc:creator>
</item>
<item>
<title>It&#039;s complicated. (reply)</title>
<content:encoded><![CDATA[<blockquote><p>This isn't a matter that you can address with a simple acuity argument, because undersampling produces artifacts that can be detected much more easily. For instance, it's easy to make out specular flicker on normal map aliasing in some seventh-gen games even if you're playing them on a 40&quot; screen from some ridiculous distance like 50 feet.</p>
</blockquote><p>You never see artifacts in a movie, even when they are shot in 2K. There is no supersampling going on. There are a set amount of photosensors on the imaging chip. Some cameras like the F65 did supersample. But others like the Alexa don't. You are actually imaging at 2K. You don't see artifacts because the cameras have great post processing to eliminate them. </p>
<p>You can say that reality is analog, so all the 'effects' are done by the time it hits the sensor. Which is the point I'm trying to make. Reality is the ultimate 'resolution doesn't matter as much' argument because a 640 x 480 photograph still looks photorealistic.</p>
<p>The whole point behind anti-aliasing was that it was cheaper than supersampling. If it weren't a good 'bang for the buck' the algorithms wouldn't exist. In your example, the developers could have lowered the resolution, and increased the filtering to eliminate the flicker with the extra GPU cycles that freed up.</p>
<p>Likewise render farm time is not cheap. You aren't going to render at a higher resolution than you need to get the job done. Your example of using more rays for the ray tracing… this is completely independent of the resolution and actually supports my claim that beyond a certain point other things matter much more than final pixel count.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152970</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152970</guid>
<pubDate>Tue, 03 Jul 2018 17:49:59 +0000</pubDate>
<category>Gaming</category><dc:creator>Cody Miller</dc:creator>
</item>
<item>
<title>How very, &quot;I bet I have better Vitals than you&quot;, so-to-speak (reply)</title>
<content:encoded><![CDATA[- No text -]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152969</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152969</guid>
<pubDate>Tue, 03 Jul 2018 17:11:20 +0000</pubDate>
<category>Gaming</category><dc:creator>Pyromancy</dc:creator>
</item>
<item>
<title>It&#039;s complicated. (reply)</title>
<content:encoded><![CDATA[<blockquote><p>The CG work is in fact rendered at 2K if the film is finished in 2K.</p>
</blockquote><p>I'm not sure to what extend rectangular-grid buffers larger than 2K are used, but regardless, prerendered CGI is generally <em>highly</em> supersampled. If you're rasterizing, you use lots of &quot;pixels&quot; to produce the color for each final pixel; if you using a technique along the lines of ray tracing, you cast lots of rays from each pixel.</p>
<blockquote><p>I am not doubting this, but the boost to image quality is slight.</p>
</blockquote><p>I'm not arguing otherwise, I'm just explaining why people aren't necessarily unreasonable to disagree.</p>
<p>This isn't a matter that you can address with a simple acuity argument, because undersampling produces artifacts that can be detected much more easily. For instance, it's easy to make out specular flicker on normal map aliasing in some seventh-gen games even if you're playing them on a 40&quot; screen from some ridiculous distance like 50 feet.</p>
<p>People's sensitivity to these sorts of things also need to be weighed against the extent to which they agree with the importance of more complex rendering. I mean, you keep bringing up Quake as this obviously-hideous game; Quake is graphically very simple compared with modern games, but whether it actually <em>looks bad</em> as a result is more subjective and not everyone agrees.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152967</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152967</guid>
<pubDate>Tue, 03 Jul 2018 16:42:21 +0000</pubDate>
<category>Gaming</category><dc:creator>uberfoop</dc:creator>
</item>
<item>
<title>It&#039;s complicated. (reply)</title>
<content:encoded><![CDATA[<blockquote><p>Movies aren't <em>rendered</em> at 2K, they're <em>mastered</em> at 2K. </p>
</blockquote><p>The CG work is in fact rendered at 2K if the film is finished in 2K.</p>
<blockquote><p>When using a physical camera, a ton of photons are involved in determining the color of each pixel. When rendering CGI, a ton of samples are taken at every pixel.<br />
It's easy to blow off this issue by saying &quot;just use better antialiasing&quot;, but that's not really a complete solution. There are a lot of types of detail that contribute to a game scene, and within current rendering methodology, very few of them are actually being filtered correctly down from high sample rates. And that produces macro-scale inaccuracies that can't simply be swept under the rug.<br />
Even if we're playing on a 1080p display from a considerable distance, there are a ton of respects in which games <em>do look cleaner</em> if you boost the rendering resolution beyond 1080p and do a good resample to produce the final output.</p>
</blockquote><p>I am not doubting this, but the boost to image quality is slight., whereas adding better shadows, lighting, shaders etc, creates a massive jump in image quality. Again, Quake 1 looks like ass at ANY resolution, because the renderer is so primitive.</p>
<p>When we are talking about SD level, then resolution can have a great deal of importance. It's why Crash Bandicoot on the PS1 used the 512 x 240 mode: if they rendered at 320 x 240 crash's single pixel eye would often vanish among other things. But in full HD, you've got enough resolution to satisfy human visual acuity in most cases. VR, large cinema screens, and IMAX excepted.</p>
<blockquote><p>Typical values for healthy young people are more in the ballpark of 20/15. (So a lot of people do even better than that.)</p>
</blockquote><p>20/15 in my right eye, 20/10 in my left. Checked 6 months ago. And to me 4K on a small screen is such a minor difference.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152966</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152966</guid>
<pubDate>Tue, 03 Jul 2018 14:30:14 +0000</pubDate>
<category>Gaming</category><dc:creator>Cody Miller</dc:creator>
</item>
<item>
<title>I agree with the idea behind Cody&#039;s argument, but... (reply)</title>
<content:encoded><![CDATA[<blockquote><blockquote><blockquote><p>Tl;dr, player choice is the ideal solution (everyone, be more like DE!), but right now, 1080p is the gold standard. 4k is the future, but the race to jump on it comes as a detriment to the player experience.</p>
</blockquote></blockquote></blockquote><blockquote><blockquote><p><br />
Also think about the fact that 2K (2048 x 1080) has long been, and still is a cinema standard, where screens are 25 FEET or more. The CGI work in the Martian looked pretty damn realistic on the big screen right? Guess what, it was 2K.</p>
</blockquote></blockquote><blockquote><p><br />
I dunno about all of that tangent, since it has absolutely nothing to do with games or modern gaming hardware, making it moot and dumb.</p>
</blockquote><p>It has everything to do with what I am saying. 2K is 'good enough', where passing it creates sharply diminishing returns on normal sized screens, especially if you really use great rendering techniques.</p>
<p>The exception is of course VR. VR games absolutely need about 10x the resolution they currently have.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152965</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152965</guid>
<pubDate>Tue, 03 Jul 2018 14:24:06 +0000</pubDate>
<category>Gaming</category><dc:creator>Cody Miller</dc:creator>
</item>
<item>
<title>There&#039;s collateral damage. (reply)</title>
<content:encoded><![CDATA[- No text -]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152959</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152959</guid>
<pubDate>Tue, 03 Jul 2018 10:53:06 +0000</pubDate>
<category>Gaming</category><dc:creator>Vortech</dc:creator>
</item>
<item>
<title>Resolution vs Effects (reply)</title>
<content:encoded><![CDATA[<blockquote><p>LOL YES, it did. </p>
<p>The Elites for example were ripped out Reach! </p>
</blockquote><p>Sharing some assets doesn't mean the style is the same. If anything, sharing assets from multiple games with different styles was a visible complaint about CEA.<br />
 </p>
<blockquote><p>CEA is arguably in the same 'art' style as the Bungie era games. H4 and H5 are not, at all.</p>
</blockquote><p>CEA was starting to go in the direction of Halo's 4 and 5 in some respects. It has a lot of disruptive macrodetailing, and makes considerable use of the sorts of teal-and-orange-esque palettes that 343i often likes. It also pretty much dropped Bungie's snazzy specular emphasis.</p>
<p>But also, while there are some common themes through the Bungie Halo games, they don't all share the same art style. Like, Halo 1 is extremely juicy and also oozes with a bit of an almost 1990s sega visual attitude, while Halo's 2 and 3 have an almost stop-motion-action-figures thing going on. </p>
<blockquote><p>From a technical perspective, H4 had a wayy higher level of detail and effects than the previous entries</p>
</blockquote><p>Not really. It's got a lot of very polished asset work, and it does make extensive use of techniques that make the environments look like they contain lots of detail. But it also makes a lot of graphical compromises compared with its predecessors.</p>
<p>Although the game supports large numbers of dynamic lights, the quality of dynamic lights is perhaps the worst in the entire series; dynamic lights have no specular reflections, and they're all point lights. So vehicles don't have actual spotlight headlights, there's no flashlight, and there's no gloss to the reflections.</p>
<p>The environmental baked lighting is also compromised in some respects. The game doesn't appear to have the &quot;area specular&quot; that Bungie used for indirect specular in their 360 Halo games, causing snazzy materials in complex lighting environments to often exhibit issues like false rim lighting.</p>
<p>A lot of effects light explosions are just plain toned down, including having less debris from vehicles and such.</p>
<p>A lot of special-case stuff is poor. Water is often very low-quality, not just in terms of splash interactivity but also lighting (a lot of the water doesn't really react to environment lighting <em>at all</em>). And the game is never really asked to do much in terms of weather.</p>
<p>Halo 4 is a very graphically <em>refined</em> game, and it cleverly goes about achieving its visual goals, but it's not the huge-leap-in-every-way that it's sometimes made out to be.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152958</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152958</guid>
<pubDate>Tue, 03 Jul 2018 07:15:48 +0000</pubDate>
<category>Gaming</category><dc:creator>uberfoop</dc:creator>
</item>
<item>
<title>It&#039;s complicated. (reply)</title>
<content:encoded><![CDATA[<p>Games need to be rendered at whatever sample rates and output resolutions necessary to pull off the desired visuals. And this is not only subjective, but also sometimes counter-intuitive.</p>
<p>Take Ridge Racer 7. It embraced the whole &quot;HD console&quot; thing, and went 1080p60 at PS3 launch. In order to hit that target, it's graphical makeup is - besides texture quality and resolution - mostly not very interesting even by PS2-era standards. But artistically it was designed around that razor-sharp hyper-speed look... it looks great on my HD LCD, but despite the visual simplicity, it looks poor on my SD CRt; all of its luster is just <em>gone</em> at low res.</p>
<p>Take The Last of Us. Very visually complex, very richly detailed. <em>But</em>, it's got a smudgy visual composition, and assets are generally authored in such a way that they still convey their intent and are readable when heavily filtered down. And consequentially, when I played it in a ridiculous 16:9 360i window on my SD CRT, I was surprised to find a game that still looked excellent and remained very visually readable.</p>
<blockquote><p>Also think about the fact that 2K (2048 x 1080) has long been, and still is a cinema standard, where screens are 25 FEET or more. The CGI work in the Martian looked pretty damn realistic on the big screen right? Guess what, it was 2K. It looked so good because they had the ability to spend an hour per frame to render graphical effects.</p>
</blockquote><p>Movie comparisons tend to be misleading in the context of game rendering, for two main reasons.</p>
<p>First, depth of field tends to increase perceived clarity on unblurred objects. That's fine for movies, but with games, having ultra-clarity everywhere is often desirable in and of itself.</p>
<p>Second, we need to separate output reconstruction with sample rate.<br />
Movies aren't <em>rendered</em> at 2K, they're <em>mastered</em> at 2K. When using a physical camera, a ton of photons are involved in determining the color of each pixel. When rendering CGI, a ton of samples are taken at every pixel.<br />
It's easy to blow off this issue by saying &quot;just use better antialiasing&quot;, but that's not really a complete solution. There are a lot of types of detail that contribute to a game scene, and within current rendering methodology, very few of them are actually being filtered correctly down from high sample rates. And that produces macro-scale inaccuracies that can't simply be swept under the rug.<br />
Even if we're playing on a 1080p display from a considerable distance, there are a ton of respects in which games <em>do look cleaner</em> if you boost the rendering resolution beyond 1080p and do a good resample to produce the final output.</p>
<p>Oh, also:</p>
<blockquote><p>Assuming he has 20/20 vision</p>
</blockquote><p>There's a very high probability that he has better than 20/20 vision.</p>
<p>20/20 vision isn't good visual acuity, it's the boundary of what's considered healthy. That is, having acuity worse than 20/20 can be indicative of a problem.</p>
<p>Typical values for healthy young people are more in the ballpark of 20/15. (So a lot of people do even better than that.)</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152957</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152957</guid>
<pubDate>Tue, 03 Jul 2018 06:29:02 +0000</pubDate>
<category>Gaming</category><dc:creator>uberfoop</dc:creator>
</item>
<item>
<title>I agree with the idea behind Cody&#039;s argument, but... (reply)</title>
<content:encoded><![CDATA[<blockquote><blockquote><p>Tl;dr, player choice is the ideal solution (everyone, be more like DE!), but right now, 1080p is the gold standard. 4k is the future, but the race to jump on it comes as a detriment to the player experience.</p>
</blockquote></blockquote><blockquote><p><br />
Also think about the fact that 2K (2048 x 1080) has long been, and still is a cinema standard, where screens are 25 FEET or more. The CGI work in the Martian looked pretty damn realistic on the big screen right? Guess what, it was 2K.</p>
</blockquote><p>I dunno about all of that tangent, since it has absolutely nothing to do with games or modern gaming hardware, making it moot and dumb.</p>
<p>...but I will say that If you look at Fury Road, one of the best-looking films in recent memory, the 4k release is an upscaled version of a 3K film with 2K effects. Doesn't matter, fantastic-looking film.</p>
<p>And a film like Cloverfield, which recently got a 4K release, looks absolutely terrible, and is at its most enjoyable when you're watching it at DVD quality.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152952</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152952</guid>
<pubDate>Tue, 03 Jul 2018 03:04:16 +0000</pubDate>
<category>Gaming</category><dc:creator>Korny</dc:creator>
</item>
<item>
<title>I agree with the idea behind Cody&#039;s argument, but... (reply)</title>
<content:encoded><![CDATA[<blockquote><p>Tl;dr, player choice is the ideal solution (everyone, be more like DE!), but right now, 1080p is the gold standard. 4k is the future, but the race to jump on it comes as a detriment to the player experience.</p>
</blockquote><p>Also think about the fact that 2K (2048 x 1080) has long been, and still is a cinema standard, where screens are 25 FEET or more. The CGI work in the Martian looked pretty damn realistic on the big screen right? Guess what, it was 2K. It looked so good because they had the ability to spend an hour per frame to render graphical effects.</p>
<p><img src="http://s3.carltonbale.com/resolution_chart.png" alt="[image]" /></p>
<p>Cheaply said he sits 4 feet from his 4K TV set. Assuming he has 20/20 vision, he's just on the threshold of actually being able to see the additional detail. I'm not saying I don't see the difference between 1080p and 4K, but the difference is so incredibly minor for a massive amount of additional work on the GPU.</p>
<p>We could probably have true ray tracing if 1080p was the focus for upcoming hardware.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152951</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152951</guid>
<pubDate>Tue, 03 Jul 2018 02:42:22 +0000</pubDate>
<category>Gaming</category><dc:creator>Cody Miller</dc:creator>
</item>
<item>
<title>I agree with the idea behind Cody&#039;s argument, but... (reply)</title>
<content:encoded><![CDATA[<p>The example he gives is extremely dumb.</p>
<p>Resolution is important, but should not be the priority. But... eh. I think it's hard to explain my position, so I've made a quick video to set the foundation of my perspective:</p>
<p><iframe style="border:none;" width="852" height="480" src="https://www.youtube.com/embed/zu-nhrPIN9A?autoplay=0&start="></iframe><br />
Okay, so yeah, resolution matters, and 4K is the future. But does it really matter that much in the current generation? Nope, and even less so when you're talking about handhelds.</p>
<p>The biggest priority for developers should be performance. The game needs to run well, impress with the effects second, and look sharp third.</p>
<p>-When your effects impress and you focus on resolution, you end up with something like Destiny 2 on Xbox (and yeeeeees Cheap and Kermit, occasionally in certain spots on PS4, no need to nag), with Slowdown galore. Even on Xbone X &quot;enhanced&quot; games, like Destiny 2 and State of Decay 2, you end up with performance-suffering games, as neither example can run above 30fps, and often, the &quot;enhanced&quot; 4K games struggle to even hit that framerate consistently (Battlegrounds, anyone?).</p>
<p>-When you focus on resolution and performance, you end up with visually limited games, like Sea of Thieves, which has flawless water effects and lighting, and 4k textures... but little else in the way of effects or environmental... life?</p>
<p>-The ideal middle ground for this generation, IMO, is games that favor performance and effects over resolution. You end up with games like Horizon Zero Dawn, God of War, Hellblade, and Gears 4. All games that perform at their best at 1080p even on Xbone X, and which are renowned as some of the best looking (and performing) games of this generation, whether or not you play them on the &quot;premiere&quot; consoles.</p>
<p>So yeh.</p>
<p><br />
Tl;dr, player choice is the ideal solution (everyone, be more like DE!), but right now, 1080p is the gold standard. 4k is the future, but the race to jump on it comes as a detriment to the player experience.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152950</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152950</guid>
<pubDate>Tue, 03 Jul 2018 01:57:25 +0000</pubDate>
<category>Gaming</category><dc:creator>Korny</dc:creator>
</item>
<item>
<title>I just hit &quot;open thread&quot; (reply)</title>
<content:encoded><![CDATA[<p>Cause then I can hit the back button and pretend like I read this entire thread without actually having the think about reading it. It's works great.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152915</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152915</guid>
<pubDate>Mon, 02 Jul 2018 12:36:46 +0000</pubDate>
<category>Gaming</category><dc:creator>MacAddictXIV</dc:creator>
</item>
<item>
<title>Resolution vs Effects (reply)</title>
<content:encoded><![CDATA[<p>So, I finally took the time to actually watch the video.</p>
<p>Now I disagree with you even more.  Wolfenstein on Switch looks like blurry dogshit.  It doesn't matter how good the lighting or effects are when it looks like I'm wearing the wrong prescription in my glasses.   The actual footage refutes the point you're trying to make.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152902</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152902</guid>
<pubDate>Mon, 02 Jul 2018 00:57:21 +0000</pubDate>
<category>Gaming</category><dc:creator>cheapLEY</dc:creator>
</item>
<item>
<title>AHHHHHHHHQILYG! (reply)</title>
<content:encoded><![CDATA[<p>You're operating under the very mistaken assumption that any given conversation with Cody is an attempt to have a productive conversation, rather than just trying to make Cody irritated.   (:</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152900</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152900</guid>
<pubDate>Sun, 01 Jul 2018 23:32:25 +0000</pubDate>
<category>Gaming</category><dc:creator>cheapLEY</dc:creator>
</item>
<item>
<title>Indeed. (reply)</title>
<content:encoded><![CDATA[- No text -]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152899</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152899</guid>
<pubDate>Sun, 01 Jul 2018 22:56:41 +0000</pubDate>
<category>Gaming</category><dc:creator>Ragashingo</dc:creator>
</item>
<item>
<title>AHHHHHHHHQILYG! (reply)</title>
<content:encoded><![CDATA[<p>I'm really tired of everyone arguing past each other every time Cody brings this up.</p>
<p>I'm going to try to review and hopefully we can all get on the same page so at least we can argue the things people are really claiming:</p>
<p>Cody's claims:</p>
<p>1. All other things being equal, better graphics make for a better game.  (I suspect he would also admit that this is based only on the fact that graphics are an aspect of a game that can materially influence the quality of the game.  That means the same could be said of all aspects of a game - making this claim almost uselessly obvious. (the fallacy of composition notwithstanding)  However, that really means people should not be lining up to argue the point.). <br />
No examples of a bad game with good graphics will address this claim.  Establishing a &quot;value&quot; for 4K resolution will not access this claim. The only way to argue this is by either claiming a) that graphics have no effect on the quality of a game whatsoever, or b) You somehow played the exact same game with better and worse graphics and are prepared to claim that the better graphics made the game objectively worse (though apparently even that will not work because of the subjectivity of &quot;better&quot; as we have now seen with the Halo argument line which must be as close to this as is possible.  Bottom line, there's no reason for arguing this point with him.</p>
<p>2. System resources are better spent on other things than increasing resolution once you hit 1080, because of the increasing cost and diminishing returns past that point.  If you must argue something, argue this claim, but please avoid drifting into Claim 1 or listing specific examples of games.  This all boils down to Cody's personal preferences — real or imagined since there is no real opportunity to do A/B/X testing on this — and trying to argue that someone does not really like the things they say they like sounds like a phenomenal waste of time to me.  It seems like a plausible point, but the biggest weakness (besides the subjectivity and pointlessness mentioned earlier) is the idea that graphics are some sort of zero-sum system that can be evaluated so mathematically.  Maybe not going 4K would enable a massive increase in other aspects of the graphics, or maybe it would just mean that they could not justify the development work cost for things that can't be put on a badge on the back of the box, or maybe the development schedule means they have plenty of graphic artists who can make higher Rez textures, but not the coders/artists needed to do other kinds of work.  If I were to guess, I would say there is a fuzzy truth in the claim, but that it does not work well as a guiding principle in the real world because of difficulties in implementation and because most games are not art projects funded through a patron of infinite wealth and minimal interest, but commercial enterprises, so giving people what they want to buy is an essential design step or else the whole thing is moot.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152896</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152896</guid>
<pubDate>Sun, 01 Jul 2018 22:35:35 +0000</pubDate>
<category>Gaming</category><dc:creator>Vortech</dc:creator>
</item>
<item>
<title>Handheld... Stop muddying the water. (reply)</title>
<content:encoded><![CDATA[<blockquote><p>The difference is that the resolution matters far less as the physical screen size decreases.  The Switch can be on a large screen for docked mode, but that's not the only way it will be played, and possibly the less common way.  And, when they say it's the best HANDHELD shooter they have seen they could just as likely be referring to the smaller screen of handheld as the lower computational power of handheld.    I think that is what Deep means.</p>
</blockquote><p>They specifically detail the differences between the game in dock mode and mobile mode, they did in fact play it on a TV too.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152895</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152895</guid>
<pubDate>Sun, 01 Jul 2018 22:20:59 +0000</pubDate>
<category>Gaming</category><dc:creator>Cody Miller</dc:creator>
</item>
<item>
<title>Handheld... Stop muddying the water. (reply)</title>
<content:encoded><![CDATA[<p>The difference is that the resolution matters far less as the physical screen size decreases.  The Switch can be on a large screen for docked mode, but that's not the only way it will be played, and possibly the less common way.  And, when they say it's the best HANDHELD shooter they have seen they could just as likely be referring to the smaller screen of handheld as the lower computational power of handheld.    I think that is what Deep means.</p>
]]></content:encoded>
<link>https://destiny.bungie.org/forum/index.php?id=152894</link>
<guid>https://destiny.bungie.org/forum/index.php?id=152894</guid>
<pubDate>Sun, 01 Jul 2018 22:11:32 +0000</pubDate>
<category>Gaming</category><dc:creator>Vortech</dc:creator>
</item>
</channel>
</rss>
