Quantcast
Channel: www.GameInformer.com
Viewing all articles
Browse latest Browse all 6965

Realism Versus Art: Fight!

$
0
0
What we call ''the next generation of gaming'' is upon us, and perhaps the greatest and most noticeable innovation we'll witness is the leap in graphical fidelity. But not all developers have the capacity to implement big-budget visuals; some even prefer to utilize their own art-styles, opting out of the popular ''photo-realistic'' textures most developers tend to use, and instead choosing a more pixelated or cartoony (albeit unrealistic) approach. This leads me to pose some questions: if all game developers were to go down this route, would the next generation of games cease to be appealing? Are photo-textured graphics really ''more realistic'' than pixelated or comic-book styled graphics? And outside the quantified improvements next-gen consoles will receive - the ability to create bigger worlds, for example - how are individual aspects being improved?

Inline image 1

These questions may seem like they have very simple answers. You might even think I've lost my mind just because I asked them. But if I'm successful in presenting my thesis, you'll see that the topic of realism in games has been greatly misunderstood, and that we aren't progressing quite in the same way most people have been led to believe. In reality, animators have been wrapping photographs of real-life surfaces around 2D and 3D objects since the '90s (before that, even). The environments, weapons, and characters of the video games we play today don't have more realistic textures than they did five years ago; It's just that modelers now have the ability to utilize more polygons on each model, and motion-capture creates a very convincing mimic of facial expressions, as well as body movements... and of course the resolution of the textures themselves gets better and better with each passing year... right?

Just hold on a second. Aren't all photographs made up of pixels? Even with the ''highest'' resolution enabled, aren't all textures really just a trillion little squares squished up against each other to give the illusion of detail when it's really very plain when you're looking at it up-close? ''Well sure, Luke; but like you said, resolution gets better every day.'' Eh, but hold your horses. It's subjective even to say that the resolution of the in-game textures ''gets better,'' since it's all just a matter of how far the camera is zoomed in on the surface when the picture is taken. I mean, really, the ''resolution'' is determined by how close theplayer is to the texture! How can we objectively say that textures are ''getting better'' when pictures of real-world objects will always be pictures of real-world objects, and the resolution of the pixels is left up to whether the player takes a close look or not? And let's take this all the way: even if a developer decided to implement a low-res texture on an in-game surface, wouldn't the resolution of that texture be subjective to how close you're sitting to the TV? And even to how big your TV is? If I wanted to, I could sit far away and my human eyes probably wouldn't be able to see the individual pixels.

Inline image 2
                                                                   Welcome to Minecraft

''Okay, Luke. You've taken this too far. What's your point?'' Simply put, that when it comes to how textures look in games, there is no way to objectively identify what's good and what's bad, because in actuality there's no such thing as a texture that isn't pixelated. Likewise, textures are not magically improving over time since we'll never truly be free from those pesky little single-color squares. The industry ought to just come to terms with the fact that games always have been and always will be made up of pixels. So why not just embrace it? To criticize or praise a game's graphics based on such a silly thing as ''resolution,'' or a lack thereof (the term ''muddy textures'' comes to mind), is downright subjective because no part of a game is truly real; it's all just art.

But it doesn't end there. We have another problem facing us in this whole ''realism versus art'' debacle. You see, while many video game enthusiasts have set out on a quest to prove video games as an art-form, we gamers have given ourselves the freedom to go on our merry way complaining about a lack of ''realism'' in certain games as if there's a specific type of game that isn't really art; to us, it's actually ''a realistic experience'' (let's use Battlefield as an example). We actually hold these games to an even higher standard of realism whenever we see a lack of detail in them, since that's what the developers were shooting for, realism... right? But then we have that whole other spectrum of ''artsy games'' (like Borderlands, The Legend of Zelda, and even Bioshock Infinite) that couldn't care less whether they're utilizing photo-textures or glaring lighting effects. These games actually pursue their own art-style, and strangely enough, gamers don't hesitate to call them ''beautiful'' as well. So here's a new question: is there really a difference between ''realistic'' and ''artsy''? And since beauty is in the eye of the beholder (and no one disputes whether video games are art), can we objectively say one game is ''better-looking'' than another when they're both pursuing separate visions?

Inline image 3
                                                                    I should think not!

Going back to the point I brought up earlier, if we're going to accept the fact that video game textures are made up of single-color pixels (and really, the textures of a game are usually the foremost defining factor of whether the game is considered ''realistic'' or not), let's just face it: Minecraft uses the same method to achieve an illusion of ''detail'' that modern photographs use. In fact, Minecraft supports thousands of blocks on-screen at once, and each of those blocks has more than a hundred individual pixels scattered across it. In reality, this illusion of ''detail'' is the same one developers of ''realistic games'' use to convey their realism! You know what that means, right? There's no such thing as a game that doesn't have its own art-style!

This may come as a shocker to a lot of people, but games aren't capable of mimicking reality even close to perfectly; all they can do is give an illusion of realism (that's essentially all that art is: mimicking what you see or think about). Even developers know this, which is why they've been spurred on to ''cheat'' in a sense by utilizing advanced technology as a template to achieve their art (things like motion-capture, the 3D scanning of real-world objects, and photo-textures come to mind). By rejecting ''raw art,'' as it were, they've pulled us into this mindset of objective realism and made us think we actually are making games ''closer to real life.'' But what they forget is that they're still just mimicking reality; those photo-realistic textures, motion-captured animations, and scanned 3D objects are still translated through a digital means, which essentially converts them from a ''realistic'' thing into an imperfect piece of digital art. That's why photographs become pixelated, scanned rocks become limited to polygons, and why motion-capture is constrained to tracking reflective dots instead of actual human expressions.

Inline image 4

And let me clarify, it's not that I don't condone these methods of creation. I think my initial disdain for utilizing that sort of technology has more to do with the fact that when I was little, I was always discouraged from ''tracing'' actual geometry when I drew pictures because it held me back from exercising my own creativity; instead, arranging my art according to someone else's template only taught me that there was a ''correct'' way to be artistic. I think developers have tricked themselves into having the same wrong mindset. Don't worry, though. I do understand that the definition of art only requires arrangement, and video games meet that criterion (even while ''tracing''). But the problem is that developers are now boasting of their new-found power to arrange their art according to physical templates as being better than other developers' art; even reviewers don't recognize the fact that all video game graphics are subjective art, and therefore can't (and should not) be critiqued as being flawed in any case. I would even venture to say that one of the biggest reasons an increasing amount of developers are opting for a plain art-style is because they'd rather not take flack for having a limited budget in the graphics department, so they try to fit in with the crowd of ''artsy'' games to avoid criticism. And no journalist would ever criticize art, would they?...

''All right, Luke... make up your mind! Should games try to be realistic or artsy?'' Well, hopefully I've conveyed neither message. Beauty is in the eye of the beholder after all, and a game's graphics can never be true to life. I support the notion of progression in our industry and all, and I would never fault gamers for expecting their gaming experiences to get better over time. But at what cost do we demand improvement, and by what criteria? Certainly no one should be put in a position to critique art that isn't fashioned realistically, should they? That's not progress at all, especially when we're trying to prove this digitally interactive medium as an art-form. It's time we give equal credit to all games and developers, even if the scope of their vision isn't as grand and detailed as another.

(This blog was originally published on Noob Magazine.)


Viewing all articles
Browse latest Browse all 6965

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>