Good afternoon class. Please, take a seat.
I wanted to talk a little today about the importance of direction in design and, specifically, the age-old argument of graphics vs. gameplay.
Before we get all analytical and historical and all the 'cals', I want to make my position clear; As someone who is slightly older than the average user of this forum I've lived through a great deal of modern gaming's history. My earliest console was the NES, or perhaps our household had our GBs first, and I've owned most consoles released since then. One of the greatest joys I got out of pc gaming was researching and emulating many of the great games I missed out on before consumer capitalism became what it was today; as beautiful as it is monstrous. So, like many of us here, I'm sure, I consider myself something of an expert in the field of videogames after receiving both an ample amount of content and, more importantly, the analytical toolset provided by first-hand experience of game design.
I'll save tldr folks the effort of scrolling down: Graphics are important to games. It's obvious. Stop being silly.
Anyone who's familiar with my posts around the forum will probably be surprised by that statement: I'm not only firmly rooted in the gameplay camp but my reason for being there is from form theory. Videogames are videogames because gameplay; its a popular idea. But its also very inhibitive idea. After all, graphics are undeniably part of videogames; its the delivery method of the gamplay experience.
Its pretty common to hear people harkening back to the golden ages; that those games didn't rely on graphics to sell games or that they weren't as expensive a part of the gamemaking process. Well, they did. And, they were. And that's just how it was so there's not really room for debate there.
"B-But, they didn't have the same technological capabilities back then so there was more focus on gameplayyy." This train of thought is erroneous as the industry has always been pushing its limits graphically, and specialists always require specialist paycheques. Obviously, in ten or twenty years our machines will look even more archaic than the NES looks to us now as technology continues to develop faster and faster.
What's worst about this kind of mindset though is the unabashed wishful thinking, the rose-tinted glasses as they say. There have been a lot of now outdated games that I've enjoyed over the last few decades that I was fortunate enough to not only live through but be conscious and mentally independant during. But they are outdated. They may possess the most incredible design decisions, the most poignant narrative moments, and ofc the most exquisite graphics of the time ( ). But even the absolute best will also be burdened with gameplay decisions designed around accomodating the primitive hardware and simply the poor logic of a medium in infancy (for example, falling down a hole in early LoZ's will cause the player to respawn before removing the heart leading to Link spinning around at the last entered doorway, a rather bizarre sequence for falling to death). And the superior gameplay is what you're all about right?
Now, don't misunderstand me; enjoy your nostalgia to your heart's content (I do!). But this kind of thinking has little place in critical discussion.
So what does belong in the discussion? I'm so glad I asked.
We are so incredibly lucky to be on the cusp of gaming's renaissance. No, really, we are. Its like we're in Victorian England at the introduction of the novel, or at Woodstock in the Sixties. There's a whole lot of mud and grime but some people are doing some really amazing things and I would put most developers into one of three tiers (regardless of their motives):
-Developers that want to make a game, any game, and are willing to lean on existing conventions.
-Developers that understand existing gaming conventions, and are attempting to manipulate them in new ways.
-Developers that are attempting to establish entirely new gaming conventions, either internally or through borrowing from other mediums.
Why have I put this across? Because gaming is unlike most artisitic mediums in that it encapsulates most existing mediums, and the visual ones, arguably, most heavily. Not only making individual graphical assets but also composing them on-screen are both very artistic endeavours. Not only creating cutscenes but effectively directing the player for most impact is very much like choosing the mise-en-scene or the editing process in film-making.
"But Tarq, time and money is spent making things look more realistic than they need to be!"
Is that strictly true? I would say that for a great many games that are intended to recognisably take place in the real world, or a variant of it, that replicating reality is a worthwhile effort. Even most games that are not intending this will still desire immersion, something that is easier to perform with a less jarring visual transition between our world and the game world.
There are, absolutely, games that have little to no reliance on graphics (a text-based adventure, for example) but the developers are still, or should be, redirecting their efforts elsewhere.There is no shortcut or loophole in gamemaking.
Which leads nicely into my next point; we are seeing some serious min-maxxing in game design nowadays. And its awesome. Why read a book when you can play it? Why watch a movie when you can walk through one? Obviously, the quality of the content itself would be the deciding factor for a lot of people, but considering we're seeing the form of story-telling being taken to these new and exciting levels, is it really that surprising that the visually creative want to leave their mark on our medium?
Wow, I've been writing this for a while and I've still yet to scratch the surface. Let me know if you've enjoyed reading this, or perhaps why you didn't. Let me know if you'd like to read more stuff like this, or mayhaps we can just continue it in the comments. And also look out for an update here
in the next couple of days. Going to be pretty cool, I promise! :3