Out of all the reasons why people choose not to play video games, the most often heard excuse is that as a hobby, gaming is simply too expensive to both break into and maintain. This sentiment is echoed not just by newbies but by longtime gamers as well, with many expressing grievances over the physical price of being a part of video game culture. Indeed, there are quite a lot groundwork and ongoing costs required, including the purchase of a modern system (PC or console), accessories, good internet, and new games. At first, the concern over expenses seems quite understandable, as video games can lead to hundreds of dollars spent on mere entertainment. However, what many people seem to miss in regards to this topic is just how subjective the idea of cost and value is. The problem isn’t that gaming is ‘too expensive,’ but that the greater public doesn’t know how to value the art form.
Lately, in the video game community, I’ve been seeing a similar problem brewing between two different sets of gamers. One side is troubled by the fact that they don’t have enough time to enjoy gaming, and the other side is struggling not to play video games, going so far as to call it an addiction. Both of these issues are similar in that they both concern how some of our favourite hobbies fit into adult life. Personally, I fall into the category of not being able to play games as much as I would like to; however, I’ve dabbled on the other end of things as well.
Unlike many other popular superheroes in the ‘mainstream’, Spider-Man is different in that his traits extend far beyond his two identities of civilian and crime fighter. For an early comic book hero, there is a surprising amount of depth to the character when it is done right. In fifteen years or so, we’ve seen three different actors portray Spidey in three different film franchises: Spider-Man (Tobey Maguire), The Amazing Spider-Man (Andrew Garfield), and Spider-Man: Homecoming (Tom Holland). From these three interpretations of the same hero, audiences are able to effectively ‘observe’ what makes the character engaging.
After four-to-five surprisingly short years, it appears that once again we are at the end of a console generation. Even though not all of us own or play on a console, the systems themselves and the brands they represent are often good indicators of the gaming industry itself. With one console cycle finished, a bookmark is left in the pages of video game history, which brings us to where we are now.
What sucks about Wonder Woman has nothing to do with the quality of the movie itself but with the expectations that have surrounded it. As a female-led action film, it wasn’t predicted by its studio to be especially profitable, and in the years before, there was doubt about whether a Wonder Woman film would even get made. But now that it’s out there, the movie seems to be suffering from a different problem. Don’t be a fool to the raving critics and the high review-aggregate scores, Wonder Woman is quite mediocre as a product on its own—an average superhero movie. But with all the hype surrounding it, I couldn’t help but feel as if it were almost worse than standard.
The relationship between the video game industry and the gaming media has always been quite tenuous. Even before things started to really get rocky, as a long-time gamer, I could never help shake the feeling that the two parties only remained acquainted because of a few mutual benefits.
For quite some time now, the gaming and cinema industry has been engulfed in a phenomenon of bringing old ideas back from the dead. Nowadays, it seems like any past franchise, presumed to be forever dormant, has the chance of being revived through a reboot, remake, or sequel. Of course, there’s no problem with a studio or a group of creatives going back to a series to realise unfulfilled potential, especially if there’s an audience for it, but I wouldn’t be mentioning any of this if there wasn’t some kind of inherent problem.