It’s been a while since my previous post. I’ve been quite busy with the renovation of what became my new home a few days ago. In fact, I’m still pretty busy with the finishing touches and, frankly, a bit overwhelmed with all the work that’s still pending, more so taking into account I’m only a few months away from becoming a father again. Of course, I still find time here and there to play some games and relax. I’m currently enjoying Hollow Knight but I’m far from being ready to review it. Previously, I played Deus Ex: Mankind Divided and I want to share some thoughts about it.
Mankind Divided was generally well received by critics and fans, but it was specifically criticized for its microtransactions and presumably short campaign length. You can count me in the group of fan series and my perspective may be a bit rosy, but I think it’s a great game. I disliked the whole microtransactions affair, the in-game “triangle codes” tied to a mobile app and other stuff like breach mode, but the really good news here is that you can ignore all of that (it never gets in your way) and still enjoy a great game. It’s not revolutionary like the original Deus Ex and it’s not as good as Human Revolution, in my opinion, but it’s still pretty good.
The graphics and attention to detail in most environments are amazing, even if the appearance of a few characters is still a bit “deusexy”, as I commented in my previous post. The gameplay is good and very similar to Human Revolution, with similar achievements for a Pacifist approach or Foxiest of the Hounds if you want to go that route (I did). I welcomed most gameplay changes. The comeback of Multitools from the original Deus Ex is nice and flexible. It allows you to bypass specific locks if you lack the skill, paying the price of not getting any experience points. Disabling robots and turrets permanently with EMP grenades is no longer possible, favoring other game mechanics like the remote hacking augmentation, the specific computer hacking skills and armor piercing rounds. I disliked the changes to the computer hacking screen. Its GUI is a bit more cumbersome and the “Stop Worm” button location made me click it by mistake many times when trying to reach an object in the upper part of the screen. The increased level decoration detail in modern games sometimes makes it hard to spot interesting items and objects, but the problem is elegantly solved here and integrated in-game with the Magpie augmentation.
As for complaints about the game’s length, I believe it ended right where it was supposed to end. It didn’t catch me by surprise. I may be wrong, but it looked clear to me the game was about to end and leave up some plot lines unsolved. It reminded me of the original Star Wars or Matrix film trilogies. The first films in those stand alone. Due to their success, more films were planned and both second films (Matrix Reloaded and The Empire Strikes Back) end up with cliffhangers and unsolved plot lines. I believe something similar is bound to happen with this hypothetical “Jensen’s Trilogy”. Rumors with somewhat credible sources circulating on the Internet put the blame for microtransactions and splitting the story in two on the game publisher, Square Enix.
The story does have less meat, but it’s interesting to see it slowly tie to elements in the original Deus Ex. We’ve got almost every character there already: Lucius DeBeers, Beth DuClare, Morgan Everett, Stanton Dowd, Bob Page, Joseph Manderley, etc. Even a prototype for the Morpheus AI is mentioned in one of the in-game pocket secretaries. The game side missions are very interesting and provide much needed support to the main story. A fast player will easily get 20 to 30 hours of gameplay and a slower and completionist player like myself may get over 40 or 50 hours out of it in the first playthrough, so I refuse to call it a short game.
Another question is if maps are less varied and if that contributes to the game feeling smaller or shorter than Human Revolution. I don’t think there’s an easy answer for that. The game only has one big hub located in the city of Prague. But it’s so big it had to be divided in two: a small starting area and a larger one. Each of those has several buildings that can be visited, as well as a moderately large sewer system and a number of sub-areas. For example, the large Prague hub includes the Dvali apartment complex and theater as well as the Palisade Bank. Additional maps take you to different areas, including the relatively large and complex Golem City, which is as large as a hub by itself. I’ve praised multilayer maps and levels in the past, with interconnected architecture, and this is one of those games featuring amazing level design, with the Augmented Rights Coalition headquarters in Golem City, as well as the Church of the MachineGod apartment complex, deserving a special mention.
To me, the game graphics, sound and technological aspects are worth a 9 and the gameplay is easily worth an 8.5. The overall score could be an 8.5, bound by its gameplay.
Regarding the future of the game series, I’d like to see one more Jensen game to solve part of the remaining open plot lines. After that, the franchise should probably move to other characters. The Deus Ex universe is rich and complex so it has room for telling many stories. I read someone in reddit say they should make a game about Paul Denton. It certainly has potential and could be nice. It could explain in detail how UNATCO was formed. Eventually, someone should remake or, to be precise, re-imagine the original Deus Ex (probably my favorite PC game of all time). Of course, I don’t mean a mere remake. Human Revolution proved it’s possible to take a franchise and update its core gameplay to modern standards. I read a Human Revolution review that mentioned the game played like a love letter to the original Deus Ex. Not as revolutionary, but a worthy homage to the original game and an nice update to the franchise. That’s what I would be expecting from a Deus Ex remake. A big game following roughly the same plot as the original, with varied locations and updated gameplay, directed with a reasonable but heavy hand and without fear of risks to make necessary changes.
I’ll have mine without microtransactions, thanks.
It’s been a few months since my last game review and I wanted to write about the last batch of games I’ve been playing and how they use different anti-aliasing and lighting techniques to improve the way they look on screen.
A few months ago I played Doom, the reboot from 2016. It was one of the most praised games that year and I can certainly see its virtues. However, I think gamers in general and PC gamers in particular were blinded by its brilliant technology and performance on different systems and forgot a bit about the somewhat basic game play.
To me, Doom is a solid 8.5 game but it’s far from a GOTY award. It’s fun, it’s replayable and it’s divided in maps that are themselves divided, for the most part, in different arenas were you fight hordes of monsters. This simple concept, coupled with a simple plot, makes it easy to enjoy the game in short gaming sessions, clearing arena after arena. For obsessive-compulsive completionists like me, this division makes the game less addictive, which is arguably a good thing, compared to other games were there’s always a pending secondary quest, a question mark in a map or a new item to get. Doom provides great replay value and a quite difficult ultra nightmare mode with permadeath. I intended to complete the game in that mode but I dropped out despite observing constant progress in my playthroughs because it was going to take too long and I wasn’t enjoying the ride.
How does Doom manage to run so well compared to other games? The underlying technology is fantastic but I’d add it’s not an open-world game, it doesn’t have vegetation or dozens of NPCs to control. Maps are full of small rooms and corridors, and even semi-open spaces are relatively simple. 3D objects in game don’t abuse polygon count. They depend on good textures and other effects to look good. Textures themselves are detailed but not in excess. They’re used wisely. Interactive elements get more detailed textures while ornamental items get simpler ones. In general, it performs well for the same reasons Prey, for example, also performs well.
Doom offers several interesting choices as anti-aliasing options, many of them cheap and effective for that specific game, including several ones with a temporal anti-aliasing (TAA) component. TAA offers the best quality anti-aliasing in modern games, as of the time I’m writing this, if used well. It can be cheap and very effective, but the drawbacks may include ghosting and an excessively blurred image.
I experienced TAA extensively for the first time when I played Fallout 4 some months ago (I reviewed it here). In this game, the ghosting effect of TAA is very noticeable the first time you’re out in the open and move among trees but, still, it’s a small sacrifice to make compared to a screen full of visible “jaggies”. I believe TAA is the best option when playing Fallout 4 despite the ghosting and the increased image blurriness.
In Doom, it may or may not be the best option. The lack of vegetation, trees and its relatively simple geometry means the game doesn’t look bad at all without any temporal component if you don’t mind a few jaggies here and there, but I still recommend you to enable any form of TAA if you can. If you find the image to be too blurry, try to compensate that with the in-game sharpening setting. Ghosting is barely noticeable. It happens when enemies move very quickly right in front of the camera and, ostensibly, when Samuel Hayden appears on screen, around his whole body and, in particular, his fingers and legs. Disabling all temporal components is the only way to see Hayden as it was intended. The ghosting effect around him is so visible I thought it was done on purpose as a weird artistic effect the first time I saw it. Fortunately, both ghosting situations happen once in a blue moon, which is why I still recommend TAA for this game.
Batman: Arkham Knight
I also played Arkham Knight and it’s a good game. Plenty of challenges for a completionist but it’s a bit repetitive. Like I said when I reviewed Arkham Origins, I still think Asylum is the best in the series. Arkham Knight features graphical improvements, a darker plot that brings the game closer to Nolan’s film trilogy, and too many gadgets. The amount of key combinations and types of enemies reaches an overwhelming level and you need to add the Batmobile on top of them. Don’t get me wrong, it’s a solid 8 and its performance problems at launch are mostly moot now thanks to software changes and the arrival of the GeForce 10xx series, which handle the game reasonably well. However, it’s not a game I see myself replaying in the future.
Arkham Knight’s urban environments do not need temporal anti-aliasing that much, which is good because the game doesn’t have that option and still manages to look reasonably good. The game suffers from small performance problems when you enable every graphics option and ride in the Batmobile or glide high above the city, but frame rate drops are not severe.
Rise of the Tomb Raider
Finally, a few days ago I finished playing Rise of the Tomb Raider. I liked the game the same way I liked the 2013 reboot, with a score of 8 or 8.5, but other reviewers have been more positive. Some elements have been improved and others have been made worse. The plot, atmosphere and characters are better. Keyboard and mouse controls have received more attention and many game mechanics have been expanded without making them too complicated. On the other hand, completing the game to 100% is now harder and not more fun. The series is starting to join the trend of adding more collectibles and things to find just to make the player spend more time playing without actually being more fun and rewarding. Still, I completed the game to 100%.
With my GTX 1070 on a 1080p60 monitor the game runs perfectly fine with everything turned up to 11, except when it does not. There are a few places in the game were the frame rate tanks, and sometimes for no obvious reasons. One of the most noticeable places I remember was an underground tunnel with relatively simple geometry where I was able to make it drop to 42 FPS.
The way to fix that is very simple. The first thing to go is VXAO, an ambient occlusion technique. Dropping that to HBAO+ or simply “On” gives back a good amount of frames. In general, ambient occlusion is something that improves the overall look of the image. It’s heavily promoted by NVIDIA but, in my opinion, the difference between its basic forms and the most expensive ones doesn’t have a dramatic impact on image quality. If you have the power to run a game maxed out, be my guest. If you’re trying to find a compromise, ambient occlusion should be one of the first things to go.
To get back more frames you can also tune shadow quality down a notch or two. In Rise of the Tomb raider, the difference between “very high” and “high” is noticeable but not too much, and going down to “high” avoided many frame rate drops for me.
As opposed to ambient occlusion, I personally find shadow quality to have a very perceptible visual impact in most games. Reducing shadow quality normally means shadows look more blocky and weird and, if they move, they tend to do it in leaps instead of smoothly. I distinctly remember playing Max Payne 3 some years ago (that uninterruptible movie with some interactive segments interleaved, if you recall) and it featured a closeup of Max Payne in the main game menus with visible jaggy shadows across his cheek that drove me nuts (why would game developers choose to have that shadow there in such a controlled situation?).
Contrary to both previous games, Rise of the Tomb Raider features lots of vegetation, trees, small details and shiny surfaces, but it doesn’t offer a temporal anti-aliasing option. The result is a picture full of jaggies at times that, no doubt, will bother a few gamers.
In general, recent games tend to include more geometry, vegetation and lighting details that exacerbate every aliasing problem. At the same time, you have more detailed textures and you don’t want to blur them, specially when you’re looking at something close up. This is why, in many recent games, FXAA, the poster child of cheap and effective post-processing anti-aliasing, is a bad option. It’s very cheap but it doesn’t do a very good job and it blurs the image a lot. If you’re playing, let’s say, Fallout 3 (a 2008 title, time flies!), FXAA is an excellent choice. The relatively simple geometry, compared to today’s games, makes FXAA effective at removing jaggies. The lack of detail means textures don’t look blurrier than usual with it.
Moving forward in time, Crysis 3 was one of the first mainstream games to feature SMAA prominently, another form of post-processing anti-aliasing which is very fast on modern cards and improved the situation a bit. It attempted to fix jaggies like FXAA did but without blurring textures. Very cheap but it does cost a bit more than FXAA and is not as easily injected from outside the game (FXAA can be activated, in general, from NVIDIA’s control panel and used to improve the visual aspect of many old games that do not support it directly). SMAA did a good job for many years and was my preferred choice for a long time. I still choose it depending on the game.
These days, omitting a TAA option can be a significant mistake for certain titles like Rise of the Tomb Raider. In contrast, I’ve just started playing Mankind Divided, which offers a TAA option, and graphically it’s quite impressive (except for the way people look, a bit DeusEx-y if you know what I mean). Its developers did make incomprehensible decisions when choosing settings for the predefined quality levels. In my opinion, you should fine-tune most parameters by hand in the game, reading some online guides and experimenting. They also included a very visible option to turn MSAA on without any warnings about performance.
In any case, TAA in Mankind Divided is complemented by a sharpening filter that prevents excessive blurring if you’re inclined to activate it. Granted, the game doesn’t feature much vegetation but it looks remarkably good in any case. Both features complement each other rather well and give you a sharp image mostly free of notable jaggies.
The cost of TAA varies with the specific implementation. In general, it costs more than FXAA and SMAA but is very far from costing as much as other less-effective techniques like MSAA.
What I did was simple: I wrote a well-documented header file with a minimal set of functions, inspired by one of the Python implementations, and a single implementation file that, combined with Solar Designer’s code, generates a static library you can link into your programs. Later, I fixed some stupid coding mistakes I made, despite its small size, and forgot about it.
The project, as of the time I’m writing this, has 95 stars and 35 forks in GitHub (not many, but more than others) and not long ago I realized it’s one of the first Google search results when trying to find a “bcrypt library”. So it seems my small experiment has been promoted and I have to answer to a social contract!
In the last couple of weeks, I’ve been working a few minutes almost every day polishing the library, improving its documentation, reading other people’s code and documentation and adding some functionality. You can take a look at the results in the project’s future branch. Summary of changes from master:
The main implementation has been changed from a static to a dynamic library so it’s easier to update the implementation if a problem is found, without recompiling everything. I use -fvisibility=hidden to hide internal symbols and speed up link time. A static library is also provided, just in case you need it.
The function to generate salts has been changed from using /dev/urandom to using getentropy. That means the library will probably only compile under a modern Linux and maybe OpenBSD, and this is the main reason these changes are still not merged to the master branch. Without despising the BSDs at all, let’s be practical: Linux is the most widely used Unix-like system for servers and getentropy, introduced by OpenBSD, is just better than /dev/urandom because it’s simpler and safer to use, can be used in a chroot, etc. With Linux now implementing it, there are not many reasons to use anything else.
I have added a manpage documenting everything better and emphasizing the 72-byte implementation limit in password length.
I have added functions and documentation explaining the rationale for pre-hashing passwords before sending them to bcrypt, which works around the previous limitation in part.
There’s now an install target.
I have added a pkg-config file to ease finding out compilation and linkage flags in the installed library.
Tests have been separated to their own file and made a bit more practical.
I’ll merge these changes to master after things calm down for a while, I process any feedback I receive (if any) and after Red Hat, Debian and Ubuntu have a stable or long-term support version with glibc >= 2.25, which introduced getentropy. The next Ubuntu LTS will have it, the next Debian stable will have it and RHEL 8 will probably have it.
I may also try to package the library for Fedora, which eventually should make it land in Red Hat. I’m not a Fedora packager yet and this may be a good use case to try to become one and learn about the process.
If anyone’s interested in the project, please take a look at the future branch, comment here, open issues in GitHub, mail me, etc. Any feedback is appreciated because this was just a small experiment and I’m not a user of my own library. Also, I don’t recall ever publishing a shared library before. If I’m doing something wrong and you have experience with that, feedback is appreciated too.
What about Argon2?
Argon2 is another password hashing algorithm that won the Password Hashing Competition in 2015. The competition tried to find an algorithm that was better than bcrypt and scrypt. Its official implementation is released under the terms of the CC0 license, it works under Linux and many other platforms, and builds a shared and static library featuring both high-level and low-level functions. In other words, Argon2 already has a pretty good official, easy-to-use, API and implementation.
In my opinion, if you want to use Argon2, you should be using its official library or libsodium. The latter has packages for most distributions and systems. Argon2 is the best option if you want to move away from bcrypt, but there is no need to do it as of the time I’m writing this. The benefits are mostly mathematical and theoretical. Argon2 is much better, but bcrypt is still very secure.
Argon2 has three parameters allowing you to control the amount of time, memory and threads used to hash the password, as well as a command-line tool to experiment and find out the best values for those parameters in your use case. The libsodium documentation also has a guide to help you choose the right values.
The official library only contains the password hashing functions and leaves out details like generating a random salt or pre-hashing the password to harmonize password processing time. A few quick tests done with the argon2 command line tool from the official implementation revealed a small, almost insignificant password processing time difference between a small 5-byte password and a large 1MB one. I conclude Argon2 doesn’t need pre-hashing but I don’t know the underlying details. You can also choose the size of the generated password hash.
If you’re using libsodium, it includes several functions to generate high quality random bytes, needed for salts. A plain call to getentropy in Linux should also be trivial.
What about scrypt?
If you’re currently using scrypt you already have an implementation, and if you’re not using it yet but you’re considering using it in the future, you could skip it and jump straight to Argon2 (see the previous section). It’s a better option, in my opinion.
If you insist on using scrypt, there’s already a libscrypt project that’s packaged for Debian, Fedora, Arch Linux and others. It takes most of its code from the official scrypt repository to create a library separated from the official command-line tool.
The library also covers obtaining random data to generate seeds (it uses /dev/urandom), but not password pre-hashing. As with Argon2, a small test with a custom command-line tool revealed a very small and almost insignificant password processing time difference between a small and a very large password, so I conclude again no password pre-hashing is needed for libscrypt.
This blog is ultimately hosted in my personal web space at FastMail. I’ve mentioned FastMail several times in the past, specially when I blogged about their CardDAV support. FastMail handles all my personal email for around 40€ a year. In addition to email, they also provide a few other services, like my contact list and my calendar (both accessible from my phone through the standard apps).
More importantly, FastMail also gives me a fair chunk of web space to upload files to. I can upload them through their web interface or using WebDAV, and I can upload HTML files and decide to publish them in a chosen domain, subdomain and path under my control. In any case, only static files are supported. FastMail is not a dedicated web hosting company. You can’t run PHP or CGI scripts, and you can’t touch the web server in any significant way. For example, I can’t configure a TLS certificate.
A nice side-effect of the new “Load comments” button is that, if you visit this blog with uBlock Origin or a similar add-on installed, you’ll see post pages don’t need to have any element blocked until you press the button. This means users with or without an ad blocker will benefit from increased privacy when passing by the blog.
While we’re at it, I put Cloudflare in front of the blog. I know we probably shouldn’t rely on a few CDNs to serve half the contents of the web, but Cloudflare was the easiest solution for me to add TLS support to this blog. You may have noticed the (probably) green padlock in the address bar. The site is now served over HTTPS, scoring an A+ in the SSL Server Test from SSL Labs. I’m using Cloudflare in its Flexible mode. That means Cloudflare retrieves my content over plain HTTP but caches it and serves it to you over HTTPS. As I don’t host any service with personal information here, this mode just means visitors will get added privacy at no cost thanks to the opportunistic encryption, and I also get some peace of mind knowing if any page hosted here is eventually hit with a lot of traffic (legit or not), I won’t reach the strict FastMail bandwidth limits, making the whole site unavailable.
If you ever comment, your experience should also be marginally better. Previously, it was an HTTP page with an embedded HTTPS comments iframe, but what you saw in the address bar was the plain text connection. As you logged in through Disqus, Google, Facebook or whatever account you may have been using to comment, it wasn’t crystal clear that it was a secure connection. With the site now being served through HTTPS, the absence of a mixed content warning should inspire more confidence. By the way, thanks to Mozilla for including an insecure login warning in Firefox. It’s a bit of a shame that searching for “firefox insecure password warning” leads you mostly to pages explaining how to disable it.
Kudos to Cloudflare for providing free accounts and making the integration process incredibly easy. You only have to create the account and tell them about the domain you want them to proxy. Then, you review the entries they import from your existing name servers, adding things they may have missed, removing some entries and deciding what Cloudflare will cache, and finally you switch your existing name servers to theirs. For simple cases like mine your site will experience no downtime. I did the switch in about one hour, including reviewing the DNS entries, reading a bit of documentation and flipping some switches in the Cloudflare control panel.
Edit: I forgot to mention the site is also available over IPv6 as another nice side effect of using Cloudflare.
When I upgraded to Fedora 26 I mentioned it had been the smoothest Fedora upgrade I had experienced, but Fedora 27 has broken that record. It was short and completely painless, so there’s nothing more to say. Congratulations to everyone working on Fedora for their stellar job!