Game Review: Severance: Blade Of Darkness

Posted on .

Severance: Blade Of Darkness or Blade: The Edge Of Darkness (its original title) is a third-person RPG from Spanish studio Rebel Act released in 2001. I’ve played and enjoyed the game many times over the years and I wanted to write a small homage review before its 15th anniversary at the end of this month.

Severance was recently featured in the “Have You Played?” section of Rock, Paper, Shotgun. This prompted me to download my copy from and play it one more time. Or, more precisely, four more times, once for each main character you can select. Nowadays, if you want to play it and you don’t already have a copy, your options are severely limited. As of the time I’m writing this, Severance is no longer present in the GOG catalog (probably due to their license expiring). It can be downloaded if you’ve already purchased it but it’s not available for new sales. This situation may change in the future.

History and piracy

While I own a legal copy nowadays, I must admit I pirated the game back when it was released. I’m far from proud of pirating games in my youth but when I was a student I simply didn’t have the money to buy every game I wanted to play. I owned quite a few games legally, but not every one of them. If I had been unable to pirate those games, I simply wouldn’t have played them. Other games released in 2001 include Serious Sam, Max Payne or Return to Castle Wolfenstein.

I’m sidetracking a bit here, but I think one of the problems in the video game industry, and also one of the reasons DRM is used so aggressively and sometimes obnoxiously, is that video games, today, are very expensive to produce, and they’re sold to people who don’t have much money themselves: students, teenagers and kids. Moreover, parents from previous generations have always seen video games as a not-very-desirable hobby. I’m sure many would prefer their kids to play sports outdoors instead of games for hours in the basement, so they don’t tend to buy their children every game they want to play even if, in contrast, they take them to the movies to watch most films they wish. When gamers like myself grow up and get a job, they can finally buy plenty of games but what they start lacking is the time to play them. I see a lot of people and friends pursuing other hobbies and dropping gaming when they grow up, get a job and start earning real money.

Rebel Act Studios closed not long after the game was published and I’m pretty sure more sales would’ve helped keep the studio alive. Part of the team later founded MercurySteam and they’ve recently been developing a few Castlevania-universe games for Konami. Rebel Act have also admitted that they ran out of money developing the game and they had to publish it in a somewhat unfinished state, wrapping up what they had to create a gameplay experience as cohesive as possible. This can be perceived in some gameplay aspects, but the game is fine as it is. The essentials are there. I sincerely apologize and regret not having bought the game for its full prize when it was released.


Obviously, a game from 2001 is in a totally different league from what you see nowadays. If you play it from the mind of a 2001 gamer, you’ll see some impressive features, like the water reflection effects and dynamic lights and shadows. I remember being impressed at the time by its graphics and animations, and how the shadows were also integrated in the gameplay beyond a mere technical feature, many times being projected on walls to give you hints and clues of enemies nearby, or to scare you making them look bigger than they are.

The sounds were unique and good too, and the game sound track was also skillfully integrated in-game to set the mood of every area and moment. Surprisingly, it was comprised of several pieces of freely available music.

Remarkably, game assets were not packed in large custom binary files and could be found in the game data directories as plain files. This also revealed much of the game logic actually resides in Python scripts (version 1.5 at the time!). Even savegames themselves are stored as Python scripts.

On a side note, if you want to run the game nowadays I recommend the following setup. Configure the game for a 640x480 screen resolution using the Voodoo 1 and 2 renderer, which is based on Glide, the 3dfx API. In GOG, the game is shipped with the latest version of nGlide, a library that provides a Glide API implementation using modern Direct3D as the backend. Dive into the game directory and find the nGlide setup tool. Configure nGlide with it to use a 4:3 resolution that maximizes the vertical space of your screen (I chose 1440x1080 as I have a 1080p monitor) and force 4:3 to be the screen aspect ratio. If you use an NVIDIA card like I do, from the NVIDIA control panel choose not to scale the screen and use the GPU for scaling. This will make sure the game is rendered in the center of your monitor with proper aspect ratio and black bands on the left and right.

Using the previous setup, the game menus, HUD and help screens will be big as they were intended to be in the original game, while the game world itself is transparently rendered in 1440x1080 at 32 bits per pixel. The only drawback of using nGlide is a glitch when rendering the help screens, in which the red and blue RGB channels are swapped. I contacted the nGlide author and he told me it’s a problem in the game code and its usage of the Glide API. In any case, they’re still perfectly usable and you won’t have to use a mod manager or mess with the OpenGL renderer and its major flaws rendering the game fog, which affect whole levels like the Gorge of Orlok. The original Direct3D renderer is too buggy in modern systems and fails to display many things properly. I ran the game in Windows 10 without any compatibility setting and only got a few crashes here and there. I think most of them were present in the original release, except for a game saving bug I hit a couple of times that made me replay some level sections.


Gameplay is very entertaining even if some people think the game’s too hard. However, it’s much simpler and easier than, say, Dark Souls, a game it’s often compared with despite both being quite different.

At the start of the game you can choose between four characters: an amazon, a knight, a barbarian or a dwarf. Each of them prefers a specific subset of the available weapons and features a unique introduction level before playing a common set of levels. While some weapons are generic and could, in theory, be used by any character, your best bet is to use your character’s favorite weapons, which can be seen on the help screen. That’s a range of pole weapons for the amazon, two-handed swords and axes for the barbarian, single-handed swords and maces for the knight, and single-handed axes and hammers for the dwarf.

Your character’s level sets the amount of health and stamina. You hit the enemy by using the attack button combined with a direction key or a dodge movement. Every character has a set of basic combos available that can be used with any weapon provided you have the required level. Higher levels give access to more powerful combos. The character’s favorite weapons also have a unique combo each that can be performed if your level is high enough. In general, unique combos are the most powerful ones and can be used when enemies attack and miss, and are left exposed to your attacks. You can start using most weapons before you reach the required level for its combo, and you can continue to use them after surpassing the required level if you don’t like the combos of (in theory) better weapons you may have available.

The game has no difficulty levels and is played as it is. In general, I’d say the game is easiest with the amazon. You can perceive they didn’t finish polishing that character because of the low amount of general combos she has and because the game’s a bit unbalanced with her. The unique combo with her best weapon is powerful enough to tear bosses and minibosses apart in two, three or four attacks. From there, the standard and most polished character is the knight, having many combos, armor and being generally balanced. The barbarian and dwarf make the game harder for those inclined to face a small challenge. The barbarian is simply too slow and his combos are easily interrupted by enemies. This happens throughout the whole game. The dwarf suffers from a very short attack range, but I think playing with him gets progressively easier as the game advances. Some of his weapons and combos are fast, cover a wide angle and allows the character to move two or three steps forward, reaching unsuspecting enemies. I kept using the third best weapon with the dwarf in the final levels just because of its special combo, combining it with the Sword of Ianna.

After playing the game with all four characters in a row, I think its low points are the enemy and combat variety. The number of different enemies is alright for a 2001 game, but more or less the vast majority of the game is spent fighting about five different types of enemies: imps, orcs, knights, skeletons and zombies. Combat-wise, each of them has a very specific set of movements that are very easy to memorize, anticipate and provoke (with knights being the least predictable and more cerebral ones). I found combat to be based usually on waiting for the enemy to start a combo, dodging it, and attacking the exposed enemy with your own combo. In other words, while your available combat movements are many and varied, in practice you only use a small part of them.


I don’t think there’s much point in giving scores to a 2001 game, but I’ll try.

From the perspective of 2001, the technical score is easily a 9. From today, there’s no point in giving it any score. The gameplay aspect could easily be an 8 or 9 in 2001, and nowadays the level variety, geometry, puzzles and the use of sound and music can keep it as high as a 7, for example.

So you think you can program an elevator

Posted on .

So You Think You Can Program An Elevator is a very interesting programming exercise that reached the front page of Hacker News a couple of weeks ago.

It’s about programming the logic controlling an elevator and responding to elevator passengers and people outside the elevator pressing buttons by deciding if you must stop when you reach a given floor, or start going up or down.

The exercise itself requires some basic Python knowledge, but the interesting part is how it may be able to surprise you and be much harder to solve than you expect (or maybe you nail it on the first try, who knows?). There’s also some difficulty in distilling the logic requirements from the long description and the unit tests themselves, which contain cases that the main description lacks. In this sense, it’s a wonderful insight into capturing software requirements from a client, for example.

So if you have some spare minutes or hours, I recommend you fork the repository and try to follow the instructions and code snippets carefully. You’ll probably have some frustrating fun!

Letting other local users connect to your PulseAudio instance

Posted on .

In the past, I mentioned I like to run my web browser as an alternative user, so if the web browser is compromised, it will not gain direct access to my user files. When I was using ALSA with the dmix plugin, this choice had no audio-related drawbacks as long as the web browser user had permission to access the sound devices. However, I also recently mentioned I migrated to PulseAudio, using it as the default ALSA backend, which has a problem in this situation.

The PulseAudio daemon is running as my normal user and depends on the ALSA devices being available. The web browser user will try to use PulseAudio first, being the default ALSA backend. This will fail, because PulseAudio is not running as that user, and it will fall back to the default system device if configured so in /etc/asound.conf or it will fail to produce sound. In addition, if the web browser user grabs the audio device, my PulseAudio instance will not be able to use it (in other words, trying to play a sound as my normal user while a web video is paused will fail, like when using ALSA without the dmix plugin).

The solution I adopted, as per the post title, is similar to using ALSA with dmix. Simply allow any local user to connect to PulseAudio. This “recipe” is available all over the web (I took it from another blog), but mentioning it one more time doesn’t hurt. It also avoids running PulseAudio in system mode, which is less well-supported upstream.

In /etc/pulse/, make sure you load the module-native-protocol-tcp module with the following parameters:

load-module module-native-protocol-tcp listen= auth-ip-acl=

This way, the PulseAudio daemon will load a module that will allow it to accept TCP connections from the loopback interface and the loopback interface only.

For every other user except the one running the PulseAudio daemon, you need to configure PulseAudio so it won’t try to launch a new instance and will instead communicate with the existing daemon over TCP by default. This is simply a matter of editing ~/.config/pulse/client.conf and using the following line:

default-server =

Note the user running the PulseAudio daemon cannot have that client configuration option, or else PulseAudio will refuse to launch a new daemon and will instead try to connect to a non-running process at

Caveats and problems

I tried to use UNIX sockets with module-native-protocol-unix and an authorised user group and it didn’t seem to work, for some reason. Any input on that is appreciated. If you have a working configuration with UNIX sockets let me know.

The web browser user will have unlimited access to your sound devices. It can output sound and it can also listen to your microphone, should you have one permanently attached. In my case, this is a desktop computer and it doesn’t have one connected normally. Make no mistakes: this is the same situation as with ALSA and dmix, with one minor variation: if the PulseAudio daemon has a vulnerability, the web browser user may be able to make it run code as your user, so PulseAudio needs to be kept secure.

Nox Hummer TX is my new computer case

Posted on .

Another small technology change in my life last year was that I switched my computer case to a Nox Hummer TX. Normally, a case change is minor, but this one left me thinking I should be building my own computers from now on.

Until now, I have always bought desktop computers by choosing the main components (processor, type and amount of memory, graphics card and hard drive model) and leaving myself open to suggestions about motherboards and other components (with typically a strong preference for Asus, but my current motherboard is a Gigabyte one). I’d go to my computer shop of choice with the component list and buy everything there, where they would assemble the computer for me.

However, something happened last year and I didn’t like what I saw. Normally I tend to prefer large CPU fans because they can move more air while spinning more slowly and making less noise, and my computer has a Nox Hummer H-300 CPU cooler, which is a 12cm-diameter fan I was successfully using in my previous computer too (which is now being used by my wife). The case the computer shop chose for me was obviously too narrow for the cooler, and the upper part of it made contact with the lateral case panel. I didn’t think too much about it and the computer worked fine for months.

Still, one day I hit the case gently with my knee. It wasn’t a big hit by any means, but the computer instantly powered off and, from then on, I had problems turning the computer on and sometimes it would turn itself off abruptly. After testing the hard drive (using the badblocks command) and memory (using Memtest86+), nothing showed up. I stressed the computer for a bit and it didn’t turn off, with the temperature being fine, so I suspected the power supply and fans may be fine and crossed my fingers it was not the CPU failing. Everything pointed to a problem with the computer case, the way everything was assembled, or a bad contact or short circuit somewhere, related to the small hit it took.

I set myself to choose a better case with ample room for the cooler, with the case change being an opportunity for cable rearrangement and connecting everything again. Initially my budget was about 60 to 70 euros, but I ended up choosing the Nox Hummer TX, as I said, which retails here for close to 100 instead. It’s a very big nice case with good air flow and many small details that make working with it very easy, including hard drive bays that don’t need screws. I had been thinking about doing the case switch myself, but I decided not to risk anything with my everyday computer, and scheduled a case change in my computer shop.

But, while going through dozens of cases having no exact idea if the cooler would fit in them or if the cases were any good (I absolutely hated the experience of choosing the case), I also watched a few computer building tutorials on YouTube. And, while choosing the case was horrible, I must say I’m impressed with how far building computers has come. It’s now much simpler with many fewer chances of making terrible mistakes. In particular, I liked the EasyPCbuilder tutorial. It shows how to apply thermal paste on the CPU (covering the center part which is the one that gets really hot while working), gives ideas on working around static charges and advice on the proper order to do every step.

In other words and like I said in the first paragraph, this whole issue got me thinking the next time I buy or improve my current computer, I’ll buy the parts online, save a few euros I can invest in better hardware, and build the computer myself. It should be a much more satisfying experience, and any mistakes will be mine to learn from.

Seat allocation in the 2015 Spanish general election

Posted on .

On December 20 a general election was held in Spain and it’s been a very active topic in local news and social networks. As usual, many people discussed seat allocation after the results were published and more so in this case. Two new parties got a lot of votes and the parliament is very fragmented. I had a rough idea of the whole seat allocation process but in the last few weeks, thanks to dozens of published articles and hundreds of comments, I learned quite a lot about it, and I also learned the specifics of the D’Hondt method used to allocate seats.

The process is well explained in the Wikipedia article about the election. For the parliament (the senate uses a different election mechanism), there are a total of 52 voting districts. These correspond to the 50 Spanish provinces in which the 17 autonomous communities are divided (autonomous communities are similar, saving the somewhat large distances, to states in the United States). Additionally, the cities of Ceuta and Melilla are voting districts too, choosing one seat each.

The amount of seats chosen in each district is mentioned in the Wikipedia article, for a total of 350. This division in voting districts is often mentioned as the cause of disproportional distribution in the number of seats. There’s another separate Wikipedia article discussing the election results were you can see the number of seats allocated for each party in the official results. In each voting district, the D’Hondt method was applied to allocate seats for each party as required by election law.

Some weeks ago, I didn’t know exactly how the D’Hondt method worked. I read Wikipedia’s above entry on it and found it to be surprisingly simple. In fact, I implemented a bit of Python code to calculate seat allocations if one single district had been used for the whole country. The code implements the D’Hondt method as well as the Sainte-Laguë method and the modified Sainte-Laguë method, just to compare several highest quotient methods. The code has been uploaded to GitHub if you want to play with it. It takes voting results in a specific format described in the code comments (and an example from the general election) and generates an HTML page with the results, which is the best format for result analysis I could think of. In the generated results, DH refers to D’Hondt, SL to Sainte-Laguë and MSL to Modified Sainte-Laguë.

I don’t think I can do a better job at describing the D’Hondt method than the Wikipedia article does, but it simply requires dividing the total number of votes each party got by an increasing number, related to the total number of seats to be allocated, and getting a very long list of resulting quotients. From all the calculated quotients, the biggest ones are taken and one seat is assigned for the party the quotient corresponds to.

As you can see by looking at the results page generated by my code, seat allocation does vary quite a bit compared to the official results. They show how the two biggest parties benefit from having many voting districts while other parties, specially UP (Unidad Popular), suffer major seat losses (I mean apart from having lost a lot of votes compared to the previous election). I was tempted to inline some results in the article here but the tables are too big. Simply observe how the election winner, Partido Popular, got 123 seats and would have gotten closer to 100 if using a single district. The two trailing parties, Partido Socialista and Podemos, got 90 and 69 seats respectively. In a single voting district, they would be much closer to one another, with around 80 and 75 seats respectively.

The results also show D’Hondt and Sainte-Laguë to be very similar, so choosing other highest quotient method isn’t going to alter the results significantly. Comparing the real votes proportion to seats proportion shows the modified Sainte-Laguë method, at least in this case, would arguably result in more disproportion compared to the normal version.

Another factor to be taken into account is that election law in Spain mandates any party to get a minimum of 3% of the valid votes in a given district to be able to opt to seats from that district. Valid votes are votes to any party plus blank votes. Some people in Spain are confused about how blank votes are taken into account and how they are different from invalid votes. The answer is not that they are added to the winner party in each district, as some people believe. They are not used for anything and seats are not left unallocated for them. That’s the specific purpose of the Escaños en Blanco party (roughly translated to Blank Seats). But they do make it more unlikely for a small party to win a seat, because they raise the bar to get that minimum 3%. In a single voting district, it wouldn’t make sense to have the 3% barrier. Otherwise, only 5 parties would be represented while the others, totalling a nontrivial quantity of 30 seats using the D’Hondt method, would get none.

Now, if we want to have a single voting district in Spain that would require changing the Spanish constitution, which is not legally easy. The constitution does not mention the exact proportional method used to allocate seats. That’s part of the election law, not the constitution. However, it does mention explicitly the voting districts correspond to provinces as it is now. Others argue having the current voting districts prevents politicians from just focusing on the 15 or 20 biggest Spanish cities, which hold the majority of the population. In theory, voting districts allow representation from minorities in small provinces. But, in practice, it shows people in small provinces choose representatives from the major parties, and smaller and raising parties are in practice losing seats right now. In other words, no system is perfect.