Counting Triangles (3)

Posted on .

A few days ago a familiar problem turned up on a Spanish online newspaper, promoted as a mind game that was trending in Twitter.

Apart from the bogus IQ statement, it offered me the chance to revisit my old piece of code that brute-forced the problem easily with a few lines of code thanks to the amazing speed of computers today.

I reworked the code by using itertools.combinations instead of custom code, removing a lot of boilerplate and improving the code in general and using Python 3 instead of Python 2. The result, in the public domain as the previous ones:

#!/usr/bin/env python3
from itertools import combinations
import sys

lines = set()
vertex = set()

if len(sys.argv) != 2:
    sys.exit('Usage: %s INPUT_FILE' % (sys.argv[0], ))

try:
    with open(sys.argv[1], 'r') as stream:
        for input_line in stream:
            if not input_line.startswith('#'):
                drawing_line = tuple(input_line.split())
                if len(drawing_line) > 0:
                    lines.add(drawing_line)
except (IOError, OSError):
    sys.exit('ERROR: unable to read input file')

for l in lines:
    vertex.update(set(l))

def tuple_in_line(t, lines):
    return any(all(e in l for e in t) for l in lines)

def valid_triangle(trio, lines):
    return (all(tuple_in_line(pair, lines) for pair in combinations(trio, 2))
            and (not tuple_in_line(trio, lines)))

triangles = []
for trio in combinations(vertex, 3):
    trio = sorted(trio)
    if valid_triangle(trio, lines):
        triangles.append(trio)

for t in sorted(triangles):
    print(' '.join(t))
print('Total: %s' % (len(triangles), ))

If problems were larger, we could generate graphs with all existing 2-vertex and 3-vertex connections beforehand and enumerate triangles using that knowledge instead of just brute-forcing the problem by generating all possible triangles and checking if they’re valid for the given drawing.

Reminder: to input data to the program you need to enumerate every vertex in the image by hand (a vertex is any point where two or more lines meet) and finally list the lines in the drawing, one per input line as a list of vertex separated by spaces. If in doubt, read my two previous posts on the problem.

In the Twitter image, like other replies said, there are actually 24 triangles.

Upgraded to Fedora 25

Posted on .

Just a quick post to say I upgraded to Fedora 25 a couple of weeks ago. The process was painless and nothing major broke for me. When booting, I get the following error on the console and, I suppose due to it, the bootsplash screen is not used.

ERROR: Unable to locate IOAPIC for GSI 37

If you get a similar error, many web searches return outdated information for messages like that one happening several years ago. The only relevant result I found was a thread in the Arch Linux forums. The error seems completely harmless otherwise and a fix has been committed for Linux 4.9. Kernel 4.9 hasn’t been officially released yet as of the time I’m writing this, and Fedora will typically take some weeks to ship it after it’s released, so in the mean time I’ll have to live with the error message.

Classic C++ problem with #include directives

Posted on .

A few days ago we had a perplexing bug in my day job that turned out to be related to a classic C++ problem with #include directives. Hopefully, C++ modules will solve part of the mess that are #include directives, but no official specification exists yet and they didn’t make it into C++17.

One of the classic questions of #include directives is: what is the difference between an #include directive using double quotes and one using angles? The C standard mentions headers using angles may not be implemented as actual files. For example, there’s no need for a cstdio file to exist when someone uses #include <cstdio>. Double quotes, OTOH, refer to files, and in both cases the implementation is free to choose how to do a lot of things. In practice, different compilers do things differently. From now on, I’ll be talking about Unix systems and GCC/Clang.

GCC and Clang both do a very simple thing. Most standard headers (every single one?) are actual files that exist on disk. The difference between double quotes and angles is that double quotes make the preprocessor look for the file first in the same directory the source file including it is located. If not found, with both angles and double quotes, the compiler proceeds to search for those headers in the list of directories passed using the -I flag. If it’s still not found, a list of standard system directories is used.

Combining the theory and practice above, some people use angles for every standard or system header, and quotes for every “user header” that belongs to the project they’re building, even if the included header is not in the same directory as the source file. Other people like myself prefer to use double quotes when the file is actually in the same directory as the source file including it, probably meaning the header belongs to the same module. Headers belonging to other modules, even if local to the project, are included using angles and passing the corresponding -I option to the compiler that would be needed in any case. Of course, there’s people that don’t fall inside of either camp.

A second typical questions would be: is there any difference between using angles and passing the current directory (dot) using -I., and using quotes without passing the -I option? Yes there is, when the file you’re compiling is not in the directory you’re calling the compiler from. In that case, angles combined with -I. will search for the header in the directory you’re calling the compiler from (the current working directory), while double quotes will make the preprocessor search for the header in the source file directory. For example, imagine the following hierarchy:

header.h
subdir/main.c
subdir/header.h

When compiling subdir/main.c from the base directory, if subdir/main.c uses #include <header.h> and you pass the -I. option, header.h, in the base directory, will be included. If subdir/main.c uses #include "header.h", subdir/header.h will be included instead.

Subtle but important. A small part of our build system used an auxiliary tool that generated a temporary file in /tmp, and ran the preprocessor on it. When that temporary file used #include directives with double quotes, the included file was searched for in /tmp first. Normally, it didn’t exist there. But one day it did because someone had put a temporary copy in /tmp to do some tests. The included file had outdated contents in /tmp and, furthermore, everything happened to compile and build just fine. Problems only appeared in runtime and it was very hard to find out what was happening because the original source code didn’t match what was being compiled into the binary. It took us a whole day and the solution, in our case, involved using a temporary directory inside /tmp instead of creating the temporary files directly there.

GTX 760 upgraded to a GTX 1070

Posted on . Updated on .

It’s been a long time since my last post, so here’s a new one to tell you I switched graphics cards from a GTX 760 to a GTX 1070.

All started when NVIDIA launched their first Pascal cards back at the start of this past summer. I was pretty serious about getting the amazing GTX 1070 (as powerful as the previous GTX 980ti, nonetheless!) once aftermarket cards became available and AMD had released their Polaris cards too, which I hoped would bring NVIDIA prices down a bit. Taking into account VAT, import taxes and everything, I was hoping GTX 1070 cards would be available for 400-ish euros. Around 410 or 420 was what I expected and the lower the better, of course.

I had never owned a card that expensive, but I thought the improvements this generation brought were worth it. I also planned to sell my old card on eBay to cover some costs (which I did, for around 75 euros). When the time came, the cards were much more expensive (they still are) and it seemed everybody was short on stock permanently. Prices were actually closer to 500 euros in many cases, at least in Spain, with some specific cards going over that figure.

Time passed and AMD released their RX480 cards, while NVIDIA released the GTX 1060 model to compete with them. Both were much more affordable while being incredibly powerful too. A GTX 1060 has similar performance to a GTX 980. RX480 cards had mixed performance. In many interesting old DX11 games performance was below a GTX 1060 and closer to a GTX 970 (which is still not bad at all). In some DX11 titles performance was more or less the same. In a few selected DX11 and most DX12 and Vulkan games performance was clearly superior to a GTX 1060 and closer to the GTX 1070.

So at the end of the summer I had the same doubt many other gamers were having, from what I’ve been reading: pondering if I should go for the RX480 (8GB version) or the GTX 1060 (6GB version). In Spain both cards were priced similarly with some stores having cheaper 1060 cards and others having cheaper RX480 cards, with both being no more than 20 euros apart. I read a lot about async compute, the state of Linux drivers, DX12, Vulkan, DX11 drivers, etc. My take from everything I read? NVIDIA, for all practical purposes, supports async compute even if their solution relies a bit more on software than hardware at this point. We still need more samples from games and comparisons are not totally fair at the time I’m writing this.

For example, Vulkan is noticeably faster in Doom using a GTX 1070, but only when the frame rate is incredibly high and you start to be constrained by CPU performance. Back to the RX480 and GTX 1060, OpenGL performance with NVIDIA drivers was already stellar, so it’s harder to notice the difference. Vulkan performance also improved when NVIDIA released new drivers with support for a newer version of the Vulkan API. And Doom still does not support NVIDIA intrinsic shaders while it supports AMD ones, so it’s hard to tell which card is better. Games implementing DX12 so far have been problematic too, with many of them being traditionally AMD-centric or having inferior performance when compared to DX11. Again, hard to compare. Also, NVIDIA cards in general have less compute performance when compared to AMD cards, but more texel performance. In other words, NVIDIA cards should be slower at shaders but noticeably faster at geometry and texturing. It’s hard to find the sweet spot and performance varies from game to game.

I finally reverted back to my initial plan and got a GTX 1070 for several reasons. First off, a decent aftermarket GTX 1070 was temporarily available in Spain for 420 euros for several days, so I took that chance to grab one. I’m going to use it to play a lot of already-released OpenGL or DX11 games, like The Witcher 3, Arkham Knight, Far Cry 4 and Primal, Fallout 4, etc. Support for the card was already available in NVIDIA drivers for Linux, while AMD support has been introduced more slowly, albeit in a better way (kernel version 4.8 was the best choice but was only released for Fedora 24 last week). 3D performance with open source drivers will be much better with Mesa 12.1, but that’s yet to be released officially and pushed to Fedora. Also, a GTX 1070 uses about the same amount of power as an RX480. In that aspect, NVIDIA cards are more efficient.

That’s more or less why I stayed on “Team Green” like I’ve done since the days of Voodoo cards, but I’m not an NVIDIA fan by any means. In general, I’m not a fan of any brand at all, in any market. I don’t look at the brand stickers on the things I buy, but I do mind their performance, features and price. In fact, being a FOSS enthusiast, I can only congratulate AMD for releasing a phenomenal card like the RX480, improving Mesa and getting support for their cards integrated in the Linux kernel as fast as they did. It’s really amazing. I’m crossing my fingers for them and I sincerely hope Zen ends up being a very nice CPU line that brings some needed competition to Intel. And the same goes for Vega cards and NVIDIA. I’ve owned several AMD Athlon processors in the past, and I look forward to building an all-AMD computer in the future with nice FOSS drivers for Linux and very decent drivers for Windows. They’re definitely on the right path as of now and it’s better for everybody if they stay on it.

From a Nexus 4 to a Moto G4

Posted on .

I recently switched mobile phones, getting rid of my old Nexus 4 and buying a brand new Moto G4. The reasons for the switch more or less obey what I’ve been saying about mobile phones in a few other posts in the past.

Basically and repeating myself, I’m disappointed with mobile. I really thought with the arrival of the Nexus 4 not too long ago (November 2012 according to Wikipedia) some things would start to be different about smartphones. They wouldn’t be disposable products like most other mobile phones were and hardware would be supported and patched for many years to come as long as it made sense. And I missed that prediction by a long shot. Android updates are a mess and we don’t get long term support in any device I know of, if there’s any. Google, ever changing its policy, seems to have settled for 2 years of system updates after the device is introduced and a third year of security updates. For me, system updates are irrelevant (and in my opinion are a frequent source of bugs and instability), but knowing you could have security updates for the next, say, 5 years (sort of like in enterprise Linux distributions) would be a big advance and nobody does that.

So when I was tired of having problems with my Nexus 4 and security support for it had in theory expired, I looked for alternatives. I must say I was a bit disappointed with the range of Moto devices this year, the first one after Lenovo took over. The increased pricing scheme and reduced availability of the Moto G4 Play version (which will not be available in Spain at least until September 1st), and the long delay in launching a new version of the Moto E, which is still not available, forced me to spend more money than expected. I actually bought it from Amazon.co.uk not long after the Brexit vote when the pound sterling lost value and, despite shipping, I still managed to save about 20 euros.

Anyway, this time I’m psychologically prepared to dump it after two years if needed and I still plan to go even cheaper as time passes. Certainly, the 200-euros Moto G4 is definitely much better than the 300-euros Nexus 4, and in two or three years I’m sure a 150 or 100-euros phone will cut it.

By refusing to waste time doing a factory reset in my Nexus 4, these were the problems I was having. I know a factory reset may have fixed some but not all of them.

  • Inability to recover mobile data connectivity after being on WiFi for a long time. Fixed by going into airplane mode and back.

  • Phone ringtone and notifications ringtone resetting to weird and faulty values from time to time. I almost always noticed this when the phone rang with a weird tone.

  • There’s a bug that reboots the phone when connecting to some WiFi routers. This happened to me with a specific router.

  • Flashlight mode or airplane mode activating by themselves from time to time.

  • The phone rebooting itself or powering off without apparent reason from time to time. This impacted me more than any other problem when I had to use the phone as an alarm clock. It powered off overnight and I overslept in two occasions.

What I love about my new Moto G4, apart from getting rid of the previous problems obviously:

  • Improved battery life.

  • The ability to see pending notifications and bringing up the unlock screen by tilting the phone. This guarantees minimal use of physical buttons.

  • 4G support as most modern phones have. It’s not about download speeds really, but latency is much lower on 4G and I notice this specially when browsing the web and tapping a link to open a new page.

  • It’s still a vanilla Android experience.

  • Much stronger vibration that can be felt and heard easily without disturbing anyone. The Nexus 4 was incredibly weak in this aspect.

  • The camera and flashlight gestures.

  • The phone in general being more powerful, which again can be noticed when using a web browser.

  • The improved screen resolution is nice too but it’s not a deal breaker.

I’ll put my old Nexus 4 phone up on eBay soon with a starting price of 50 euros, but contact me privately if you’re interested and that price plus shipping works for you. I don’t expect anyone bidding much higher. Battery life is not great but with light use I was still getting more than a day when fully charged.