Sound output from the wrong jack

Debian recently released an update to their stable release, version 8.7, and with it an update to slightly more recent Linux kernel version (up to 3.16 from 3.2). Well, that would be nice to have I thought, and updated my office workstation and rebooted. Everything looked fine, it even picked up and updated the Nvidia graphics driver that I always have problems with. But then, when I tried to play radio over the Internet, the sound suddenly started blaring out from a speaker inside the chassis that I didn’t even know it had, instead of my connected proper speakers.

So, first I thought the driver was broken, so I rebooted back to the old kernel. Still wrong, then I turned power off and back on and started the old kernel, still the wrong output. Strange.

I have a HP Z220 Workstation (from 2013) at the office, with an “Intel Corporation 7 Series/C210 Series Chipset Family High Definition Audio Controller (rev 04)” audio controller, with a Realtek ALC221 chip (as per output from lspci -v and /proc/asound/card0/codec#0). It took me an hour of intense googling to find the correct set of keywords to find something, but apparently most English-language threads use “jack” for the outputs. I should have known that.

I eventually stumbled on this ArchLinux thread from 2014 which mentioned a tool called hdajackretask that can be used to rearrange the outputs from the HDA cards. Debian distributes this utility in the alsa-tools-gui package. After installing the package and changing the output type I managed to get sound playing through my speakers again.

hdajackretask screenshot, setting "Green Line Out, Rear side" to "Line out (back)"

Screenshot from hdajackretask, used to select output devices from an HDA audio card

Now to actually get some work done. That is Mondays for you.

The futility of OSX parental control and web browsers

I have kids. Two of them, the youngest is five and the oldest is about to turn eight years old. Since they see me and my wife use a computer regularly, they of course also want to use it. The oldest has access to computers at school, and if they are going to be proficient with computers, they need to start using them at an early age. I have a MacBook Pro that they both have accounts on, both set up with OSX’s default “Parental Control” feature.

That works fairly well when they use the local application (Photo Booth is a favourite, if I hadn’t blocked it their little clips would probably have ended up on YouTube if the knew how to upload them). Well, before getting to the applications, there are all these little pesky pieces of software that phone home on every start-up, under the guise of doing software updates. No matter how many times I block “Google Software Update” or “Paragon Updater” and the like, every time they log in to their accounts, they get a message that they cannot run them. Well, they learn to click “OK” and go on with their life. Using a web browser is a lot more hassle, though.

I had initially set up a whilelist in the Parental Control settings, to only allow them to access certain web sites. That doesn’t work, since every site in the universe now include stuff from other places, either be it CDNs, Google’s web tracking stuff or a JavaScript library that they are too bored to copy to their own domain. I can live with that, a lot of it can be blocked with Ghostery or similar, but that is if you can even get to it.

Trying to even run a web browser on an account that has Parental Control enabled is a chapter in itself. First it is the phone-home auto-update stuff that kicks in every few moments. Then there are the pre-installed shortcuts (at least in Opera) that wants to download screenshots to display inside the Speed Dial screen (why can’t they just ship with default images?). Then even trying to type a web address keeps trying to send every single keystroke to Google, requiring having to close a dialog after every single letter in the URL. In Google Chrome, it seems utterly and completely impossible to disable this behavior. Opera has it, hidden deep inside its configuration options, but I then I have to enter a magic key combination to remove the Search field. And fight the blocked URL pop-ups to remove the pre-installed Speed Dials.

I need to try out Vivaldi for the kids’ accounts. I know it can be configured to be less intrusive, and it doesn’t send all keystrokes to the search engine. When I set up the account for my oldest daughter there wasn’t a stable version around, but it should be fine now.

End of an era

The day had to come, I knew it, I just postponed it for as long as possible. But now it is time to move on, it is time to close down my Fidonet system for good, over twenty years after setting up my first system. My Fidonet history has been going through a lot of different setups, starting out with reading off BBSes using Blue Wave, through a simple point setup with Terminate on MS-DOS, moving on to an OS/2-based system using SquishMail using timEd and Fleet Street as readers, even serving as the Swedish shareware registration agent for Fleet Street for a few years at the Fidonet peak in the late 1990s.

I then moved to a Linux-based system using CrashMail II (for a while, running timEd through an MS-DOS emulator under Linux, before GoldEd was ported to Linux), and lately using a Usenet News reader and the JAMNNTPd software. During my tenure as a Debian developer, I had a lot of this stuff packaged for Debian, but I haven’t checked if they are still there. I have just been using the stuff I compiled several years ago, but lately it has simply stopped working. Maybe my message bases have broken completely, I don’t know, and considering how seldom I read them, I figured now was the time to shut the system down for good.

It is still a bit sad, I remember the peak around 1996–1998, when I moderated a chat area and had enforce a limit of 50 posts per day per author, else it would overflow completely (remember, this was at the time where it could take a day or three for the messages to propagate). Now I don’t know how many years it has been since anyone even posted a single message in any of the derelict Swedish areas. There is some activity in the international areas,

Good-bye, Fidonet!

OS X Time Machine recovery does not find my USB disk

Today the root file system on my MacBook developed an “Invalid index key” error that I was unable to fix by booting into recovery mode and using the Disk Utility, or even by booting into single-user mode and using the fsck_hfs tool, no matter what flags I threw at it. Paragon HFS for Windows could still read (and write) to the partition from the Windows installation and I was able to read the file system, but I couldn’t boot it.

After a few hours of trying to fix the problem, I simply gave up. I saw several mentions of a tool called Disk Warrior that supposedly could fix a lot of the problems fsck couldn’t, but I was a bit reluctant at throwing over 100 US dollars at a tool that I didn’t know if it would make any difference.

I do have backups. Even if the MacBook isn’t set up to do daily backups like most my machines are (I never got the Time Machine interface in my Synology NAS to work with it), so the last backup I had was from December last year. Better than nothing, and I don’t really keep that many important files on the laptop – most of the important files are shared with other computers (using Git version control to synchronize), or in Dropbox.

So I booted from the recovery partition, selected Restore from Time Machine and … my backup didn’t appear.

So I rebooted. Still nothing.

Rebooting, this time booting from the backup disk (which has a convenient OS image installed onto it). Still no disk. I only saw my (failed) attempt of a backup node from the Synology NAS get listed (and I was unable to connect to it, just like Time Machine itself was).

Meh.

Then it struck me. What if I power off the Synology, and then open the recovery program? So that is what I tried, and there it was! Now the recovery finally let me select the disk that was physically connected to the machine, rather than the network share over WiFi (still, it’s quite impressive of it to find it when booting from the recovery partition on the backup disk, I must say that Apple are rather good at making those things just work, even if it failed at what I really wanted to do).

Now the backup is finally restoring. The clock is approaching half past midnight and it is at 7.5 % restored, so I guess I will have to wait until the morning until I see if it actually did work, but at least it is trying now…

Time to go to sleep.

Watching the WWE Network on Linux

Okay, I confess, I am a fan of pro wrestling. You know, that weird US-American show-style wrestling where people pretend to beat each other up? Hulk Hogan and Ric Flair? No, okay, then you don’t need to continue reading.

Anyway, I am a fan, I even have a website dedicated to it, and I am subscribed to the WWE Network, an on-line channel where WWE broadcast their live events and I have access to their back archive. I subscribed when they opened international subscriptions back in August 2014, and among others, I have watched it on my PCs running Linux. It has worked flawlessly, until a few weeks ago, when it started developing error messages and then stopped playing completely.

Contacting their technical support didn’t help, once they heard about me running Linux they just stopped responding, both on Facebook and e-mail. Despite it having worked perfectly before, apparently since it is unsupported they do not want to look at ways of fixing it. So, what to do?

I ended up finding a workaround in installing the Windows (32-bit) version of Firefox and Flash Player under WINE. While it was easy enough to find the download link for Firefox, finding a working installation for the Flash plug-in was a bit more difficult. The normal plug-in download page didn’t work, as the installer was just a placeholder that downloaded the real installer, which it was unable to do under WINE. I managed to find a page with an off-line installer, a page that started with a big warning that it is going to be taken away next year.

Installing those and launching the Windows Firefox, I am able to play videos again. There are a few issues, the audio is not 100 % synchronized with the audio, but it at least is better than not playing at all.

I now have a workaround, but I still hope they will fix it properly soon.

Making OVF images using Packer

At my $DAYJOB, the need recently arose for not only making our software available as an installer that the end-user can install on their machines, but also for providing pre-built OVF (Open Virtualization Format) images, mainly targeted towards costumers running VMware vSphere and wanting to not have software running on bare metal. They can of course run the regular installer, but providing a pre-installed image cuts deployment time considerably and eliminates many of the mistakes that can be done while performing the installation.

Hunting around for solutions on how to actually generate these images, using some kind of automated procedure as we will regenerate the images several times and in slightly different configurations, I eventually landed on Packer. Packer lets me drive VMWare Workstation by submitting a configuration file listing an ISO image to install from and giving the commands necessary to run the installation automatically.

One of the issues with doing this is that most installations will add some unique identifiers in the image, and we do not want that. For instance, SSH host keys are generated, as are MAC addresses for the network cards, and also some other stuff is dropped. Fortunately, I was not the first one to have faced this problem, so it was fairly easy to find a solution that would clean up the generated image. In addition to that, I had the post-install script install VMWare Tools in the virtual image, and then go on to remove various UUIDs and MAC addresses from the generated VMWare configuration file.

The result of running Packer is, however, still a VMWare image. It does have a driver for OVF, but that one is using Oracle VirtualBox instead. OVF is supposed to be platform-independent but there are enough differences between how the images are built to create trouble if we use the wrong build platform. Instead we landed on using VMWare OVF Tool on the generated VMWare image, converting it into an OVF archive (.ova). This is the part that takes the longest time in our build process, which starts out with generating the ISO to install from on-the-fly. But in the end, we have an OVA file that can be imported into VMWare (vSphere, Workstation or Player all work fine) and be up and running in under two minutes.

Fixing battery drain on my Samsung Galaxy Note 3

I have a Samsung Galaxy Note 3 that I am quite happy with. I have had it for a couple of years now and have no plans of switching to a newer one in a while. Recently, it has started acting up and draining the battery very rapidly. I thought the upgrade to Lollipop (Android 5) would fix that, but it just made it worse. After trying out quite a few things I started researching it and found that an app called “Unified Daemon(EUR)” was using up both a lot of battery power and network bandwidth.

2015-08-25 05.13.01

Searching for more information on the ‘net, I came up with forum thread describing this very issue. Apparently, the app purpose is to update the weather forecast, news and stock tickers. I don’t really use any of them, although I did have a weather symbol on the S Cover screen, so how it could end up downloading hundreds of megabytes of data I have no idea, something must be very wrong with it.

To test out if it was indeed the culprit, I went into the settings and disabled the daemon in the App Manager yesterday. And, indeed, today battery usage was down to normal again, and when I came back from work I had 50 % battery life left despite having placed a couple of phone calls, played some games and browsed some web sites. Had I tried that yesterday the battery would either have been empty long before, or I would have had to recharge it during the day.

A replacement for the Opera IRC client

After transitioning from Opera to Vivaldi as my primary browser, one of the features I have been missing is the IRC client. Granted, I am not a heavy IRC user, but there is this one channel I monitor where some of my friends and former Opera colleagues hang out. I liked the simplicity of the Opera IRC client and I am not quite a fan of the terminal-based ones.

One of my friends pointed me towards WeeChat, which is an extensible chat client. In its basic configuration, it runs in a terminal and looks like any old IRC client. However, it does have support for plug-ins which allows it to connect to many different systems (although I have as of yet only set up IRC), and also for relays, making it possible to use other front-ends.

One such front-end is Glowing Bear, which is web based. It connects to a WeeChat server which has a relay set up. By default, that relay is unencrypted, which is not very safe, but it does support SSL and I found this wonderful guide describing how to set that up with a proper certificate. I configured that, and dropped a copy of the Glowing Bear files to a web-site of my own (which is not really necessary since the connection is direct from browser to WeeChat, but it is nice to know exactly what I am connecting to). With the certificate I got using the configuration guide above I could also make this a https server.

Now I have a replacement for the IRC client. Now I just need to replace the mail client

Just a simple mail server installation

So, the mail server at work died on Wednesday. It was running Microsoft Exchange and died so utterly completely that even with several hours of premium support from Microsoft, they were unable to get it up and running again. Being one that comes in fairly early in the morning, and already am managing a few internal servers, I was asked to set up a new box using Linux or whatever.

Can’t be too difficult, huh?

Well, that depends. In this case, I needed to have it authenticate users against an Active Directory server and support mail aliases set up in its user database. After doing a fair amount of googling around, I found a few guides that helped me along the way. I started out with iRedMail and continued by configuring it to talk to the Active Directory server. Never having worked with AD or Kerberos before, it took me quite some time to get Kerberos working (tip: have a look at what the DNS thinks is the domain name of the KDC, in our case it was “BT.LOCAL” in all uppercase; use anything else as the Kerberos realm and all I got was cryptic error messages).

I had some hurdles to overcome, getting postfix to authenticate with Active Directory’s LDAP server was fairly easy once I a) had the unprivileged account that could do LDAP lookups (using the “Administrator” account for that does not work), and b) reduced the LDAP query so that it would actually find the users I was looking for (tip: make a dump of the LDAP directory and look at the lowest common denominator for the lookup keys).

Then I had the problem of Dovecot, which handles local mail delivery and IMAP/POP, could not read the mail that it had stored in mailboxes. It turned out that since I had set up Kerberos so that the AD users were available as Unix users, and had the recipient domain (“bt.local” from above) in “mydestination”, Postfix would always setuid the LDA. I had to remove the domain from there and add it to the list of virtual domains for that to work.

All in all, it took me about a day and half to get the thing set up. Not bad for the first time. I did set up Git to version-control all the important configuration files so that I can track my future mistakes and revert to a working configuration.

Now to get the SMTP SASL configuration working