Welcome to my web log. See the first post for an introduction. See the archive page for all posts, and comments for a feed of comments only. (There is an english language feed if you don't want to see Finnish.)

Archives Tags Recent Comments Moderation policy Main site

All content outside of comments is copyrighted by Lars Wirzenius, and licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License. Comments are copyrighted by their authors.


I gave a talk about the early days of Linux at the jubilee symposium arranged by the University of Helsinki CS department. Below is an outline of what I meant to speak about, but the actual talk didn't follow it exactly. You can compare these to the video once it comes online.

  • Linus and I met at uni, the only 2 Swedish speaking new students that year, so we naturally migrated towards each other.
  • After a year away for military service, got back in touch, summer of
    1. .
  • C & Unix course fall of 1990; Minix.
  • Linus didn't think atime updates in real time were plausible, but I showed him; funnily enough, atime updates have been an issue in Linux until fairly recently, since they slow things down (without being particularly useful)
  • Jan 5, 1991 bought his first PC (i386 + i387 + 4 MiB RAM and a small hard disk); he had a Sinclair QL before that.
  • Played Prince of Persia for a couple of months.
  • Then wanted to learn i386 assembly and multitasking.
  • A/B threading demo.
  • Terminal emulation, Usenet access from home.
  • Hard disk driver, mistaking hard disk for a modem.
  • More ambition, announced Linux to the world for the first time
  • first ever Linux installation.
  • Upload to ftp.funet.fi, directory name by Ari Lemmke.
  • Originally not free software, licence changed early 1992.
  • First mailing list was created and introduced me to a flood of email (managed with VAX/VMS MAIL and later mush on Unix).
  • I talked a lot with Linus about design at this time, but never really participated in the kernel work (partly because disagreeing with Linus is a high-stress thing).
  • However, I did write the first sprintf for the kernel, since Linus hadn't learnt about varargs functions in C; he then ruined it and added the comment "Wirzenius wrote this portably..." (add google hit count for wirzenius+fucked).
  • During 1992 Linux grew fast, and distros happened, and a lot of packaging and porting of software; porting was easier because Linus was happy to add/change things in the kernel to accomodate software
  • A lot of new users during 1992 as well.
  • End of 1992 I and a few others founded the Linux Documentation Project to help all the new users, some of who didn't come from a Unix background.
  • In fact, things progressed so fast in 1992 that Linus thought he'd release 1.0 very soon, resulting in a silly sequence of version numbers: 0.12, 0.95, 0.96, 0.96b, 0.96c, 0.96c++2.
  • X server ported to Linux; almost immediate prediction of the year of the Linux desktop never happening unless ALL the graphics cards were supported immediately.
  • Linus was of the opinion that you needed one process (not thread) per window in X; I taught him event driven programming.
  • Bug in network code, resulting in ban on uni network.
  • Pranks in the shared office room.
  • We released 1.0 in an event at the CS dept in March, 1994; this included some talks and a ritual compilation of the release version during the event.
Posted Mon Aug 22 15:00:43 2016 Tags:

Today it is 23 years ago since Ian Murdock published his intention to develop a new Linux distribution, Debian. It also about 20 years since I became a Debian developer and made my first package upload.

In the time since:

  • I've retired a couple of times, to pursue other interests, and then un-retired.

  • I've maintained a bunch of different packages, most importantly the PGP2 software in the 90s. (I now only maintain software for which I'm also upstream, in order to make jokes about my upstream being an unco-operative jerk, and my packager being unhelpful in the extreme.)

  • Got kicked out from the Debian mailing lists for insulting another developer. Not my proudest moment. I was allowed back later, and I've tried to be polite ever since. (See also rules 6.)

  • I've been to a few Debconfs (3, 5, 6, 9, 10, 15). I'm looking forward to going to many more in the future. It's clear that seeing many project members at least every now and then has a very big impact on project cohesion.

  • I had a gig where I was paid to improve the technical quality of Debian. After a few months of bug fixing (which isn't my favourite pastime), I wrote piuparts in order to find new bugs. (I gave that project away many years ago, but it seems to still be going strong.)

  • I've almost ran for DPL twice, but I'm glad I didn't actually. I've carefully avoided any positions of power or responsibility in the project. (I live in fear that someone decides to nominate me for something where I'd actually have make important decisions.)

    Not being responsible means I can just ignore the project for a while when something annoying happens. (Or retire again.) With such a large project, eventually something really annoying does happen.

  • Came up with the DEP process with Zack and Dato. I also ran the second half of the DEP5 process to get the debian/copyright machine readable format accepted. (I'm no longer involved, though, and I don't think DEP is much now.)

  • I've taught several workshops about Debian packaging, including online for Debian-Women. It's always fun when others "get" how easy packaging really is, despite all the efforts of the larger variety in tooling and random web pages go to to obscure the fundamental simplicity.

  • Over the years Í've enjoyed many of the things developed within Debian (without claiming any credit for myself):

    • the policy manual, perhaps the most important technical achievement of the project

    • the social contract and Debian free software guidelines, unarguably the most important non-technical achievements of the project

    • the whole package management system, but especially apt

    • debhelper's dh, which made the work of packaging simple cases so easy it's nearly a no-brainer

    • d-i made me not hate installing Debian (although I think time is getting ripe to replace d-i with something new; catch me in a talkative mood at party to hear more)

    • Debian-Women made an almost immediate improvement to the culture of the larger project (even if there's still much too few women developers)

    • the diversity statement made me a lot happier about being a project member.

    I'd like to thank everyone who's worked on these and made them happen. These are important milestones in Debian.

  • I've opened my mount in a lot of places over the years, which means a lot of people know of me, but nobody can actually point at anything useful I've actually done. Which is why when I've given talks at, say, FOSDEM, I get introduced as "the guy who shared an office with Linus Torvalds a long time ago".

  • I've made a number of friends via participation in Debian. I've found jobs via contacts in Debian, and have even started a side business with someone.

It's been a good twenty years. And the fun ain't over yet.

Posted Tue Aug 16 15:42:44 2016 Tags:

I write free software and I have some users. My primary support channels are over email and IRC, which means I do not have direct access to the system where my software runs. When one of my users has a problem, we go through one or more cycles of them reporting what they see and me asking them for more information, or asking them to try this thing or that thing and report results. This can be quite frustrating.

I want, nay, need to improve this. I've been thinking about this for a while, and talking with friends about it, and here's my current ideas.

First idea: have a script that gathers as much information as possible, which the user can run. For example, log files, full configuration, full environment, etc. The user would then mail the output to me. The information will need to be anonymised suitably so that no actual secrets are leaked. This would be similar to Debian's package specific reportbug scripts.

Second idea: make it less likely that the user needs help solving their issue, with better error messages. This would require error messages to have sufficient explanation that a user can solve their problem. That doesn't necessarily mean a lot of text, but also code that analyses the situation when the error happens to include things that are relevant for the problem resolving process, and giving error messages that are as specific as possible. Example: don't just fail saying "write error", but make the code find out why writing caused an error.

Third idea: in addition to better error messages, might provide diagnostics tools as well.

A friend suggested having a script that sets up a known good set of operations and verifies they work. This would establish a known-working baseline, or smoke test, so that we can rule things like "software isn't completely installed".

Do you have ideas? Mail me (liw@liw.fi) or tell me on identi.ca (@liw) or Twitter (@larswirzenius).

Posted Tue Jul 19 14:57:36 2016 Tags:

Warning: This blog post includes instructions for a procedure that can lead you to lock yourself out of your computer. Even if everything goes well, you'll be hunted by dragons. Keep backups, have a rescue system on a USB stick, and wear flameproof clothing. Also, have fun, and tell your loved ones you love them.

I've recently gotten two U2F keys. U2F is a open standard for authentication using hardware tokens. It's probably mostly meant for website logins, but I wanted to have it for local logins on my laptop running Debian. (I also offer a line of stylish aluminium foil hats.)

Having two-factor authentication (2FA) for local logins improves security if you need to log in (or unlock a screen lock) in a public or potentially hostile place, such as a cafe, a train, or a meeting room at a client. If they have video cameras, they can film you typing your password, and get the password that way.

If you set up 2FA using a hardware token, your enemies will also need to lure you into a cave, where a dragon will use a precision flame to incinerate you in a way that leaves the U2F key intact, after which your enemies steal the key, log into your laptop and leak your cat GIF collection.

Looking up information for how to set this up, I found a blog post by Sean Brewer, for Ubuntu 14.04. That got me started. Here's what I understand:

  • PAM is the technology in Debian for handling authentication for logins and similar things. It has a plugin architecture.

  • Yubico (maker of Yubikeys) have written a PAM plugin for U2F. It is packaged in Debian as libpam-u2f. The package includes documentation in /usr/share/doc/libpam-u2f/README.gz.

  • By configuring PAM to use libpam-u2f, you can require both password and the hardware token for logging into your machine.

Here are the detailed steps for Debian stretch, with minute differences from those for Ubuntu 14.04. If you follow these, and lock yourself out of your system, it wasn't my fault, you can't blame me, and look, squirrels! Also not my fault if you don't wear sufficient protection against dragons.

  1. Install pamu2fcfg and libpam-u2f.
  2. As your normal user, mkdir ~/.config/Yubico. The list of allowed U2F keys will be put there.
  3. Insert your U2F key and run pamu2fcfg -u$USER > ~/.config/Yubico/u2f_keys, and press the button on your U2F key when the key is blinking.
  4. Edit /etc/pam.d/common-auth and append the line auth required pam_u2f.so cue.
  5. Reboot (or at least log out and back in again).
  6. Log in, type in your password, and when prompted and the U2F key is blinking, press its button to complete the login.

pamu2fcfg reads the hardware token and writes out its identifying data in a form that the PAM module understands; see the pam-u2f documentation for details. The data can be stored in the user's home directory (my preference) or in /etc/u2f_mappings.

Once this is set up, anything that uses PAM for local authentication (console login, GUI login, sudo, desktop screen lock) will need to use the U2F key as well. ssh logins won't.

Next, add a second key to your u2f_keys. This is important, because if you lose your first key, or it's damaged, you'll otherwise have no way to log in.

  1. Insert your second U2F key and run pamu2fcfg -n > second, and press the second key's button when prompted.
  2. Edit ~/.config/Yubico/u2f_keys and append the output of second to the line with your username.
  3. Verify that you can log in using your second key as well as the first key. Note that you should have only one of the keys plugged in at the same time when logging in: the PAM module wants the first key it finds so you can't test both keys plugged in at once.

This is not too difficult, but rather fiddly, and it'd be nice if someone wrote at least a way to manage the list of U2F keys in a nicer way.

Posted Fri Jul 15 08:57:07 2016 Tags:

For those who use my code.liw.fi/debian APT repository, please be advised that I've today replaced the signing key for the repository. The new key has the following fingerprint:

8072 BAD4 F68F 6BE8 5F01  9843 F060 2201 12B6 1C1F

I've signed the key with my primary key and sent the new key with signature to the key servers. You can also download it at http://code.liw.fi/apt.asc.

Posted Sun Jun 19 10:44:28 2016 Tags:

In March we started a new company, to develop and support the software whose development I led at my previous job. The software is Qvarn, and it's fully free software, licensed under AGPL3+. The company is QvarnLabs (no website yet). Our plan is to earn a living from this, and our hope is to provide software that is actually useful for helping various organisations handle data securely.

The first press release about Qvarn was sent out today. We're still setting up the company and getting operational, but a little publicity never hurts. (Even if it is more marketing-speak and self-promotion than I would normally put on my blog.)

So this is what I do for a living now.



The development of the open source Qvarn Platform was led by QvarnLabs CEO Kaius Häggblom (left) and CTO Lars Wirzenius.

Helsinki, Finland 10.05.2016 – With Privacy by Design, integrated Gluu access management and comprehensive support for regulatory data compliance, Qvarn is set to become the Europe-wide platform of choice for managing workforce identities and providing associated value-added services.

Construction industry federations in Sweden, Finland and the Baltic States have been using the Qvarn Platform (http://www.qvarn.org) since October 2015 to securely manage the professional digital identities of close to one million construction workers. Developed on behalf of these same federations, Qvarn is now free and open source software; making it a compelling solution for any organization that needs to manage a secure register of workers’ data.

"There is something universal and fundamental at the core of the Qvarn platform. And that’s trust," said Qvarn evangelist Kaius Häggblom. "We decided to make it free, open source and include Gluu access management because we wanted all those using Qvarn or contributing to its continued development to have the freedom to work with the platform in whatever way is best for them."

Qvarn has been designed to meet the requirements of the European Union’s new General Data Protection Regulation (GDPR), enabling organizations that use the platform to ensure their compliance with the new law. Qvarn has also incorporated the principles of Privacy by Design to minimize the disclosure of non-essential personal information and to give people more control over their data.

"Today, Qvarn is used by the construction industry as a way to manage the data of employees, many of whom frequently move across borders. In this way the platform helps to combat the grey economy in the building sector, thereby improving quality and safety, while simultaneously protecting the professional identity data of almost a million individuals," said Häggblom. "Qvarn is so flexible and secure that we envision it becoming the preferred platform for the provision of any value-added services with an identity management component, eventually even supporting monetary transactions."

Qvarn is a cloud based solution supported to run on both Amazon Web Services (AWS) and OpenStack. In partnership with Gluu, the platform delivers an out-of-the-box solution that uses open and standard protocols to provide powerful yet flexible identity and access management, including mechanisms for appropriate authentication and authorization.

"Qvarn's identity management and governance capabilities perfectly compliment the Gluu Server's access management features," said Founder and CEO of Gluu, Michael Schwartz. "Free open source software (FOSS) is essential to the future of identity and access management. And the FOSS development methodology provides the transparency that is needed to foster the strong sense of community upon which a vibrant ecosystem thrives."

Qvarn’s development team continues to be led by recognized open source developer and platform architect Lars Wirzenius. He has been developing free and open source software for 30 years and is a renowned expert in the Linux environment, with a particular focus on the Debian distribution. Lars works at all levels of software development – from writing code to designing system architecture.

About the Qvarn Platform:

The Qvarn Platform is free and open source software for managing workforce identities. Qvarn is integrated with the Gluu Server’s access management features out of the box, using open and standard protocols to provide the platform with a single common digital identity and mechanisms for appropriate authentication and authorization. A cloud based solution, Qvarn is supported to run on both Amazon Web Services (AWS) and OpenStack. Privacy by Design is central to the architecture of Qvarn and the platform has been third party audited to a security level of HIGH.

http://www.qvarn.org

For more information, please contact:
Andrew Flowers
andrew.flowers@ellisnichol.com
+358 40 161 5668

Posted Tue May 10 13:16:49 2016 Tags:

Today was my last day at Suomen Tilaajavastuu, where I worked on Qvarn. Tomorrow is my first day at my new job. The new job is for a new company, tentatively named QvarnLabs (registration is in process), to further develop and support Qvarn. The new company starts operation tomorrow, so you'll have to excuse me that there isn't a website yet.

Qvarn provides a secure, RESTful JSON HTTP API for storing and retrieving data, with detailed access control (and I can provide more buzzwords if necessary). If you operate in the EU, and store information about people, you might want to read up about the General Data Protection Regulation, and Qvarn may be a possible part of a solution you want to look into, once we have the website up.

Posted Thu Mar 31 10:26:28 2016 Tags:

In January and February of 2016 I ran an Obnam user survey. I'm not a statistician, but here is my analysis of the results.

Executive summary: Obnam is slow, buggy, and the name is bad. But they'd like to buy stickers and t-shirts.

Method

I wrote up a long list of questions about things I felt were of interest to me. I used Google Forms to collect responses, and exported them as a CSV file, and analysed based on that.

I used Google Forms, even though it is not free software, as it was the easiest service I got to work that also seemed it'd be nice for people to use. I could have run the survey using Ikiwiki, but it wouldn't have been nearly as nice. I could have found and hosted some free software for this, but that would have been much more work.

Most questions had free form text responses, and this was both good and bad. It was good, because many of the responses included things I could never have expected. It was bad, because it took me a lot more time and effort to process those. I think next time I'll keep the number of free text responses down.

For some of the questions, I hand-processed the responses to a more or less systematic form, in order to count things with a bit of code. For others, I did not, and show the full list of responses (I'm lazy, we don't need a survey to determine that).

The responses

See http://code.liw.fi/obnam/survey-2016.html for the responses, after hand-processing.

For the questions for which it makes sense, a script has tabulated the various responses and calculated percentages. I haven't produced graphs, as I don't know how to do that easily. (Maybe next time I'll enlist the help of statisticians.)

Conclusions

  • There were 263 responses in total. I have no idea of knowing if the total number of Obnam users is about that, but the number correlates fairly well with the Debian popcon numbers, so I'm assuming Obnam has on the order of a few hundred users total.

    A larger number might be more impressive, but it'd also mean that I would be responsible for much more data loss if I make a horrible mistake. That said, it is probably time to start spending some effort on growing the developer base of Obnam.

  • People seem to hear about Obnam primarily from my blog posts, or by searching the web for backup software. Also, from the Arch Linux or Gentoo wikis, or Joey Hess.

  • People use Obnam mostly for personal machines, but also at work.

  • Those who have tried Obnam, but don't use it, rejected it primarily for speed or because it's unstable or buggy. I hope that the bad bugs have mostly been fixed, and I'm working on improving the speed.

  • People seem to use either the latest version, or the version included in the release of their operating system (e.g., Debian jessie). Other versions are relatively rare.

  • Most people started using Obnam in the past two years.

  • People use Obnam on a variety of Linux based operating systems, but also others. Obnam users are especially skewed towards Debian and Ubuntu, which is not surprising, as I'm involved in Debian and have been publicising it there, and provide package for Debian myself.

  • About half the people have at least hundreds of thousands of files, containing hundreds of gigabytes of data. All extremes (very few or very many files, very little or very much data) are represented, though. A couple of people have at least a hundred million files, or at least ten terabytes of data.

  • Most people don't have a backup strategy, or at least not a documented one, and if they do, it's not regularly tested.

    This isn't a good thing.

  • Most people had backed up within the past week as of the time of filling in the survey. This hopefully indicates that they back up frequently. Only one respondent said they'd never backed up.

    Rather more people hadn't tested their backups, however, with about a fifth of the people having never tested their backup. This is also not good.

  • Most people only back up one machine to each repository, or at most a few. A total of 17 respondents reported that they don't have a backup, and do not fear clowns.

  • About half the people back up to a local drive, and nearly two thirds to an SFTP server.

    People ask for more remote storage options, such as support for services like Amazon S3.

  • The things people like most about Obnam are on its list of core features: de-duplication, encryption, and ease of use / simplicity. FUSE is also well-liked, as are snapshot backups.

  • I didn't tabulate the reasons why people don't like Obnam, but performance and stability seem to be the most common reasons. My favourite response to this question is "the name obnam, does not sounds like a backup program".

  • Speed is also the pet bug people seem to have.

  • People seem to generally find Obnam documentation adequate. There's room for improvement, of course.

  • Nearly everyone finds it easy to get help if they have a problem with Obnam, but almost no-one uses the Obnam support mailing list or IRC channel.

  • Some people read the NEWS file, others do not. Few have sent patches, but some would like to. There's a bunch of suggestions for new features.

    None of this is surprising to me, except perhaps that so many Obnam users actually do read the NEWS file, as it's been my experience in other projects that that's rare.

  • About half the people have heard of the green albatross. It's the name of the new way in which Obnam will be storing data on disk, which is a big factor in how fast or slow Obnam is. When the green albatross soars, Obnam will fly faster.

  • People use other backup software as well, which is sensible: no point in having all one's eggs in one basket. The top choices are rsync, duplicity, attic, and rsnapshot, but the list seems to mention most free backup software.

  • There's some interest in helping Obnam development, either by direct contributions, donations, paying for support or development, or by buying merchandise. Nearly no-one wants a printed version of the manual, but stickers and t-shirts might sell well enough.

    A lot of people don't really want to, or are not able to, contribute, especially not by doing things, and that's OK. (They did contribute, however, by filling in the survey.)

  • When given an opportunity to say whatever they want to Obnam developers, most people say "thank you" in some form or another. This was very heartwarming.

Posted Sun Mar 27 11:47:45 2016 Tags:

After some serious thinking, I've decided not to nominate myself in the Debian project leader elections for 2016. While I was doing that, I wrote the beginnings of a platform, below. I'm publishing it to have a record of what I was thinking, in case I change my mind in the future, and perhaps it can inspire other other people to do something I would like to happen.

Why not run? I don't think I want to deal with the stress. I already have more than enough stress in my life, from work. I enjoy my obscurity in Debian. It allows me to go away for long periods of time, and to ignore any discussions, topics, and people that annoy or frustrate me, if I don't happen to want to tackle them at any one time. I couldn't do that if I was DPL.

NOT a platform for Debian project leader election, 2016

Apart from what the Debian constitution formally specifies, I find that the important duties of the Debian project leader are:

  • Inspire, motivate, and enable the Debian community to make Debian better: to be the grease that makes the machinery run smoother. It is not important that the DPL do things, except to make sure other people can do.
  • Delegate what work can be delegated. The DPL is only one person, and should not be a bottleneck. Anything that can reasonably be delegated should be delegated.
  • Deal with mundane management tasks. This includes spending Debian money when it makes things easier, such as by funding sprints.
  • Represent the project in public, or find people to do that in specific cases.
  • Help resolve conflicts within the project.

I do not feel it is the job of the DPL to set goals for the project, technical or otherwise, any more than any other member of the project. Such goals tend to best come from enthusiastic individual developer who want something and are willing to work on it. The DPL should enable such developers, and make sure they have what they need to do the work.

My plan, if elected

  1. Keep Debian running. Debian can run for a long time effectively on autopilot, even if the DPL vanishes, but not indefinitely. At minimum, the DPL should delegate the secretary and technical committee members, and decide on how money should be spent. I will make sure this minimum level is achieved.

  2. While I have no technical goals to set for the project, I have an organisational one. I believe it is time for the project to form a social committee whose mandate is to step in and help resolve conflicts in their early stages, before they grow big enough that the DPL, the tech-ctte, listmasters, or the DAM needs to involved. See below for more details on this. If I am elected, I will do my best to get a social committee started, and I will assume that any vote for me is also a vote for a social committee.

Social committee

(Note: It's been suggested that this is a silly name, but I haven't had time to come up with anything better. I already rejected "nanny patrol".)

We are a big project now. Despite our reputation, we are a remarkably calm project, but there are still occasional conflicts, and some of them spill out into our big mailing lists. We are not very good, as a project, in handling such situations.

It is not a new idea, but I think its time has come, and I propose that we form a new committee, a social committee, whose job is to help de-escalate conflict situations while they are still small conflicts, to avoid them growing into big problems, and to help resolve big conflicts if they still happen.

This is something the DPL has always been doing. People write to the DPL to ask for mediation, or other help, when they can't resolve a situation by themselves. We also have the technical committee, listmasters, GRs, and the expulsion process defined by the DAM. These are mostly heavy-weight tools and by the time it's time to consider their use, it's already too late to find a good solution.

Having the DPL do this alone puts too much pressure on one person. We've learnt that important tasks should generally be handled by teams rather than just one person.

Thus, I would like us to have a social committee that:

  • is reasonably small, like the technical committee
  • attempts to resolve conflicts at an early stage
  • is delegated by the DPL to have authority to do this
  • doesn't necessarily have direct authority to remove people from mailing lists, IRC channels, or expel them from the project, but has authority suggest such measures to the appropriate team
  • may do other things, such as educate people on how to resolve conflicts constructively themselves, or deal with chronic conflicts in addition to acute flare-ups

About me

I've been a Debian developer since 1996. I've been retired twice, while I spent large amounts of time on other things. I haven't been a member of any important team in Debian, but I've been around long enough that I know many people, and have a reasonable understanding of how the projects works.

Posted Sat Mar 12 07:39:55 2016 Tags:

At Debconf15 I gave a talk on topic of having backups be a default service on Debian machines. In that talk, I proposed that we create infrastructure to be included in a default Debian install to manage backups.

I still think this is a good idea, but over the past several months, I've had nearly no time at all to actually do this.

I'm afraid I have to say now that I won't be able to work on this in time for the stretch release. I would be very happy for others to do that, however.

The Debian wiki page https://wiki.debian.org/Backup acts as a central point of information for this. If you're interested in working on this, you can just do it.

Posted Sun Jan 31 10:50:17 2016 Tags:

For more, see the archive.