Hypertext and software patents

While I was visiting my family in the UK recently, I went through some old paperwork. One of the documents was my undergraduate dissertation from 1989/1990. My project was to build a graphical hypertext browser and editor which could link existing text files. I’d been experimenting with hypertext on and off since the mid 80s, when I had learned about it from Ted Nelson’s book “Dream Machines”.

My browser used a two-pane interface resembling a frameset, with links in a list on the right of the document being viewed. It wasn’t terribly elaborate — after all, it was only an undergraduate project — but it’s worth remembering that this was before Tim Berners-Lee developed his first web browser, and several years before Netscape introduced framesets.

I’ve scanned the document to PDF and filed it away, mostly because there’s a chance it might one day be useful as a demonstration of prior art to invalidate a software patent. I don’t plan on publishing it here, because it really isn’t anything special; I’m just writing this note for the benefit of anyone who suddenly needs prior art around framesets, hyperlinks and extrinsic links, who might want to contact me.

Security practices: experts vs non-experts

Google recently published the results of a survey of computer users to see what security practices they follow. The paper splits the users into two groups — security experts (people who deal with computer security for a living as part of their job, like me), and non-expert users.

For each of the two groups, Google assembled a list of the top five security practices followed. You probably won’t be surprised to learn that the two groups had almost completely different lists.

I thought I’d write and explain why the experts make the choices they do, and what it is about the non-experts’ choices that make them relatively poor ones.

First on the experts’ list is install software updates.

I do this. It’s part of my job, as a server sysadmin, but I do it at home as well. As soon as a software update is available which includes any bug fixes, I install it. This morning I upgraded WordPress, for example. In fact, I don’t just wait to be told about bug fixes — I look out for announcements of security issues on prominent tech news sites, and go find the fixes.

Most other people aren’t as diligent. My spouse has a bad habit of leaving software updates for weeks. Yes, it often requires rebooting, but really, it’s important! 99% of attacks happen as a result of known security flaws that people have been too lazy to install the fix for. That’s why Microsoft is going to be requiring that users accept automatic updates for Windows 10.

Do not turn off your computer.
Cian Ginty via Compfight

Second on the experts’ security practices list is use unique passwords. This is absolutely crucial. Every few months a web site gets hacked and the passwords are stolen and leaked. If it happens to you and you’ve used the same password everywhere, you have to hurry and change your password before some bad guy cross-references your accounts. Worst case for me, they get the password to the one site that was hacked.

I go a step further. For accounts where security is particularly important — like bank accounts — I use a unique user ID different from the one I use anywhere else. A criminal won’t be able to find out my bank login just because some forum got hacked.

Of course, using unique passwords means having too many passwords to memorize. That’s why an important step towards better online security is using some sort of password manager — whether it’s a paper notebook you keep in your office, an application, or a fancy online service like LastPass. That’s item 5 on the experts’ list.

Item 3 on the experts’ list is use two-factor authentication. I do this too, using the free open source FreeOTP. It implements TOTP, which is the Internet standard for time-dependent two-factor authentication. TOTP is the basis of Google Authenticator, so anywhere Google Authenticator works, FreeOTP works.

86/365+1 CrypticCreative Commons License
Dave Crosby via Compfight

Two-factor authentication is pretty simple once you set it up. When you log in, you’re asked for the secret code. You run the app on your phone, and it shows you a 6-digit number, which you enter into the login form. That’s it, you’re done. The clever parts that make this good for security are:

  1. The 6-digit code changes every 60 seconds, and
  2. Each code only works once.

This immediately helps with all kinds of security problems:

  • A web site hack exposes my password? Not a big deal, without the code generator it’s useless, so there’s no hurry to change it.

  • I log in using someone else’s computer, and they have keylogger malware on it? Not a major problem — the code only works once, so the logged password and code can’t be used again.

  • Someone shoulder-surfs me and watches me type my password and sees the code number on my phone screen? Not a problem, the code number only works once, and when I’m done logging in I lock the phone so they can’t get a fresh number, even assuming they could hurry to use the information within a minute.

Two-factor authentication doesn’t prevent all attacks, of course, but it’s a massive improvement over just a login and password.

Unfortunately, there are a few sites that still don’t support TOTP — including Apple, eBay and PayPal. I don’t use SMS-based two factor authentication because SMS isn’t reliable and doesn’t necessarily work if you’re overseas, meaning it’s all too easy to get locked out or have to wait half an hour to log in.

Item four on the experts’ list is use strong passwords. You might wonder why it’s so far down the list. Well, using a strong password is less important than using a unique one. If you use unique weak passwords, well, it probably won’t be that big of a deal — most web sites limit the number of attempts to log in remotely or how quickly you can try again if you get the password wrong. The real danger of a weak password is that any hacker getting the encrypted password file will be able to crack the password immediately. As long as I use unique passwords, that only lets them get into my account on the site that was hacked.

Contrast that with the situation if you use the same strong password everywhere. Eventually a web site will be hacked that doesn’t properly encrypt its passwords, and the hackers will now have your password to every site you use. Much, much worse. So unique passwords are much more important than strong ones — not that I’m encouraging you to use weak passwords, by any means.

As an aside, you might surprised by just how long and complicated a password has to be in order to be a strong password. The bare minimum password length you should be using is 12 characters, and that’s assuming you use random combinations of upper- and lower-case letters, numbers and symbols.

Now let’s go through the non-experts’ list of security practices, and consider why they aren’t very useful.

First on the non-expert list is to use antivirus software. I’m honestly not sure how important that is on Windows these days, but I’ll confess that I haven’t used Mac or Linux antivirus software in years. I haven’t heard of anyone I know getting a Mac or Linux virus in years either. Trojans, yes, but not viruses.

Second on the non-experts’ list is strong passwords, and I’ve already discussed why that’s less important than you think — but I’ll point out that it’s also trivial to use strong passwords anyway if you do what the experts do, and use a password manager.

Third on the non-experts’ list is to change passwords frequently. A few web sites seem to think this is good practice, but it usually isn’t.

First of all, if you have a strong password, there is nothing to be gained by changing it. If it’ll take a hundred years to crack it, changing it to another strong password gains you nothing.

Even if you don’t have a strong password, there’s the downside that changing passwords are harder to remember. That means people are more likely to write them down or leave them on a Post-It note on the monitor.

But the big reason why I don’t change passwords often is that the real reason for changing them isn’t to make them harder to guess. Rather, it’s to limit the amount of time for which a stolen password can be used. If someone steals my password and I don’t change it for a year, they can use that password for a year to snoop on my account.

Except… I use two factor authentication. But even ignoring that, most hackers today aren’t going to quietly snoop. If you read stories about victims of hacks, you’ll learn that today’s crooks generally steal a password and then immediately change it, to lock the rightful owner out of the account while they steal whatever they’re after.

So password changing is really only protecting you against a particular class of attacker — say, the jealous ex-boyfriend who wants to read your e-mail, or the corporate competitor who wants to spy on your sales prospects. For things like your Amazon account or your bank account, that’s not the kind of attack you need to worry about.

Now what is my password
Robin Hutton
via Compfight

Number 4 on the non-experts’ lists is to only visit web sites you know.

Well, that’s an easy guideline to evaluate — it’s completely useless. Consider all the well-known companies whose systems have been hacked: Home Depot, Target, Sony, AT&T, Nieman Marcus, Michaels, Yahoo!, eBay, Evernote, Apple, JPMorganChase, Snapsave… and that’s just a partial list of the hacks in 2014 alone!

Now consider how many of the rest of the sites you know and visit run ad banners, and ponder the fact that in the same year, Google’s Doubleclick ad network exposed millions of computers to malware.

No, restricting yourself only to familiar big name web sites isn’t going to do anything for your security. In fact, since the big names are big targets and seem to be doing a terrible job of protecting you, you’re probably safer visiting some obscure web forum nobody has heard of. In a 2012 study by Symantec, the most dangerous sites to visit were religious sites — you were safer visiting sleazy porn sites than the site of your well-known local church.

The final non-expert security practice is don’t share personal information. That’s not a terrible principle to follow, but it’s way down the list of effective security measures. The problem is that you need to share personal information with many companies in order to do business with them, and criminals don’t get your personal information by asking you to share it — they get it by hacking the database where it has been collected.

The most recent example is the data breach at the Office of Personnel Management. That’s the agency that performs 90% of US government background checks. So sorry, but you don’t get to decide whether to share your personal information with the OPM — at least, not if you want a government job. And that database got stolen because as a cost-saving measure, the OPM had decided to use offshore contractors to manage their systems — so as part of that deal, they handed root access to their systems to a Chinese national living in China and another person living in Argentina. Remember, this is the database that contains all the background check information for US spies.

Similarly, the Hollywood stars who had their personal lives exposed by the Sony hack didn’t really have any choice about whether to share information with Sony, if they wanted a career in Hollywood. And while you might have read that the “hack” was the work of North Korea, the evidence suggests that it was the work of Russian hackers tipped off by a disgruntled insider.

Finally, let me throw in a vital security practice which was missing from the experts’ list: backing up.

As more and more of our data gets stored in the cloud, or on Internet-connected systems, it becomes more and more likely that a security incident will result in data being wiped. Last year, a source code hosting system called Code Spaces was hacked, and the hackers wiped all their data. Plenty of small web sites get hacked too, and forums get wiped.

WordPress is a notorious target; you don’t have to go far to find stories of sites being wiped by hackers. That’s why I keep backups of everything I post. Not just copies, either — rolling incremental backups which will let me roll back some number of days, in case it takes a while for me to notice that something is wrong. The backups are stored on a machine in my office, which you can’t get to from the web server.

So, back up your data. Use Apple Time Machine or something like it. Keep backups of everything you store in the cloud.

Why the mobile web sucks

Over at The Verge, Nilay Patel writes:

I hate browsing the web on my phone.

I do it all the time, of course — we all do. Just looking at the stats for The Verge, our mobile traffic is up 70 percent from last year, while desktop traffic is up only 11 percent. That trend isn’t going back; phones are just too convenient, beckoning us to waste more and more of our time gazing at their ever-larger screens.

But man, the web browsers on phones are terrible. They are an abomination of bad user experience, poor performance, and overall disdain for the open web that kicked off the modern tech revolution. […]

Now, I happen to work a media company, and I happen to run a website that can be bloated and slow. Some of this is our fault: The Verge is ultra-complicated, we have huge images, and we serve ads from our own direct sales and a variety of programmatic networks. Our video player is annoying. (I swear a better one is coming, for real this time.) We could do a lot of things to make our site load faster, and we’re doing them.

I couldn’t resist doing a quick analysis of page speed for that very article:

  • 22 seconds to fully load on desktop
  • 2.6MB of data
  • 81 separate JavaScripts
  • …plus 12 more JavaScripts blocked for XHR violations
  • …plus chartbeat.net trackers giving 503 errors (site over capacity)
  • …plus 3 Flash movies which would have bloated things up much further if I didn’t have Flash disabled

And yes, if I emulate a mobile device, it’s the same 2.6MB of data and 81 JavaScripts.

Gosh, I wonder why The Verge is miserable to read on mobile, eh?

Expecting mobile browser makers to magically solve your site’s performance problems is unrealistic. Mobile network data has high latency and (comparatively) low speed, and that’s inherent to the technology. You simply must engineer your web site with mobile browsers in mind.

For those who have ignored web performance, the current trend of more and more traffic coming from mobile browsers is going to lead to a painful reckoning.