Countering Trusting Trust through Diverse Double-Compiling, ACSAC 2005
Something new: I have a section about my work to counter the “Trusting Trust” computer security attack. The “Trusting Trust” attack is a very old and incredibly nasty attack in computer security. Karger and Schell published information about this attack in 1974, and Ken Thompson (of Unix fame) made it much more widely known in 1984 in his Turing award speech “Reflections on Trusting Trust.” Ken Thompson even demonstrated it; he gained complete control over another system, and that system’s owners never detected the subversion. Up to now it’s been presumed that the “Trusting Trust” attack is the essential uncounterable attack.
What exactly is the trusting trust attack? Basically, if an attacker can get a Trojan Horse into the binary of a compiler, at any time, you’re essentially doomed. The subverted compiler can subvert itself, indefinitely into the future, as well as anything else it compiles.
I’ve worried about this attack for a long time, essentially since Thompson made his report. If there’s a known attack that cannot be effectively countered, even in theory, should we really be using computers at all? My hope is that my work in this areas aids the computer security field writ large.
The reason I note this in my blog is that I’ve finally formally published my paper that describes a technique for countering this attack. The paper is Countering Trusting Trust through Diverse Double-Compiling (DDC), and it was published by ACSAC 2005. Here’s a local copy, along with more info and material. Here’s the abstract of that paper:
An Air Force evaluation of Multics, and Ken Thompson’s famous Turing award lecture “Reflections on Trusting Trust,” showed that compilers can be subverted to insert malicious Trojan horses into critical software, including themselves. If this attack goes undetected, even complete analysis of a system’s source code will not find the malicious code that is running, and methods for detecting this particular attack are not widely known. This paper describes a practical technique, termed diverse double-compiling (DDC), that detects this attack and some compiler defects as well. Simply recompile the source code twice: once with a second (trusted) compiler, and again using the result of the first compilation. If the result is bit-for-bit identical with the untrusted binary, then the source code accurately represents the binary. This technique has been mentioned informally, but its issues and ramifications have not been identified or discussed in a peer-reviewed work, nor has a public demonstration been made. This paper describes the technique, justifies it, describes how to overcome practical challenges, and demonstrates it.
I just got back from the ACSAC 2005 computer security conference. Several interesting papers there, on a variety of topics.
An aside: At ACSAC 2005, Aleks Kissinger (from the University of Tulsa) also presented work that he and I had done on micro-tainting. This was the presentation “Fine-Grained Taint Analysis using Regular Expressions,” which was part of the Works in Progress. Basically, we noted that instead of assigning “taint” to a whole value, such as a string, you could assign taint on subcomponents, such as each character. Then you could assign rules that identified the input paths and what could come in — typically zero or more tainted characters — and rules on output paths. We concentrated on defining regular expressions for what is legal, though any other expression for patterns such as BNFs would be fine too. We noted that you could then check statically or dynamically. For the static case, when you work backwards, if the check “fails” you can even trivially derive the input patterns that cause security failures (and from that information it should be easy to figure out how to fix it). Aleks has recently made some good progress by transforming the regular expressions into DFAs. There was another ACSAC presentation on doing taint analysis with Java, but this was the traditional “whole variable” approach that is used in many languages, but through which many vulnerabilities slip by. We hope this micro-tainting approach will lead to improved tools for detecting security vulnerabilities in software, before that software is delivered to end-users.
path: /security | Current Weblog | permanent link to this entry
A relative gave me an old laptop with Windows 98, Which led me to the question: can you take a castoff laptop and, with zero or very little money, improve it so it’s more useful and can run more “modern” software? The answer is: yes! In fact, this turned into a little game/project for me, and I learned a few things along the way. So I wrote down what I decided to do, in the hopes that you may find these ideas useful for reviving an old laptop yourself:
I think it’s a shame that older machines sometimes rot in closets instead of helping people, and I hope that this document will help change that. With a little elbow grease (and adjusted expectations), you can still get mileage out of an older laptop.
I talk about getting new hardware (keep it cheap!), buying a wireless card, making backups and moving windows to a new disk, installing GNU/Linux, updating a Windows 98 System, and making a boot floppy for windows.
path: /misc | Current Weblog | permanent link to this entry
Ever since Richard Stallman wrote his article Free But Shackled - The Java Trap, many developers have avoided using Java. Why? At the time, there was no practical way to delivery fully free-libre / open source software (FLOSS) using Java while still being fully functional. Not because it was illegal to have a FLOSS Java implementation, but simply because the FLOSS tools and libraries weren’t available.
But things have been moving quickly; many developers have been working hard to develop an implementation of Java that doesn’t depend on proprietary software. The problem is that there hasn’t been a simple way to understand what’s going on — unless you’re an “insider”.
Thankfully, that’s changed. Escaping the Java Trap: A practical road map to the Free Software and Open Source alternatives is a simple 3-page summary that surveys the many different FLOSS projects that are building, testing, and distributing a complete FLOSS Java implementation (including mountains of libraries). As the roadmap notes, “Important large applications like JOnAS, OpenOffice.org 2, Eclipse 3 and Tomcat 5 are known to work. This document provides a road map of the various projects; how they work together, where they are, where they’re going, and how we make sure that they work well and are compatible.”
This is the roadmap I noted earlier as part of my FISL 2005 travelogue. Although I helped the other authors write it, I really operated as a ghost writer rather than speaking with my own voice. Basically, I really wanted to know what the state of FLOSS Java implementations was, and I was fortunate to be able to talk with the top experts at FISL. I promised them if they told me about the various parts, I would in turn help them describe it in a simple way. So the material is really all theirs — I was just lucky enough to be the first recipient of it.
Other articles also help give more perspectives on the topic, too. The state of Java on Linux by Tom Tromey has some interesting material, for example. But I know of no other document that gives such a wide overview of how a full FLOSS implementation of Java (TM) is getting built, tested, and distributed.
Again, take a peek: Escaping the Java Trap: A practical road map to the Free Software and Open Source alternatives
path: /oss | Current Weblog | permanent link to this entry
There has been a lot of virtual ink spent on OpenDocument accessibility. I’ve written up a short essay on OpenDocument accessibility, where I point to some other resources that talk about OpenDocument accessibility, and point out that there are lots of ways to get it. For a vast number of cases, products that natively support OpenDocument do just fine today. For some cases, just use Microsoft Office with an OpenDocument plug-in; you already have to use a third party plug-in to add accessibility in those cases, so saying that you can’t add a third-party plug-in for OpenDocument as well is simply hypocritical.
I also post a lengthy letter from Wesley Parish, who is disabled and yet is a strong supporter of OpenDocument. The article has more, but here are a few quotes: “It is necessary for the disabled to have access to all government information relevant to them, in a file format that is readily available for as many different applications from as wish it, one that does not insist that one jump through licensing hoops in order to implement it, one that can be readily extended in the future according to need - and one that can not be used as an excuse by lazy bureaucrats to deny me my rights! The question currently buzzing in Massachussetts is , “Does Open Document Format limit accessibility?” For myself, I find it does not. [In Computer Science] I found one of the most persistent concepts was a strict separation between data and executable code. ODF provides that strict separation, defining data separately from the code. … An open specification that allows ANYONE to implement accessibility solutions is the way to solve the problems of access by the the blind and other disabled. Otherwise, government data will be tied to specific programs and NOT accessible to all, and in time, NOT accessible at all.”
So go take a peek at my short essay on OpenDocument accessibility.
path: /misc | Current Weblog | permanent link to this entry
November 2005 release of “Why OSS/FS? Look at the Numbers!”
It’s November, and I’m putting out another release of “Why Open Source Software / Free Software (OSS/FS, FLOSS, FOSS)? Look at the Numbers!” This paper continues to provide “quantitative data that, in many cases, using open source software / free software (abbreviated as OSS/FS, FLOSS, or FOSS) is a reasonable or even superior approach to using their proprietary competition according to various measures. This paper’s goal is to show that you should consider using OSS/FS when acquiring software.”
The big news is that I’m releasing a presentation based on this report. The presentation is at http://www.dwheeler.com/numbers — and you can use it as-is or as the starting point for own presentations. The presentation is being released in two formats, PDF (for reading) and OpenDocument (for presenting or editing). I’m hoping that many other people will be willing to create translations of this presentation. The presentation is much smaller, and thus much easier to translate, than my thorough (but much larger) work.
I’ve made a number of changes since May as well. Here are some of the highlights:
Were I to start now, I think I’d use the term “FLOSS” (Free-Libre / Open Source Software) as my all-encompassing term, so I mention that at the beginning. FLOSS is much easier to say than some of the alternatives. The term “Free Software” is widely misunderstood as being “no cost”, so by itself I find that it’s not helpful for explaining things. The term Free-Libre is a big improvement because it at least hints at what its promulgators intended the term to mean. However, I’ve used the term OSS/FS all over, and it’s awkward to change now (and people might not find the document they were looking for), so I haven’t changed my own documents.
Enjoy!
path: /oss | Current Weblog | permanent link to this entry
Internet Explorer: So insecure, it’s only safe 7 days a year?!?
I recently learned some amazing — unbelievable — shocking data. It turns out that there were only 7 days in 2004 that you could have somewhat safely used Internet Explorer (it was October 12-17), even assuming that attackers only used publicly-known attacks, and that you were only worried about the worst kind of attacks. What does that mean? Let me set the stage first… and I’ll conclude what to do at the end.
In my article how to secure Microsoft Windows (for home and small business users), I give advice for people who want to keep using Windows but have some security. One piece of advice: stop using some of the most vulnerable programs, such as Internet Explorer (IE) and Outlook, and instead more secure alternatives (such as the freely-available Firefox and Thunderbird). It should be self-evident that replacing insecure programs with more secure programs will make you more secure! But let me deal with two complaints: (1) why should I change, and (2) is Internet Explorer (IE) really so much worse?
First - why should you change to using more secure software? Because if you’re not willing to select a more secure program, then you are part of the problem — you are causing everyone to have insecure programs, as well as causing your own misfortune. Why? Because vendors will not make secure products unless customers prefer them. “The marketplace” decides what’s successful, and you are part of it. I’m tired of hearing “my machine is full of spyware”; if you chose to use a product that is known to have that problem, then you need accept the consequences of your choices. You can’t claim ignorance at this point, the news has been circling for a long time. Sure, the attackers should be convicted. But since there are prowlers in the alleyway, please don’t invite them into your house, and then act surprised when they take the silverware. Yes, you can’t spend all your time on securing things, and you need to have useful (not just secure) products, but it’s easy to replace these programs with perfectly good alternatives.
And second — IE really is worse. This isn’t just a random opinion, and it’s not Microsoft-bashing. There is lots of evidence that, in particular, Internet Explorer has become a malware delivery system. See, for example, David Hammond’s comments on Internet Explorer.
But I’m blown away by one particular study I just learned about, which shows the problem is even more serious than I thought. Scanit’s Browser Security Test group “A Year of Bugs” analyzed the vulnerability reports in 2004 for three popular browsers: Microsoft’s Internet Explorer, Mozilla-based browsers (including Firefox and Netscape), and Opera. Since not all vulnerabilities are equal, they only considered the especially dangerous “remote code execution” vulnerabilities, i.e., defects that allow a “malicious web page or e-mail message to execute arbitrary code or OS commands on the viewer’s computer.” They then compared the time from the “public announcement of the vulnerability to the time when the fix is available to the general user population.” They had an incredibly simple metric: every day there’s a publicly-known vulnerability, for which there is no patch available from the vendor, is an unsafe day. That’s a metric anyone can understand: how many days are you vulnerable to the worst attacks that are (1) known worldwide but (2) there’s nothing you can do about it?
Their results: there were only 7 days Internet Explorer was safe to use in the entire year of 2004. That means that 98% of the year, Internet Explorer was not safe to use. Is it any wonder people like me say “please don’t use it?”
Let me quote their study: “there was only one period in 2004 when there were no publicly known remote code execution bugs - between the 12th and the 19th of October - 7 days in total.” That means that someone who diligently kept their installation patched every day of the year (do you install the latest patches every day?) was still known to be vulnerable 98% of the time in 2004. The rediculous excuse “well, it wasn’t exploitable” doesn’t work, either; they found that for “200 days (that is, 54% of the time) there was a [known] worm or virus in the wild exploiting one of those unpatched vulnerabilities.” And that only counts known attacks. Frankly, 2004 was a disturbing year for IE; at the beginning of the year there were two known unpatched vulnerabilities, and 2004 ended with an “unpatched HTML Help ActiveX control vulnerability and [the worm] Trojan.Phel using it to install a backdoor.” And remember, this is only the publicly-known attacks, of the worst kind.
Now let’s not let alternatives off the hook; Mozilla-based programs and Opera had unsafe days too. But compared to IE’s “98% unsafe” value, Opera had unsafe days only 17% of the time, and the Mozilla/Firefox were only unsafe 15% of the time (and about half of that 15% only affected MacOS users). Let’s look at the details:
On June 28, 2004, Microsoft’s Bill Gates told Australians that while other operating system vendors took 90-100 days to release a security patch, Microsoft had this time “down to less than 48 hours.” And Microsoft has clearly stated that IE is part of their operating system. Yet ZDNet found that Microsoft had failed to fix a critical known IE vulnerability for nearly nine months Things got so bad that in late June 2004, the U.S. Department of Homeland Security’s Computer Emergency Readiness Team (CERT) recommended using browsers other than Microsoft Corp.’s Internet Explorer (IE) for security reasons. (That’s not exactly how they officially worded it… but I think many people correctly realized that that was the subtext). And even after all that, IE still had unpatched vulnerabilities for the worst kind of vulnerabilities through most of the rest of the year.
Let me throw in an aside about reporting vulnerabilities. Some companies try to convince vulnerability reporters to “keep quiet” until they fix the problem… and then just never fix it. The vulnerability is still there, though it’s officially not publicly known… and if one person can find it, others will too. That head-in-the-sand approach used to be common, but our systems are just too important to allow that to continue. That’s why I think it’s a good idea for vulnerability reporters to give suppliers 14 days to fix the problem, with a few more days if there’s a really good reason to allow unusual delays. Fourteen days should be more than enough time to fix a critical problem in the vast number of cases, but it puts the supplier on notice that leaving its customers permanently vulnerable to a known weakness is unacceptable. Certainly 30 days should be plenty for even complex problems. If your supplier can’t normally turn around patches for critical fixes in 14 days or less — and certainly by 30 days — perhaps you need a new supplier. Gates says 48 hours is enough, half of the Mozilla problems had one-day turnaround times, and all the Mozilla problems (even the complex ones) were fixed within 30 days of a confirming report.
I will say, with relief, that Microsoft is finally going to release a new version of Internet Explorer, with some attempt at fixing the security problems. But the reports worry me. CERT’s July 2, 2004, notification noted some of the major design decisions that make Internet Explorer so easy to exploit: “There are a number of significant vulnerabilities in technologies relating to the IE domain/zone security model, the DHTML object model, MIME type determination, and ActiveX.” Yet everything I’ve read suggests that they will not fundamentally change all of these major design decisions, so at least some of their fundamental design weaknesses will probably still be there. Disabling ActiveX completely by default for all sites would be a help, for example: The “zone” model doesn’t work (it’s too easy to fool), a massive number of signed or pre-installed ActiveX components are vulnerable, and people just click “ok” when another ActiveX component is sent that ActiveX is a synonym for “send me malware”. I really hope that IE is much more secure, but we’ll see. The past does not necessarily predict the future.. but it’s usually a good way to bet.
And the next version of Internet Explorer will still not support the Internet standards. This was reported by Paul Thurrott in Windows IT Pro, among others. So many standards-compliant web sites will still be inaccessible to Internet Explorer users.
But even worse… the next version of Internet Explorer is only going to go to XP Service Pack 2 users. Microsoft has various excuses for this. That’s rediculous; why does everyone else, who already paid for Internet Explorer, have to suffer? Unless you pirated Windows, Internet Explorer was part of the purchase price of your machine or included in a separate license; it’s actually possible you paid for Windows several times. But most Microsoft Windows users don’t use XP Service Pack 2; even many XP users haven’t installed Service Pack 2 because of the legion of incompatible changes and machine lockups it caused many. A vast number of people do not have Windows XP; Windows 2000 is in widespread use, and even Windows 98/ME have significant use (25% by some measures). It’s not true that a secure browser requires Service Pack 2; other browser makers manage it.
Don’t use the current versions of Internet Explorer normally, and wait a number of months before thinking about using the new version. In particular:
Note: I don’t make any money no matter what web browser or operating system you choose. I suggest preferring advice about this topic from others who can say the same. And obviously I’m speaking only for myself, not anyone else, though it’s clear that many, many others have come to the same conclusions.
path: /security | Current Weblog | permanent link to this entry
Travelogue Available of International Free Software Forum (FISL), Brazil
As I noted earlier, I spoke at the “6th International Free Software Forum” / Fórum Internacional Software Livre (FISL). The conference was 1-4 June, 2005.
I’ve just posted my travelogue of the 6th International Free Software Forum in Porto Alegre, Brazil. If you didn’t get to go, this may give you an idea what it was like. I also try to make some observations along the way, which hopefully you’ll find interesting. For example, I comment about Brazil’s relationship with open source software / Free software (OSS/FS), which I found very interesting. I also try to explain how I ended up helping to document the complicated inter-relationships between some of the many OSS/FS Java projects.
path: /oss | Current Weblog | permanent link to this entry
I’ll be speaking 3 June at the International Free Software Forum (FISL), Brazil
I’ll be speaking at the “6th International Free Software Forum” in Porto Alegre, Brazil. Its Portuguese name is 6° Fórum Internacional Software Livre, so this is abbreviated as “FISL”. The conference itself is 1-4 June, 2005.
I’ll be speaking on June 3, 2005, from 17:15-18:15. I’ll be presenting in room 41A (in the 41 building). This is their biggest room, with 1000 person capacity and sessions with simultaneous translation. That may sound like a lot, but as of May 27 there were 3,180 attendees registered, and I’m sure there will be more at the door. So if you’re interested, please come early!
My presentation will summarize my work, Why Open Source Software / Free Software (OSS/FS)? Look at the Numbers! Here’s the summary:
“The goal of this lecture is to convince you to consider using OSS/FS when you’re looking for software, using quantitive measures. Some sites provide a few anecdotes on why you should use OSS/FS, but for many that’s not enough information to justify using OSS/FS. Instead, this paper emphasizes quantitative measures (such as experiments and market studies) to justify why using OSS/FS products is in many circumstances a reasonable or even superior approach.”
I hope to see you there!
path: /oss | Current Weblog | permanent link to this entry
My long-time boss, mentor, and friend, Dr. Dennis W. Fife, passed away on April 28, 2005. His funeral was held on May 2, 2005.
Dennis was a good man, who dearly loved truth. On many occasions he spoke the truth to those he worked for, even when the truth was very unpopular. Yet he never did this maliciously; he always did this gently, trying to help people move to a more productive path.
He was smart, and enjoyed learning new things. He wrote important documents of their time on approaches to managing software development; many of his ideas are still applied today (though not always acknowledged!). He later helped many others apply computing to solve real problems. Indeed, he was always willing to share his knowledge with others, including the mentoring of many who are very grateful for his guidance. When asked a question, he’d think about it for a moment, and afterwards he’d often reply with a helpful insight into the situation.
Dennis also had a dry, subtle wit that many of us grew to love. Dennis enjoyed comics like the “Far Side”, with its often twisted view of the world. He would often would say subtle things with a twinkle in his eye… it might take you a moment to get it, and then you’d laugh out loud.
I will greatly miss him.
path: /misc | Current Weblog | permanent link to this entry
Trend: Simple, readable text markup languages
Here’s a new(?) trend, that shows that everything old really is sometimes new again. What’s the trend? Simple, highly readable markup languages.
In some situations, typical document formats (such as OpenDocument or Word .doc format) simply don’t work well. This includes massive collaboration over an internet, or for creating relatively simple/short documents. Although existing markup languages like DocBook, HTML/XHTML, LaTex, and nroff/man all work, they’re often complicated to write and read. You could use SGML or XML to create your own markup language, but that doesn’t really address the need for simplicity. None of these work very well if you expect to have many users who don’t really understand computers deeply (HTML comes closest, but complicated HTML documents become unreadable in a hurry).
Thus, there’s been a resurgence of new markup languages that are really easy to read and write, which can then be automatically translated to other formats. Two especially capable examples of this trend seem to be AsciiDoc and MediaWiki:
The various Wiki languages, such as MoinMoin’s, etc., are also examples of this. But there are a lot of different ones, all incompatible. Here’s some text on StructuredText, ReStructuredText, and WikiText. Many Wiki languages use CamelCase to create links, unfortunately; a lot of people (including me) find that convention ugly and awkward (MediaWiki dumped CamelCase years ago; MediaWiki internal links look like this: [[MediaWiki link]]). Most Wiki languages are too limiting for wider use.
No doubt there are others. One I learned about recently is Markdown. Markdown is a notation for simply writing text and generating HTML or XHTML; it seems to be focused on helping bloggers.
Anyway, it’s an interesting trend! I’ve created a new essay about this at http://www.dwheeler.com/essays/simple-markup.html; if I learn about interesting new links related to this, I’ll add them there.
path: /misc | Current Weblog | permanent link to this entry
Updated: Comments on OSS/FS Software Configuration Management (SCM) Systems
I’ve updated my paper Comments on Open Source Software / Free Software (OSS/FS) Software Configuration Management (SCM) Systems. This is basically a review of several of these systems. I can’t possibly look at them all, but I intend for it to be a useful place to start. Given the recent issues with BitMover (maker of BitKeeper), causing Linus Torvalds to look at other SCM tools to manage Linux kernel development, this seems pretty timely.path: /oss | Current Weblog | permanent link to this entry
April 2, 2005 release of “Why OSS/FS? Look at the Numbers!”
I’ve posted an update of “Why Open Source Software / Free Software (OSS/FS, FLOSS, FOSS)? Look at the Numbers!”
The biggest change? I’ve added a large set of studies about the market penetration of the Mozilla web browsers (primarily the newer Mozilla Firefox, but the older Mozilla suite is also in use), as compared to Internet Explorer (IE). A multitude of studies show that IE is losing market share, while OSS/FS web browsers (particularly Firefox) are gaining market share. Sources of data include general market surveys like WebSideStory, OneStat, Information Week/Net Applications, thecounter.com, and quotationspage.com, as well as more specialized sources such as W3Schools (web developers) and Ars Technica (computer technologists). The figure below extracts data from several sources (there are far more in my paper than I can legibly show), but they all show the market trend over time. The red squares are Internet Explorer’s market share (all versions), and the blue circles are the combination of the older Mozilla suite and the newer Mozilla Firefox web browser (both of which are OSS/FS):
The data seems to show a small, gradual trend in the general web browsing community, with a much larger and more rapid move towards Mozilla and Mozilla Firefox in the home user, technical, web development, and blogging communities. In some cases (such as the Ars Technica technical site and the Boing Boing blog site), Firefox has become the leading web browser! That’s interesting particularly because it can be easily argued that the technical, web development, and blogging communities are leading indicators; these are the developers of the web sites you’ll see tomorrow and some of the heaviest users of the web, all making a switch.
One study not shown in the figure above (because it’s a single point of data) is from XitiMonitor. They surveyed a sample of websites used on a Sunday (March 6, 2005), totalling 16,650,993 visits, and categorized various European users. By surveying on Sunday, they intended to primarily find out what people choose to use, from their homes. Of the German users, an astonishing 21.4% were using Firefox. The other countries surveyed were France (12.2%), England (10.9%), Spain (9%), and Italy (8.6%). Here is the original XitiMonitor study of 2005-03-06, an automated translation of the XitiMonitor study, and a blog summary of the XitiMonitor study observing that, “Web sites aiming at the consumer have [no] other choice but [to make] sure that they are compatible with Firefox … Ignoring compatibility with Firefox and other modern browsers does not make sense business-wise.”
I analyzed this data to determine that 13.3% of European home users were using Firefox on this date in March 2005. How can I justify that figure? Well, we can use these major European countries as representatives of Europe as a whole; they’re certainly representative of western Europe, since they’re the most populous countries. Presuming that the vast majority of Sunday users are home users is quite reasonable for Europe. We can then make the reasonable presumption that the number of web browser users is proportional to the general population. Then we just need to get the countries’ populations; I used the CIA World Fact Book updated to 2005-02-10. These countries’ populations (in millions) are, in the same order as above, 82, 60, 60, 40, and 58; calculating (21.4%*82 + 12.2%*60 + 10.9%*60 + 9%*40 + 8.6%*58) / (82+60+60+40+58) yields 13.3%. This is something you won’t find on other pages; this is new analysis unique to my paper.
For all the detail on web browser market surveys, see http://www.dwheeler.com/oss_fs_why.html#browser-marketshare.
And yes, I’ve made lots of other improvements to the paper. Here are a few examples:
path: /oss | Current Weblog | permanent link to this entry
E-Password comment deadline (April 4) looms - COMMENT NOW
As noted in a Wired article, the U.S. Department of State plans to issue U.S. passports that can be read wirelessly (remotely), and it won’t even encrypt this extremely personal data. This plan is absurd; it appears to give terrorists and organized crime a way to remotely identify U.S. citizens (for murder or kidnapping) and to provide enough detailed personal information to significantly aid identity theft.
The Department of State claims that the new passports can only be read from 10 centimeters and that fibers will prevent any reading while closed. However, most security experts scoff at these claims, noting that people have to open their passports eventually, and doubting that the fiber’s protection will be perfect anyway in real life. Lee Tien, an attorney at the Electronic Frontier Foundation, reports the reading distance as more like 10-30 feet. Bruce Schneier, who just renewed his passport to make sure he will not have an unencrypted passport for another 10 years, says he has yet to hear a good argument as to why the government is requiring remotely readable chips instead of a contact chip — which could hold the same information but would not be skimmable. “A contact chip would be so much safer.”
I think this Department of State plan is going to kill people. There are people in this world who want to hurt or kill Americans, or citizens of some other countries — now we’re giving them an easy tool to help them find Americans (or citizens of some other countries) in foreign countries so that they can be murdered, tortured, raped, or kidnapped for ransom. The ransom stuff alone would fund huge efforts to use this technology in foreign countries to target victims, because it’d be insanely profitable for the immoral.
In my mind, the real problem is the use of wireless technology. This is an area where the convenience of wireless is far outweighed by the disadvantages of getting murdered. Frankly, for data storage, a 2D barcode (which is MUCH cheaper) would have all the advantages of permitting quick storage of a lot of data. If the purpose of the chip is to make forgery harder, then requiring contact would be sufficient.
Is the lack of encryption a problem? Not necessarily, as long as contact is required. After all, if there’s no encryption, then it’s easier to see exactly what data is on the passport (e.g., to verify that it’s correct for you), and the data is supposed to be the same as what’s already on the passport. But it’s a disaster if it’s wireless, because then people who have no business getting the data will be able to retrieve it. Indeed, it’s a disaster that this is wireless at all.
Those who wish to protest this plan have until April 4, 2005, to send their comments to PassportRules@state.gov. I urge you to send in emails asking State to abandon this wireless approach, and that they instead use a system that requires contact. Do it, before someone dies.
path: /security | Current Weblog | permanent link to this entry
March 2005 release of “Why OSS/FS? Look at the Numbers!”
For March - another release of “Why Open Source Software / Free Software (OSS/FS, FLOSS, FOSS)? Look at the Numbers!” I made many changes, here are some of the highlights:
path: /oss | Current Weblog | permanent link to this entry
OWASP Legal Project - Secure Software Development Contract Annex
The Open Web Application Security Project (OWASP) Legal Project has just announced the “Secure Software Development Contract Annex”. This is basically a starting point for a contract to do software development; it tries to spell out exactly what’s required so that the results are secure.
I didn’t develop this text, but I’m glad to see that some people are working on it. In the contracting world, if you don’t specifically ask for it, you don’t get it. Since most contracts today don’t specifically say that a secure result is needed (and what that means), currently the person paying for the software isn’t getting a secure product. Hopefully this sort of thing will solve the problem.
Personally, I think this is a “first draft”; there are things I’d like to see made more specific. For example, I think it should clearly state that in the development environment it should be possible to determine specifically, by name, who wrote any given line of code. And there are many other issues (like automated examination of code) that aren’t covered. In particular, there are many more common vulnerabilities than the top ten list of OWASP. But this is a very interesting and encouraging first start, and I’m glad to see it.
path: /security | Current Weblog | permanent link to this entry
January 2005 release of “Why OSS/FS? Look at the Numbers!”
I’ve made another release of my paper “Why Open Source Software / Free Software (OSS/FS, FLOSS, FOSS)? Look at the Numbers!” I made many changes, here are some of the highlights:
path: /oss | Current Weblog | permanent link to this entry