I’m currently scheduled to be a guest on FLOSS Weekly on Wednesday, 2020-12-16, at 12:30pm Eastern Time (9:30am Pacific, 17:30 UTC). The general topic will be about Linux Foundation work on improving Open Source Software security.
Please join the live audience or listen later. I expect it will be interesting. I expect that we’ll discuss the Open Source Security Foundation (OpenSSF), the Report on the 2020 FOSS Contributor Survey, the free edX trio of courses on Secure Software Development Fundamentals, and the CII Best Practices Badge program.
path: /security | Current Weblog | permanent link to this entry
Report on the 2020 FOSS Contributor Survey
It’s here! You can now see the Report on the 2020 Free and Open Source Software (FOSS) Contributor Survey! This work was done by the Linux Foundation under the Core Infrastructure Initiative (CII) and later the Open Source Software Foundation (OpenSSF), along with Harvard University.
path: /security | Current Weblog | permanent link to this entry
Secure Software Development Fundamentals
If you develop software, please consider taking the free trio of courses Secure Software Development Fundamentals on edX that I recently created for the Linux Foundation’s Open Source Security Foundation (OpenSSF). The trio of courses is free; if you want to get a certificate to prove you learned it, you can pay to take some tests to earn the certificate (this is how many edX courses work).
Here’s a brief summary:
Almost all software is under attack today, and many organizations are unprepared in their defense. This professional certificate program, developed by the Open Source Security Foundation (OpenSSF), a project of the Linux Foundation, is geared towards software developers, DevOps professionals, software engineers, web application developers, and others interested in learning how to develop secure software, focusing on practical steps that can be taken, even with limited resources to improve information security. The program enables software developers to create and maintain systems that are much harder to successfully attack, reduce the damage when attacks are successful, and speed the response so that any latent vulnerabilities can be rapidly repaired. The best practices covered in the course apply to all software developers, and it includes information especially useful to those who use or develop open source software.
The program discusses risks and requirements, design principles, and evaluating code (such as packages) for reuse. It then focuses on key implementation issues: input validation (such as why allowlists and not denylists should be used), processing data securely, calling out to other programs, sending output, cryptography, error handling, and incident response. This is followed by a discussion on various kinds of verification issues, including tests, including security testing and penetration testing, and security tools. It ends with a discussion on deployment and handling vulnerability reports.
The training courses included in this program focus on practical steps that you (as a developer) can take to counter most common kinds of attacks. It does not focus on how to attack systems, how attacks work, or longer-term research.
Modern software development depends on open source software, with open source now being pervasive in data centers, consumer devices, and services. It is important that those responsible for cybersecurity are able to understand and verify the security of the open source chain of contributors and dependencies. Thanks to the involvement of OpenSFF, a cross-industry collaboration that brings together leaders to improve the security of open source software by building a broader community, targeted initiatives, and best practices, this program provides specific tips on how to use and develop open source secur
I also teach a graduate course on how to the design and implementation of secure software. As you might expect, a graduate course isn’t the same thing. But please, if you’re a software developer, take the free edX, my class, or in some other way learn about how to develop secure software. The software that society depends on needs to be more secure than it is today. Having software developers know how develop secure software is a necesary step towards creating that secure software we all need.
path: /security | Current Weblog | permanent link to this entry
If you contribute to Free/Open Source Software, please take the FOSS Contributor Survey!
This survey is a collaboration between the Linux Foundation’s Core Infrastructure Initiative and the Laboratory for Innovation Science at Harvard. Some of the questions are specific to those who write software; if you contribute, but don’t write software, just skip those questions. The goal is to get a better understanding about its development so that we can best work out how to improve its security and sustainability.
Also: please tell others who develop this software about the survey!
One interesting complication about this survey is that it’s difficult to get the word out about such a general survey. People talk about the “open source software community”, but in practice there isn’t one such community, there are many communities with some overlap. I don’t want to spam people who have never expressed any interest in information like this.
I’m currently talking with some folks in the Linux Foundatinon leadership about sending a one-time email only to developers who are already signed up for Linux Foundation mailing lists that are focused on developing open source software. We don’t want to spam people, but I think it’s reasonable to believe that people on those mailing lists are interestd in information related to the development of open source software. One problem with sending to multiple mailing lists is that we don’t want to annoy people by having them receive multiple copies, so we want to work out a way so an individual only gets one copy.
I’ve never done this before, and I hate spam myself. So I’m first checking with Linux Foundation leaders and program managers to see if they think this is reasonable. I think it is, but it’s easy to justify anything to yourself, so I’m waiting to hear from others about what they think.
So getting back to the point - if you contribute to Free/Open Source Software, please take the FOSS Contributor Survey!
path: /oss | Current Weblog | permanent link to this entry
The Linux kernel has earned the CII Best Practices gold badge. The CII Best Practices badge has three badge levels: passing, silver, and gold. Gold badges are especially hard to get, and I congratulate them! More info here: Linux kernel earns CII best practices gold badge
path: /oss | Current Weblog | permanent link to this entry
Verizon still failing to support RPKI
On 2019-06-24 parts of the Internet became inaccessible because Verizon failed to implement a key security measure called Resource Public Key Infrastructure (RPKI). Here’s a brief story about the 2019 failure by Verizon, with follow-on details.
What’s shocking is that Verizon is still failing to implement RPKI. Verizon’s continuing failure continues to make it trivial for both accidents and malicious actors (including governments) to shut down large swathes of the Internet, including networks around the US capital. That’s especially absurd because during the COVID-19 pandemic we have become more dependent on the Internet. There have been many routine failures by accident or on purpose; it’s past time to deploy the basic countermeasure (RPKI) to deal with it. Verizon needs to implement RPKI, as many other operators already have.
The fundamental problem is that the Internet depends on a routing system called Border Gateway Protocol (BGP), which never included a (useful) security mechanism. Resource Public Key (RPKI) provides an important security mechanism to counter certain kinds of BGP problems (either by accident or on purpose). “Why it’s time to deploy RPKI” (RIPE NCC, 2019-05-17) is a short 2-minute video that explains why it’s past time to deploy RPKI.
Verizon already knows that they’re failing to support RPKI; here’s a complaint posted on 2020-04-19 7:16AM that Verizon wasn’t supporting RPKI. It’s clear RPKI is useful; “Visualizing the Benefits of RPKI” by Kemal Sanjta (2019-07-19) shows how RPKI really does help.
If you’re a Verizon customer, you can easily verify Verizon’s status via Is BGP safe yet?. The answer for Verizon users is “no”.
If your Internet Service Provider (ISP) doesn’t support RPKI, please nag them to do so. If you’re a government, and your ISPs won’t yet support RPKI, ask when they’re going secure their network with this basic security measure. It will take work, and it won’t solve all problems in the universe, but those are merely excuses for failure; those statements describe all things that should be done. RPKI is an important minimum part of securing the Internet, and it’s time to ensure that every Internet Service Provider (ISP) supports it.
path: /security | Current Weblog | permanent link to this entry
Software Bill of Materials (SBOM) work at NTIA
Modern software systems contain many components, which themselves contain components, which themselves contain components. Which raises some important questions, for example, when a vulnerability is publicly identified, how do you know if your system is affected? Another issue involves licensing - how can you be confident that you are meeting all your legal obligations? This is getting harder to do as systems get bigger, and also because software development is a global activity.
On July 19, 2018, the US National Telecommunications and Information Administration (NTIA) “convened a meeting of stakeholders from across multiple sectors to begin a discussion about software transparency and the proposal being considered for a common structure for describing the software components in a product containing software.” [Framing Software Component Transparency: Establishing a Common Software Bill of Material (SBOM)]
A key part of this is to make it much easier to define and exchange a “Software Bill of Materials” (SBOM). You can see a lot of their information at the Community-Drafted Documents on Software Bill of Materials. If you’re interested in this topic, that’s a decent place to start.
path: /security | Current Weblog | permanent link to this entry
Initial Analysis of Underhanded Source Code
Announcing - a newly-available security paper I wrote! It’s titled “Initial Analysis of Underhanded Source Code” (by David A. Wheeler, IDA Document D-13166, April 2020). Here’s what it’s about, from its executive summary:
“It is possible to develop software source code that appears benign to human review but is actually malicious. In various competitions, such as the Obfuscated V Contest and Underhanded C Contest, software developers have demonstrated that it is possible to solve a data processing problem “with covert malicious behavior [in the] source code [that] easily passes visual inspection.” This is not merely an academic concern; in 2003, an attacker attempted to subvert the widely used Linux kernel by inserting underhanded software (this attack inserted code that used = instead of ==, an easily missed, one-character difference). This paper provides a brief initial look at underhanded source code, with the intent to eventually help develop countermeasures against it. …
This initial work suggests that countering underhanded code is not an impossible task; it appears that a relatively small set of simple countermeasures can significantly reduce the risk from underhanded code. I recommend examining more samples, identifying a recommended set of underhanded code countermeasures, and applying countermeasures in situations where countering underhanded code is important and the benefits exceed their costs.”
In my experience there are usually ways to reduce security risks, once you know about them. This is another case in point; once you know that this is a potential attack, there are a variety of ways to reduce their effectiveness. I don’t think this is the last word at all on this topic, but I hope it can be immediately applied and that others can build on it.
This was the last paper I wrote when I worked at IDA (I now work at the Linux Foundation). My thanks to IDA for releasing it! My special thanks go to Margaret Myers, Torrance Gloss, and Reginald N. Meeson, Jr., who all worked to make this paper possible.
So if you’re interested in the topic, you can view the Landing page for IDA Document D-13166 or go directly to the PDF for IDA DOcument D-13166, “Initial Analysis of Underhanded Source Code”. (If that doesn’t work, use this Perma.cc link to paper D-13166.) Enjoy!
path: /security | Current Weblog | permanent link to this entry
COVID-19/Coronavirus and Computer Attacks
Sadly, attackers have been exploiting the COVID-19 pandemic (caused by Coronavirus SARS-CoV-2) to cause problems via computers around the world. Modern Healthcare notes that hospitals are seeing active attacks, emails where a sender (pretending to be from the Centers for Disease Control and Prevention) asks the receiver to open a link (which is actually malware), other scams claim to track COVID-19 cases but actually steals personal information. Many official government COVID-19 mobile applications have threats (ranging from malware to incredibly basic security problems). For example, in Columbia the government released a mobile app called CoronApp-Colombia to help people track potential COVID-19 symptoms; the intention is great, but as of March 25 it failed to use HTTPS (secure communication), and instead used HTTP (insecure) to relay personal data (including health data).
In the long term, the solution is for software developers and operators to do a much better job in creating and deploying secure applications. In the short term, we need to take extra care about our computer security.
path: /security | Current Weblog | permanent link to this entry
On April 1, 2020, I started working at the Linux Foundation!
My new title is “Director, Open Source Supply Chain Security”. I’ll be working to improve the security of open source software. I look forward to working with many others on this important problem.
So please wish me luck… and stay tuned for more.
path: /oss | Current Weblog | permanent link to this entry
Census II Report on Open Source Software
The Linux Foundation and the Laboratory for Innovation Science at Harvard have just released a new report: “Vulnerabilities in the Core: Preliminary Report and Census II of Open Source Software” by Frank Nagle, Jessica Wilkerson, James Dana, and Jennifer L. Hoffman, 2020-02-14. Just click on “Download Report” when you get there. A summary is available from Harvard. Here’s a quick introduction to the paper.
Their long-term goal is to figure out what FOSS packages are most critical through data analysis. This turns out to extremely difficult, as discussed in the paper, and they expressly state that their current results “cannot - and do not purport to - be a definitive claim of which FOSS packages are the most critical”. That said, they have developed a method as a “proof of concept” to start working towards that answer.
They describe their approach in detail. Here’s a quick summary. First they use data from Software Composition Analysis (SCAs) and application security companies, including Snyk and Synopsys Cybersecurity Research Center, to identify components used in actual systems. They then use dependency analysis (via libraries.io) to identify indirect (transitive) dependencies. Finally, they averaged the Z-scores to provide normalized rankings.
Here are some key lessons learned from the report (Chapter 7):
Also, here’s an interesting nugget: “These statistics illustrate an interesting pattern: a high correlation between being employed and being a top contributor to one of the FOSS packages identified as most used.”
I’m on the CII Steering Committee, so I did comment on an earlier draft, but credit goes to the actual authors.
path: /oss | Current Weblog | permanent link to this entry
Gource visualization (including set.mm)
Software and mathematics are often difficult for others to visualize. Computer hardware engineers can often have cool props to distribute during their talks, but software developers and mathematicians work with ideas of the mind - no physical objects involved.
This can sometimes make it difficult to explain important ideas like open source software (OSS). The idea of “people collaborating to produce something” is easy enough, but getting a true visceral understanding of what happens can be hard.
Gource is a cool visualization tool that makes it easy to see “collaboration in action”. The Gource project even has a web page showing some examples of Gource visualization.
I recently created a Gource visualization of the Metamath set.mm project. Some context is important here. In mathematics, claims are supposed to be rigorously proven, but humans are fallible; they make mistakes, and others often miss those mistakes. The solution to this problem is to rigorously describe mathematics in a formal way so that every step can be rigorously and automatically checked by a computer. This turns out to be difficult, and requires that a lot of people work together. Now… how can you visualize people working together to rigorously prove mathematical claims? One way is to use Gource… because while it doesn’t show everything, you at least get a sense of the collaboration. In this case, 48 people have contributed so far.
This visualization shows a common feature: in many cases, a single person starts and makes all the contributions for a while. The same thing happens if you view, for example, a Gource visualization of the Python programming language.
Gource is itself OSS, so you can download it and use it to create your own visualizations. I strongly recommend that you automate doing it as much as possible. For example, if you process data first, use a script to automate processing the data. You’ll need to give Gource various options; store options in its config file or a scripts.
If you create a Gource video, I strongly recommend adding some music or at least an audio commentary. If you add music, make sure it’s legal to add; the safe route is to use music released under open licenses such as Creative Commons Attribution (CC-BY) or CC0 Public Domain Dedication (CC0). Beware of the “non-commercial use” licenses - your releases might count as “commercial” even if you don’t think they do (talk to a lawyer if you want to go down that path). A great place to start for Gource music is audionautix.com, which has released lots of music under the Creative Commons Attribution 3.0 Unported License; you can select from lots of different styles and get some great options. Improving Gource Videos with Background and Audio has some tips and instructions.
In conclusion: enjoy my Gource visualization of the Metamath set.mm project… and perhaps it will inspire you to do something similar. I’ve embedded the video below so you can easily view it (if you like):
path: /oss | Current Weblog | permanent link to this entry
In case you weren’t aware of it, there is now a 2019 version of the CWE Top 25 list. This list attempts to rank what are the most important kinds of software vulnerabilities (what they call “weaknesses”).
Their new approach is to directly use the National Vulnerability Database (NVD) to score various kinds of vulnerabilities. There are a number of limitations with this approach, and they discuss many of them in the cited page.
Their approach does have some oddnesses, for example, their #1 worst problem (CWE-119, Improper restriction of operations within the bounds of a memory buffer) is itself the parent of items #5 (CWE-125, out-of-bounds read) and #12 (CWE-787, out-of-bounds write).
Another oddity: they rank Cross-Site Request Forgery (CSRF) quite high (#9). CSRF doesn’t even appear in the 2017 (latest) OWASP Top 10 list, even though the OWASP top 10 list focuses on websites (where CSRF can occur). I think this happens because the CWE folks are using a large dataset from 2017-2018, where there are still a large number of CSRF vulnerabilities. But the impact of those remaining vulnerabilities has been going down, due to changes to frameworks, standards, and web browsers. Most sites use a pre-existing frameworks, and frameworks have been increasingly adding on-by-default CSRF countermeasures. The “SameSite” cookie attribute that provides an easy countermeasure against CSRF was implemented in most browsers around 2016-2018 (depending on the browser), but having it take effect required that websites make changes, and during that 2017-2018 timeframe websites were only starting to deploy those changes. As of late 2019 several browsers are in the process of switching their SameSite defaults so that they counter CSRF by default, without requiring sites to do anything. (In particular, see the announcement for Chrome and the change log for Mozilla Firefox.) These changes to the SameSite defaults implement the security improvements proposed in Incrementally Better Cookies by M. West in May 2019. This change in the security default could not have been realistically done before 2019 because of a bug in the Apple Safari browser that was only fixed in 2019. As more browsers self-protect against CSRF by default, without requiring sites or developers to do anything, CSRF vulnerabilities will become dramatically less likely. This once again shows the power of defaults; systems should be designed to be secure by default whenever possible, because normally people simply accept the defaults.
That said, having a top 25 list based on quantitative analysis is probably for the best long-term, and the results appear to be (mostly) very reasonable. I’m glad to see it!
path: /security | Current Weblog | permanent link to this entry
Metamath book for 2019 available!
One of my hobbies is playing with Metamath, a tiny language that can express theorems in abstract mathematics, accompanied by proofs that can be verified by a computer program. I find it absolutely fascinating.
I’m happy to announce that the 2019 hardcover version of the so-called Metamath book is now available! You can even watch me unbox a proof copy. If you’re thinking about getting your own copy, please go to Lulu here: Metamath: A Computer Language for Mathematical Proofs by Norman Megill & David A. Wheeler, 2019, ISBN 9780359702237.
path: /misc | Current Weblog | permanent link to this entry
GitHub Maintainer Security Advisories
GitHub just made a change that I think will make a big improvement to the security of open source software (OSS). It’s now possible to privately report vulnerabilities to OSS projects on GitHub via maintainer security advisories! This wasn’t possible before, and you can blame me (in part), because I’m the one who got this ball rolling. I also want to give a big congrats to the GitHub team, who actually made it happen.
Here some details, in case you’re curious.
As you probably know, there are more OSS projects on GitHub than any other hosting service. However, there has been no way to privately report security vulnerabilities on OSS projects. It’s hard to fault GitHub too much (they’re providing a service for free!), yet because so much software is maintained on GitHub this has led to widespread problems in reporting and handling vulnerabilities. It can be worked around, but this has been a long-standing systemic problem with GitHub.
Why is this a problem? In a word: attackers. Ideally software would have no defects, including vulnerabilities. Since vulnerabilities can harm users, developers should certainly be using a variety of techniques to limit the number and impact of vulnerabilities in the software they develop If you’re developing OSS, a great way to see if you’re doing that (and show others the same) is to get a CII Best Practices badge from the Linux Foundation’s Core Infrastructure Initiative (I lead this effort). But mistakes sometimes happen, no matter what you do, so you need to be prepared for them. It’s hard to respond to vulnerability reports if it’s hard to get the vulnerability reports or discuss them within a project. Of course, a project needs to rapidly fix a vulnerability once it is reported, but we need to make that first step easy.
In September 2018 I went to a meeting at Harvard to discuss OSS security (in support of the Linux Foundation). There I met Devon Zuegel, who was helping Microsoft with their recently-announced acquisition of GitHub. I explained the problem to her, and she agreed that this was a problem that needed to be fixed. She shared it with Nat Friedman (who was expected to become the GitHub CEO), who also agreed that it made sense. They couldn’t do anything until after the acquisition was complete, but they planned to make that change once the acquisition was complete. The acquisition did complete, so the obvious question is, did they make the change? Well…
I am very happy to report that GitHub has just announced the beta release of maintainer security advisories, which allow people to privately report vulnerabilities without immediately alerting every attacker out there. My sincere thanks to Devon Zuegel, Nat Friedman, and the entire team of developers at GitHub for making this happen.
This seems to be part of a larger effort by GitHub to support security (including for OSS). GitHub’s security alerts make it easy for GitHub-hosted projects to learn about vulnerable dependencies (that is, a version of a software component that you depend on but is vulnerable).
It’s easy to get discouraged about software security, because the vulnerabilities keep happening. Part of the problem is that most software developers know very little about developing secure software. After all, almost no one is teaching them how to do it (I teach a graduate class at George Mason University to try to counter that problem). I hope that over time more developers will learn how to do it. I also hope that more and more developers will use more and more tools will help them create secure software, such as my flawfinder and Railroader tools. Tools can’t replace knowledge, but they are a necessary piece of the puzzle; putting tools into a CI/CD pipeline (and an auditing process if you can afford one) can eliminate a vast number of problems.
These changes show that it is possible to make systemic changes to improve security. Let’s keep at it!
path: /oss | Current Weblog | permanent link to this entry
The year of Linux on the desktop
For those who know their computer history, wild things are going on regarding Linux this year.
Linux is already in widespread use. For years the vast majority of smartphones run Android, and Android runs on Linux, so most smartphones run on Linux. As of November 2018 100% of all top 500 supercomputers worldwide run on Linux. Best estimates for servers using Linux are around 66.7%, and Linux is widely used in the cloud and in embedded devices.
But something different is going on in 2019. All Chromebooks are also going to be Linux laptops going forward. Later this year Microsoft will include the Linux kernel as a component in Windows. In a sense, 2019 is the year of the Linux desktop. This was not in the way it was envisioned in the past, but perhaps that’s what makes it most interesting. No, it does not mean that everyone is interacting directly with Linux as their main laptop OS, and so you can certainly argue that this doesn’t count. But increasingly that is measurement is less important; people today access computers via browsers, not the underlying OS, and that system is often running and/or developed using Linux.
path: /oss | Current Weblog | permanent link to this entry
A malicious backdoor has been found in the popular open source software library bootstrap-sass. Its impact was limited - but the next attack might not be. Thankfully, there are things we can learn and do to reduce those risks… but that requires people to think them through.
See my essay Subversion of boostrap-sass for more about that!
path: /oss | Current Weblog | permanent link to this entry
No one thing creates secure software, so you need to do a set of things to make adequately secure software. But no one has infinite resources; how can you have confidence that you are doing the right set? Many experts (including me) have recommended creating an assurance case to connect the various approaches together to an efficient, cohesive whole. It can be hard to start an assurance case, though, because there are few public examples.
So I am pleased to report that you can now freely get my paper A Sample Security Assurance Case Pattern by David A. Wheeler, December 2018. This paper discusses how to create secure software by applying an assurance case, and uses the Badge Application’s assurance case as an example. If you are trying to create a secure application, I hope you will find it useful.
path: /security | Current Weblog | permanent link to this entry
Don’t Use ISO/IEC 14977 Extended Backus-Naur Form (EBNF)
Sometimes people want to do something, find a standard, and do not realize the downsides of using that standard. I have an essay in that genre titled Don’t Use ISO/IEC 14977 Extended Backus-Naur Form (EBNF). The problem is that although there is a ISO/IEC 14977:1996 specification, in most cases you should not use it. If you have to write a specification for a programming language or complex data structure, please take a look at why I think that!
path: /misc | Current Weblog | permanent link to this entry
Railroader: Security static analysis tool for Ruby on Rails (Brakeman fork)
I’ve kicked off the Railroader project to maintain a security static analysis tool for Ruby on Rails that is open source software. If you are developing with Ruby on Rails, please consider using Railroader. We would also really love contributions, so please contribute!
A security static analysis tool (analyzer) examines software to help you identify vulnerabilities (without running the possibly-vulnerable program). This helps you find and fix vulnerabilities before you field your web application. Ruby on Rails is a popular framework for developing web applications; sites that use Rails include GitHub, Airbnb, Bloomberg, Soundcloud, Groupon, Indiegogo, Kickstarter, Scribd, MyFitnessPal, Shopify, Urban Dictionary, Twitch.tv, GitLab, and the Core Infrastructure Initiative (CII) Best Practices Badge.
In the past the obvious tool for this purpose was Brakeman. However, Brakeman has switched to the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 Public License (CC-BY-NC-SA-4.0). This is not an open source software license since it cannot be used commercially (an OSS license cannot discriminate against a field of endeavor). Similarly, it is not a free software license (since you cannot run the program as you wish / for any purpose). You can verify this by looking at the Brakeman 4.4.0 release announcement, the SPDX license list, Debian’s “The Debian Free Software Guidelines (DFSG) and Software Licenses”, Various Licenses and Comments about Them (Free Software Foundation), and Fedora’s Licensing:Main (Bad Licenses list). Railroader conitinues using the original licenses: MIT for code and CC-BY-3.0 for the website. MIT, of course, is a very well-known and widely-used open source software license.
If you are currently using Brakeman, do not update to Brakeman version 4.4.0 or later until you first talk with your lawyer. At the very least, if you plan to use newer versions of Brakeman, check their new license carefully to make sure that there is no possibility of a legal issue. This license change was part of a purchase of Brakeman by Synopsys. Synopsys is a big company, and they definitely have the resources to sue people who don’t obey their legal terms. Even if they didn’t, it is not okay to use software when you don’t have the right to do so. Either make sure that you have no legal issues… or just switch to Railroader, where nothing has changed.
Unfortunately, it is really easy to “just upgrade to the latest release” of Brakeman without realizing that this is a major license change. I suspect a lot of people will just automatically download and run the latest version, and have no idea that this is happening. I only noticed because I routinely use software license checkers (license_finder in my case) so that I immediately notice license changes in a newer version. I strongly recommend adding static source code analyzers and license checkers as part of your continuous integration (CI).
We assume that “Brakeman” is now a trademarked by Synopsys, Inc, so we’ve tried to rename everything so that the projects are clearly distinct. If we’ve missed something, please let us know and we’ll fix it. The term “Railroader” is a play on the word Rails, but it is obviously a completely different word. Railroader shares a common code base historically with Brakeman, and that’s important to explain, but they are not the same projects and we are expressly trying to not infringe on any Brakeman trademark. It’s obviously legal to copy and modify materials licensed under the MIT and CC-BY-3.0 licenses (that’s the purpose of these licenses), so we believe there is no legal problem.
I think I have a reasonable background for starting this project. I created and maintain flawfinder, a security static analysis tool for C/C++, since 2001. I literally wrote the book on developing secure software; see my book Secure Programming HOWTO. I even teach a graduate class at George Mason Univerity (GMU) on how to develop secure software. For an example of how I approach securing software in an affordable way, see my video How to Develop Secure Applications: The BadgeApp Example (2017-09-18) or the related document BadgeApp Security: Its Assurance Case. I have also long analyzed software licenses, e.g., see The Free-Libre / Open Source Software (FLOSS) License Slide, Free-Libre / Open Source Software (FLOSS) is Commercial Software, and Publicly Releasing Open Source Software Developed for the U.S. Government.
While Railroader is a project fork, we hope that this is not a hosttile fork. We will not accept software licensed only under CC-BY-NC-SA-4.0, since that is not an OSS license. But we’ll gladly accept good contributions from anyone if they are released under the original OSS licenses (MIT for software, CC-BY-3.0 for website content). If the Brakeman project wants to cooperate in some way, we’d love to talk! We are all united in our desire to squash out vulnerabilities before they are deployed. In addition, we’re grateful for all the work that the Brakeman community has done.
So, again: If you are developing with Ruby on Rails, please consider using Railroader. We would also really love contributions, so please contribute!
path: /oss | Current Weblog | permanent link to this entry