On this page we will curate a growing bestiary of examples of real-world reuse failure, caused by the use of traditional development techniques - including, but not limited to, the use of "information hiding", "encapsulation" and the structuring of large-scale artifacts using programming language code.
The node.js "internal" debacle
A well-respected member of the node.js community, isaacs, wrote a widely-used library, graceful-fs, as an improvement in portability and resilience over the built-in node.js filesystem handling library, fs. In the course of doing so, he discovered that he needed to reuse the module source for fs in a particular way which suddenly broke when node.js was updated to the 6.x version, since the developers had decided to follow "industry best practices" by hiding the module sources for built-in libraries in a freshly created package, "internal" that was invisible to userland code. The author started the following thread as node issue #8149, asking the core node developers to explain the benefits of this encapsulation approach. The thread makes highly interesting reading, since the points of view are very clearly articulated on each side. There is a clear division between the "elites" on the core team who stick to their abstract best practices, regardless of what evidence is produced by the "ordinary users" on the thread that this encapsulation simply interferes with their ability to get value from the node.js codebase. isaacs also interestingly draws attention to what he sees as the "drift" in the core values (both articulated, and implicit) held by the core node.js team (with whom he has been familiar since the project start) between the 0.x/1.x days and the io.js/4.x days. He states that a core value held by the former team was that "the user should not be protected from themselves". Some commenters suggest that this drift in values could be explained as a result of numerous, unskilled users joining the node.js community who are more likely to need protecting from themselves. I am unconvinced, and instead see this as a very traditional "senescence" phase of a project, where the core team comes to attract dogmatists and rule-followers, after the initial phase where the team was built of enthusiasts who simply wanted to make things that worked, and did not see a fundamental distinction between themselves and their users.
This invaluable and current coverage checking tool depends on a tower of libraries to select and traverse the files requiring instrumentation. In the GPII, we are following what has come to be called the "Alle proposal" for operating micromodules, which involves the use of an internal node_modules directory holding the module root. It turns out that one of nyc's dependencies, "test-exclude", includes a hard-wired regex check for node_modules anywhere in the path of a file to be instrumented, as follows: index.js#L35. As usual, this requires forking of the entire tree of dependencies to fix a problem which is hidden in a nested closure.
Unifying input device tracking and browser events
The DOM event model does not associate user input events with their source device, only with the abstract type of device and the 'target' DOM elements. This presents a design challenge for multi-user or bimanual interfaces that may, e.g., want to track multiple mice, with each their associated cursor and behavior. ptchernavskij has attempted to get around this by using node-usb and node-hid on a locally-running server to identify devices and forward their state to the browser, thus maintaining models of actual input devices. However, at this point, there is a difficult design trade-off:
- emit only "in-house" events generated by the device server. This requires significant re-implementation of processes such as acquiring targets of pointing events.
- adopt DOM events and forget about decorating them with source devices. This strongly limits the possible bimanual interface designs.
- combine DOM events with device information. This requires some sort of asynchronous event synthesis module that reliably matches DOM events and "in-house" events.
- use both kinds of events without attempting to match them up. This appears to be the most "clumsy" design, but at least makes the desired designs possible.
This is an interesting reuse problem because, in addition to the traditional closed-ness of the DOM event system, it is particularly difficult to coordinate two asynchronous information sources.
In their study of a volunteer community working to distribute locally sourced organic vegetables in Aarhus, Denmark (AOFF), Bødker and colleagues describe the following development around the community's website. The group began by using a facebook group both as their public face and means of communication, but subsequently members of the group operated a wiki and several different websites. The first of these websites was created and hosted by a web developer volunteer, who eventually abandoned it: "the initial web developer became less and less involved with AOFF, resulting in minimal development, slow communication and lack of access to the basic configuration on the back-end, forcing the community to "invent" alternatives around the website." While the abandoned website was still in use, another member with development skills implemented a calendar feature on the website essentially by hacking it: "Paul, the member who later went on to develop the second website, "went into the database and put in an iframe as a content element...that's not done through the CMS at all, that's just some injected some SQL, into the database, which case the calendar feature \[...\] but I mean, that's what we had, that's what we could do, it's the only possibility".