DALI

RSS

Info

Programmer, writer, introvert. Unstructured thoughts and prose.

Personal

Network

Contact

Curmudgeon's Guide to REST

This is an opinion piece about the state of APIs on the web. Caveat emptor.

Ever so often the phrase “REST API” is used to advertise a service, there is someone who is irritated that it is not really RESTful. Indeed, REST is misused by people who don’t understand it or simply repeat false information. The common pitfalls of APIs marketed as RESTful are not trivial to solve, and matter much more than one might think.

Fragmentation Hell

If you’ve ever used a web API out in the wild, chances are that you’ve had to read some technical specifications, considered out of band information, in order to use it. This is usually because there’s no mechanism of control or links in the payload, either for humans or robots. If links are present, they are often not encoded as links and rely on clients to construct URIs. What is typically the case is that API design is bike-shedded (or worse, not invented here). Clients have to assume, sometimes incorrectly, a priori knowledge of what exists on the server.

The widely implemented solution to documentation goes something like this: use comments and annotations in implementation code to generate documentation in a serializable format (such as Swagger, RAML, etc), or worse, have a human update technical documentation manually. The problem with either approach is that they rely heavily on humans to do busy work. This busy work is extended to maintaining API-specific clients in multiple programming languages, or else pass the busy work on to consumers to write their own clients. APIs on the web are in a fragmentation Hell in which every implementor acts like they are a special snowflake, deserving of special attention to the intricacies of their particular syntax.

Hypermedia Matters

Web APIs are usually delivered through HTTP (Hypertext Transfer Protocol). What is crucial to the concept of hypertext is that it contains hyperlinks, and if you are reading this, you probably followed hyperlinks to get here. The media type “HTML” facilitates the following of hyperlinks in a document. We take this for granted, as even non-technical users are able to follow links on a page and know where they are based on the current URI.

What most people pass off as RESTful is using URIs and HTTP verbs, but this is simply not enough. There is a better approach, though it is not easy: using hypertext as the engine of application state (HATEOAS for short). This is rare to see in practice, as it is rather difficult to implement. The cowboy approach is far easier, as the cowboy coder gets to build things in an idiosyncratic way while breaking client implementations and creating busy work.

What hypertext imposes is that links to other resources are embedded in the payload, and in the case of HTML, it’s the humble anchor tag <a>. It is important to note that links in HTML documents are not necessarily bi-directional, so that it is possible to reach dead-ends everywhere since pages do not necessarily record what links to itself. This sounds redundant to say to anyone who browses the Internet, but in the case of APIs, it is important for a machine client to traverse links from a given state, this promotes self-discoverability.

REST Is Mostly a Buzzword

Too often, an API that operates over HTTP using HTTP semantics is confused for a RESTful API, typically by marketers but also by developers themselves. Instead of building core functionality and writing prose not syntax, API developers and the companies that employ them are overly concerned with short-term goals, rather than longetivity. It’s the move fast and break things motto applied to API design, with sub-par results as expected.

The Cult of Futurology

The technological advancements that futurologists tout have consistently failed to improve the lives of people. A naive explanation might be that they are exaggerating their findings or blaming external factors, but I think there is a more realistic reason for this. They are more interested in believing pretty lies packaged by the faux-science/news industry, than in any transformative technology that could possibly affect their lives.

If we were to believe the hype, humans can attain near immortality in the future because aging will have been a solved problem and cancer has been cured. It is the wishful thinking of every solipsist that their own universe will last forever. However, attempting to solve a mystery with plausible, pre-determined methods will only lead to failure.

Protip: Use Arrow Functions

There is hardly any reason anymore to use var _this = this and other names to store the value of this for future use. Before ES6, it was a necessity of code to be littered with references to this due to each function binding this to its own scope. For example:

FrameBuffer.prototype.invert = function () {
  var _this = this;

  this.memory.map(function (location) {
      return _this._invertLocation(location);
  });
};

FrameBuffer.prototype._invertLocation = function () { ... };

Arrow functions introduce lexical scope for functions, so that the value of this in an arrow function is bound to the callee’s scope. The relevant portion in the above example could be rewritten as:

this.memory.map(location => this._invertLocation(location));

The line in the first example declaring _this may be removed entirely.

ES6 Modules in Node.js / io.js

Updated to reflect name change from 6to5 to Babel.

Not many people know that at this point in time, it is possible to use ES6 modules in Node.js (and io.js lacks ES6 module support at the moment) without messing with build tools or installing anything globally by using the Babel transpiler, which transpiles ES6 code into ES5 during runtime. This makes it possible to write ES6 and target current stable versions of Node v0.12, or the current io.js release for that matter. First install Babel as a dependency in your package:

$ npm install --save babel

Babel provides a hook that makes subsequent calls to require to run ES6 code, including modules. For example, if we have in entry point index.js:

require('babel/register');
module.exports = require('./lib/main');

Then in the subsequent entry point, the required file main.js runs in a transpiled ES6 environment. Stack traces also work without the need for source maps, although with extra compilation steps in between. The main.js file can import and export modules in ES6 style, for example:

import { randomBytes as rng } from 'crypto';
export default () => rng(...arguments);

There is a helpful guide to ES6 modules here.

Being an Alienated Human Being

A large part of what makes contemporary living so novel from the past is that its surroundings are transient and inconsequential. There is no regard for people in our area, just a relocation away from being irrelevant again. A person might move for college, then for work, more work, or a loved one. With careers being obsoleted and jobs are filled by more highly replaceable beings, there is not even the security of a fixed location.

No wonder that San Francisco is deemed a transient city, with people moving in and out because that’s where the jobs are at. The transience is visible in all walks of life, from the homeless who were evicted or relocated to the Tenderloin, to the tech workers who just moved into the city for work. The people who will remain are those who have lived generations in their deeply entrenched communities.

Implementing a Word Filter for the Browser

At first, this may seem like a trivial task. A naive approach would be to simply find and replace on a HTML document, but it is immediately apparent that non-textual data may be overwritten, causing unexpected problems. It needs to iterate over text nodes and watch for changes to the DOM, which is especially necessary for single-page apps. This is made possible by using the MutationObserver and TreeWalker APIs.

The first thing we can do is iterate over the text nodes of the DOM when the document is loaded:

var node;
var walk = document.createTreeWalker(
  document.body, NodeFilter.SHOW_TEXT, null, false);

while (node = walk.nextNode()) {
  replaceWords(node);
}

This creates a TreeWalker on the entire body of the document, and the while loop iterates over all of the text nodes of the document. For static pages, this is sufficient, but for it to really work, it needs to observe changes to the DOM, which is where MutationObserver comes in.

var mutationFilter = new MutationObserver(function (mutations) {
  mutations.forEach(mutationHandler);
});

mutationFilter.observe(document.body, {
  childList: true,
  characterData: true,
  subtree: true
});

In the above example, an instance of MutationObserver observes the entire body, triggering when changes to all nodes occur. The MutationObserver constructor accepts a function whose argument is an array of mutations. A Mutation object contains relevant data on the type of the change, and what was changed. We are mainly interested in added or modified text nodes, so that we can run the replacement function on them.

function mutationHandler (mutation) {
  var i;
  var node;
  var walk;

  if (mutation.type === 'childList') {
    for (i = 0; i < mutation.addedNodes.length; i++) {
      node = mutation.addedNodes[i];
      if (node.nodeType === 1) {
        walk = document.createTreeWalker(
          node, NodeFilter.SHOW_TEXT, null, false);
        while (node = walk.nextNode()) {
          replaceWords(node);
        }
      } else if (node.nodeType === 3) {
        replaceWords(node);
      }
    }
  } else if (mutation.type === 'characterData') {
    replaceWords(mutation.target);
  }
}

The above example tries to find all of the relevant text nodes of the mutation. If the mutation is of type characterData, there is only one text node, mutation.target. Otherwise, we have to iterate over the addedNodes property, detect if the node is a text node or not, and walk over the node if it is not.

The mechanics of running a regular expression on a text node is not as interesting, so I will end at this point. A working UserScript is available here.

No Justification to Live

The year is 2014 and the vast majority of people are compelled to do something they would rather not do for half of their waking hours, in exchange for money that is largely spent on shelter, food, thus securing their existence. Despite approaching a post-scarcity economy in which the cost of the manufacture of goods sinks due to automation, society demands that everyone must work, for what? The march of technological progress is no match for the cruelty of the human condition.

What Is a Wage?

A wage is not something to be proud of, unless your sense of self-worth is defined by how much money other people are willing to pay you. Those who actually create value have the lowest social status, and this excludes people whose jobs involve mentioning the phrase “creating value.” Creators are often unable to extract value from their own creation, though becoming wealthy or famous is hardly their motivation. Society would rather idolize those who extract value from the work of others, than those who create it. The sooner that humanity comes to the realisation that it is a rather contrived means of control, the better life would be for the majority of people.

The economics of the future are somewhat different. You see, money doesn’t exist in the 24th century … The acquisition of wealth is no longer the driving force of our lives. We work to better ourselves and the rest of humanity. — Captain Jean-Luc Picard

Silicon Valley’s Latest “Innovation” Is Enabling Plutocracy

A disingenuous meme about technology is that it enables many tasks to be automated, thus freeing humans from work and giving more time for leisure. This is at least partly true: startups are offering de facto taxi services, groceries hand-picked and delivered to your door, laundry pickup, parking spots, restaurant reservations, and more. All of this is facilitated by fluffy and visually appealing “apps” (I loathe this terminology) that help automate daily chores for a fee. The audience for these kinds of apps often neglect the human costs of these services, and more glaringly the low tech nature of it all.

The average provider in the poorly named sharing economy offers their labor at below minimum wages. As independent contractors and not employees of a company, they are not subject to minimum wage laws and do not receive any benefits that an employee may have, such as insurance. In terms of career, it is invariably a dead-end job, doing menial tasks for the bourgeoisie. The only thing that Silicon Valley has managed to disrupt lately is labor laws and the lives of the formerly middle class turned peasants. Those that do these small jobs to “pull themselves up by their bootstraps” as conservatives like to parrot, belong to the new underclass: the sharers, also known as suckers.

The startups that facilitate the sharing economy are technically accomplishing nothing new, read & write from a database through an user interface, as if this alone qualifies them as a tech company. Their biggest and most often discussed problem is scaling, rather than more challenging technical obstacles. Human labor is far more difficult to scale than computing power, so they stay confined to large cities where the affluent can afford their services and perpetuate the erosion of the middle class.

But what of the software engineers who are working hard (overtime, actually) on these “apps”? They must belong to the elite class with their high salaries and expensive consumer tech products, according to protestors. However, they are merely the favored pawns of the real elite, well-connected founders and venture capitalists. By circumstance of large swaths of entirely incompetent programmers and relatively limited immigration, programmers hold a middle-class standing. When given the opportunity, the elites can and do screw over everyone below them, and this includes programmers. Programmers can only lose in a race to the bottom against an endless supply of outsourced employees largely from India who are willing to work for a fraction of the cost, providing a new and larger base of people for the elites to screw over.

Towards a Future Informed by the Past

Ultimately, the new wave of Silicon Valley startups is about fulfilling retro-futuristic visions in the same vein as the Jetsons in which people can relax and not work hard because nearly everything is automated from food preparation to transportation to cleaning, except that everything is actually powered by humans and therefore only accessible to an elite.

We should do away with the absolutely specious notion that everybody has to earn a living. It is a fact today that one in ten thousand of us can make a technological breakthrough capable of supporting all the rest. The youth of today are absolutely right in recognizing this nonsense of earning a living. We keep inventing jobs because of this false idea that everybody has to be employed at some kind of drudgery because, according to Malthusian Darwinian theory he must justify his right to exist. So we have inspectors of inspectors and people making instruments for inspectors to inspect inspectors. The true business of people should be to go back to school and think about whatever it was they were thinking about before somebody came along and told them they had to earn a living. — Richard Buckminster Fuller