👯 ReST and modular evolution
Of the key (contentious) tenets in ReST are the ideas of versioning and hypermedia as the engine of application state (HATEOAS: not exactly the core idea but notice the complexity of the response JSON and think about how programming modal responses would involve significant levels of complexity in code matter; more accurately, HATEOAS would involve “control data” that manipulates other resources via link relations, beyond mere CRUD operations, but also involving higher order or bespoke application semantics which would in some cases be derived from more basic Action semantics; I must emphasize that standard link relations and actions are only standards and users are more variegated and multifaceted than what standards documents, living or not, practically can know, so your project may discover a novel kind of user who has a special need or capability that you may want to implement in the response so as to cultivate/implement better frontends). Web developers want to know: even if interpersonal evaluations of the API from the frontend (mobile clients, browser clients, CLI clients, etc.) align or diverge in API consumption habits on the same recommendations for changing the API endpoint response, what is the best strategy for managing change or updates which would constitute evolution of the API? Should the API use a subdomain (e.g., http://v1.example.com/api/...
) or should it use a subdirectory approach (e.g., http://api.example.com/v1/...
), and the clients choose the “base URL”? Well, even if ReST’s code-on-demand requirement is “optional” (implementing it implies minimally some opinionated commitment to a versioning strategy), it raises questions as to whether it speaks to a desirable property anyway, which would require either co-existing APIs endpoints/bases, using information hiding, incurring costs of side-by-side releases and coordination with internal/external/public consumers, and writing “safety net” and integration testing code for catching errors, not only for use-case workflows but also edge cases where issues might reveal themselves under different software development life-cycle environment conditions and differential database expectations (i.e., replication, consistency, boundary-crossing, etc. requirements unmet determine that data may not be available in one environment which can fail integration tests). In a word, how ought we enable co-evolution of client and server, or even peep-to-peer, such that code-on-demand is achieved as well?
Of course abstractions are the go-to solution to questions of coordination, but at what moment in the code ought such abstractions touch base, and hopefully without introducing code smell? Here I suggest that we substitute a certain moment in the code where we use if-else
statements to facilitate conditional behavior based in user input. Say, e.g., we capture input from a <select>
that either is used for setting gender, role, country, or what have you for some entity, a user. Depending on a role, say “editor” or “user” or “buyer”, we might send the user down a workflow that asks to create a “payment method” and which would involve writing additional values to a database. So by CQRS, say, a certain “role” would involve more DB-writes than DB-reads by comparison to an “admin”, whose account creation process would not involve setting up a “payment method”, or at least such a setup would be facilitated not through the same workflow as a non-admin user. So we have an example, this code seems pretty straightforward:
function setUserRole({ payload }) {
if (payload.role !== 'admin') {
return models.User
.find({ id: payload.userId })
.setPaymentProfileVerifiedStatus(false)
.updateRole(payload.role);
}
return models.User
.find({ id: payload.userId })
.updateRole(payload.role);
}
This code is probably too simplistic to be reflective of anything actually found in a production environment, but it’s sufficient to get the point across and yet it also presumes some framework conventions about how models are configured and static methods available, etc. Basically, something additional happens for a non-admin user, and eventually the code involved might become more complex in the future: e.g., we might add some additional kind of method that implies some other kind of workflow of the same order as a “payment method”; say that we integrate a third-party service and we want non-admin users to be able to link their account to that service (which is really the same kind of action as creating a payment method; e.g., linking to Stripe via its API).
So let’s look at this idea of modular, or “mosaic”, evolution toward the goal of co-evolution. First, what’s the gist? That the server and the client “think together” in cyberspace; sounds pretty Spinozistic, right? This all turns on Spinoza’s assertion that the order and connection of ideas is the same as the order and connection of things and a true idea corresponds to its object. In this case, the “object” is some code that we write to function as a pattern-matching contract strategy for allowing for client and server to evolve independently while communication through HTTP Accept and Content-Type headers facilitate the common ground through which evolutionary events take place. The server “accepts” application/json-version-*
for instance and the clients will ask for a content-type/mime-type that specifies the versions they want for a certain response to be received. None of this is a new idea, but the code we write is a matter of how we allow for “graceful degradation”.
// this is a server function that will
function setUserRole({ payload }) {
const dictOfFunctions = {};
const param = payload.role;
const mimeType = payload.contentType;
dictOfFunctions['admin'] = () => { ... };
dictOfFunctions['buyer'] = () => { ... };
dictOfFunctions['seller'] = () => { ... };
dictOfFunctions['editor'] = () => { ... };
return dictOfFunctions[param]();
}
This is the first iteration of the idea, and yet it’s unfinished, but simply to get the if-else
paradigm out of the way. No big deal, right? In each function we can pass in a config object that looks for the “content-type”. But what good is that? What’s the point? What we want to do is use the “content-type” that is explicitly given in the request to guide our code matter on the server as a commitment to “thinking together”:
dictOfFunctions['application/vnd.example.v1+json/buyer'] = () => ...
dictOfFunctions['application/vnd.example.v2+json/buyer'] = () => ...
...
Now, of course, we have introduced some complexity, but at the same time, whatever is to the right of the =>
can be a modular function we set up as an import or require('...')
. But to the left, the payload.role
we pass in is not going to directly match unless we use something like indexOf
or includes
on the string parameter as if it were an array we are searching through. And further than this, we may want to approach this idea in general: a semantics for these dictionary keys that is minimally “propositional” or “grammatical”, so that means special characters like /
or $
or —
take on a structural meaning so as to facilitate an “architecture of distinction” between semantic items like “buyer”, which corresponds to a workflow, and “application/…json”, which corresponds to something else altogether, namely, informing the developer themself how to treat the code, whether to touch it or not, etc. during refactorization, or what’s more, it might actually entail writing to a different database or something else that can be programmed which would sufficiently be different from other functions that would be versioned according to an evolution path or will eventually die off after enough clients evolve far enough so as to warrant deprecating older generations of features (known through their functions).
Aside from special characters like /
we have a wealth of semantic properties we can think about from environment settings (staging, prod, etc) to Date/timestamp settings to trace-ids to cryptographic grouping (for mesh or P2P organization) to semantic versioning. Then, as well, we can specify “default” behavior. Both server and clients can use this idea and also programmatically pattern-match in order to prioritize (think order of operations) which semantic properties ought to have greater or lesser “specificity” as determining which modular function may be used in the running program, whether API or client. Think of it as “CSS specificity for co-evolving clients and servers”; in a way its “thinking in CSS conceptually in order to write scalable, evolvable JS”. To say the least, the abstraction used to determine default behavior relative to semantic properties allows for server and client developers to manually configure higher priority functions even without algorithmically pattern-matching.