POST DIRECTORY
Category software development

This is Part 3 of a three-part guide on refactoring JavaScript from imperative and/or object-oriented patterns to declarative functional ones. For a background on the concepts used in this tutorial, refer to Part 1. In Part 2 we applied the concepts of currying, partial application and pointfree style.

Makeover Time (continued)

Recap from Part 2

We have some books:

const books = [
  {
    title: "To Kill A Mockingbird",
    author: "Harper Lee",
    year: 1960
  },
  {
    title: "The Secret History",
    author: "Donna Tartt",
    year: 1992
  },
  {
    title: "Infinite Jest",
    author: "David Foster Wallace",
    year: 1996
  },
  {
    title: "Fight Club",
    author: "Chuck Palahniuk",
    year: 1996
  },
  {
    title: "Harry Potter and the Sorcerer's Stone",
    author: "J.K. Rowling",
    year: 1997
  }
]

And we have some helper functions that wrap imperative code in a functional way:

const filter = predicate => collection => collection.filter(predicate)
const prop = propName => object => object[propName]
const equals = a => b => a === b

Our filtering function is now entirely built upon plain ol’ functions:

const whereYear = year => filter(item => equals(year)(prop("year")(item)))

As mentioned at the conclusion of Part 2, this is a clunky, awkward parentheses salad. Our final step for refactoring this function is to apply automated function composition. The overall strategy for this is to:

  1. Provide any number of functions to create a “pipeline” of functions.
  2. Provide an initial input.
  3. Apply the first function from step 1 to the initial input (step 2), then the apply the next function from step 1 to the return value of the first function. This process repeats for however many functions are supplied in step 1. Inputs become outputs, outputs become inputs.

In this tutorial, we will call the function for automated function composition pipe. This function may also be seen in JavaScript and other languages as compose or flow, and even as an operator like |>.

Introducing pipe

First, let’s review how Array.prototype.reduce works. It takes two parameters: a reducer function and an initial value.

The reducer function also takes two parameters, an accumulating value (idiomatically referred to as acc) and an element (el). The return value of the reducer function becomes the accumulator value for the next round of the iteration. The initial accumulator is the initial value that we pass as the second parameter to reduce. Here are some examples.

const sum = numbers => numbers.reduce((acc, el) => acc + el, 0)
sum([1, 2, 4]) // => 7

const fromPairs = pairs =>
  pairs.reduce((acc, el) => Object.assign({}, acc, { [el[0]]: el[1] }), {})
fromPairs([["a", 1], ["b", 2]]) // => { a: 1, b: 2 }

reduce itself does exactly what we want for function composition. It takes an initial value, applies a reducer function to each element in the array, and sends the return value of that reducer function to be the new accumulator value for the next iteration. All we have to do to write pipe is add an interface that takes any number of functions and turns it into an array:

const pipe = (funs) => startingValue =>
 funs.reduce((returnValue, fun) => fun(returnValue), startingValue);

Using ES2015 (ES6) ... spread operator, we turn a list of any number of functions into an array of functions. Using the startingValue, we pass it into the first function. The result of that function becomes the input for the next function in funs. Once the last function is applied, pipe is done and returns the final value, which will be whatever type the last function in funs returns.

const exclaim = str => `${str}!!!`;
const toUpper = str => str.toUpperCase();
const repeat = str => `${str} ${str}`;

const freakout = pipe(
  repeat,
  toUpper,
  exclaim
);
freakout("hey"); // => "HEY HEY!!!"

const freakyFreakout = pipe(
  exclaim
  repeat,
  toUpper,
)
freakyFreakout("hey"); // => "HEY!!! HEY!!!"

Note that pipe composes functions from left-to-right, top-to-bottom. In the first pipeline, exclaim is applied last, so the exclamations only appear once. In the second, exclaim is the first function applied. Some implementations of function composition run right-to-left, bottom-to-top, which comes from the way compositions are described in mathematics.

whereYear with pipe

Now that we have automated function composition at our disposal, let’s clean up the filtering function.

const whereYear = year => filter(item => equals(year)(prop("year")(item)))

Currently, the sequence of our manual composition goes:

  1. Apply equals to year
  2. Apply the resulting function of step 1 to prop("year")(item), which evaluates prop("year") first, then applies the returned function to item.

With that in mind, it is more clear that the only thing we need item for is as an argument for the result of prop("year"). The result of that then gets applied to equals(year). Manual composition generally reads inner-to-outer, right-to-left. To switch to pipe, we’ll write these steps from left-to-right, top-to-bottom.

const whereYear = year =>
  filter(item =>
    pipe(
      prop("year"),
      equals(year)
    )(item)
  )

The predicate function takes an item and passes it into a pipeline. First we get the value of the year property on the item, then pass that value to equals(year), which returns true or false based on the values of year and item.year. Now that item is simply passed by the anonymous predicate function to the second stage of pipe we can remove the concrete reference to it and use pointfree style.

const whereYear = year =>
  filter(
    pipe(
      prop("year"),
      equals(year)
    )
  )

Now our only concrete reference is to year. Let’s take this one step further into abstraction land and make this function work for any property name.

Removing year

To make the property we want to compare dynamic, we have to add another parameter to our function. Let’s have it come first.

const wherePropEq = propName => value =>
  filter(
    pipe(
      prop(propName),
      equals(value)
    )
  )

const whereYear = wherePropEq("year")
const in97 = whereYear(1997)
in97(books)
/* => [
  {
    title: "Harry Potter and the Sorcerer's Stone",
    author: "J.K. Rowling",
    year: 1997
  }
]; */

And since evaluating equality of a property and given value is a common task, let’s pull out the pipe function to a named helper (most uses of pipe encapsulate a discrete unit of transformation, so it is often helpful to assign that pipe to a named variable for a more declarative style).

const propEquals = propName => value =>
  pipe(
    prop(propName),
    equals(value)
  )

const wherePropEq = propName => value => filter(propEquals(propName)(value))

Once we use that helper, we expose some more obvious inner-to-outer manual function composition, let’s refactor with pipe.

const wherePropEq = propName => value =>
  pipe(
    propEquals(propName),
    filter
  )(value)

Now value is in a position to be substituted for pointfree style.

const wherePropEq = propName =>
  pipe(
    propEquals(propName),
    filter
  )

After receiving its first argument (propName), wherePropEq returns another function that takes a value to load into propEquals, which gets passed into filter. At this point, propEquals is only waiting for one more argument, an object, so it serves perfectly as a predicate function for filter. After first two applications of wherePropEq, we are left with a function that takes an array and applies the partially applied filter function to it. Once the array is supplied, the function reaches final execution.

You may have noticed that the function signature for wherePropEquals is the same as propEquals. That is a hint at an opportunity to refactor to pointfree style. The following implementation is definitely over the top, but a fun demonstration of how lean your code can get using a functions first, data last pattern.

const toArray = x => [x]
const append = x => xs => [...xs, x]
const apply = fn => args => fn.apply(null, args)

const wherePropEq = pipe(
  propEquals,
  toArray,
  append(filter),
  apply(pipe)
)

This function takes an initial input (e.g., 'year'). propEquals gets preloaded with year and is then wrapped by an array. filter is appended to the array, so we end up returning another pipeline function.

pipe(
  propEquals('year' /* preloaded */),
  filter
)

At this point the function is exactly the same as when we have an explicit reference to the propName argument in the previous implementation.

Review

Let’s take a look at all of our unit level helper functions.

const filter = predicate => filterable => filterable.filter(predicate)
const prop = name => obj => obj[name]
const equals = a => b => a === b
const pipe = (...funs) => startingValue =>
  funs.reduce(
    (returnValue, fun) => fun(returnValue), 
    startingValue
  )
const propEquals = propName => value =>
  pipe(
    prop(propName),
    equals(value)
  )

All of these are pure functions completely independent from context. Luckily, instead of having to write these wrapper functions by hand, they are readily provided by libraries like Ramda. While it is helpful to write our simple versions of these functions, library code like Ramda’s is battle tested, and even auto-curried. That means you can supply multi-argument functions all their arguments at once (if you have them), or one at a time.

const R = require("ramda")

const year = { year: 1992 }
R.equals(
  R.prop("year")(year),
  R.prop("year", year)
) // true

With these low level wrapper functions, we were opened up to compose them to solve larger problems.

const wherePropEq = propName => value =>
  pipe(
    propEquals(propName)(value),
    filter
  )

Now that is functional. More than just refactoring, function composition brought us to rethinking. But was all that effort worth the outcomes? The for loop did work after all, and with way fewer lines of code. This is where I ask you to have faith in planting the seeds of functional programming. The one off examples (including this series) might seem pointless, but I have found incredible dividends as anything from a function to a feature grows in size and/or complexity.

By being disciplined at a low level with functional programming patterns, adding or adjusting behavior is much easier. Good code is measured by its ability to change easily over time as requirements change. While the upfront cognitive load of FP can appear excessively clever, it pays off at the first moment things need to be changed or extended. Each building block function being used can be plugged in and out or rearranged at will, all while being extremely testable and predictable.

Imagine we have some new requirements for our filtering case. Instead of returning entire objects, we want to filter and then only return the titles of the books. Oh, and they should be lowercase titles, because that’s hip. We already have the filtering aspect done, we just need to pass the result of filtering into a mapping function.

const map = mapper => mappable => mappable.map(mapper) // library code
const toLower = str => str.toLowerCase() // library code

const whereYear = wherePropEq("year")
const in96 = whereYear(1996)

pipe(
  in96,
  map(
    pipe(
      prop("title"),
      toLower
    )
  )
)(books)
/* => [ "infinite jest", "fight club" ]; */

This is all pure and declarative. By writing this way, the default task for the developer reading the code is to interpret what it does, not how it works. After learning the concepts introduced in this series, it is much faster to look at and understand declarative code than it is to parse through imperative code. There is of course always an option to dive deeper into implementation details, declarative code just makes that extra effort voluntary instead of required.

Remembering that individual pipes can often be extracted and named for their overall behavior, we could even increase readability by naming map‘s mapping function.

const lowerTitle = pipe(
  prop("title"),
  toLower
)

pipe(
  in96,
  map(lowerTitle)
)(books)
/* => [ "infinite jest", "fight club" ]; */

With these curried functions that take critical data last, it becomes easy (and fun) to use these functions on any inputs that meet the interface contract. In the case of:

pipe(
  in96,
  map(lowerTitle)
)

the interface requires the input to be an array of objects with year and title properties (at a minimum). So if we all the sudden have to support collections of movies, paintings, articles, &c, nothing in the function has to change because it all depends on an interface. While the original for loop solution also only requires an interface, its generalizability is less clear because of variable names and the clutter of imperative filtering. It would just feel weird to pass an array of movie objects into booksInYear – right? Functionally styled code expresses generalizability up front, focuses on the flow of logic, and primes us to recognize patterns in data transformations across domains.

Conclusion

By thinking about data transformations functionally, we focus primarily on the transformation, not the particulars of the data being processed. Complex functions emerge directly from composing smaller functions, which can be plugged in and out and rearranged with ease. All that matters is that data types and shapes line up with each input and output of the composed functions.

By applying currying and partial application, pointfree style, and moving our arguments to provide functions first and data last, we implemented a declarative filtering function that is made up entirely of smaller unit functions. At the lowest level, these units depend on imperative code (.e.g, equals uses ===), but by wrapping the imperative implementation in a declarative one, we are free to program to and interpret programs as an interface instead of an implementation.

While we went through a whole lot of work and added a whole lot of code to get the same result as the for loop implementation, the for loop solves the problem at hand in a one-off way and is not poised to change easily over time as new requirements come in. Solving the problem with function composition results in expressive, intentional, and declarative code now, and positions us to systematically manage complexity in the future.

Thank you for reading. Have FUN out there 😉.

Resources

''