I'm a huge fan of functional programming. The JavaScript language treats functions as first-class-citizens (meaning, they're just another variable) and so there are ample opportunities for coding in a functional style.

Some of my favourite functions are on the Array prototype: filter, map, reduce. I like that these are chainable, meaning that it's possible to create a pipeline of functions to execute for a dataset. The more formal name for this pattern is Fluent Interface. In this post, I cover these functions and provide some examples of how to use them.

Digging Into the Functions

filter Copies All Array Entries That Pass a Test

Use filter when you need some elements in an array based on whether a condition evaluates to true.

Filtering a dataset before transforming the relevant data minimises the amount of operations performed. This is of particular importance for large datasets or for large chains.

The C# version of this function is more readable to me (it's .Where) as the filter operation retains the items that satisy the test. It filters out the items that don't!

A filter statement requires a callback function that returns a boolean - true to keep the element in the result, false if not.

[1, 2, 3, 4, 5, 6, 7, 8, 9, 10].filter(x => x < 5);
// returns [1, 2, 3, 4]

map Transforms Each Value in the Array

Use map to create a new array of transformed elements.

I use mapper functions often when handling data from an API call, typically to transform a collection of data into domain models that I control. I frequently find myself then mapping from domain model objects into another format specific to a UI component.

As with filter, map accepts a callback function, executed for each element in the array, and whose return value becomes an element in the resultant array.

[1, 2, 3, 4, 5, 6, 7, 8, 9, 10].map(x => x*x);
// returns [2, 4, 6, 8, 10, 12, 14, 16, 18, 20]

reduce Consolidates All the Array Values to a Single Value

Use reduce for when the array elements need combined into a single value. The reduce function has a slightly different signature to map and filter, because it works slightly differently.

It's best to start back-to-front with the call signature. Argument two is the initial value for the result. It could be a 0 if the reducer is summing all the array elements, or an empty object {} if the result is an object.

The callback function has two arguments - the current value of the accumulator and the current array element. Its return value becomes the value passed as the accumulator argumnet to the next callback.

[1, 2, 3, 4, 5, 6, 7, 8, 9, 10].reduce((acc, x) => acc + x, 0);
// returns 55, whereas
[1, 2, 3, 4, 5, 6, 7, 8, 9, 10].reduce((acc, x) => acc + x, 100);
// returns 155, and
[["eggs", 1], ["ham", 2], ["cheese", 3]].reduce((acc, x) => {
  acc[x[0]] = x[1];
  return acc
}, {});
// returns { "eggs": 1, "ham": 2, "cheese": 3 }

The reduceRight variant applies from right-to-left.

Putting It All Together

Let's put this learning into practice with some real-world code.

For a recent side project I needed to query the JIRA API to list all the (completed and active) sprints for a project. The JIRA API returns a lot of information, not all are actually needed in my application. I used the filter and map functions to exclude any future sprints and then transform the returned objects to include the properties I need, in the format I want:

const data = await getFromJiraAPI(boardId);
return data
  .filter((item) => item.state !== future)
  .map((item) => ({
    id: `${item.id}`,
    name: item.name,
    state: item.state,
    goal: item.goal,
    started: toMs(item.activatedDate),
    finished: toMs(item.completeDate || item.endDate),

In another part of the code, I have an array of objects containing timing data and a need to produce a sum total of all timings. This calls for a reduce.

return flowTimes.reduce(
  (acc, entry) => {
    acc.leadTime += entry.leadTime;
    acc.cycleTime += entry.cycleTime;
    acc.activeTime += entry.activeTime;
    acc.idleTime += entry.idleTime;
    acc.blockedTime += entry.blockedTime;
    return acc;
  }, {
    leadTime: 0,
    cycleTime: 0,
    activeTime: 0,
    idleTime: 0;
    blockedTime: 0;

I wanted to visualise the proportion of stories, versus defects and technical debt tickets in a sprint, as a stacked bar chart. Given an object listing each ticket type and the number of that type:

const ticketTypesCount = { "Story": 10, "Defect": 4, "TechDebt": 2 }

I can (generically) calculate the sum of all the tickets by:

const total = Object.keys(ticketTypesCount)
  .reduce((acc, k) => acc + ticketTypesCount[k], 0);

Object.entries is a useful function for producing an array containing each property in an object. On the above, this will output ["Story", "Defect", "TechDebt"]; Each callback of reduce provides an individual index to a property of the ticketTypesCount object.

Dynamic access to an object's property uses the square-bracket syntax, just like accessing an array element at an index. When k="Story", ticketTypesCount[k] is functionally-equivalent to ticketTypesCount.Story, with the exception that the property name did not need to be hard-coded!

To 'stack' the data, transform each data point to the proportion of the total it accounts for. This requires a map:

const band = Object.keys(ticketTypesCount)
    type: k,
    value: ticketTypesCount[k] / total


  { type: "Story", value: 0.625 },
  { type: "Defect", value: 0.25 },
  { type: "TechDebt", value: 0.125 }

These three functions, map, filter, reduce, alongside some supporting functions such as Object.keys are often sufficient for the vast majority of data manipulation needs. I also find the code cleaner than the alternative, which is to do all the operations over the array in one large for statement, as there is less state to maintain overall. Each operation remains nicely contained to its own callback function in the pipeline.

Why not give this style a go? See if it makes a difference!