Normalize your complex JS objects

Normalize your complex JS objects

Data normalization

The process of normalization its havily used in every software design because normalizing data has a big impact on reducing data redunancy.

When to normalize your data?

Suppose we received the following data from an api:

const apiData = [
    {
        id: 1,
        name: "Daniel Popa",
        siteUrl: "danielpdev.io"
    },
    {
        id: 2,
        name: "Other Name",
        siteUrl: "danielpdev.io"
    }
];

Now, you get a task to find the item with id of 1.

How will you solve it?

1. Dummy solution with complexity O(n):

Iterate over the whole collection using a find and output the result.

const findId = (apiData, id) => apiData.find(el => el.id === id);

You finished quite fast and gone for a coffee, but over the next few months the data grows and now you don’t have only two elements, but 10000. Your time of searching for elements will increase considerabily.

2. Normalized solution with complexity O(1):

Transform data from [objects] => { id: object}:

const apiData = [
    {
        id: 1,
        name: "Daniel Popa",
        siteUrl: "danielpdev.io"
    },
    {
        id: 2,
        name: "Other Name",
        siteUrl: "danielpdev.io"
    }
];

function assignBy(key) {
    return (data, item) => {
        data[item[key]] = item;
        return data;
    }
}
const optimizedData = apiData.reduce(assignBy("id"), {});

optimizedData variable looks like the following:

{
  "1": {
    "id": 1,
    "name": "Daniel Popa",
    "siteUrl": "danielpdev.io"
  },
  "2": {
    "id": 2,
    "name": "Other Name",
    "siteUrl": "danielpdev.io"
  }
}

Now, searching for an element becomes really easy. Just optimizedData[id] and your data is ready.

Conclusion:

Normalize your data only when dealing with complex objects and searching for objects takes long.

Article first posted on danielpdev.io

Follow me on twitter

Related Posts