Comparing Javascript Frameworks to rebuild the Freshworks help widget [Part 2]

[Edit Notes: Late last year we relaunched the Freshworks’ help widget which sits on a website and helps support customers better. In this multi-part blog written by our engineering team, we talk about how we picked the components and technologies that went into it and the improvements we made to it. Part 1 of the series is here. ]

As we hunkered down to rebuild the Freshworks help widget from scratch, we evaluated many technologies and components. Our goal, as outlined in the previous post was to improve the user experience, performance, accessibility, decouple certain components for better performance and also to enable integrations in the future.

As proponents of Indian Democratic Design, we needed for it to be simple, scalable, self-reliant, well crafted and affordable. In this article, we will take a look at the metrics we used to compare six different JavaScript frameworks to rebuild our feedback widget.

Why were we evaluating newer frameworks?

All our Freshworks products are built using the Ember.js framework. While Ember helps us build scalable, dynamic enterprise web applications, it was not suitable to build embeddable widgets. That’s because it consumes a lot of memory and leads to performance issues. We wanted a light-weight alternative to build simple widget-like components that can be embedded on to our customer’s websites. We also wanted to explore some other industry-grade frameworks which we can later adopt, to build our products.

JS Frameworks that we evaluated

Before we began exploring different frameworks, we were inclined to pick a few popular frameworks like React and Preact. These are major players in the industry with wide adoption, community support, and detailed documentation. But we also wanted to explore some new frameworks and standalone libraries to better validate our preferences. After a thorough look at the stats we had collected, we ended up with the following frameworks: React, Preact, Vue, Svelte and Glimmer. In addition to this list of frameworks, we also wanted to try out the Web Components specification offered natively by the browser platforms so that we can ignore the framework overhead.

Metrics used to evaluate

We looked at the following metrics to evaluate each of the frameworks and asked ourselves many questions to better prioritize what we needed.

1) Ease of Use

  • CLI Conventions: Does the framework provide a CLI (Command Line Interface) tool in the first place? How exhaustive are the features provided by the CLI? Are they just application scaffolding tools or more than that? How flexible or rigid is the enforcement of framework conventions by the CLI? We were looking for something on the lines of Ruby on RAILS for client-side Javascript applications.
  • Developer Productivity: What level of abstraction is provided by the JS Framework so we could improve developer productivity? When we use integrated templates that automatically update underlying data changes, there is no need for developers to write redundant code.
  • Build Pipeline: Which framework provides tools that help developers build web applications from various resources like mark up, styles, and javascript into the final build file? Additional packages shouldn’t be necessary to complete the build. The CLI encompasses all the tools and libraries to compile, transpile, concat and minify the assets.

2) Industry Adoption

Are companies using the framework in production-grade applications? And is the framework promoted as part of the organization’s open-source initiative?

3) Community

How big is the framework’s community? Is the community united or fragmented? Are documentation plugins, add-ons, npm packages boilerplates tutorials, or guides for the framework available to support programmers build the widget?

4) Support

How big is the team behind the framework? Is there a company or organization that owns the framework?

5) Low Memory Footprint

Which framework has a low memory footprint? We were looking for a framework that cost less to download and parse.

6) Performance

How well do apps built with the framework perform? We wanted to ensure that when the application is booted, the performance of the framework in regards to the time taken to update/modify the DOM is good.

How big is the package and what’s the memory usage like? The framework should keep a check on how much memory is allocated in the browser for the application so that we don’t have any memory leaks while using the framework.

7) Conceptual Integrity

The framework should support a consistent model of implementation with respect to both architecture and design. In other words, it should do one task and do it well. Since we were primarily looking for a framework that is based on component paradigm, we were looking for better support for nested components, inter-component communication, event model, template interpolation and state management.

8) Learning Curve

How easy is it for an experienced developer to learn the framework?

9) Code Style

How readable and intuitive is the code and conventions of the framework?

10) Maturity

How mature is the framework? How long has it been production-tested? How clear is its future?

11) Single File Components

Single file components serve to be a huge advantage to the developer. They keep the component code and its assets like markup, styles, and behavior encapsulated within a single file for easy maintenance. How intuitive is browsing and maintaining a component in the framework?

12) Flexibility

How many features is the framework offering out of the box?

Does the framework offer both features that are needed to manage complexity in modern web applications and an integrated development toolkit that enables rapid iteration? What are the major features that the framework offers that we can’t do without? How many of its features are mandatory? How easy is it to customize the framework?

13) Tooling

How well will all the features provided by the framework work well together? What tools are available for the framework? How many stable plugins are there for the framework?

The tool we used to measure performance

Once a framework is chosen, it is necessary to validate its performance. We needed a tool for that. Since we are in the early stages of developing the architecture, it was not feasible to host the prototype application on to the servers and use a third party performance profiling service to carry out the performance analytics. Rather, we needed a tool that is both versatile and flexible and can do performance audits in our local development environments. This is where Lighthouse comes into the picture.

What is Lighthouse Audit?

Lighthouse is an open-source tool. When run against any web page, it can audit performance, accessibility, progressive web apps, and more.

Screenshot of Lighthouse Audit tool.

You can run Lighthouse in Chrome DevTools either from the command line or as a node module. Once the URL is given to the audit, Lighthouse runs a series of tests against the page and generates a report on how well the page did. Failing audits can also be seen in the reports. These are taken as indicators on where and how to improve the page. Each audit has a reference document explaining why the audit is important and how to fix failures encountered.

Parameters of Lighthouse

The Lighthouse report gave us an overall scorecard in terms of performance and best practices. The scores gave us a general understanding of the performance of the prototype application we built and from the performance numbers, we can derive conclusive evidence with respect to the framework performance. The score can be divided to measure the following metrics.

The metrics on Lighthouse

First paint and First contentful paint

The First-Paint metric marks the point at which the browser has painted the first pixel on the page. By contrast, the First-Contentful-Paint metric marks when the first bit of content is painted from DOM.

First meaningful paint (FMP)

The FMP metric measures the time it takes for a page’s primary content to appear on the screen.

Time to interactive

The metric time to interactive (TTI) marks when the JavaScript main thread is idle for a number of seconds and highlights how reliable the application responds to user input.

Speed Index

Speed Index is a page load performance metric that measures the average time at which visible parts of the page are displayed.

To know more about the metrics, you can check them out in the User-Centric Performance metrics in Google Chrome’s Developer Portal.

Now that we’re familiar with the metrics we used to evaluate various frameworks, in the next post we’ll get down to the details and look at how each framework performed. Stay tuned.