Try OSlash—Plug & play copilots for your software tool and your team!

Book a demo

Integrate the OSlash Copilot SDK into your app in just 20 minutes!

Get in touch with us

Integrate the OSlash Copilot SDK into your app in just 20 minutes!

Get in touch with us

Integrate the OSlash Copilot SDK into your app in just 20 minutes!

Get in touch with us
Share to Twitter

Why we transitioned from React to Svelte(Kit) at OSlash

React vs. Svelte(Kit) is a debate that divides engineers the world over. Here's why we chose to transition from the former to the latter at OSlash.
Table of Content

“Simplicity is the ultimate sophistication,” said Leonardo Da Vinci. We might know him as a gifted painter but he was also an astute engineer and inventor to whom the simplest things were the most remarkable. 

His maxim also applies to the dynamic world of web development, as we found out the hard way at OSlash. We were dealing with the complexities of maintaining our frontend stack, crafted with React.js and orchestrated by Next.js, a meta framework for React.js apps.

At first, React.js and Next.js seemed like a dream team, offering fantastic benefits. But as OSlash grew and our projects became more intricate, we faced a tough question: How do we maintain code quality while ensuring optimal performance?

This is the story of how we pivoted to Svelte and SvelteKit, finding elegant solutions to our challenges and rethinking our approaches from first principles. Grab some popcorn if you can!

(Note: We wrote this blog post assuming our readers’ familiarity with Svelte and SvelteKit.)

Not good, definitely old times

We’re a tech startup. We build software. And our aim has always been to build great stuff that delights our users.

Since the frontend/UI of an app is where users get to experience the product and its capabilities, it needs to be top-notch. To achieve this, we built what we thought was the best frontend stack using React. 

To manage our user dashboard, we used Next.JS as the meta framework. We also had a browser extension, which was a pure React app. This setup used to sit super cozily in the frontend folder of our monorepo. 

Next.JS provided a lot of powerful capabilities to us including but not limited to authentication, server side rendering, in app state management, data fetching, and so on. 

Not ones to take anything for granted, we took special care of component purity, side effects management, race conditions, preventing extra renders, and all the other common programming mistakes that people commit while developing React components.

But as our product and teams grew in scale and complexity, hygiene suffered. This happens to many rapid growth startups where interns and senior devs collaborate to get things done faster. Our teething troubles resulted in a monolithic frontend app with escape hatches like useEffect sprinkled everywhere. To add to it, the app’s lighthouse scores were way below what they should have been and its size was also bloated.

The quest for an ideal solution

With so many problems plaguing the system, we needed an ideal solution and needed it quickly. 

So we began with an internal, exhaustive code review. 

The result? This list of inherently complex and surprisingly avoidable issues:

1. More and more boiler-plating

React, in many ways, is like a child, to whom you need to issue repetitive instructions. In many cases, you have to hold the fingers of your React app and guide it to eternal bliss. You have to enclose some of your code in performance hooks like useMemo and useCallback, just to tell React not to worry about calculating unnecessary things on re-renders.

Boiler-plating adds complexity, increases code size, and makes code maintenance and readability more challenging.

2. No containment of side effects

We loved the functional approach that React came with. But, there was no clear-cut way to contain and handle the side effects. Languages and frameworks like Elm which are purely functional, provide a much cleaner way to do this.

With React, we had to use escape hatches, refs, and other effect related hooks whose state dependencies, if not specified accurately, would result in lots of extra renders (dreaded by the entire human race!) and less performant code.

3. No visible separation of client and server code

In next.js, the server and client code have to be included on a single page. If you’re not careful, it can lead to leakage of server code into the client code. 

If server code is exposed on the client-side, it may be susceptible to vulnerabilities and exploits that are specific to client-side environments. Attackers can leverage these vulnerabilities to compromise the application, steal data, or perform unauthorized actions. 

It can also expose a business’ intellectual property, making it easier for competitors to replicate or reverse-engineer the code and potentially undermine its competitive edge.

4. No reactivity

The app we were building—an advanced launcher—was expected to be reactive to every keystroke of the user. This required detailed logs and it became increasingly tough for us to keep track of all this. We had to use external support libraries like Redux and RxJS. These are really marvels in themselves, no doubt, but they too add a certain layer of complexity to the app.

5. Retrospection

Our team became painfully aware of these dx/performance issues, one thing was very clear: building with React was unnecessarily complicated and even the smallest of errors could lead to huge performance implications. We started trying different ways to tackle them.

For instance, we tried our hands at Astro. The main motivation was to ship negligible JavaScript to the client to improve the first load. To account for the side effects like synchronization, data fetching/loading etc. we extensively used service-workers and web-workers. 

For syncing our data between our user dashboard and browser extension, we required unified domain-wide storage. For this we leveraged IndexedDB and implemented mechanisms for caching and optimistic updates using URQL. All these efforts gave us good perf, but at the cost of a bit of sophistication in the architecture.

No ideal solution, just tradeoffs and more often than not we picked the tradeoff against DX.

We chose to head toward simplicity and minimal boiler-plating. 

In retrospect, we could have been stricter in our code reviews and should have established SOPs and declarative internal docs to follow and maintain hygiene in our codebase. But, is that really enough to have a high quality developer experience, every time you want to introduce a new feature in your product or change an existing one?

I’ll let you figure that out while we continue the story.

Pivot and the arrival of Svelte(Kit)

While we were building these engineering capabilities, our product teams were also hard at work, brainstorming ideas to build a product that fit in with our guiding principle of ‘ease of access and productivity’ in a more fundamental way. 

As we zeroed in on the construct of the OSlash Copilot, all the teams within the engineering department were given the freedom to choose their own tech-stack and build it from scratch. 

{{cta-component-1}}

While this does sound like an engineer’s dream, it was actually quite a high responsibility task, as the team would be accountable for making it scalable and reliable. To add the cherry to the cake, we had to make sure we made use of our past learnings and avoided repeating mistakes. 

The frontend for the copilot needed an interactive dashboard and a web SDK, which would be imported and used as a dependency in our customers' projects/products. The key requirements and specifications can be grouped as follows:

1. Performant and reliable

This SDK needed to handle all the heavy lifting of user interactivity and API calls. So it needs to be performant and reliable.

2. Lean and lightweight

The first alarm bells which go off inside any developer’s mind when they import a dependency are those about its size. The SDK has to be as lightweight as possible. We were aiming for gzipped bundle sizes of less than 50KB.

3. Shipped at speed

To gain an early mover advantage in the era of copilots, we needed to ship things at speed.

The frontend team started coming up with solutions that fit these criteria.

We considered three frameworks seriously: Vue, Svelte and React itself

Having based more than a few hobby projects on Svelte, I decided to give it a go.

We already had a launcher (think something like Spotlight in Mac or Cortana in Windows) in place, which shared a lot of similarities with the OSlash Copilot.

I made a simple version of that Launcher using Svelte. It literally took me 30 minutes to come up with a basic functional version.

We were all very impressed with the developer experience and the speed. On further looking at the growing ecosystem around Svelte and Svelte Kit, we decided to take a bet on it.

The transition to Svelte(Kit) and the results

We decided to use SvelteKit, a meta-framework for Svelte for building our user-facing dashboard. For the SDK, we decided to use simple Svelte, which would build and give a JavaScript file as an output. This JavaScript file could be used by our customers via an NPM dependency or could be imported as a script directly in their HTML templates via a CDN.

The above setup ended up working very well for us, as you’ll see in a minute from the positive results we got. Some of the benefits are specific to our use cases and some are generic. One common pattern is the ease of development with Svelte(Kit). 

1. Reusability

Internally, we developed and published a UI component library using SvelteKit. SvelteKit comes with a built-in package command, which you can use to build and package/publish your component libraries. This component library mainly exported our Copilot widget. One instant benefit was reusability, as we were using the same Copilot Widget in our SDK as well as in our dashboard previews.

2. Strong SSR support

In the dashboard, we used SvelteKit’s load functions. These load functions would fetch and load data during page loads. They run on the server (SSR) and reside in separate page.server.ts and layout.server.ts files which are different for each different route. This gave us easy separation of concerns and modular scalability.

3. Promise streaming

One beautiful thing about these server side load functions is that they support streaming of promises. This is super useful in data heavy pages. We used this in our analytics page. 

Here’s how.

We needed to call a bunch of APIs to display different categories of analytic results. Let's call these ‘Initial APIs’. On top of this, we needed to make more API calls, if the user wanted to know more about a particular analytics result. Let’s call these ‘Know More APIs’.  The user would click on the dropdown to view this ‘Know More Result’. 

A naive/brute force implementation would be to call ‘Initial APIs’ on page load and do client side ‘Know More API’ calls when the user clicks on ‘Know More’. Clearly, the problem with this is that the user needs to wait till the ‘Know More API’ resolves its promise.

Naive approach to load API results

Another approach would be to load everything on page load. Firstly, call the ‘Initial APIs’. On getting the initial results, call the ‘Know More APIs’. Once everything is resolved, then only render the page. Clearly, this would result in quite a lengthy page load time.

Another approach to load API results

A better approach—one that we implemented—was to call the ‘Initial APIs’ parallelly on page load. On getting results, do two things. First, render the page, and then side by side, call the ‘Know More APIs’ and stream their promises to the client using SvelteKit’s streaming interface. Still, all this would be done at the server side only. SvelteKit enabled us to do these things in a simple and crisp way. 

Better approach to load API results

In this manner we still maintained a 100% performance in lighthouse page speeds. To know more about this, checkout SvelteKit’s documentation

4. Native yet powerful form interface

Another simple, yet elegant thing that we leveraged in SvelteKit was its form interface. 

The good old HTML forms and their component elements have a lot of in-built support for submission and validation. SvelteKit enables us to tap into that natively.

For example, in our Login page, we used simple HTML forms and relied on form submission methods to do the authentication.

Using SvelteKit, the submitted data will go to its respective page.server.ts file, which will handle the authentication logic on the frontend server and pass the result to the client.

All this will work even if JavaScript is disabled on the client. If JavaScript is enabled, you can tap into the submitted data progressively and even do some client side validation or add loading states before passing it to the frontend server.

We did basic sanitization related validation and loading states on the client side and other validation and submission logic (like API calls, setting up auth cookies, throwing login failure errors etc.) on the server side. All this is handled out of the box by Sveltekit, and no other form libraries are required. Sweeeet!

Powerful form interface

5. SvelteKit hooks

For app wide functionalities like authentication, fetch requests interception, logging etc. we utilized SvelteKit hooks. I will refer to them as hooks from now on. Basically, these hooks are app-wide functions you declare that SvelteKit will call in response to specific events, giving you fine-grained control over the framework's behavior. Here’s how we used them: 

i. Authentication

When you have authentication based apps, you need to take a lot of care in separating guarded routes from public ones. The hooks.server.ts file contained our server side hook functions which would be triggered whenever any of our server side load functions are executed. On loading any page, before its respective server side code would be executed, the handle hook function would get triggered and would fetch the authentication cookie stored on the client, verify it, decode it to get the user and store the user object locally till the session lasts. This user object would be available throughout the app.

With SvelteKit, we grouped all our guarded routes under one layout. This layout comes with layout.server.ts file, which would check the user from the user object (stored by the above handle hook function) and only allow it when it is a valid user.

So we only had to handle authentication once and our app routes are secured and well authenticated.

Authentication

ii. Fetch request interception

Another benefit of hooks is request interception. Our API request calls made to the backend require the auth token to be present as a cookie in the header. 

Instead of having a separate universal fetch function, which would add this auth token, we utilized the handleFetch function provided in our hooks.server.ts file to intercept fetch requests made by the server and add the auth token as a header cookie before sending the request to the backend.

This gave us service-worker-like capabilities, but in a much more reliable and robust way as it is still server side. 

Client side hooks are also provided, which would be triggered before all client side functions and we plan to use that in the future to implement optimistic updates and caching.

Request Interception

6. Lightweight SDK bundle

Our SDK was built just on Svelte as we wanted it to be lean and simple. Unlike React, which is a library, Svelte is a language which comes with its compiler. What this means is that, you only ship what you need. Whereas with React, you ship the entire 130-140 kb library with your application. 

React’s gzipped size is around 40 kb whereas Svelte is less than 2 kb. This vast difference is something which cannot be ignored and was one of the practical reasons for us to adopt Svelte apart from the reactivity and other rendering issues associated with React. 

Our current SDK’s gzipped bundle size is 32 kb. There is still a lot to squeeze out like fonts and some unnecessary dependencies to make it more lean. For now, we would still consider it Mission Accomplished!

{{cta-component-2}}

Caveats and drawbacks of Svelte(Kit)

We as of now are quite satisfied with the overall functionality and design of Svelte and SvelteKit as a framework. Yet, like all good things, Svelte has some caveats. By looking at the pace of development and the people involved with Svelte, both in and out of Vercel, we strongly feel that these would be improved in the future.

1. Transitive dependencies would not work in function calls

If you call two functions as separate individual reactive blocks, whose parameters are transitively dependent (function 1 is dependent on a value and function 2 is dependent on value changed by function 1), then the compiler would not know this transitive dependency and it would not work as expected. The workaround for this is to either use the functions in the same reactive block, or separate out the reactive values and declare them individually, not the functions.

For more info, check this GitHub issue 

2. The Svelte(Kit) ecosystem is relatively small

Since Svelte is very recent, it does not command wide scale production usage. Although, it is important to note that it came out to be the most loved language/framework in various surveys.

3. Developer expertise

For us at OSlash, the current implementation is our first ever usage of Svelte in production. So we still have to develop more expertise in it. 

Nevertheless, we see this as an opportunity rather than a drawback. Being part of a community in its early stages has its own merits, like having less clutter and noise, lots of visibility for libraries/projects we might open-source and even to an extent a better understanding of how the framework has evolved over time.

What’s next for Svelte(Kit) at OSlash?

We have barely scratched the surface of Svelte at OSlash. 

We intend to develop a much deeper understanding of it and contribute to its slowly-but-surely thriving ecosystem. We have been keeping a keen eye on Svelte 5, which comes with some radical changes. 

As always, we don't intend to promote anything just because it's trendy and new. The aim of this blog is to just showcase how we benefited from this tech-stack and start a discussion around picking up and identifying things that meet our use cases.

Ready to integrate AI

into  your website, app or software?
Book a demo