Skip to content

Blog

Static site generators: brief history, as I recall it

Hype keeps building up around static site generators. But this technique has been around for much longer than most people think. Our industry's collective memory 🧠 is very short, and it some times needs a little help.

In the beginning the whole web was static. Then there was CGI. Webmasters could now stitch together some dynamic rendering, as well as respond to forms posted by users.

In less than 3 years, CGI evolved into a panoply of glue languages developed exclusively for this purpose - e.g.: ColdFusion, PHP, Ruby, ASP, JSP - or quickly adapted to it - e.g.: Perl, Python.

The golden age of MVC

With the rise of these scripting languages, application servers, and the free relational databases - free as in free beer! - dynamic server-side rendering went from being the next big thing to completely taking over the landscape, all in a couple of years.

At some point, MVC was everywhere, and OO was the craft. And with them, a hundred frameworks written in ColdFusion, PHP, Ruby, Python, Java, and .Net. Soon enough, the scene matured into CMSs of all sizes and flavors. Here's an article from 2007 about the history of the dynamic web for a bit of perspective, from deep within the bubble.

Actually, the dark age of MVC

When I joined SAPO in 2008 I had totally jumped on the MVC framework bandwagon and I didn't really know better. A single code base, a monolith application for several interfaces - public website, rss, atom, back office. We had reached peak design patterns, maximum reusability, and couldn't even sense the smell.

Photo of a hand holding a hammer
The hammer & nail quote comes later. Photo by Leo Moko on Unsplash Visit site

Unfortunately, yet not surprisingly, these database driven websites were not scaling that well. Don't get me wrong, they were dealing with complexity pretty fine, consuming and producing a variety of services, plugged into offline distributed processes, brokers, full-on SOA, the works.

Run-time, though, writes and reads were competing for stretched out DB servers, and in order to keep the latency at acceptable levels, and to serve the apps with decent availability the operational costs were spiking. Blinded by the automagical lights, we were categorizing all the challenges under optimise this later. Caching partials and responses, abusing cache farms, replicating the DB, micro optimising code, ... Plenty of technical solutions evolving, but at what cost?

Rendering like rebels

Meanwhile, back to SAPO, the older crew there, thought differently.

These were a bunch of very talented engineers, and a handful of legend level Perl and C developers, that had built a network of hundreds of large scale, high traffic, high availability websites, among so many other things, like dozens of mobile apps and even sending tech to space just for fun.

So how did they think differently from the predominant MVC doctrine? Quite simple. First: pre-rendering all the static content, and serving pure html+css. Second: a layer of client-side progressive enhancement, rendering the user aware content, such as comments, ratings, favorites, directly in the browser. Sounds familiar?

Whenever an editor published something on their CMS, entire websites or sections, were re-generated. After all, these back offices had like 2 journalists and an intern doing most of the writes. While the actual websites had millions of users doing all the reads and, only occasionally, an interaction.

What was going on here? How could a bunch of disconnected perl scripts be better, faster, and cheaper than the elegant, modular, extensible approach? Turns out pre-rendering, or pre-burning, "queimar" in Portuguese, materializing the end user state as soon as the source data was updated, wasn't even news in 2008. If you read the first section above, it was the actual origin story of the dynamic web. It had already been around for 10 years.

The plot twist we deserved

Meanwhile, the web-standards movement was striking win after win. With Firefox and Chrome, the browser was becoming a viable application platform, and client-side rendering, beyond the initial AJAX hacks, was becoming a believable idea.

Suddenly, the paradigm was shifting. Along came NodeJS, the NoSQL movement, and a massive simplification of the application server approach. Ditch relational and transactional where you actually don't needed it. Embrace REST, eventual consistency, schemaless, streams, client-side rendering, and much more long awaited techniques. Looked like we now had more options and combining the best of both (all) worlds would be the sensible thing to do.

But it is not surprising that the new bandwagon and its rising tide of followers was riding fast and furious as far away as possible from its starting point. The past is wrong and all prior art needs to be forgotten! Rewrite all the things! Let's now over engineer all things front-end because we can!

Fast forward to 2013: we now have ES6, javascript build tools, Webpack, the NPM ecosystem and a new framework war fueled by Google and Facebook. Suddenly every project must be built as a front-end rendered single page application. Every startup is now hiring the new breed: the full stack developer, a.k.a. let's just use Javascript everywhere for everything. So, what's next? Javascript fatigue, Javascript framework fatigue, and more Javascript fatigue.

The new problem to solve? Overcoming javascript framework fatigue. How did this happen? Did we stop to consider the implications? Were we leveraging all the options? Combining the best of each technology to solve the different problems with the right tools?

Stock photo of a hammer lightly touching a chicken egg
Photo from Pixabay Visit site

When everything you ~~have~~ WANT TO USE is a hammer, everything looks like a nail. Landing page? SPA! personal website? SPA! newspaper? SPA! e-commerce? SPA! dating app for lonely pets? SPA! SPA! SPA! SPA! is the new hammer and it makes the most lovely sound.

History repeats itself

Long story long, welcome to 2019. People are writing posts explaining SSG maybe ignoring that so much of the original web was built exactly that way. Somewhere else, someone is making the case for static site generation every single day.

Gatsby, React Static, Next.js are all the rage. But to be fair to prior art, in the early 2010s, projects like Jekyll then Hugo were already getting a lot of traction in some circles and paved the way for so many SSG options we have now. They are unfortunately not written in everyone's love/hate scripting language, but in Ruby and Go, respectively.

Jekyll is a simple, blog-aware, static site generator perfect for personal, project, or organisation sites. Think of it like a file-based CMS, without all the complexity.

Circling back to the whole point of this post. If you do a quick search for "static site generation" pre 2000s you can find gems like this paper on Tools and Approaches for Developing Data-Intensive Web Applications: a Survey, all the way back in 1998.

An orthogonal architectural issue concerns the time of binding between the content of the information base and the application pages delivered to the client, which can be static when pages are computed at application definition time and are immutable during application usage; or dynamic, when pages are created just-in-time from fresh content.

Do we already know better than letting the bandwagon mentality shoot us again with the new silver bullet? Are we going to make everything static site generated like we did with SPAs? Probably not. But if we don't stop obsessing (and click-baiting) about whether or not SSG is paradigm of the year we will do a lot of damage to projects, businesses, and users.

We have to be a bit less excited about the present and contemplate prior art as well. Better decisions are taken when considering all the options. And the next next big thing will always be a smart mashup of a few old things.

Conclusion

There has never been a more exciting time to develop for the web. We have the codebase, the tools, and the infrastructure to design, develop, and deploy faster, better than ever. And also options: modern SPAs, dynamic SSR, isomorphic rendering, serverless architectures, and more recently SSG.

Photo of hundreds of DYI tools neatly organised
Photo by Leo Moko on Unsplash Visit site

And it's never been easier to mix and match all these ingredients in secure, reliable, and cost-effective ways. Technologically, the entire web landscape has changed 5 times over and we are being gifted with all this potential under open source licenses for our fun and profit.

We all - developers, teams, companies - just need to take a step back some times, and appreciate the options. Learn a bit about the techniques we understand the least and remember that the best solutions are almost always the result of combining different tools.


This post was published on

Go back to top of the page