Behold, the most significant addition to my site in years: search capability. I'm pretty sure this was in the original plan for my site, but in the early years, it was a combination of not having a good plan to implement and not having enough posts to need searching capability that meant I never got around to doing it. Then, two years ago, I was exposed to Elasticsearch. I was pleased by its ease of use, robustness, capabilities, and features. I had a good solution. Now, that I have almost 150 entries on my site, it is time to make it happen, which is what I did in the last two days. And I am very happy with the results.
This is the third version of my website. Each version has been revolutionary. With the first version, using a LAMP stack and Dojo Toolkit, it was leaps and bounds easier to create a website than ever before. AJAX was new at the time, and for the first time, websites could refresh pieces of a page, instead of reloading the whole page with every navigation. The second version was built using Ruby on Rails, which was a huge step up. The site could be built in minutes, and developing on it was extremely simple. Everything was neatly organized. Since the second version was completed 6 years ago, there have been many new developments in technology. CSS libraries such as Bootstrap became popular. HTML5 became supported. Node.js arrived, and for the first time, the server and frontend could use the same language. But, all these developments were not enough to motivate me to redo my entire site. What caused the recent revision of my website was frontend JS libraries. Now, frontend JS libraries have been around for a while already. But, they had shortcomings in the areas of SEO and URL history tracking. In fact, the second version of my site was a proto-frontend JS library site (single-page application). It accomplished what frontend JS libraries were designed to do, but came before these libraries were available. However, at my job, I was able to overcome the two aforementioned shortcomings of frontend JS libraries using RiotJS. That caused me to reconsider the notion of undertaking a third revision of my site. I would incorporate these new technologies that had been invented since my second revision. In addition, I would try using Vue.js, since many people highly recommended it. If Vue.js could do what I wanted, then I would stick with it. Otherwise, I would use RiotJS, which I knew could do what I wanted. Thus began my third revision on March 3, according to the commit history.
Initially, my tech stack was Node.js, Express, Nunjucks, Vue.js and Material Design Lite. In the first week, I hit obstacle after obstacle, and setback after setback. I was trying to use Vue.js as I did with RiotJS, by rendering the content on the page, and using Vue.js to parse it. Unfortunately, Vue.js does not work that way. Through much searching, I discovered Nuxt, which builds on top of Vue.js, adding server-side rendering, just the feature I needed. Ultimately, I had to start over, dropping Nunjucks and Vue.js for Nuxt. After that, it was not always smooth sailing, but I was able to do just about everything that I needed using Nuxt. Sometimes, it did require extensive experimentation. But, I'm happy with the end result.
Google provides a useful tool called Lighthouse, which audits a site, and scores its likelihood of being successful, based off of a number of clearly defined metrics. This has helped me to optimize the site. By ticking each requirement, my site got better. Now, it looks and feels very professional, and accomplishes the initial goal of my website - to be an online portfolio highlighting my skills and work. I audited all three revisions, which can be compared against each other.
While the site is very good, there is still some room left for improvement. Unfortunately, the low-hanging fruit have been picked. What remains are more challenging things. One thing to keep in mind when implementing new features is that popular browsers need to support these features. If a browser does not support a feature, then either a fallback needs to be implemented, or the feature cannot be implemented; otherwise, a portion of users will have negative experiences. That is precisely what holds me back from converting my images to next-gen formats - the thing that would yield the most performance benefit. Firefox does not support any next-gen formats currently, so I will wait until they do. Besides that, HTTP/2 is something I will also be looking into, though that is more a Node.js support issue.