Originally, I wrote up a post trying to give a 2020 - 2021 overview that got hosed with a local git repo of this blog. I'm using the moment to remind myself that backups are important. It's also important to complete ideas for posts or journals quickly, even if something doesn't feel complete. Letting those linger for days without a git commit that hit the server is a genuine problem and I need to at the very least create and push to a new branch often.
One change that happened at the end of 2020, I started the journal section to try to capture bite-sized rough ideas. I had started a journal at work with notes in files like Phoenix Developer Diary.txt
and I looked for a solution to merge my different diaries. The excellent Claire Codes has an extremely consistent diary at clairecodes and served as my main source of inspiration.
I've gone all-in learning Elixir by participating in my first Advent of Code in 2020. I tapered off pretty quickly as I had serious problems working through loops and control flow. Seeing other examples on Elixir Forum helped immensely as I had slowly gotten better at reading the code. Later on in the year, I decided to take a TodoMVC sample through to a LiveView version with a little help from other resources on the internet. I had also started a diary where I wanted to capture the approaches I took each day I worked on the example. I have a plan to try to tackle my version from scratch but I'm also looking at other application ideas.
While the Advent of Code and TodoMVC was good to get my feet wet, I learned far more by pushing through Exercism exercises. If you're on Exercism and curious, my solutions can be found here. I highly recommend using Exercism to learn any language it covers as the recently released version 3 makes for a great experience. Exercises feel a bit more "real world" and less like brain teasers that happen to use programming concepts. Even if I happened to look at the HINTS.md
file, it never felt like cheating as it would only guide us toward a solution, not implement it.
After attending the excellent ElixirConf 2021 virtually, I've started working with Livebook in a few examples. I wanted to highlight the 3 notebooks that use the excellent spider_man
package to crawl 3 websites: Elixir Jobs, Elixir Radar Jobs, and Elixir Companies. Parsing the DOM of each required slowly stretching far outside my comfort zone. It's also worth mentioning that in the Elixir Jobs
example, I left a problem I found under the Sorting the Results
section. Due to the zero-width space, the section throws the message ** (SyntaxError) nofile:5:1: unexpected token: "" (column 1, code point U+200B)
.
Coming to the end of 2021, I'm looking forward to immersing myself deeper in the Elixir ecosystem. Livebook is also a great way to get your feet wet with Elixir concepts, like a powerful language scratchpad. There have been other life changes since January 2020 but those deserve separate posts when I can get to them. Fortunately, the pandemic hasn't been harsh on my family or extended family at all, which I consider an extreme blessing. I can't say we weren't impacted by the last 2 years but things could've been much worse.
In my last post, I mentioned the transition to Gridsome and it has been relatively pain free. I owe a lot of this to the existing community and the great list of starter resources. If a concept isn't explained or clear in the docs, chances are you can gain some insight from the various starters.
One particular concept I had a problem with right out of the gate was how to use markdown files from multiple directories. I started with the post type to handle /year/month/day/title routes but I wanted to move to an equivalent of the generic page type from Hexo. In doing research to the search terms I could've used months ago, I stumbled on multiple issues that point out how to do it.
In the file gridsome.config.js
, I use the following snippet in the plugins section:
{
use: '@gridsome/source-filesystem',
options: {
path: 'blog/articles/**/*.md',
typeName: 'Article',
refs: {
authors: {
typeName: 'Author',
create: true
},
}
}
},
{
use: '@gridsome/source-filesystem',
options: {
path: 'blog/posts/**/*.md',
typeName: 'Post',
refs: {
authors: {
typeName: 'Author',
create: true
},
categories: {
typeName: 'Category',
create: true
},
tags: {
typeName: 'Tag',
create: true
},
}
}
},
Since Gridsome has a concept of pages already, I chose the word article to represent them instead. As an example, the portfolio page is an article type while this page represents a post type. While hindsight makes this seem intuitive now, I somehow had the impression that you were only allowed one plugin type for safety reasons.
To point out something else, the portfolio page highlights a technique I didn't think was possible at the time. The parent portfolio page is an article type but all the subsequent child pages are markdown files in a separate portfolio directory as a portfolio type. In the plugins section of gridsome.config.js
:
{
use: '@gridsome/source-filesystem',
options: {
path: 'blog/portfolio/**/*.md',
typeName: 'Portfolio',
refs: {
authors: {
typeName: 'Author',
create: true
},
}
}
},
Coming from Hexo, I opt for placing content in markdown files and having unique layouts defined in the various pages
and templates
files. As much as Gridsome is a generic website framework, I find that it can be extremely flexible to whatever workflow you wish to create. There are some parts of Hexo I miss like scaffolding new page types or steering me into blog concepts but the transition to Gridsome has been rather smooth. While Gridsome may not be for everyone, I can definitely see how JAMstack has gained traction recently. Barring very few gotchas, working on this site is fun again even in the I-can-see-every-blemish state it's currently in.
It's been over a year since my last post and while unfinished drafts don't count, I thought my blog was due for a change. The move from Octopress to Hexo was relatively uneventful but I found keeping up a little difficult. It wasn't completely on Hexo, I had tweaked things to a point that merging in changes over time became cumbersome and slow. In a previous post, I roughly mentioned the transition and a lot has happened to the web in over 3 years.
Static site generators like Hugo and Gatsby have picked up steam and the feature set of Gatsby, particularly the GraphQL component stood out. I wanted to stick to Vue for as many of my personal projects as possible, so I searched for any static site generator using Vue I could find. Fortunately Gridsome has come along as a nice clone of Gatsby using Vue rather than React and even though it's at v0.7.12 at the time of this post, I've run into very few hurdles.
I don't have the best understanding of JAMstack after working with a sample size of one, but learning GraphQL by only dealing with queries has made this one of the best ways to get my feet wet. I'm by no means an expert but this light interaction compels me to use it more often, as it's mostly been a pleasure to work with. Frameworks like Gridsome and I suspect Gatsby let you focus on almost entirely the frontend. Even though the A in JAMstack stands for APIs, as a backend developer I haven't had to write a single REST, GraphQL or what I'd typically associate with an API like I would with Laravel, Phoenix Framework, or Express.
One thing I miss about Hexo is that it had scaffolding to generate new files. Gridsome is a framework for generic sites, not just blogs, so scaffolding doesn't seem to be included. Coming from Hexo I wanted to keep as much of the existing markdown as possible and I think some of the approaches I've taken may be useful to others. A small example I had a problem understanding is that you can use a @gridsome/source-filesystem
plugin multiple times, one for each directory or type. It makes sense in hindsight but none of the starters used the technique nor did the docs seem to suggest it was possible. I'm tempted to create a starter based on my usage patterns but worst case, I plan on writing a post outlining some of these approaches in the near future.
One last thing is a small humblebrag. While the theme for this site draws a few cues from the older version, I wanted to flex my design abilities by focusing on techniques I've learned reading Refactoring UI. By the time this post is published it likely won't be perfect but I think it's a decent first pass that should only get better over time.
I've been using this Swaggervel package with almost all my recent Laravel projects. A few instances were lightly customized to work against different authentication schemes and I only briefly touched on using Laravel Passport.
I wanted to highlight a few areas while also offering up an example project as a lightly opinionated jumping off point. Just the highlights cover quite a bit of information but the example should have ample information in commit messages and in the finished product.
First we run laravel new <project_name>
, git init
and commit immediately to mark our base Laravel installation.
I've always preferred this immediate commit over making customizations first as it's far easier to track your customizations versus the base install.
Next, we run through the Laravel Passport docs with the following caveats:
php artisan vendor:publish --tag=passport-migrations
doesn't copy the migrations as expected. We manually do this.php artisan migrate --step
creates a migration batch for each migration file individually. This lets us rollback to individual steps and is primarily personal preference.app/Providers/AuthServiceProvider
contains the following:Passport::routes(function (RouteRegistrar $routeRegistrar) {
$routeRegistrar->all();
});
Passport::tokensCan([
]);
Passport::enableImplicitGrant();
Passport::tokensExpireIn(Carbon::now()->addDays(15));
Passport::refreshTokensExpireIn(Carbon::now()->addDays(30));
Run artisan make:auth
to utilize the app layout and create a home
view that is protected by the Login prompt.
Create a proper WelcomeController
with matching view that utilizes the same app layout
artisan route:cache
in the future as route closures aren't supported.Now that the basics are complete, we bring in Swaggervel via composer require appointer/swaggervel --dev
.
We can ignore the line in the documentation that mentions adding Appointer\Swaggervel\SwaggervelServiceProvider::class
as that's only for Laravel versions earlier than 5.5 without package discovery.
It's necessary to run artisan vendor:publish
to publish the content as we're using this package as a dev dependency and the assets won't show up otherwise.
Now that Swaggervel is in place we can bring it all together.
To start, we create the file app/Http/Controllers/Api/v1/Controller.php
as our generic API base controller.
This controller houses our root-level @SWG\Info
definition in a convenient location.
This also sets us up for future work where API controllers are versioned, though this is personal preference.
The secret sauce is the @SWG\SecurityScheme
annotation:
/**
* @SWG\SecurityScheme(
* securityDefinition="passport-swaggervel_auth",
* description="OAuth2 grant provided by Laravel Passport",
* type="oauth2",
* authorizationUrl="/oauth/authorize",
* tokenUrl="/oauth/token",
* flow="accessCode",
* scopes={
* *
* }
* ),
*/
The securityDefinition
property is arbitrary but needs to be included in every protected route definition.
You can specify multiple security schemes to cover things like an generic api key or likely multiple OAuth flows, though I haven't tried working out the latter.
These are the supported flows and it's important to note that Swaggervel is currently on the OpenAPI 2.0
specification, though this may change in the future.
The scopes
specified includes everything (*) but we could define any scopes explicitly.
It should be noted that we also need to setup the route definitions in our resource Controller classes but due to the verbosity they are too much to include in this post.
A small snippet that is unique to working with this setup is the following:
* security={
* {
* "passport-swaggervel_auth": {"*"}
* }
* },
This tells a specific endpoint to use the securityDefinition
created earlier and it's important that these match.
The example project has rudimentary UserController
, User
model, and UserRequest
definitions that should be a decent starting point, though I can't vouch for them being very comprehensive.
First we need to create an OAuth client specifically for Swaggervel connections.
Go to the /home
endpoint and under OAuth Clients
click Create New Client
.
Under Name
specify Laravel Passport Swaggervel
or just Swaggervel
.
Under Redirect URL
we're unable to specify /vendor/swaggervel/oauth2-redirect.html
directly, so use a placeholder like https://passport-swaggervel.test/vendor/swaggervel/oauth2-redirect.html
instead.
Using your SQL toolbox of choice, navigate to the table oauth_clients
and look for the row with the name specified in the previous step, in our case Laravel Passport Swaggervel
.
Manually update the redirect column to /vendor/swaggervel/oauth2-redirect.html
.
Now that our OAuth client in Passport should be setup correctly, we focus our attention on the config/swaggervel.php
settings.
The client-id
should be set to what Passport shows in the UI as the Client ID
field.
This is also the id of the row in the oauth_clients
table.
The client-secret
should be set to the what Passport shows in the UI as the Secret
field.
We also set both secure-protocol
and init-o-auth
to true, the latter of which fills in the UI with our secrets otherwise we'd have to put them in manually.
For the OAuth2 redirect to function properly we need to modify the Swagger UI configuration in resources/views/vendor/swaggervel/index.blade.php
.
Under const ui = SwaggerUIBundle({
right below the url parameter should be oauth2RedirectUrl: '/vendor/swaggervel/oauth2-redirect.html',
.
This reinforcement is necessary as the Swagger UI doesn't 'catch' the tokens properly without this.
Other notable additions that make the UI slightly easier to work with:
tagsSorter: 'alpha',
operationsSorter: 'alpha',
docExpansion: 'list',
filter: true
First we go to the api/docs
endpoint to display the Swagger UI.
Click the Authorize
button with the unlocked padlock icon.
Verify the client_id
and client_secret
sections are filled in.
Click Authorize
and the Laravel Passport screen labelled Authorization Request
should display with the Authorize
and Cancel
buttons.
Click Authorize
again and you should be redirected back to Swagger with the client_id
and client_secret
now showing as ******
with a Logout
button instead of Authorize
.
We should now be able to click on the GET /users
route, click the Try it out
button, click on the blue Execute
button and be greeted with our expected response as a list of users.
We've hopefully highlighted the basic touch points of the process with the example code going into much further detail. The project is lightly opinionated to facilitate practices that have served me well so far. It is by no means a complete reference but it should be a good jumping off point when it's somewhat harder to see the big picture without a comprehensive example.
In case you need the link to the project again.
Not too long ago I became obsessed with Prometheus.
I'd heard about it for a while, knew it was powerful, and couldn't quite understand how everything fit together.
The documentation is extremely verbose for good reason but it took playing with it for a while for everything to click.
This post is a rather concise and extensive overview that goes a long way in expressing the basic concepts to my developer brain.
In their simplest form, exporters expose an HTTP endpoint of /metrics
with the output being statistics in Prometheus' format.
The real power of Prometheus comes when you expose your own /metrics
endpoint and have Prometheus consume the statistics you generate.
This post is also a very good introduction with the section Building your own exporter
being extremely valuable in describing just some of the possibilities.
After getting my bearings I started with a prototype with a simple premise "Why look at the usage graphs in Digital Ocean for each server independently? Why not have it in one location?" How To Install Prometheus on Ubuntu 16.04 is a very good primer to get everything up and running quickly.
I've made a few modifications since working through the article:
Use prometheus:prometheus
for ownership of core prometheus processes like prometheus
or alertmanager
.
sudo useradd --no-create-home --shell /bin/false prometheus
Use prometheus-exporter:prometheus-exporter
for ownership of exporters. Exporters should possibly be more isolated but I feel it may be a case of YAGNI.
sudo useradd --no-create-home --shell /bin/false prometheus-exporter
Set scrape_interval to 1 minute: scrape_interval: 1m
.
At $dayJob we've moved to provisioning servers using Laravel Forge, which has the possibility of utilizing exporters for mysqld, mariadb, postgres, memcached, redis, beanstalkd, nginx, php-fpm, and sendmail.
I've opted to use node_exporter, mysqld, nginx-vts-exporter, php-fpm, and redis respectively.
To put the original premise into perspective, replicating the newer monitoring agent graphs in Digital Ocean only require node_exporter
.
A few of the exporters require very little setup, only setting a few configuration variables systemd service definitions. Other exporters like nginx-vts-exporter
require building nginx from source.
I plan to introduce a series of posts that should aid in getting a very rudimentary implementation running. There is an abundant usage of Kubernetes in the Prometheus ecosystem, to the point that it almost seems required but fortunately it also just works(tm) in a traditional virtual machine without any real fuss.