Tell me if you've done this before. You write up a nice little prototype of an idea in Livebook. You then get distracted by life situations like eating, writing an email, or taking a nap. You feel the need to close Livebook or prune the multiple sessions you've had running for weeks now. Because you have a million tabs open (with a session manager) and too many in Livebook to individually check, you restart your computer and let it crash(TM). When you open up Livebook again, "Oh. Shiiiiit" you exclaim. Where the hell did that notebook go? I'm 100% sure I clicked the disk icon, what the hell? If you're like me, you may have created this forked Livebook from memory, possibly taking a better approach.
There is a better way to handle this scenario. Livebook has had autosaves since 0.4:
The feature was added in this PR according to the changelog:
https://github.com/livebook-dev/livebook/pull/736
To find your autosave files:
For the Desktop application and CLI in production: ~/Library/Application Support/livebook/autosaved/
.
/Users/jbrayton/Library/Application Support/livebook/autosaved/
.For the dev environment: in config/dev.exs
, this is set as config :livebook, :data_path, Path.expand("tmp/livebook_data/dev"
.
/Users/Shared/repositories/personal/elixir/livebook/tmp/livebook_data/dev/autosaved/
.For the test environment: in config/test.exs
this is set as Path.expand("tmp/livebook_data/test")
.
/Users/Shared/repositories/personal/elixir/livebook/tmp/livebook_data/test/autosaved/
.Notebooks are saved by day in the autosave directory and the date corresponds to when they were created (when you immediately click the New notebook button).
To view or change your autosave directory in the CLI:
Settings
under the Home
and Learn
links.For the Desktop application, the port will be randomized but you can either change the URL to tack on /settings
after the port or click around to the settings page as described earlier.
If you are curious as to how this setting gets configured, we can start by looking at Livebook.Settings.default_autosave_path()
in https://github.com/livebook-dev/livebook/blob/main/lib/livebook/settings.ex#L32-L34.
We follow Livebook.Config.data_path()
to https://github.com/livebook-dev/livebook/blob/main/lib/livebook/config.ex#L76-L78 then the Erlang function :filename.basedir(:user_data, "livebook")
.
Running this in Livebook we get the output "/Users/jbrayton/Library/Application Support/livebook"
, precisely where the desktop app stores its files.
What lead me to this discovery, after vaguely remembering autosave was a thing, was looking for files on my computer.
I purposefully install and use the locate
command because I find it far easier to use than remembering the find -name
syntax.
Here's the output for checking that the word autosave
is in any directory or file name:
⋊> ~ locate autosaved/
/Users/Shared/repositories/personal/elixir/livebook/tmp/livebook_data/dev/autosaved/2022_10_31/18_25_03_mapset_drills_hedh.livemd
/Users/Shared/repositories/personal/elixir/livebook/tmp/livebook_data/dev/autosaved/2022_11_03/18_12_21_teller_bank_challenge_pv4e.livemd
/Users/Shared/repositories/personal/elixir/livebook/tmp/livebook_data/dev/autosaved/2022_11_03/18_13_39_untitled_notebook_pidb.livemd
/Users/Shared/repositories/personal/elixir/livebook/tmp/livebook_data/dev/autosaved/2022_11_03/19_31_57_dockyard_academy_amas_p75r.livemd
/Users/Shared/repositories/personal/elixir/livebook/tmp/livebook_data/dev/autosaved/2022_11_03/20_02_17_intro_to_timescale_jm7r.livemd
/Users/Shared/repositories/personal/elixir/livebook/tmp/livebook_data/dev/autosaved/2022_11_08/11_10_21_untitled_notebook_ervg.livemd
/Users/Shared/repositories/personal/elixir/livebook/tmp/livebook_data/dev/autosaved/2022_11_22/19_15_12_untitled_notebook_p75e.livemd
What I found interesting was that my files in ~/Library/Application Support/livebook/autosaved/
did not show up.
Had I not realized there could be different locations, I may have overlooked the notebook I was looking for all along.
I have no clue why locate
doesn't scour the directories in ~/Library
it should have access to but that's a problem for another day.
In December of 2021, Brian Cardarella introduced DockYard Beacon CMS in this series of tweets:
Over the course of the past year, I've created a sample project a total of 3 times to get a better understanding for how it operates. I haven't seen a ton of content on Beacon beyond announcement tweets, the mention in the ElixirConf 2022 keynote, and https://beaconcms.org/. This post covers the complete instructions in the readme with some notes on where to go from here. I had run into a few snags at first but a lot of those initial pain points have been hammered out so far. While a basic "Hello World" sample project is great, I plan on expanding on the sample with deeper dives into how Beacon serves up content. It takes a few novel approaches I haven't seen before to create either a CMS that runs along your application or it can be centralized with multi-tenancy. One CMS can service all of your ancillary marketing sites, blogs, or wherever you need the content.
The following instructions are also listed on the sample application readme so you're welcome to skip them if you want to look at the code.
Create a top-level directory to keep our application pair. This is temporary as the project matures.
mkdir beacon_sample
Clone GitHub - BeaconCMS/beacon: Beacon CMS to ./beacon
.
git clone git@github.com:BeaconCMS/beacon.git
Start with our first step from the Readme
mix phx.new --umbrella --install beacon_sample
Go to the umbrella project directory
cd beacon_sample/
Initialize git
git init
Commit the freshly initialized project
Initial commit of Phoenix v1.6.15
as of the time of this writing.Add :beacon as a dependency to both apps in your umbrella project
# Local:
{:beacon, path: "../../../beacon"},
# Or from GitHub:
{:beacon, github: "beaconCMS/beacon"},
apps/beacon_sample/mix.exs
and apps/beacon_sample_web/mix.exs
under the section defp deps do
.mix deps.get
to install the dependencies.Commit the changes.
Add :beacon as a dependency to both apps in your umbrella project
seems like a good enough commit message.Configure Beacon Repo
Beacon.Repo
under the ecto_repos:
section in config/config.exs
.Configure the database in dev.exs
. We'll do production later.
# Configure beacon database
config :beacon, Beacon.Repo,
username: "postgres",
password: "postgres",
database: "beacon_sample_beacon",
hostname: "localhost",
show_sensitive_data_on_connection_error: true,
pool_size: 10
Commit the changes.
Configure Beacon Repo
subject with Configure the beacon repository in our dev only environment for now.
body.Create a BeaconDataSource module that implements Beacon.DataSource.Behaviour
Create apps/beacon_sample/lib/beacon_sample/datasource.ex
defmodule BeaconSample.BeaconDataSource do
@behaviour Beacon.DataSource.Behaviour
def live_data("my_site", ["home"], _params), do: %{vals: ["first", "second", "third"]}
def live_data("my_site", ["blog", blog_slug], _params), do: %{blog_slug_uppercase: String.upcase(blog_slug)}
def live_data(_, _, _), do: %{}
end
Add that DataSource to your config/config.exs
config :beacon,
data_source: BeaconSample.BeaconDataSource
Commit the changes.
Configure BeaconDataSource
Make router (apps/beacon_sample_web/lib/beacon_sample_web/router.ex
) changes to cover Beacon pages.
Add a :beacon
pipeline. I typically do this towards the pipeline sections at the top, starting at line 17.
pipeline :beacon do
plug BeaconWeb.Plug
end
Add a BeaconWeb
scope.
scope "/", BeaconWeb do
pipe_through :browser
pipe_through :beacon
live_session :beacon, session: %{"beacon_site" => "my_site"} do
live "/beacon/*path", PageLive, :path
end
end
Comment out existing scope.
# scope "/", BeaconSampleWeb do
# pipe_through :browser
# get "/", PageController, :index
# end
Commit the changes.
Add routing changes
Add some components to your apps/beacon_sample/priv/repo/seeds.exs
.
alias Beacon.Components
alias Beacon.Pages
alias Beacon.Layouts
alias Beacon.Stylesheets
Stylesheets.create_stylesheet!(%{
site: "my_site",
name: "sample_stylesheet",
content: "body {cursor: zoom-in;}"
})
Components.create_component!(%{
site: "my_site",
name: "sample_component",
body: """
<li>
<%= @val %>
</li>
"""
})
%{id: layout_id} =
Layouts.create_layout!(%{
site: "my_site",
title: "Sample Home Page",
meta_tags: %{"foo" => "bar"},
stylesheet_urls: [],
body: """
<header>
Header
</header>
<%= @inner_content %>
<footer>
Page Footer
</footer>
"""
})
%{id: page_id} =
Pages.create_page!(%{
path: "home",
site: "my_site",
layout_id: layout_id,
template: """
<main>
<h2>Some Values:</h2>
<ul>
<%= for val <- @beacon_live_data[:vals] do %>
<%= my_component("sample_component", val: val) %>
<% end %>
</ul>
<.form let={f} for={:greeting} phx-submit="hello">
Name: <%= text_input f, :name %> <%= submit "Hello" %>
</.form>
<%= if assigns[:message], do: assigns.message %>
</main>
"""
})
Pages.create_page!(%{
path: "blog/:blog_slug",
site: "my_site",
layout_id: layout_id,
template: """
<main>
<h2>A blog</h2>
<ul>
<li>Path Params Blog Slug: <%= @beacon_path_params.blog_slug %></li>
<li>Live Data blog_slug_uppercase: <%= @beacon_live_data.blog_slug_uppercase %></li>
</ul>
</main>
"""
})
Pages.create_page_event!(%{
page_id: page_id,
event_name: "hello",
code: """
{:noreply, Phoenix.LiveView.assign(socket, :message, "Hello \#{event_params["greeting"]["name"]}!")}
"""
})
Run ecto.reset
to create and seed our database(s).
cd apps/beacon_sample
.mix ecto.setup
(as our repos haven't been created yet).mix ecto.reset
thereafter.SafeCode
package works as expected.This is typically where we run into issues with safe_code
on the inner content of the layout seed, specifically:
** (RuntimeError) invalid_node:
assigns . :inner_content
<%= @inner_content %>
, seeding seems to complete.Running mix phx.server
throws another error:
** (RuntimeError) invalid_node:
assigns . :val
safe_code
is problematic and needs to be surgically removed from Beacon for now.In Beacon's repository, remove SafeCode.Validator.validate_heex!
function calls from the loaders
lib/beacon/loader/layout_module_loader.ex
lib/beacon/loader/page_module_loader.ex
lib/beacon/loader/component_module_loader.ex
Fix the seeder to work without SafeCode.
apps/beacon_sample/priv/repo/seeds.exs
under Pages.create_page!
from <%= for val <- live_data[:vals] do %>
to <%= for val <- live_data.vals do %>
.Commit the seeder changes.
Add component seeds
Enable Page Management and the Page Management API in router (apps/beacon_sample_web/lib/beacon_sample_web/router.ex
).
require BeaconWeb.PageManagement
require BeaconWeb.PageManagementApi
scope "/page_management", BeaconWeb.PageManagement do
pipe_through :browser
BeaconWeb.PageManagement.routes()
end
scope "/page_management_api", BeaconWeb.PageManagementApi do
pipe_through :api
BeaconWeb.PageManagementApi.routes()
end
Commit the Page Management router changes.
Add Page Management routes
Navigate to http://localhost:4000/beacon/home to view the main CMS page.
Header
, Some Values
, and Page Footer
with a zoom-in cursor over the page.Navigate to http://localhost:4000/beacon/blog/beacon_is_awesome to view the blog post.
Header
, A blog
, and Page Footer
with a zoom-in cursor over the page.Navigate to http://localhost:4000/page_management/pages to view the Page Management
section.
Listing Pages
, Reload Modules
, a list of pages, and New Page
.We should put the page management through its paces to determine weak points.
Add another more robust layout.
<main>
.<body>
section.stylesheet_urls
?Add another more robust component.
0.17.7
.A replica of Laravel Nova panel of pages. Welcome and Home are Laravel defaults. Users would be useful as we could integrate with phx gen auth
.
The dependency safe_code
was a problem during my first two attempts.
I ran into issues by failing to add a BeaconWeb
scope and adding it as BeaconSampleWeb
instead.
UndefinedFunctionError
as function BeaconSampleWeb.PageLive.__live__/0 is undefined (module BeaconSampleWeb.PageLive is not available)
.The sample isn't as "pristine" as I'd like due to the bug fix but it really shouldn't be a showstopper.
<head>
as inline <style>
tags.<body><div data-phx-main="true">
mix phx.server
) immediately boots our Beacon components before it shows the url.I've been using this Swaggervel package with almost all my recent Laravel projects. A few instances were lightly customized to work against different authentication schemes and I only briefly touched on using Laravel Passport.
I wanted to highlight a few areas while also offering up an example project as a lightly opinionated jumping off point. Just the highlights cover quite a bit of information but the example should have ample information in commit messages and in the finished product.
First we run laravel new <project_name>
, git init
and commit immediately to mark our base Laravel installation.
I've always preferred this immediate commit over making customizations first as it's far easier to track your customizations versus the base install.
Next, we run through the Laravel Passport docs with the following caveats:
php artisan vendor:publish --tag=passport-migrations
doesn't copy the migrations as expected. We manually do this.php artisan migrate --step
creates a migration batch for each migration file individually. This lets us rollback to individual steps and is primarily personal preference.app/Providers/AuthServiceProvider
contains the following:Passport::routes(function (RouteRegistrar $routeRegistrar) {
$routeRegistrar->all();
});
Passport::tokensCan([
]);
Passport::enableImplicitGrant();
Passport::tokensExpireIn(Carbon::now()->addDays(15));
Passport::refreshTokensExpireIn(Carbon::now()->addDays(30));
Run artisan make:auth
to utilize the app layout and create a home
view that is protected by the Login prompt.
Create a proper WelcomeController
with matching view that utilizes the same app layout
artisan route:cache
in the future as route closures aren't supported.Now that the basics are complete, we bring in Swaggervel via composer require appointer/swaggervel --dev
.
We can ignore the line in the documentation that mentions adding Appointer\Swaggervel\SwaggervelServiceProvider::class
as that's only for Laravel versions earlier than 5.5 without package discovery.
It's necessary to run artisan vendor:publish
to publish the content as we're using this package as a dev dependency and the assets won't show up otherwise.
Now that Swaggervel is in place we can bring it all together.
To start, we create the file app/Http/Controllers/Api/v1/Controller.php
as our generic API base controller.
This controller houses our root-level @SWG\Info
definition in a convenient location.
This also sets us up for future work where API controllers are versioned, though this is personal preference.
The secret sauce is the @SWG\SecurityScheme
annotation:
/**
* @SWG\SecurityScheme(
* securityDefinition="passport-swaggervel_auth",
* description="OAuth2 grant provided by Laravel Passport",
* type="oauth2",
* authorizationUrl="/oauth/authorize",
* tokenUrl="/oauth/token",
* flow="accessCode",
* scopes={
* *
* }
* ),
*/
The securityDefinition
property is arbitrary but needs to be included in every protected route definition.
You can specify multiple security schemes to cover things like an generic api key or likely multiple OAuth flows, though I haven't tried working out the latter.
These are the supported flows and it's important to note that Swaggervel is currently on the OpenAPI 2.0
specification, though this may change in the future.
The scopes
specified includes everything (*) but we could define any scopes explicitly.
It should be noted that we also need to setup the route definitions in our resource Controller classes but due to the verbosity they are too much to include in this post.
A small snippet that is unique to working with this setup is the following:
* security={
* {
* "passport-swaggervel_auth": {"*"}
* }
* },
This tells a specific endpoint to use the securityDefinition
created earlier and it's important that these match.
The example project has rudimentary UserController
, User
model, and UserRequest
definitions that should be a decent starting point, though I can't vouch for them being very comprehensive.
First we need to create an OAuth client specifically for Swaggervel connections.
Go to the /home
endpoint and under OAuth Clients
click Create New Client
.
Under Name
specify Laravel Passport Swaggervel
or just Swaggervel
.
Under Redirect URL
we're unable to specify /vendor/swaggervel/oauth2-redirect.html
directly, so use a placeholder like https://passport-swaggervel.test/vendor/swaggervel/oauth2-redirect.html
instead.
Using your SQL toolbox of choice, navigate to the table oauth_clients
and look for the row with the name specified in the previous step, in our case Laravel Passport Swaggervel
.
Manually update the redirect column to /vendor/swaggervel/oauth2-redirect.html
.
Now that our OAuth client in Passport should be setup correctly, we focus our attention on the config/swaggervel.php
settings.
The client-id
should be set to what Passport shows in the UI as the Client ID
field.
This is also the id of the row in the oauth_clients
table.
The client-secret
should be set to the what Passport shows in the UI as the Secret
field.
We also set both secure-protocol
and init-o-auth
to true, the latter of which fills in the UI with our secrets otherwise we'd have to put them in manually.
For the OAuth2 redirect to function properly we need to modify the Swagger UI configuration in resources/views/vendor/swaggervel/index.blade.php
.
Under const ui = SwaggerUIBundle({
right below the url parameter should be oauth2RedirectUrl: '/vendor/swaggervel/oauth2-redirect.html',
.
This reinforcement is necessary as the Swagger UI doesn't 'catch' the tokens properly without this.
Other notable additions that make the UI slightly easier to work with:
tagsSorter: 'alpha',
operationsSorter: 'alpha',
docExpansion: 'list',
filter: true
First we go to the api/docs
endpoint to display the Swagger UI.
Click the Authorize
button with the unlocked padlock icon.
Verify the client_id
and client_secret
sections are filled in.
Click Authorize
and the Laravel Passport screen labelled Authorization Request
should display with the Authorize
and Cancel
buttons.
Click Authorize
again and you should be redirected back to Swagger with the client_id
and client_secret
now showing as ******
with a Logout
button instead of Authorize
.
We should now be able to click on the GET /users
route, click the Try it out
button, click on the blue Execute
button and be greeted with our expected response as a list of users.
We've hopefully highlighted the basic touch points of the process with the example code going into much further detail. The project is lightly opinionated to facilitate practices that have served me well so far. It is by no means a complete reference but it should be a good jumping off point when it's somewhat harder to see the big picture without a comprehensive example.
In case you need the link to the project again.
Not too long ago I became obsessed with Prometheus.
I'd heard about it for a while, knew it was powerful, and couldn't quite understand how everything fit together.
The documentation is extremely verbose for good reason but it took playing with it for a while for everything to click.
This post is a rather concise and extensive overview that goes a long way in expressing the basic concepts to my developer brain.
In their simplest form, exporters expose an HTTP endpoint of /metrics
with the output being statistics in Prometheus' format.
The real power of Prometheus comes when you expose your own /metrics
endpoint and have Prometheus consume the statistics you generate.
This post is also a very good introduction with the section Building your own exporter
being extremely valuable in describing just some of the possibilities.
After getting my bearings I started with a prototype with a simple premise "Why look at the usage graphs in Digital Ocean for each server independently? Why not have it in one location?" How To Install Prometheus on Ubuntu 16.04 is a very good primer to get everything up and running quickly.
I've made a few modifications since working through the article:
Use prometheus:prometheus
for ownership of core prometheus processes like prometheus
or alertmanager
.
sudo useradd --no-create-home --shell /bin/false prometheus
Use prometheus-exporter:prometheus-exporter
for ownership of exporters. Exporters should possibly be more isolated but I feel it may be a case of YAGNI.
sudo useradd --no-create-home --shell /bin/false prometheus-exporter
Set scrape_interval to 1 minute: scrape_interval: 1m
.
At $dayJob we've moved to provisioning servers using Laravel Forge, which has the possibility of utilizing exporters for mysqld, mariadb, postgres, memcached, redis, beanstalkd, nginx, php-fpm, and sendmail.
I've opted to use node_exporter, mysqld, nginx-vts-exporter, php-fpm, and redis respectively.
To put the original premise into perspective, replicating the newer monitoring agent graphs in Digital Ocean only require node_exporter
.
A few of the exporters require very little setup, only setting a few configuration variables systemd service definitions. Other exporters like nginx-vts-exporter
require building nginx from source.
I plan to introduce a series of posts that should aid in getting a very rudimentary implementation running. There is an abundant usage of Kubernetes in the Prometheus ecosystem, to the point that it almost seems required but fortunately it also just works(tm) in a traditional virtual machine without any real fuss.
In this blog post on January 26 2015, Code School became part of Pluralsight. The website codeschool.com continued to operate normally until earlier this year when a banner showed the site would shut down and transition to Pluralsight June 1st. The banner pointed to this url, which gives a great overview of the changes but was sparse on what would take place during the transition.
It wasn't until June 1st that I finally understood the full breadth of the transition and stumbled upon the integration faqs. The important bit of information is this snippet:
Will I be able to access my Code School invoices or course history?
No. Your invoices and course history will not carry over or be accessible as of 6/1.
Code School customers were instructed to generate a PDF of their profile before the migration. Due to finding the integration FAQs after June 1st, sadly I wasn't able to do that in time.
What particularly impacts me the most is a belief that pointing potential employers to a reputable website as a source of truth carries far more weight than a PDF that can be altered. As a web developer in an industry where employers seem to assume a resume is partially or wholly embellished, this seems like a step backwards.
In spite of the transition pains, I do find Pluralsight's Skill IQ
to be a fresh way to measure competency with multiple choice questions that cover broad aspects of a given topic.
You're shown what is marked wrong so you can learn from your mistake and the equivalent of the old Code School subscription I believe allows unlimited retests.
The integration with Stack Overflow's developer story is compelling enough to use it and I did gain quite a sense of accomplishment when I scored in the very low expert level range.
As I finished typing this up I noticed Pluralsight seems to have a fair number of the Code School courses by searching for the keyword "Code School".
There are newer interactive courses like the one titled HTML 5 and CSS 3: Overview of Tag, Attribute and Selector Additions
but the introductory video includes the Front End Formations
title that it was called on Code School.
It appears that some of the content is migrating over but things aren't 1:1 so we may never get credit for courses we've essentially completed.
I plan on going through the course shortly as I hope at least the challenges have been updated but it would be a terrible experience to go through all of this realizing I've accomplished it recently.
I don't quite know how I feel about the transition a month in and now after noticing at least some of the content was moved over. It's hard to lose the accomplishments but the outcome would've been no different if Code School closed completely. It does have me pause to make sure the course accomplishments I share are worth the investment and that's likely an important thing to remember whenever similar services catch my attention.