There’s a whole slew of information about their API here. What we’re interested in is the ‘zone’ portion of their API which allows for full site purges, by URL, and by cache-tags or host. I usually stick with the full site purge just because it’s easier to toss everything out and allow it to naturally flow back into their cache without missing anything.
To do this you’ll need your zone id
of the domain, api key
for your account, and an email address
associated to the domain’s account. You can find your zone id
by going to the overview of the domain you’re interested in, the api key
can be found at the bottom of your account profile (using the top right drop-down), and the email is the one you use to sign in to your account. Once you have all that you can use the following cURL command to purge your cache!
1 |
|
If it executed successfully you should see a response similar to {"result":{"id":"xxxxxxxxxxxxx"},"success":true,"errors":[],"messages":[]}
.
And that’s it!
]]>Our command line interface, Terminus, provides advanced interaction with Pantheon. Terminus enables you to do almost everything in a terminal that you can do in the Dashboard, and much more.
While Terminus makes deployments from your local machine simple I wanted the ability to run it from our CI/CD servers on git push. To do that I created a Docker image that our servers can use; The Dockerfile looks like the following
1 |
|
The composer image sets composers home to /tmp
so we’re doing the same with Terminus as well as adding the /tmp/vendor/bin
to that path. Now we can execute terminus
by just referencing the name and not write out the full path to the bin file within the container.
You can now user Terminus in your pipeline builds, such as within your bitbucket-pipelines.yml
, it would look something like:
1 |
|
Above we’re using three environment variables that you would set with your specific info.
1 |
|
You can also use this image to run terminus on your local machine. This can be handy if you don’t want deal with installing all the requirements for Terminus. To do this simply run the following (docker is required):
1 |
|
Here we’re mounting ${HOME}/.terminus:/tmp/.terminus
so that we can cache our auth token so we can access our remote sites. You could set this up as a bash function and just call it via terminus
with the following:
1 |
|
The function above sets --entrypoint=terminus
so you can use it more naturally, this essentially tells docker to behave like an executable and accept arguments to pass to that executable.
Now you can use it as if it were installed locally: terminus --version
, terminus auth:login --machine-token=<your token>
, terminus site:list
.
The first thing we need to do is npm install
a couple third-party dependencies to use in our Hooks; We’ll need Browserify and babelify. After we have these dependencies added we can easily write a hook that will run browserify on our ES Modules with the added bonus of being able to easily require other JS dependencies from npm.
I prefer using npm scripts and triggering those via Hooks, this allows npm to handle the paths and we don’t have to references things like ./node_modules/.bin/browserify
to use browserify on the CLI. So, our first step will be writing an npm script in our package.json file.
It should look similar to the following:
1 |
|
Here we’re
browserify
and passing js/main.js
as the entry file.-t
babel transform, so that babel will transform our code into ES 5, and modules to CommonJS style.js/bundle.js
You could also include uglifyjs and pipe browserifies content to it, but since we can trigger that via CodeKit I see no reason to add that extra dependency since it’s provided. But there may be some case where you might want this, so it’s completely possible to handle minification here as well.
1 |
|
The above are just simple files to demonstrate small modules that can be imported and bundled up, we do however have rxjs
that was installed via npm. I’ve included this to show the ease of including thrid-party libs using this work flow.
1 |
|
1 |
|
1 |
|
1 |
|
In main.js
we can see that we’re requiring our own ES Modules with a relative path ./
. This is important so that browserify knowns that the file is relative to the file it’s being imported within and not a dependency that’s within node_modules
.
As for Observable
we’re importing it with no relative path, doing this browserify will do a lookup in your node_modules
folder to find the dependencies needed. You also may notice that we’re using the new ES Module syntax to import the Observable
, even though it’s built via the CommonJS pattern. This is possible because Babel’s presets will use the CommonJS style when transforming your code, which is nice since we can code in the standard syntax while still requiring dependencies that are built using the CommonJS format.
Another added benefit is that import {Observable} from 'rxjs/Observable'
is converted to var Observable = require('rxjs/Observable').Observable
. Note the added .Observable
needed using the CommonJS style; It’s a nice benefit of not having to do this with the ES Module syntax.
You’ll want to setup each js file to be processed, but, ensure that you set Transpile With:
to Nothing
; This is because we’re going to be processing this file to itself. You can, however, setup up the linter of your choice, I’ve been moving over the ESLint.
If you’re using ESLint, make sure you’ve set
Source Type
to beModule
and have checked theES6
box.
In the following image you’ll see that CodeKit is listing our custom ES Module files that main.js
is importing but, you wont see the items that are required from node_modules
here due to that folder being skipped by default. Which is a nice perk since you’ll really only want to mess with files that you’re writing yourself and leave the third-party dependencies alone.
Earlier I wrote about either piping Browserify’s content to uglifyjs yourself, or, letting CodeKit handle the minification. Here you can see that it’s as simple as checking a box, one thing to note though, source maps will not work due to us working outside of CodeKit’s baked in transpiling. If source maps are important to you, this will probably the moment you want to uglify the content yourself as well and setup Browserify to build out the source maps too.
Now, the step that ties everything together and builds out your bundle anytime a dependencies of your entry file changes; The Hook. As you can see below it’s quite simple, just three words. This is because we’re using npm to handle the rest of the logic, so if you want to make modifications to your bundle processes remember that package.json
is where all the work is happening.
If you find yourself in a situation where npm is not working out for you and your team, maybe due to some folks not having npm and/or node; Or maybe you’re experimenting with different versions of node and npm. If that’s the case you can setup a Hook that uses CodeKits bundled version of node and a custom JS file written on your end to setup Browserify.
::Warning:: Using this method relies on that fact that everyone has CodeKit installed in the same location and that CodeKit itself doesn’t change the location of node, which is possible as this is not documented. So… enter at your own risk.
Your Hook would look something like:
1 |
|
and browserify.js
would look something like:
1 |
|
You can find the sample project with all the above files at https://github.com/subhaze/CodeKit_3_Browserify
When you preview the project you should see the following in your console.
1 |
|
Happy coding!
]]>While some will want a more flushed out experience via Browserify or Rollup bundling, using TypeScript isn’t that bad, especially when you’re really only using one .ts to do the bundling while the rest of your code can be pure JavaScript.
1 |
|
From the above example you’ll see one TypeScript file bundle.ts
and its sole purpose is to import our top level JavaScript files and bundle them up into bundle.js
. The reason for TypeScript is because it provides built-in bundling so long as you select either the AMD or System format, in this example we’ll be using the System format.
The contents of each file:
1 |
|
Yep, just one line, as onload.js
is the top-level file in this project, you will most likely have more.
1 |
|
You can see here that onload.js
loads in our other two files and uses them to log their output to the console.
1 |
|
1 |
|
So, with all of the above coded imported into onload.js
and it being imported by bundle.ts
anytime you save any one of those files CodeKit is now smart enough to build your new bundle.js
which will produce ES 5 code in the System format.
Here you can see CodeKit showing you what files are linked.
1 |
|
Now that we have our code bundled up via the System module format there’s one final step. As you can see above it’s using a global called System
which “registers” each file into its own scope and passes in its dependencies. Because of this, we’ll need to load in part of the SystemJS library, I say part because we only need the “register” portion of it which we can grab from a CDN or load it locally.
1 |
|
When we load in our bundle.js
files we need to set the onload
event to tell SystemJS to import our bundled up code so that it get executed.
We’ve gone over each role of the files provided but we need to adjust CodeKit slightly for all of this to work correctly.
If you enjoy this work flow you’ll probably want to edit CodeKits “project defaults” instead of doing this for every single new file or project.
We’ll want
Output Module Type
SystemECMAScript Target Version
ES5When This File Changes Or Builes
Compile itCheck Syntax With
ESLintTranspile With
NothingWhen This File Changes Or Builes
Process itTo This Path
The exact path to the JavaScript file you’re processingSource Type
ModuleEnvironments
Check “ES6”The code you see above is on Github with a CodeKit config file, I hope this helps folks that are looking for a way to write in the new ES syntax as well as start using ES Module imports to bundle up their code.
Happy coding!
]]>Currently CodeKit does not support babeljs transpiling, but appears it will at some point in the future. This doesn’t mean you can’t use CodeKit to transpile babel, you’ll just need to install babel locally and use a hook to do so.
There’s one unfortunate thing with this hook, since we can’t trigger a hook on file save _(unless you’re running JSLint/JSHint, but the file name is not passed to the CK_INPUT_PATHS
variable)_ we’ll need to create specific output files for our babel files so that CodeKit will trigger hooks. This adds a minor bit of complexity, but doable with a small amount of work.
With this hook there are three variables of interest to you, SRC
, DIST
, and SUFFIX
.
SRC:
Is set to use a directory named es6
so if you create your files in some other directory you’ll want to change that.
DIST:
Is where babel will save the transpiled files, currently that is set to js
. Also, this folder must already exist, I may update this hook later so that it will create that folder if it does not already exist.
SUFFIX:
This one is needed since we can’t set CodeKit to trigger hooks on file save, hence adding a minor bit of complexity to this hook. So for each file you’re wanting this hook to trigger on, you need to set the output path to *-babel.js
(or whatever you change this var to).
Also, make sure when you set the output file it’s not saving the output in the root of the project but within the es6
directory (or whatever directory you’re using).
Here’s the hook criteria:
Any
of the following are truethe output path of any processed file
ends with
-babel.js
Shell Script (/bin/bash)
Here’s the hook code:
1 |
|
You could create a main.js
file that’s used to concat all your babel transpiled files and minify in one go.
The contents would look something like:
1 |
|
From there you can set an output file for main.js
and adjust the Output style:
to your liking.
This is a quick write-up on how to gzip your HTML files with CodeKit Version 2.3+ hooks; Jade is used in the example but any of the HTML processors available should work the same.
First you’ll need to setup a hook, the hook I’ve used is:
Any
of the following are trueThe filename of any processed file
ends with
.jade
Now you’ll need to select Shell Script (/bin/sh)
from the drop down and paste the follow into the textarea:
1 |
|
The above code accesses a newly available environment variable that CodeKit is now setting CK_OUTPUT_PATHS
. This variable is a :
delimited string of all the files that were produced by the action defined in the hook above.
The line gzip -nc $i > $i.gz
takes the current HTML file, gzips it, and saves a new version of it with the .gz
extension. If you just want to replace the current HTML file with no .gz
extension all you should have to do is use gzip -nc $i
instead which will convert the current html files into a gzipped version.
With this hook setup, if you save a jade file that only effects one file then only one file will be gzipped and if you change a layout that’s extended by many other files then all of those files will be gzipped.
]]>Here is another quick post on using the new Hook variables added in CodeKit 2.3, this time looking at adding timestamps to CSS files.
Even though this is focused on SCSS/CSS you should be able to take this example and modify it to work with any of the other preprocessing abilities CodeKit has to offer. The main take away is, adding a unique placeholder for the timestamp in your source files and using sed to replace it with a real timestamp when they’re processed by CodeKit.
The hook settings used are:
Any
of the following are trueThe filename of any processed file
ends with
.scss
Then, we’ll need to select Shell Script (/bin/sh)
from the select box.
Once that’s all set add the following to the textarea:
1 |
|
We’re creating the timestamp with now=$(date)
so if you wish to tweak the timestamp format you can find more information here.
The line sed -i -e "s/{{TIMESTAMP}}/${now}/" "$i"
is using sed to execute a find/replace inline so that we overwrite the current file $i
with the timestamp we created above. Right now the script is expecting to find {{TIMESTAMP}}
somewhere within the CSS file so that it can replace it.
So your SCSS file could look something like:
1 |
|
and you will end up with a CSS file like:
1 |
|
By deploy I mean, syncing what’s on your local machine to a remote machine. So anything that’s been updated gets pushed to the remote server and anything you’ve deleted on your local gets removed on the remote server.
For myself, the most basic way to deploy something is via rsync which basically just keeps files and folders synced between two computers. I’m using a Makefile to easily trigger a few CLI commands by typing make
in my terminal when I want to deploy. This also allows me to add more features pretty easily when I find the need to.
So, first off, let’s look at what it takes to sync theme files from our local machine to our remote server which is the following one liner:
rsync -a --delete ./ username@example.com:/var/www/ghost/content/themes/YOUR_THEME
In the above we’re asking rsync to run in ‘archive move‘ and delete files on the remote that are not found on the local machine. You’ll need to update username
to the user you SSH in as, example.com
to the domain of the server you’re deploying to, and /var/www/ghost/content/themes/YOUR_THEME
to the path of your theme.
So, we’ve got a nice one-liner for deploying files, but Ghost doesn’t know that we’ve made these changes. SSH to the rescue, we can make another one-liner to restart the Ghost service on our remote server which will look something like this:
ssh username@example.com service ghost restart
Remember this is assuming your using something like upstart and have a service setup to start/stop/restart Ghost. If you’re using a Digital Ocean 1-Click Application you’ll get this all setup for you.
Now that we have those two lines we can create a Makefile to trigger them for quick deploys. To do this, create a file called Makefile
and save it along with your theme files, its contents should look something like this:
1 |
|
With that file in place we can now quickly deploy via the terminal. Just fire up your terminal, cd
into your theme directory you’re wanting to deploy and then make
or make deploy
and your files will get synced up on your remote server and Ghost will be restarted.
If you’re not familiar with Makefiles sitepoint has a nice write-up on using them in the front-end world.
]]>