<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Greg's Blog]]></title><description><![CDATA[Artisanal Small Batch 1's and 0's]]></description><link>https://gregleeds.com/</link><generator>Ghost 4.27</generator><lastBuildDate>Wed, 25 Mar 2026 09:42:24 GMT</lastBuildDate><atom:link href="https://gregleeds.com/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Reverse Engineering Sony Camera Bluetooth]]></title><description><![CDATA[<p>I&apos;ve been working on building a homemade camera slider system to make time lapse photos, and I needed a way to trigger my camera (a <a href="https://electronics.sony.com/imaging/interchangeable-lens-cameras/aps-c/p/ilce6100-b">Sony a6100</a>) to start taking photos at the same time that my slider starts moving. &#xA0;My slider is driven by an ESP32</p>]]></description><link>https://gregleeds.com/reverse-engineering-sony-camera-bluetooth/</link><guid isPermaLink="false">61b1882a4da50119488aa522</guid><dc:creator><![CDATA[Greg Leeds]]></dc:creator><pubDate>Fri, 10 Dec 2021 00:59:04 GMT</pubDate><media:content url="https://gregleeds.com/content/images/2021/12/bluetoothremote.png" medium="image"/><content:encoded><![CDATA[<img src="https://gregleeds.com/content/images/2021/12/bluetoothremote.png" alt="Reverse Engineering Sony Camera Bluetooth"><p>I&apos;ve been working on building a homemade camera slider system to make time lapse photos, and I needed a way to trigger my camera (a <a href="https://electronics.sony.com/imaging/interchangeable-lens-cameras/aps-c/p/ilce6100-b">Sony a6100</a>) to start taking photos at the same time that my slider starts moving. &#xA0;My slider is driven by an ESP32 microcontroller and has a web interface to setup and run automations. &#xA0;This left me with a few options to control my camera:</p><ul><li>Emulate an IR camera remote</li><li>Emulate a Bluetooth LE camera remote</li><li>Use Sony&apos;s HTTP based Camera API/SDK</li></ul><p>I tried the IR approach first because it was super simple and well documented. &#xA0;Unfortunately after building a test circuit with a spare IR I took out of an old remote, I discovered that while most Sony a6000 series cameras support IR, some newer ones like my a6100 do not. &#xA0;Whoops.</p><p>Next I looked at the Camera&apos;s API. &#xA0;Sony has two versions of a SDK for developing apps that control its cameras. &#xA0;A deprecated one and a newer beta one. &#xA0;Both are pretty poorly documented. &#xA0;Neither are listed as supporting my specific camera model, but Sony&apos;s own applications that presumably use the same APIs work with the a6100, so it seemed worth a shot.</p><p>To test I enabled smartphone based remote control in my camera&apos;s settings, and connected my laptop to the camera&apos;s WiFi Access Point it creates. &#xA0;From there after figuring out the cameras IP and API port using the documented Simple Service Discovery Protocol (SSDP) approach.</p><p>Once I had the API base URL I could easily used Postman to snap a shot with one weird caveat. &#xA0;For whatever reason the camera will ignore requests to take a photo 2/3 of the time if you don&apos;t first ask it to focus. &#xA0;If you send it those two commands in short succession, it will snap a shot, and even provide you with a URL to download the picture it just too.</p><p>This looked like a pretty good approach until I tried to get my camera to connect to my home WiFi instead of operating in Access Point mode. &#xA0;As best I can tell, the a6100 will only remain connected to a WiFi network if it is placed in &quot;Send to Computer&quot; mode, where it pushes photos to a desktop Sony app, and cannot take pictures. &#xA0;Since my microcontroller in my camera slider can&apos;t be connected to two networks at once, I need another solution unless I want to rewrite the slider to be controlled over bluetooth and build some app to use that instead of a browser.</p><p>With the other two control methods off the table, that leaves Bluetooth. &#xA0;Sony sells a Bluetooth Low Energy remote (the <a href="https://electronics.sony.com/imaging/imaging-accessories/imaging-compact-camera-accessories/p/rmtp1bt">RMT-P1BT</a>) that works with it&apos;s Bluetooth enabled cameras, but provides no documentation of the interface. &#xA0;Using a BLE app on my phone I can look at the services and characteristics the camera exposes which is a good start. &#xA0;One service in particular is of particular interest, <code>8000-ff00-ff00-ffffffffffffffffffff</code>, this service has a notification characteristic that when you subscribe to it gives you a unique ID back for ever button you press on the camera body. &#xA0;It has a writable characteristic as well. &#xA0;Unfortunately simply sending back the notification values to the write characteristic doesn&apos;t seem to do anything.</p><p>I saw that the Sony Android app does have a bluetooth intent declared even though it looks like it mostly uses the WiFi SDK to control the camera. &#xA0;I tried decompiling the APK and looking through the source for any info, and while it does have a class that handles some BLE communication, it seems to be there solely as a way to aid in discover and paring the WiFi to the camera using other BLE services. &#xA0;The <code>ff00</code> service isn&apos;t referenced in the code.</p><p>I also found <a href="https://gethypoxic.com/blogs/technical/sony-camera-ble-control-protocol-di-remote-control">an article</a> online that corroborated the purpose of most of the BLE services, but crucially lacked the actual codes for camera control.</p><p>This left me with one more doable option, buy a remote and use a BLE Sniffer to capture the traffic. &#xA0;Luckily there is a <a href="https://www.amazon.com/JJC-RMT-P1BT-Bluetooth-Wireless-Commander/dp/B08CR1QPKQ?th=1">knock off remote manufactured by JJC</a> that has the same model number but retails for 1/3 the price of the Sony remote. &#xA0;I ordered one on Amazon and it easily paired to my camera and controls everything I could ask for. &#xA0;I already had a Bluetooth sniffer based on the Nordic nRF51 chip laying around that works with some <a href="https://www.nordicsemi.com/Products/Development-tools/nRF-Sniffer-for-Bluetooth-LE">python plugin to WireShark</a>.</p><p>The general idea for bluetooth sniffing of &quot;secure&quot; communications is that your sniffer needs to be running when you first pair the two target BLE devices you want to capture the communications between. &#xA0;Unfortunately this process can be pretty hard to pull off, other bluetooth devices nearby can interfere with the capture, and BLE operates on 3 radio channels with channel hopping behavior, but the sniffer can only be sniffing one channel at any given time. &#xA0;This means that you only have a 33% chance of successfully capturing a given paring message. I tried and tried to get it to work including taking the target devices into my backyard away from the RF noise of my house but nothing worked.</p><p>Before giving up, on a whim I decided to buy a second bluetooth sniffer. &#xA0;I bought a <a href="https://wiki.makerdiary.com/nrf52840-mdk-usb-dongle/">USB dongle dev board</a> based on the newer nRF52840. &#xA0;It didn&apos;t have the sniffing firmware preloaded like my other dongle, but Nordic provides the <code>.hex</code> images. &#xA0;The dev kit I used had a file system based bootloader instead of DFU, so I had to use a <a href="https://github.com/makerdiary/nrf52840-mdk-usb-dongle/blob/master/tools/uf2conv.py">python script</a> to convert the <code>.hex</code> to a <code>.u2f</code> file before I could just drag and drop it to the board like a thumb drive. &#xA0;The bootloader no longer seems to allow me to flash the board again that way, but it worked the first time with the sniffer firmware, so that&apos;s a problem for another day if I ever decide I want to use the board for something else.</p><p>The new sniffer worked on the second attempt, and from there I was able to record every button press on remote. &#xA0;All commands send one code for pressing the button down, and another when you release the button with the exception of the video record button which only sends when you press it down, and operates like a toggle. &#xA0;A light on the remote stays on when the camera is recording, but I&apos;m not absolutely sure that it actually is reading that status back out of the camera vs just tracking it&apos;s own state.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://gregleeds.com/content/images/2021/12/wireshark.png" class="kg-image" alt="Reverse Engineering Sony Camera Bluetooth" loading="lazy" width="2000" height="1195" srcset="https://gregleeds.com/content/images/size/w600/2021/12/wireshark.png 600w, https://gregleeds.com/content/images/size/w1000/2021/12/wireshark.png 1000w, https://gregleeds.com/content/images/size/w1600/2021/12/wireshark.png 1600w, https://gregleeds.com/content/images/size/w2400/2021/12/wireshark.png 2400w" sizes="(min-width: 720px) 720px"><figcaption>WireShark capture of remote bluetooth traffic</figcaption></figure><p>So, finally, presented to the internet for anyone who ever want&apos;s to control a modern BLE equipped Sony camera without going through all of that effort, here&apos;s what you need to do.</p><ol><li>Pair your device with the camera, the cameras don&apos;t appear to respond to commands from unpaired devices and immediately disconnect.</li><li>Write to characteristic <code>0xff01</code> on service <code>8000-ff00-ff00-ffffffffffffffffffff</code> one of the following commands:</li></ol><!--kg-card-begin: markdown--><table>
<thead>
<tr>
<th>Command</th>
<th>Hex Code</th>
</tr>
</thead>
<tbody>
<tr>
<td>Focus Down</td>
<td><code>0x0107</code></td>
</tr>
<tr>
<td>Focus Up</td>
<td><code>0x0106</code></td>
</tr>
<tr>
<td>Shutter Down</td>
<td><code>0x0109</code></td>
</tr>
<tr>
<td>Shutter Up</td>
<td><code>0x0108</code></td>
</tr>
<tr>
<td>AutoFocus Down</td>
<td><code>0x0115</code></td>
</tr>
<tr>
<td>AutoFocus Up</td>
<td><code>0x0114</code></td>
</tr>
<tr>
<td>Zoom In Down</td>
<td><code>0x026d20</code></td>
</tr>
<tr>
<td>Zoom In Up</td>
<td><code>0x026c00</code></td>
</tr>
<tr>
<td>Zoom Out Down</td>
<td><code>0x026b20</code></td>
</tr>
<tr>
<td>Zoom Out Up</td>
<td><code>0x026a00</code></td>
</tr>
<tr>
<td>C1 Down</td>
<td><code>0x0121</code></td>
</tr>
<tr>
<td>C1 Up</td>
<td><code>0x0120</code></td>
</tr>
<tr>
<td>Toggle Record</td>
<td><code>0x010e</code></td>
</tr>
<tr>
<td>Focus In? Down</td>
<td><code>0x024720</code></td>
</tr>
<tr>
<td>Focus In? Up</td>
<td><code>0x024600</code></td>
</tr>
<tr>
<td>Focus Out? Down</td>
<td><code>0x024520</code></td>
</tr>
<tr>
<td>Focus Out? Up</td>
<td><code>0x024400</code></td>
</tr>
</tbody>
</table>
<!--kg-card-end: markdown--><p>To reliably get the camera to take a picture, you&apos;ll want to first send the focus command, so the following 4 commands are needed in this order:</p><ul><li><code>0x0107</code></li><li><code>0x0109</code></li><li><code>0x0108</code></li><li><code>0x0106</code></li></ul><p>Enjoy!</p>]]></content:encoded></item><item><title><![CDATA[Dear FCC]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p><em>A repost of my FCC Filing on &quot;Restoring Internet Freedom&quot;</em></p>
<p>I&#x2019;m writing to encourage the commission to keep the existing Title II based regulation of internet service providers.  Over the past decades, America has seen consolidation in both the wired and wireless service providers.  Today most</p>]]></description><link>https://gregleeds.com/dear-fcc/</link><guid isPermaLink="false">61b185d64da50119488aa511</guid><dc:creator><![CDATA[Greg Leeds]]></dc:creator><pubDate>Mon, 29 May 2017 20:16:57 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p><em>A repost of my FCC Filing on &quot;Restoring Internet Freedom&quot;</em></p>
<p>I&#x2019;m writing to encourage the commission to keep the existing Title II based regulation of internet service providers.  Over the past decades, America has seen consolidation in both the wired and wireless service providers.  Today most Americans have only once choice for broadband service, usually their cable providers.  By its very nature, providing last mile service is extraordinarily difficult for new providers to enter a market, given the capital costs, access to rights of way, permitting, and addressing community concerns about construction.  Much like power and water, this has created a situation where the free market only allows for one or two competitors.  This is the classic model of a monopoly.  When a market does not naturally lend itself to competition, governments must step in to protect consumers from profit seeking corporations using their monopoly power to extract unreasonable profits from them.  The free and open internet that FCC proposal 17-60 claims to cherish must be protected from monopolies.  The FCC must regulate internet service providers for what they are, utilities, and the way to do this within the existing Telecommunications Act powers that the FCC is charged with enforcing are through Tittle II.</p>
<p>The decrease in network investment that the Commission is using as the justification for reversing course on the 2-year-old (point #4) rules appears to be either unfounded, or far too premature to justify the reversal.  While there was a decrease in 2015 as the rules were being established, 2016 saw billions in increased spending from some of the largest providers including Comcast and AT&amp;T.</p>
<p>The proposal goes on to question the need for bright line rules in point #76 on the basis that consumers have yet to be harmed by them, then goes on to suggest eliminating the transparency rules that allow consumers and the FCC to gather any proof of harm.  Removing these rules is dangerous, as providers were already demonstrating their willingness to take advantage of their monopoly position before Title II rules were put in place.  Paid Prioritization amounts to protection money that providers are trying to extort from content providers, trying to charge twice to transport the same data.  The internet backbones have long operated on peering agreements (point #87) where providers exchanging roughly the same amount of data would connect their networks without charging for data.  Last mile internet service providers were attempting to extort companies like Netflix to pay them extra on top of the peering agreement with backbone providers to be allowed to connect to their customers.  The last mile providers subsequently refused to upgrade their interconnect with backbone providers transferring Netflix&#x2019;s data to try and pressure them into paying what amounted to extorted protection money.  Meanwhile, before Title II providers such as Verizon were injecting &#x201C;super cookies&#x201D; into customers&#x2019; packets to allow them to sell their customers tracking data, and Comcast was blocking certain traffic such as BitTorrent.</p>
<p>Wireless providers in some markets have marginally more direct competition than wired service providers, but are oligopolies at best. Until recently, wireless competition consisted of various zero-rating schemes, which is just a way to make paid prioritization and packet manipulation more attractive to consumers.  AT&amp;T Wireless introduced perhaps the most anti-competitive plan that made streaming video from their DirecTV subsidiary free to customers, while metering all other video providers.  While this can be seen as a nice perk for customers who bundle all of their services together with AT&amp;T, it is clearly anti-competitive to all other video service providers.</p>
<p>All of these examples contradict point #76&#x2019;s assertion of no harm.  Point #89 of the proposal questions the need for transparency, implying that end users can just choose to go to a provider with stated terms that they won&#x2019;t do any of the above.  Nothing has changed in the market since the introduction of the existing Title II rules that would imply that the monopoly internet services providers who were already interfering with customers&#x2019; service as stated above would resist the urge to return to their former behavior, and as stated earlier, most consumers simply do not have a choice in their broadband provider.</p>
<p>I wholeheartedly urge the FCC to keep the existing Title II regulations in place.  The courts have already ruled that Section 706 was inadequate to enforce net neutrality rules, and that Title II was the correct path.  Returning to Section 706 would be accepting that the FCC is not going shirk its mission, and do nothing to protect consumers.</p>
<p>Sincerely,</p>
<p>Gregory Owen Leeds</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Building a Static Documentation Site with Metalsmith]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>At work my company&apos;s product team has been using GitHub wiki for years for all of our use facing documentation.  As they have grown from a small open source project to a much larger team with a more fully featured enterprise offering, they had outgrowing using GitHub wiki.</p>]]></description><link>https://gregleeds.com/building-a-static-documentation-site-with-metalsmith/</link><guid isPermaLink="false">61b185d64da50119488aa510</guid><category><![CDATA[metalsmith]]></category><category><![CDATA[nodejs]]></category><category><![CDATA[documentation]]></category><dc:creator><![CDATA[Greg Leeds]]></dc:creator><pubDate>Tue, 23 May 2017 19:03:04 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>At work my company&apos;s product team has been using GitHub wiki for years for all of our use facing documentation.  As they have grown from a small open source project to a much larger team with a more fully featured enterprise offering, they had outgrowing using GitHub wiki.  We went out in search of a set of tools to build our own self hosted documentation web site with the following set of requirements:</p>
<ul>
<li>An easy to use workflow for documentation authors that doesn&apos;t require a programer or designer to write</li>
<li>The ability to version our documentation</li>
<li>Quick deployments</li>
<li>A technology stack that our developers know and can support</li>
<li>Serverless deployment</li>
</ul>
<p>The goto for documentation, and GitHub Sites&apos; default is <a href="https://jekyllrb.com/">Jekyll</a> which we looked at first.  While Jekyll has a great community and would have been the path of least resistance, the fact that no one on our team had any Ruby experience made us look for more options.  Our core product is written in Java, but we already have some of our supporting infrastructure written in NodeJS, so we started out there when looking for tools, and found <a href="http://www.metalsmith.io/">Metalsmith</a> to be the most popular option.  While Metalsmith has plugins for days, it is closer to a box of Lego than a fully assembled system.</p>
<p>Luckily I found and heavily cribbed from the open source documentation for the fantastic <a href="https://docs.particle.io/guide/getting-started/intro/photon/">Particle microcontroller board</a>.  Check them out on <a href="https://github.com/spark/docs">GitHub</a>.  Their working example of Metalsmith gave me enough of a reference to get started.</p>
<h1 id="projectstructure">Project Structure</h1>
<p>Our initial project structure looks something like this:</p>
<pre><code>docs
&#x251C;&#x2500;&#x2500; public
&#x2502;&#xA0;&#xA0; &#x2514;&#x2500;&#x2500; components - Bower working directory
&#x251C;&#x2500;&#x2500; scripts - All of the actual Metalsmith code
&#x251C;&#x2500;&#x2500; src - Source of all content
&#x2502;&#xA0;&#xA0; &#x251C;&#x2500;&#x2500; assets 
&#x2502;&#xA0;&#xA0; &#x2502;&#xA0;&#xA0; &#x251C;&#x2500;&#x2500; doc-media - Images used in docs
&#x2502;&#xA0;&#xA0; &#x2502;&#xA0;&#xA0; &#x2514;&#x2500;&#x2500; images - Images used for all pages
&#x2502;&#xA0;&#xA0; &#x251C;&#x2500;&#x2500; css
&#x2502;&#xA0;&#xA0; &#x2514;&#x2500;&#x2500; markdown - The actual docs, subdirectories correspond to topnav
&#x2502;&#xA0;&#xA0;     &#x251C;&#x2500;&#x2500; api
&#x2502;&#xA0;&#xA0;     &#x251C;&#x2500;&#x2500; development
&#x2502;&#xA0;&#xA0;     &#x251C;&#x2500;&#x2500; guide
&#x2502;&#xA0;&#xA0;     &#x251C;&#x2500;&#x2500; index.md
&#x2502;&#xA0;&#xA0;     &#x2514;&#x2500;&#x2500; install
&#x2514;&#x2500;&#x2500; templates - The Bootstrap layouts for all pages
</code></pre>
<h1 id="settingupmetalsmithpipeline">Setting up Metalsmith Pipeline</h1>
<p>Metalsmith works as a chain of filters that transform an input directory (in our case, a bunch of markdown in <code>/src/markdown</code>) into the output directory.  There is nothing that says that the input of Metalsmith has to be Markdown, nor that the output needs to be a static HTML site, but it is important to remember that at it&apos;s core, Metalsmith is transforming the source files, so trying to force it to work on another set of data outside of the source files can be difficult.  At one point we tried to have Metalsmith bulk resize the screenshots we were using in our documentation at the same time it was building and it proved problematic.</p>
<p>In <a href="https://github.com/gleeds/metalsmith-docs-example/blob/master/scripts/metalsmith.js"><code>/scripts/metalsmith.js</code></a> we script out the core rendering flow as follows:</p>
<pre><code class="language-javascript">var ms = Metalsmith(__dirname)
  .source(&apos;../src/markdown&apos;)
  .destination(&apos;../build&apos;)
  .use(paths())
  .use(helpers({
    directory: &apos;./hbs-helpers&apos;
  }))
  .use(collections({
      home: {
        pattern: &apos;index.md&apos;,
        metadata: {
          name: &quot;Home&quot;
        }
      },
      installation: {
        pattern: &apos;install/*.md&apos;,
        sortBy: &apos;order&apos;,
        metadata: {
          name: &quot;Installation&quot;
        }
      },
      guide: {
        pattern: &apos;guide/*.md&apos;,
        sortBy: &apos;order&apos;,
        metadata: {
          name: &quot;Guide&quot;
        }
      },
	  development: {
        pattern: &apos;development/*.md&apos;,
        sortBy: &apos;order&apos;,
        metadata: {
          name: &quot;Development&quot;
        }
      },
      api: {
        pattern: &apos;api/*.md&apos;,
        sortBy: &apos;order&apos;,
        metadata: {
          name: &quot;API&quot;
        }
      }
    }))
  .use(markdown())
  .use(layouts({
    engine: &apos;handlebars&apos;,
    directory: &apos;../templates&apos;,
    default: &apos;template.hbs&apos;
  }))
  .use(assets({
    src: &apos;../src/assets&apos;,
    dest: &apos;../build/assets&apos;
  }))
  .use(assets({
    src: &apos;../src/css&apos;,
    dest: &apos;../build/assets/css&apos;
  }))
  .use(assets({
    src: &apos;../public/components/bootstrap/dist&apos;,
    dest: &apos;../build/assets/bootstrap&apos;
  }))
  .use(assets({
    src: &apos;../public/components/jquery/dist&apos;,
    dest: &apos;../build/assets/jquery&apos;
  }))
  .use(permalinks({
    relative: false
  }))
</code></pre>
<p>At a high level, here is what our rendering pipeline is doing:</p>
<ol>
<li>Configure source and destination directories</li>
<li>Add file path information for each source file to the Metalsmith metadata collection, this helps us build links and ToC.</li>
<li>Allow for javascript helpers exported in <code>/scripts/hbs-helpers</code> to be invoked by the Handlebars template.  We use this for a few simple things like highlighting the active collection on the topnav.</li>
<li>Split apart source files into collections based on a matching pattern.  These are used for the topnav and the sidebar navigation as well as the directory each individual page gets rendered into.</li>
<li>Render Markdown into HTML</li>
<li>Inject rendered HTML into the Handlebars template</li>
<li>Force copy the static assets outside of the &quot;source&quot; directory into the appropriate output directory.</li>
<li>Move all html files not named <code>index.html</code> into a subdirectory with the same name, and rename them to <code>index.html</code> inside that directory.  This gives us pretty URLs in our static site.</li>
</ol>
<p>The pipeline is then exported so we can use it without a separate build scripts.</p>
<h1 id="buildscripts">Build Scripts</h1>
<p>The Metalsmith pipeline we built will compile the entire static site into the <code>/build</code> directory when invoked, but that&apos;s usually not what we want to do.  We built a series of scripts on top of our master pipeline that lets us do a few fun things like:</p>
<ul>
<li>Just render the whole thing and quit</li>
<li>Render the site and start a web server to host the content, and watch for any changes and rebuild the site.  This is a great workflow for our documentation writers, as all then need to do is save their Markdown file and hit F5 in their browser to see how their work looks.</li>
<li>Render the site, then deploy it.</li>
</ul>
<p>All of these scripts are run from <code>package.json</code> by doing something like <code>npm run www</code>.</p>
<p>Adding extra filters to these scripts is pretty straightforward, like this <a href="https://github.com/gleeds/metalsmith-docs-example/blob/master/scripts/www.js">development server script</a>:</p>
<pre><code class="language-javascript">ms
  .use(watch({
        paths: {
          &quot;${source}/**/*&quot;: true,
          &quot;../templates/**/*&quot;: true,
        },
        livereload: true,
      })
    )
  .use(serve({
    port:3000
  }))
  .build(function(){});
</code></pre>
<h1 id="versioning">Versioning</h1>
<p>Eventually we want to host different version of our docs that correspond to different release of our application.  For now we are just tagging the git repo that hosts our content.</p>
<h1 id="deployments">Deployments</h1>
<p>The great thing about static sites is they are dead simple to host.  In our case we copy the site to a AWS S3 bucket, and put a CloudFront CDN in front of that.</p>
<p>While Metalsmith has a S3 plugin, I found it easier to just roll my own using the <a href="https://github.com/andrewrk/node-s3-client">Node S3 library</a> which even runs checksums against all of your files so it uploads our entire site in just a few seconds.  After the script is done with the upload, it follows it up by sending a cache invalidation request to CloudFront.</p>
<p>Here are the details of the <a href="https://github.com/gleeds/metalsmith-docs-example/blob/master/scripts/deploy.js">deployment script</a>:</p>
<pre><code class="language-javascript">ms
    .build(function(err){
        if(err) {
            return fatal(err.message);
        }
        else {
            var client = s3.createClient({
                s3Options: {
                    region:&apos;us-west-2&apos;
                }
            });
            
            var params = {
                localDir: __dirname + &apos;/../build&apos;,
                deleteRemove: true,
                s3Params: {
                    Bucket:&apos;docs-site&apos;
                }
            };

            var uploader = client.uploadDir(params);
            uploader.on(&apos;error&apos;, function(err) {
                console.error(&quot;unable to sync:&quot;, err.stack);
            });
            uploader.on(&apos;progress&apos;, function() {
                console.log(&quot;progress&quot;, uploader.progressAmount, uploader.progressTotal);
            });
            uploader.on(&apos;end&apos;, function() {
                console.log(&quot;done uploading&quot;);
            });
        }
    });
</code></pre>
<p>If you don&apos;t have it setup already from the AWS CLI tool, you&apos;ll need to create a <code>~/.aws/credentials</code> file with your AWS credentials to get the deployments to work.</p>
<h1 id="conclusion">Conclusion</h1>
<p>In the end, our Metalsmith based documentation website probably tool a bit more work to get setup then we would have liked, but now that it&apos;s done, we are really happy with the results.  The documentation writers have had a great time with the quick feedback look of the auto updating server.  Using git has given us a great way to review documentation updates through pull requests and version the documentation.  And the deployments are so fast it almost seems like something went wrong.</p>
<p>For the full working example, check out this <a href="https://github.com/gleeds/metalsmith-docs-example">GitHub repo</a>.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item></channel></rss>