2022

  • Puppy kisses

    I spent December 8th-12th in Port Clinton for the Winter Airgun matches. When we came back from several days away, I had to give our now 7-month-old puppies some love.

  • Have you tried it to see what breaks?

    Last year, my team received a challenge from our CEO to try to significantly improve our purchase flow along with a bit of time pressure.

    For several months, I’d had a good, high-level idea of how we might improve our purchase flow. But, as the idea lacked concrete implementation details, we kept prioritizing other work ahead of it.

    But, with a directive from our CEO and some time pressure, prioritizing other work ahead of this was no longer an option. Knowing this, I reached out to a friend who had experience in the part of the codebase I?d need to make my changes in to ask him for advice on how to move forward. His reply was succinct:

    Have you tried it yet to see what breaks?

    As soon as he asked me this question, as silly as this sounds, it occurred to me that I hadn’t actually tried it. I then had the realization that I had paralyzed myself by thinking of it as a large problem that would take months to untangle rather than a series of small problems that could be addressed as needed.

    Armed with this outlook, I immediately dug into the codebase and emerged with a proof of concept within a couple of days.

    I often think about this interaction when I’m a bit paralyzed with a problem.

  • Getting a list of all Google Fonts

    As part of a recent audit that I was doing of Google Font usage in a codebase, I had a need to get a list of all Google Fonts. But, after a bit of searching, I wasn’t finding an easily consumable list of these fonts.

    But, it turns out that if you get an Google Fonts API key from the Google Cloud Console, then it’s super simple to query and get a list of fonts. Actually, this is what getting a list of Google Fonts looks like with a single command from the terminal.

    curl --location --request GET 'https://www.googleapis.com/webfonts/v1/webfonts?key=...' | jq ".items[] | .family" --raw-output

    To get this to work, simply replace the ... with your actual key and then run the command.

    Of note, this command does assume that you’ve got jq installed which is used to parse the JSON response. But, if you’d like, you can simply remove the everything after from the | jq and after to echo the json.

  • Where’d that 7 come from?

    I shot this target last week. Just that one shot that must’ve come from someone else.

  • Target backer after match

    After getting done with the Texas State Indoor Championship this past weekend, one of the most satisfying things is taking a look at the backer and seeing your overall group. It’s especially satisfying the more that I shoot out the black.

  • Simple Github Webhook handler in PHP

    Over the past month or so, I spent a bit of time working on setting up a local development environment for a project that I’m working on. As part of that, I also set up a deployment flow using a simple Github webhook handler in PHP.

    I modeled this Github webhook handler very much after jplitza’s Gist here, but I simplified it even further since all I really cared about was whether an event happened since I’d already filtered on Github for push events.

    This is the handler that I ended up with.

    <?php
    
    define( 'LOGFILE', '/DIR' );
    define( 'SECRET', '0000000000' );
    define( 'PULL_CMD', 'CMD_HERE' );
    
    $post_data = file_get_contents( 'php://input' );
    $signature = hash_hmac( 'sha1', $post_data, SECRET );
    
    function log_msg( $message ) {
    	file_put_contents( LOGFILE, $message . "\n", FILE_APPEND );
    }
    
    if ( empty( $_SERVER['HTTP_X_HUB_SIGNATURE'] ) ) {
    	exit;
    }
    
    if ( ! hash_equals( 'sha1=' . $signature, $_SERVER['HTTP_X_HUB_SIGNATURE'] ) ) {
    	exit;
    }
    
    // At this point, we've verified the signature from Github, so we can do things.
    $date = date(' m/d/Y h:i:s a', time() );
    log_msg( "Deploying at {$date}" );
    
    $output_lines = array();
    exec( PULL_CMD, $output_lines );
    
    if ( ! empty( $output_lines ) ) {
    	log_msg( implode( "\n", $output_lines ) );
    }
    
    exit;

    Basically, all this file is doing is:

    • Verifying that the request actually comes from Github by creating a signature using our SECRET and then comparing that to the signature that Github sent us with the time constant hash_equals
    • Running whatever command is necessary. For me, this command is basically just cd dir && git pull origin main.
    • Writing logs so that we can keep track of how often the Github webhook handler is called and whether it’s successful or not.

    To get this to work, you’ll roughly do the following:

    • Create a log file and add the location to the LOGFILE constant.
    • Create a strong secret in the SECRET constant that can be shared on Github so that we can validate the webhook came from Github.
    • Update the PULL_CMD constant with the command that you’d like to run.
    • Upload this file to your server in a publicly accessible location and make note of what the URL will be.
    • Go to Settings > Webhooks for one of the Github repos that you manage and create a new webhook.
    • Ensure that you set the URL to the file that we uploaded earlier and ensure that you enter the value from the SECRET constant.

    That seems like several steps, but in just a few minutes, you could have your own Github webhook handler in PHP up and running!

  • Portugal Road Trip 2022

    This past June, following the Crew meetup in Porto, Sara and I rented a car and drove from Porto to Lisbon.

    We stopped at several places along the way as well including Nazare and Sintra.

  • Jetpack Crew Meetup in Porto

    In June, I flew to Porto, Portugal to join the Crew team on their meetup.

    After nearly 2 years with Automattic not having meetups, it was great to see some old friends as well as meet several new people that we’d added in the past couple of years.

    Near the beginning of the meetup, I was so jet lagged, that I took any chance I could to sleep. Which was basically every time we got in a car. Even when our van for the wine tour broke down and we were stuck on the side of the highway.

    For our work time, we focused our time around a security workshop and talks such as start, stop, continue. As you can tell by the photos, there was also much time spent on social activities.

    Near the end of our time together, Derek Smart and I announced to the team that I would be leaving the division to become Head of Web Engineering at WordPress.com and that Derek would become Development Lead at Jetpack.

    We spent some time answering questions and then some time jokingly trying to pitch projects over the fence between divisions.

  • How to export Github pull requests

    I had the need recently to export Github pull requests from a repository to a CSV so that I could do some analysis.

    I wasn’t able to find a simple way to this in the Github UI. When I searched, I found several tools. But, the API seemed quite simple, so I just wrote a script that would dump all pull requests from a Github repository to a CSV.

    #!/bin/bash
    
    # This script requires jq to be installed and available in the path.
    
    TOKEN=$1
    ORG=$2
    REPO=$3
    OUTPUT_RAW=""
    
    get_pull_requests() {
    	curl -s --location --request GET "https://api.github.com/repos/$ORG/$REPO/pulls?state=all&per_page=40&page=$1" \
    		--header "Authorization: token $TOKEN" \
    		--header "Accept: application/vnd.github+json"
    }
    
    get_raw_output() {
    	printf '%s' "$1" | jq -r
    }
    
    i=1
    while [ "$OUTPUT_RAW" != "[]" ] ; do
    	OUTPUT=$( get_pull_requests "$i" )
    	OUTPUT_RAW=$( get_raw_output "$OUTPUT" )
    
    	i=$((i+1))
    
    	printf '%s' "$OUTPUT" | jq -r '.[] | [ .created_at, .html_url, .user.login, .title ] | @csv'
    done

    To use the script, you’ll need to have jq. On a mac, you can use brew install jq.

    The only other pre-requisite that you’ll need to export pull requests from Github is a personal access token.

    From there, you should just need to run the script with something like to export all of your pull requests for a given repository:

    sh github_pulls_export.sh TOKEN ORG REPO

    If you’d like to change what data gets exported, simply change the fields that are pulled in this section:

    [ .created_at, .html_url, .user.login, .title ]
    
    

    You can modify that printf line to get an idea of what fields are even included that you can pull from

  • Hangout Music Festival 2022

    About a month ago, Sara and I went to Hangout Music Festival in Gulf Shores, Alabama.

    This trip was Sara’s birthday and Christmas gift for 2021. The only thing that she asked for was to see Post Malone in concert. Originally she suggested that we go to Lollapalooza in Europe, but I was sure that we could find Post Malone in concert somewhere closer.

    Then I found Hangout Music Festival which seemed perfect in terms of distance and line-up (at least the line-up for the first day).

    On the way back to the airport in Pensacola, we decided to stop by the beach one last time and got caught in a storm.

  • Family Trip to Beavers Bend

    This weekend, the family went to Beavers Bend State Park in Broken Bow, Oklahoma for some kayaking, hiking, and just hanging out in a cabin.

    Here are some photos from the trip.

  • Camping trip at Lake Bryan

    I went on a camping trip last weekend with some family and Troop 599. This was followed by the Texas A&M vs Auburn baseball game on Sunday for driving back home.

    On the second day of camping, the troop attended the Living History Weekend at Santa’s Wonderland, where we even saw a flamethrower in action.

  • My first meetup in 2 years! Pura Vida

    A few weeks ago, I traveled to Costa Rica for a meetup with our Backup, Scan, and Security Research teams. This was the first time that I had traveled for work in 2 years!

    Over the past couple of years, I’d gotten comfortable with the idea of not traveling. It was relaxing to not be gone from home several weeks per year. This allowed me space to focus on other things like family and my shooting sports, even to a point where I won some things.

    But, going to this meetup has reminded me of what I’ve been missing out on. It’s hard to simulate the impromptu, open, honest, and fun conversations that happen at meetups. Conversations that happen one-on-one as you sit around the pool, are walking to lunch, are at the dinner table, etc.

    It’s also a very humbling and exciting experience as it reminds me how little I know and how much I have to learn.

    Photos

    Below are some photos from the trip. Most are from myself, though I did pull some from a Google Photos album that the team shared.

  • White Sands National Park Trip

    As part of our Spring Break trip to New Mexico this week, my family stopped by White Sands National Park for some sand sledding.

    We bought sleds from the visitors center for $24.99. Note that I said buy, not rent. You can gift the sleds back to the visitors center when you’re done and they’ll give you a koozie in return.

    We just decided to keep our sleds for the next time there’s a snowpocalpse in Texas.

  • Sandia Peak Trip

    This week was Spring Break for our family. So, after I got back from Costa Rica, we packed up the truck and headed to New Mexico.

    On the first day we were in New Mexico, we took the Sandia Peak Tramway up to Sandia Peak and spent a couple of hours exploring.

    The Sandia Peak Tramway was much longer and taller than any tram that I’ve ever been on before. It’s also a very popular destination during Spring Break, which caused us to have to wait about 1.5-2 hours before getting on the tram up.

    Below are some photos from the trip.

  • On comparing large lists

    In the past, I’ve often had to generate email lists of users that fit specific conditions. This usually isn’t too difficult with some of our in-house data tools at Automattic. But, when I hit a case where I have to work across systems, it usually results in me dumping the data from each system and then comparing large lists.

    Comparing large lists on the command line isn’t that difficult. All it takes is a few commands which I’ll walk you through in this blog post!

    So, let’s come up with a theoretical example. Let’s say that I have two separate lists, one of users that have purchased Product A and one of users that have purchased Product B. Second, let’s also agree that these lists contain the user’s email address and the date of purchase. So, in a CSV, the data would look a bit like this:

    email_address,date
    "user@example.com","2022-03-01"

    Prepping the list of users

    Later on, we’re going to use the comm command for the actual comparing of the lists. Before we can use that command, there’s a bit of prep work that we first need to complete. Specifically, we need to:

    • Remove the CSV header row if there is one
    • Filter down the source data to whatever field we want to compare, email addresses in our theoretical example
    • Sort the list
    • Unique the list

    To cut the CSV header from file, we’re going to use the following:

    sed 1d product_a.csv

    This will remove the first line and print the rest of the file to standard out.

    Next, we’re going to pull out the column that contains email address, or the first column in our dataset above. To do this, we’re going to use the following:

    cut -d',' -f1 product_a.csv

    This command is setting the , as the delimeter and then pulling the first column of the file.

    From here, we simply need to sort and unique, which we can do with the sort and uniq commands. If we put all of the above together, we can run it in a single go for each file:

    sed 1d product_a.csv | cut -d',' -f1 | sort uniq > product_a_output.csv

    At this point, now we just need to take the files and actually compare them. ?

    Actually comparing large lists ?

    Alright, now that we have two processed files, we can get to the easy part, comparing. The easiest way I?ve found to compare files for these use cases is the comm command. You can man comm to get a detailed view of how that command works, but in summary, you use it like this:

    comm file1 file2

    That command then outputs three columns, where:

    • The first column is all values that are in file1 and not file2
    • The second column is all values in file2 and not file1
    • The third column is all values in both files

    If you’re only interested in the lines that are in both files, you can do the following, which will remove the first and second columns of values:

    comm -12 file1 file2

    You can then write redirect that output to a file or however you’d like.

  • How to expand tilde in bash script

    When I was working on a recent bash script, I was irritated when I wasn’t getting output to my desktop. After a while, I figured out that quoted tildes are not automatically expanded.

    From that link:

    If a word begins with an unquoted tilde character (‘~’), all of the characters up to the first unquoted slash (or all characters, if there is no unquoted slash) are considered a tilde-prefix.

    The “quoted” part being the key phrase. Thus, if your tilde is in quotes, it will not be expanded. This issue is demonstrated by the following:

    ?  ~ path="~/Desktop"
    ?  ~ cd "$path"
    cd: no such file or directory: ~/Desktop

    Never fear though, there is an easy solution for this: parameter expansion. ?

    ?  ~ path="~/Desktop"
    ?  ~ path=${path/\~/$HOME}
    ?  ~ cd "$path"
    ?  Desktop

    What we’re doing here is basically a find and replace. We take in path, search for ~, and replace it with $HOME. This gives us a valid path like /Users/username/Desktop which we can then use in various commands.

    Now, before you actually implement this, maybe consider whether you need to. In my case, the issue was that I was quoting an argument to my script. Instead of using parameter expansion to expand the tilde, I could’ve simply unquoted the path as it was passed to the script.

    But, you know, lessons learned.

  • Wheel of Life in Google Sheets

    I’ve been working with a career coach since last April. One of the very first things that my coach asked me to do was a wheel of life.

    Except, when she assigned the task to me, she asked me to fill it out by modifying an image. This worked fine, but felt a bit weird to me.

    Fast-forward several months to a point where I see a Google Sheet that looks very similarly to a wheel of life. So, I dug in a bit and found out that the chart is called a radar chart in Google Sheets.

    Anticipating that I’ll need to do more of these wheel of life assignments in the future, I’ve now created a Google Sheet to make it a bit simpler. ?

    The data for this is quite simple as it only requires the items to track and the values for those items. This data set then becomes the data range for the radar chart.

    Now, the only thing left to do is to set the maximum value of the chart to 10. To do this, go to “Customize -> Vertical Axis” for the radar chart and then set the max value to 10.

    Now, you’ve got a Wheel of Life in Google Sheets.

    If you’d like to start from an existing Google Sheet, then here’s a publicly available wheel of life built with Google Sheets that has randomly generated values.