Sub 50 10k

Well, it has been a long time since I’ve broken 50 minutes with my 10ks but today was the day. I think the last time I did it was about 6 or 7 years ago. This morning, I really shattered it going sub 49.

Without a doubt, Noom has had a huge impact on my life. It’s been about 10 months since I’ve started following their green/yellow/red approach to tracking what I eat. I’ve managed to lose and keep off about 40 lbs. Just shedding that weight dramatically improved my running. Recently though, I started mixing in interval runs about once a week. This post from Running Faster is a good one for routines. I don’t quite know why I decided to do them again. It was probably a combination of things but I think it was primarily that I am feeling more confident about my running and I have a smart watch where I can program in the intervals. Anyhow, I’ve been hovering around the 50 minute mark for a while trying to work my way towards breaking it and I guess it was a plateau. I literally knocked a minute off my time this morning.

Imperfect Foods: One Weird Sweet Potato

I’ve been getting my groceries delivered to me weekly now for about a month using Imperfect Foods. I love these guys. The service, aside from being very convenient, has a mission of helping the environment by focusing on selling meat and produce that cosmetically doesn’t meet the bar for grocery stores. Unsurprisingly, there is a lot of food that doesn’t look perfect but what is surprising is that it typically doesn’t get sold. This is where Imperfect Foods comes in. Instead of letting that food go to waste, they make it available in their delivery service and at a great price. I am now saving money on my groceries every week and they are delivered to my front door, basically for free when you think about it.

Last week I got this especially odd looking sweet potato. It kind of looks like a heart. It just looked weird though. Nothing else was wrong with it. I cooked it up in a hash with spinach. Yum!

jolt: JSON Transforms Simplified

Every now and then it comes up that a REST API you are working with is a little inconsistent with the JSON that it returns. Sometimes a property you want is an array when it has multiple values but an object when there is only one. It’s annoying when you encounter these APIs and it happens more often than you think. Jolt is a package that handles JSON transformations really well and is perfect for this exact situation.

What I especially like about it is that it separates JSON transformation logic into separate files that can be read in at runtime. That allows it to be more configuration driven as oppose to being in your code. Here are a few examples that you can play with to get a feeling for things.

Step 1: Include jolt as a dependency

Create a new starter project using Gradle as your dependency manager. Add jolt as a dependency.

compile group: 'com.bazaarvoice.jolt', name: 'jolt-complete', version: '0.1.1'

Step 2: Get a JSON string to transform

Add this JSON to your src/main/resources folder as a file named jolt-me.js. We are going to use it as the target for our jolt transformations.

{
    "hello": "world",
    "I'm an array": [
        {
            "object": 1
        }
    ]
}

The following snippet will read that JSON file and covert it into a String instance.

Scanner scanner = new Scanner(ClassLoader.getSystemResourceAsStream("jolt-me.js"));
StringBuilder inputJSON = new StringBuilder();
while(scanner.hasNextLine()) {
    inputJSON.append(scanner.nextLine() + "\n");
}
scanner.close();

Step 3: Create a transformation JSON

All jolt transformations are executed using an instance of Chainr. To create an instance, you feed it JSON that defines the transformations that you want it to perform. Jolt provides several types of transformations out of the box. The two I find the most usefully are Shift and Cardinality.

Step 3.1: Shift transform

Shift transformations are for moving properties around and renaming them. Hence why it is called shift. Here is a simple example below. Safe it in a file named shift-spec.js in your src/main/resources folder:

[
    {
        "operation": "shift",
        "spec": {
            "hello": "blue",
            "*": "&"
        }
    }
]

The following snippet loads and runs the shift transformation.

Chainr chainr = Chainr.fromSpec(JsonUtils.jsonToList(ClassLoader.getSystemResourceAsStream("shift-spec.js")));
System.out.println(JsonUtils.toPrettyJsonString(
        chainr.transform(JsonUtils.jsonToObject(inputJSON.toString()))
    )
);

That results in the following output:

{
  "blue" : "world",
  "I'm an array" : [ {
    "object" : 1
  } ]
}

Step 3.2: Cardinality transform

This transformation is by far my favorite. The cardinality transform is specifically for dealing with JSON properties that are sometimes arrays and sometimes objects. In other words, the API you are working with is not consistent in how it returns data. I don’t know why this happens so often but I suspect developers of those APIs are trying to make things easier by considering the context in which their end points are getting called. However, it really has the opposite effect especially when you are trying to deserialize that JSON into a POJO.

Create the file cardinality-spec.js in src/main/resources with the contents below.

[
    {
        "operation": "cardinality",
        "spec": {
            "I'm an array": "ONE"
        }
    }
]

As with the previous step, you use a Chainr instance to execute the transformation. For example:

Chainr chainr = Chainr.fromSpec(JsonUtils.jsonToList(ClassLoader.getSystemResourceAsStream("cardinality-spec.js")));
System.out.println(JsonUtils.toPrettyJsonString(
        chainr.transform(JsonUtils.jsonToObject(inputJSON.toString()))
    )
);

That gives the following output. Notice that I'm an array is not an array anymore. It’s an object.

{
  "hello" : "world",
  "I'm an array" : {
    "object" : 1
  }
}

Step 3.3 Chaining transforms

What is really cool is that you can chain transforms together. So let’s combine the previous two transforms into a single JS file called chain-spec.js in your src/main/resources folder. The contents should be the following:

[
    {
        "operation": "cardinality",
        "spec": {
            "I'm an array": "ONE"
        }
    },
    {
        "operation": "shift",
        "spec": {
            "hello": "blue",
            "*": "&"
        }
    }
]

This familiar Chainr snippet will execute it.

Chainr chainr = Chainr.fromSpec(JsonUtils.jsonToList(ClassLoader.getSystemResourceAsStream("chain-spec.js")));
System.out.println(JsonUtils.toPrettyJsonString(
         chainr.transform(JsonUtils.jsonToObject(inputJSON.toString()))
    )
);

That gives you the following. As you can see, both the shift and cardinality transforms were performed on our JSON data.

{
  "blue" : "world",
  "I'm an array" : {
    "object" : 1
  }
}

Closing Thoughts

Jolt is pretty convenient as an off the shelf utility for quickly doing some JSON transformations. The implementation of its JsonUtils is a little clunky and it is annoying how it doesn’t support generics so explicit casting has to be done. Regardless, that is pretty minimal pain compared to having to do these transformations using POJOs.

Whole Wheat Biscotti

I’m having a lot of success baking with whole wheat flour these days. In fact, I am rather thrilled that I haven’t needed to make any sacrifices transitioning from regular white flour. Working with nut flours is still giving me some odd results though. I recently made cookies with almond flour and… well… it was like eating ground almonds pressed together. I suppose that should make sense.

Anyhow, here’s my recipe for whole wheat biscotti. Perfect with a hot coffee and not super bad for you. Toss in chopped dried fruit that you have on hand to give it some flavor and texture. When you make it, don’t get discouraged by how sticky the dough is. Remember that you are going to twice bake this cookie. Hence… Biscotti 🙂

Discord YAGPDB Custom Role Commands

A few of my friends asked me recently if I could help them with the Discord bot, YAGPDB, that they use to help organize server roles for people that they play games with. I do love chat bots so I opted to throw some code their way.

Specifically, what they were asking for were two commands for assigning and removing a given role to a list of users. The documentation for YAGPDB isn’t the greatest and their code samples use a function that they developed, userArg, which limits you to 5 calls per custom command execution; not ideal for bulk assignments. Below are the custom commands that I created which do not use userArg so they aren’t restricted to this limit. I got around it by simply parsing out the user id.

Give Role

This custom command gives the specified role to all listed users. When I deployed it to YAGPDB, I called it give-role which is why I refer to it as such in the usage instructions I wrote.

{{ if (gt (len .CmdArgs) 1) }}
	{{ $role := (reReplace ">" (slice (index $.CmdArgs 0) 3) "") }}
	{{ range $x := seq 1 (len .CmdArgs) }}
		{{ $userId := (reReplace ">" (slice (index $.CmdArgs $x) 3) "") }}
		{{ giveRoleID $userId $role }}
	{{ end }}
{{ $role }} given
{{ else }}
Usage: give-role \@role \@user1 \@user2 ...
{{ end }}

Take Role

This custom command removes the specified role from all listed users. Like the give role command, I called this one take-role when I deployed it.

{{ if (gt (len .CmdArgs) 1) }}
	{{ $role := (reReplace ">" (slice (index $.CmdArgs 0) 3) "") }}
	{{ range $x := seq 1 (len .CmdArgs) }}
		{{ $userId := (reReplace ">" (slice (index $.CmdArgs $x) 3) "") }}
		{{ takeRoleID $userId $role }}
	{{ end }}
{{ $role }} taken
{{ else }}
Usage: take-role \@role \@user1 \@user2 ...
{{ end }}

Closing thoughts

I love how vibrant the Discord bot community is and how it is helping people spend more time gaming and less time doing administrating and coordinating. I was a little surprised that these commands were not already built into YAGPDB since bulk role management is a pretty common task. I hope others find these custom commands as useful as my friends do.

Noom Friendly Whole Wheat Sweet Potato Muffins

Veggie cake is quickly becoming one of my favorite things to bake. You can make something low cal and healthy that also satisfies that cake craving. Sweet potato followed by carrot are my veggies of choice. Sweet potato I find has a natural sweetness to it just like carrot but it is softer and easier to work with. Zucchini in general is popular but I don’t really get it. It is super wet and doesn’t have much flavor. It is kind of just all filler. Check out my recipe for sweet potato muffins. You can whip a batch up in a flash especially if you bake your potato using a microwave.

My Favorite Quick Bread: Irish Soda Bread

I have to say that by far, my favorite quick bread is Irish soda bread. It is something that you can whip up as soon as you wake up in the morning and have with a coffee for breakfast. It takes less than 10 minutes to make the dough which doesn’t need to prove. 35 minutes in the oven and your done. It is honestly one of my favorite ways to start the weekend.

Here’s my recipe for it on Cookpad. There are very few ingredients that go into soda bread. Buttermilk is quite critical though as its acidity in combination with the baking soda is what causes the bread to rise in the oven. I like adding in some sort of dried fruit or berry. Soda bread is generally pretty dense so having something that provides a strong flavor makes it more interesting.

Automatic Updated At Fields In Google Sheets

When working on Google Sheets, especially with others, it is often quite useful to have fields that capture when a cell, column, row, or even an entire sheet has been updated. In order to do that, you need to use Google Apps Scripts. Don’t worry if you haven’t really coded before; the sample script below is quite straightforward and can be easily altered to meet your needs.

Step 1: Open the Sheets script editor

In your Google Sheets, you first have to open the script editor. It is in Tools -> Script editor. That will start a new Google Apps Script project with a single script: Code.gs. Be sure to set a project title at the top of the page.

Step 2: Create the script

Replace the contents of Code.gs with the script below. It gets the selected cell on the sheet being viewed. If that cell is in a column after B and row after 1 and it is on Sheet1, then the current date is set in the first cell of the same column.

function onEditSheet1(e) {  
  var sheet = SpreadsheetApp.getActiveSheet();
  var cell = sheet.getActiveCell();
  // The edit is done in any column after B and row after 1 and the sheet name is Sheet1
  if (cell.getColumn() > 2 && cell.getRow() > 1 && sheet.getName() == "Sheet1") {
    sheet.getRange(1, cell.getColumn()).setValue(new Date()).setNumberFormat("yyyy-MM-dd");
  }
};

Step 3: Add a trigger

To execute the script on edits to your sheet, you need to attach it to a trigger. To view your project’s triggers, click the clock symbol on the script editor’s tool bar. That will open up the project’s triggers dashboard. Click the Add Trigger button at the bottom left of the page. Use the following settings:

  • Function to run: onEditSheet1
  • Deployment: Head
  • Event source: From spreadsheet
  • Event type: On edit

Click save. You will be prompted to grant permission to run the script with your Google account and informed that the app has not be verified. That is fine since you are the person that created it.

Now go back to your sheet and on Sheet1, try editing some cells. You will see that the cells at the top of the columns you edit get set to the current date.

Closing thoughts

Well now you know enough to be dangerous 🙂

Google Apps Scripts is incredibly powerful. Here, we are just doing simple cell updates but where things can get crazy is when you start making calls to 3rd party APIs. That means you can set triggers that will make calls to Webhooks. At that point, the possibilities with what you can do with these little scripts is limitless. The reference documentation for Google Apps Scripts specific to Google Sheets can be found here.

My First Imperfect Foods Box

I finally decided to give Imperfect Foods a try, a grocery delivery service with an interesting twist. Basically, I got tired of going to the store especially with the COVID crisis going on. The lines and social distancing have made getting groceries take so long. In fact, before the crisis, I was getting pretty annoyed having to deal with parking. Even with all that, I have been hesitant to use a delivery service but then I found Imperfect Foods.

What makes Imperfect Foods interesting is that by and large they sell the produce that grocery stores won’t. There isn’t anything wrong with their fruit and veg aside from them being a little misshaped or scarred. As it is all food that doesn’t make the cut for the shops, they can sell it at a really good price and if you spend over $60, easy enough to do for a week’s worth of groceries, you get free shipping.

I’m super happy with this first box I got from them and I’m really looking forward to seeing how they do with my next. It might just be good enough to keep me from going back to my grocery store.

Local Kubernetes using Docker Desktop

Kubernetes is really starting to be a de-facto standard in how we deploy things to the cloud. It provides a ton of functionality around docker images that it deploys while at the same time making optimal use of the hardware that it is running on. Having worked with traditional servers and then VMs, I have found Kubernetes to be quite refreshing. A lot of the tedious things that you had to deal with are just handled by default.

Running kubernetes locally is really quite simple. You only need to have docker desktop installed. That should be almost an automatic these days if you are developing software for the cloud. Now if you are someone that has docker desktop installed but hasn’t messed with kubernetes, don’t worry. A lot of folks do write apps and build docker images for them without needing to play with it. They are either leveraging a hosted service, like AWS ECS, to run and manage their containers or they have someone on their team that is hogging all the fun. Here’s a quick tutorial on how to start a container on kubernetes locally.

Step 1: Create a docker image

You might already have an image kicking around for an app you are developing. Let’s assume that you are starting from scratch though. Create a file called Dockerfile with the following contents. alpine is a minimal linux distro and we are going to install HTTPie on it. The working directory is set to be /opt/. The image opens a shell using the command /bin/sh.

FROM alpine:edge

RUN apk add httpie

WORKDIR /opt/

CMD /bin/sh

Build the image using the following command:

docker build --no-cache -t hello-world:docker-desktop .

You can verify that the image is built and working as expected by run it in interactive tty mode. That will allow you to interact with the shell as you would through a console.

docker run -it hello-world:docker-desktop

In the shell, you can run httpie from the container. That basically proves that your simple docker image was built correctly.

http https://jsonplaceholder.typicode.com/todos/1

No need to clean up. As soon as you exit the shell the container will stop running.

Step 2: Create a kubernetes pod

Now that you have a simple docker image, let’s create a kubernetes pod for it. A pod is a group of one or more containers. In this tutorial, it is just going to be one container that has our small image with HTTPie installed.

Select the docker-desktop context from the docker desktop kubernetes drop down menu. That is the locally running kubernetes instance that you get with docker desktop. You can confirm that you have that context selected using this command:

kubectx

Let’s create a namespace in your context called hello-namespace. Namespaces let you group together pods. In fact, you can create multiple pods with the same name as long as they exist in different namespaces.

kubectl create namespace hello-namespace

That gives us the kubernetes infrastructure needed for deploying our pod. A pod is defined using manifest files like the one below. They can either be JSON or YAML but virtually everyone uses YAML. This manifest says to create a pod named hello-pod with a single container called hello-container. This container will use the hello-world image and runs the command sleep 15m which means it will run for 15 minutes before completing. Copy this manifest into a file named hello-manifest.yaml

---
apiVersion: v1
kind: Pod
metadata:
  name: hello-pod
  namespace: foo
spec:
  containers:
    - name: hello-container
      image: hello-world:docker-desktop
      imagePullPolicy: IfNotPresent
      workingDir: /opt/
      command: ["sleep"]
      args: ["15m"]

We can deploy the pod using this command:

kubectl apply -f hello-manifest.yaml

We can now see that our pod has successful been deployed using get pods.

kubectl get -n hello-namespace pods

Step 3: Shell into your pod

Let’s jump into that pod you just deployed. The command below should look very familiar as it is almost the same as the one we used to shell into your docker container from step 1.

kubectl -n hello-namespace exec -it hello-world /bin/sh

Just like in step 1, let’s run HTTPie.

http https://jsonplaceholder.typicode.com/todos/1

The pod will complete after 15 minutes but we should still clean up and remove it. Use the command below to delete the pod:

kubectl delete -f hello-word.yaml 

Closing thoughts

This tutorial has walked you through how to deploy a kubernetes pod with a docker image you built. The image that we used is really quite basic but you should be able to see that with a little effort, you can use it to deploy any image you are working on. There is more that you need to tinker with if you are going to setup a Web app or service but what we covered in this post is a good first step in that direction.