add the old snippet md files

This commit is contained in:
Travis Shears 2025-06-05 16:20:57 +02:00
parent fc0dd204c7
commit bcf8313a4b
110 changed files with 3048 additions and 0 deletions

View file

@ -0,0 +1,30 @@
---
date: 2023-10-27T22:00:00.000Z
title: How to trust gpg keys
draft: false
snippet_types:
- gpg
seo_description: How to trust gpg keys
---
After moving some GPG keys to a new computer I kept getting these trust warnings.
```
It is NOT certain that the key belongs to the person named
in the user ID.If you * really * know what you are doing,
you may answer the next question with yes.
Use this key anyway? (y/N)
```
To solve this one has to edit the key and set a trust level.
```javascript
$ gpg--edit - key 't@coolsite.com'
trust
5
```
## Source
* [yanhan](https://yanhan.github.io/posts/2014-03-04-gpg-how-to-trust-imported-key/)

View file

@ -0,0 +1,44 @@
---
title: "auto find ssh keys"
date: 2020-08-12T12:14:15+02:00
draft: false
snippet_types:
- ssh
---
I use to always pass a key when sshing ex:
```shell
$ ssh -i ~/.ssh/de2 travis@vxxxxxxxxxxxxxxxxxxx.megasrv.de
```
That can be a bit annoying. I know two fixes:
# add private key to ssh-agent
```shell
$ ssh-add ~/.ssh/de2
```
Now when you ssh you need not include the `-i ~/.ssh/de2` because the ssh-agent will find it
automatically.
*\*note: this resets once you reboot*
# configure individual host
You can configure individual hosts to use a spefic private key my editing your `~/.ssh/config`:
```
Host de2
HostName vxxxxxxxxxxxxxxxxxxxx.megasrv.de
User travis
IdentityFile ~/.ssh/de2
```
Now you only need to
```shell
$ ssh de2
```
source -- https://www.techrepublic.com/article/how-to-use-per-host-ssh-configuration/

View file

@ -0,0 +1,19 @@
---
title: "auto nvm use"
seo_description: Configuration to automatically run when starting a shell session in a dir with an .nvmrc file
date: 2023-03-13T10:33:12+01:00
draft: false
snippet_types:
- nvm
---
Simply adding:
```
[ -f ./.nvmrc ] && nvm use
```
to your .bashrc or .zshrc
Will automatically run the `nvm use` command should you start the shell in a dir with a `.nvmrc` file.

View file

@ -0,0 +1,14 @@
---
title: "automatic tmux session names"
date: 2020-07-13T11:38:52+02:00
draft: false
snippet_types: ["tmux"]
---
This month I ditched [XQuartz](https://www.xquartz.org/) and am back to using Tmux. One part I found tedious was manually naming sessions. I wrote this little alias help. When run from outside a tux session to creates a new session named after the current directory. Ex when in `/users/t.shears/dev/cool-app` a session named cool-app is created.
Excerpt from my `.zshrc`
```bash
alias tmux_new="tmux new -s \$(pwd | awk -F "/" '{print \$NF}' | sed 's/\./_/g')"
```

View file

@ -0,0 +1,34 @@
---
title: "awk last column"
seo_description: "Shell snippet explaining how to extract the last column using awk"
date: 2022-02-13T10:46:08+01:00
draft: false
snippet_types:
- awk
---
My first [awk](https://en.wikipedia.org/wiki/AWK) snippet! Today I needed to get
all the file extensions in directory for a blog post I'm writing.
I solved it with:
```shell
$ fd . --type f | awk -F"." '{print $(NF)}' | tr '[:upper:]' '[:lower:]' | sort | uniq | pbcopy
```
## Break down
**fd . --type f**, lists all the files in a directory recursively.
**awk -F"." '{print $(NF)}'**, the **-F"."** tells awk to split columns on ".". The **'{print $(NF)'}** tells awk to print the last column.
Normally you do something like **'{print $2}'** to print the second column.
**tr '[:upper:]' '[:lower:]'**, tr is a Unix until to translate characters. In
this case all upper case letters will be translated to lower case. I've created a
[seprate snippet](/snippets/lower-case) for it as well.
**sort | uniq**, a classic combo sorts the results then gets rid of duplicates.
**pbcopy**, anther common one for me pipes the result into the clipboard.
[source](https://linuxhint.com/awk_print_last_column_file/)

View file

@ -0,0 +1,15 @@
---
title: "aws cloudfront invalidation sync"
date: 2020-01-11T04:41:11+01:00
draft: false
snippet_types: ["aws"]
---
```shell
$ aws cloudfront create-invalidation --distribution-id E29OAXKYAP0NP8 --paths /work/
```
Pairing well with the sync is invalidating the CDN so that the live site is updated immediately with the S3 bucket. Like everything these days, there is a cost involved with this operation so if I'm not in a rush I often avoid it. Also you can reduce cost by using **--paths** only invalidating needed routes.
inspiration:
[lustforge](https://lustforge.com/2016/02/27/hosting-hugo-on-aws/) - great blog post

View file

@ -0,0 +1,16 @@
---
title: "aws s3 sync"
date: 2020-01-11T04:43:43+01:00
draft: false
snippet_types: ["aws", "s3"]
---
```shell
$ aws s3 sync --acl public-read --sse AES256 build/ s3://travisshears.com
```
Having the ability to send a site live with a single command is heaven and not a BASH script with a bunch of moving parts liable to break. This command takes me back to the old days when putting a site live meant an FTP upload command to a server letting Apache take care of the rest. This site, for example, is hosted in an AWS S3 bucket that is connected to AWS CloudFront.
inspiration:
[lustforge](https://lustforge.com/2016/02/27/hosting-hugo-on-aws/) - great blog post

View file

@ -0,0 +1,38 @@
---
title: "bash extract file name"
seo_description: "How to use basename command to extract just file name from full path"
date: 2022-03-17T15:08:43+01:00
draft: false
snippet_types:
- sh
- bash
---
Comparing the following shows how to use basename to extract just the file name from a full path.
```shell
$ for file in ./content/**/*.md ; do echo $file ; done | head -10
./content/_index.de.md
./content/_index.en.md
./content/_index.ru.md
./content/blog/_index.de.md
./content/blog/_index.en.md
./content/blog/ahrn-valley/index.en.md
./content/blog/archiving-corona-cal/index.en.md
./content/blog/arco/index.de.md
./content/blog/arco/index.en.md
./content/blog/armycookbot/index.de.md
$ for file in ./content/**/*.md ; do file=$(basename $file) && echo $file ; done
_index.de.md
_index.en.md
_index.ru.md
_index.de.md
_index.en.md
index.en.md
index.en.md
index.de.md
index.en.md
index.de.md
```

View file

@ -0,0 +1,24 @@
---
title: "choose over awk"
seo_description: Why I use choose instead of awk now days.
date: 2024-03-04T11:52:22+01:00
draft: false
snippet_types:
- awk
- choose
---
Sometimes typing out `awk '{print $1}'` just takes too long. Today I decovered [choose](https://github.com/theryangeary/choose?tab=readme-ov-file).
It makes selecting a field from an output much faster.
Now instead of:
```shell
$ gss | rg awk | awk '{ print $2 }' | xargs hx
```
I run:
```shell
$ gss | rg awk | choose 1 | xargs hx
```

View file

@ -0,0 +1,21 @@
---
title: "close the garden"
seo_description: ""
date: 2022-01-24T10:12:03+01:00
draft: false
snippet_types:
- garden-cli
---
Started playing with [garden cli](https://garden.io) today. After playing around
with the local kubernetes deployments I found it annoying it left some system
containers running when I was finished. To get rid of these run the following
from the project directory (the dir with project.garden.yml)
```shell
$ garden delete env
$ garden plugins local-kubernetes uninstall-garden-services
```

View file

@ -0,0 +1,29 @@
---
title: "k8s deployment.yaml env vscode snippet"
date: 2020-06-20T14:45:57+02:00
draft: false
snippet_types: ["vscode", "kubernetes"]
---
Most of my personal projects are deployed via kubernetes. I write a lot of
*deployment.yaml* files. In order to keep them clean and checked in to version control
I keep sensitive env variables in a config maps. Problem is adding values env values to
*deployment.yaml* files is pretty painful. This makes it a little less.
placed in *yaml.json* 😀 what a file name!
```json
{
"env var from configmap": {
"prefix": "env",
"body": [
"- name: $1",
" valueFrom:",
" configMapKeyRef:",
" key: $1",
" name: configmapname"
],
"description": "env varable from config map, remember to replace configmapname with your configmap name"
}
}
```

View file

@ -0,0 +1,17 @@
---
title: "describing a raku variable"
date: 2021-10-15T18:04:19+04:00
draft: false
snippet_types:
- raku
---
My go to way to figure out what I'm working with in Raku.
```raku
my $res = cool_thing();
say $res.WHAT;
say $res.^attributes;
say $res;
```

View file

@ -0,0 +1,27 @@
---
title: "destructuring an array in javascript"
date: 2021-11-29T09:22:30+01:00
draft: false
seo_description: "Turns out you don't need blank variables at all, simply using commas is enough"
snippet_types:
- js
---
How I use to destructure Arrays:
```js
const nums = [1,2,3];
const [a, _, c];
(a === 1) // true
(c === 3) // true
```
Problem is this **_** is not needed and will cause problems with some ESLint
setups. For example they might not allow unused variables. Turnes out you
can just leave that spot blank!
```js
const nums = [1,2,3];
const [a, , c];
(a === 1) // true
(c === 3) // true
```

View file

@ -0,0 +1,14 @@
---
title: "disable user"
date: 2020-08-11T16:23:05+02:00
draft: false
snippet_types:
- sysadmin
---
In this case disabling the user named ubuntu from logging in. This includes logging in via ssh.
```shell
$ sudo usermod --expiredate 1 ubuntu
```

View file

@ -0,0 +1,46 @@
---
title: "diy git remote on nas storage"
seo_description: "How to setup a folder on a network attached storage drive to be a remote git repo."
date: 2022-06-26T15:57:58+02:00
draft: false
snippet_types:
- git
- nas
---
Got a repo with sensitive data you don't want to push to a remote server you don't control? Have a NAS setup on our home network?
Here is how to setup a folder on that NAS to act as a git remote.
**Step 1:*
Change directorys to the NAS and clone the local folder with the _--bare_ option.
```shell
$ cd /Volumes/travis/git
$ git clone --bare ~/.password-store
```
This creates **/Volumes/travis/git/.password-store** but without a working directory. Basically its just the **/.git** part of the repo.
**Step 2:**
Setup the NAS file path to be a git remote on the repo.
```shell
$ cd ~/.password-store
$ git remote add nas /Volumes/travis/git/travisshears.com.git
...
```
**Step 3:**
Done. Now jus push.
```shell
$ git push nas
Enumerating objects: 8, done.
Counting objects: 100% (8/8), done.
Delta compression using up to 10 threads
Compressing objects: 100% (5/5), done.
Writing objects: 100% (5/5), 1.30 KiB | 1.30 MiB/s, done.
```

View file

@ -0,0 +1,22 @@
---
title: "make emacs words closer to vim"
date: 2021-10-20T20:18:34+02:00
seo_description: "code snippet explaining how to make emacs more like vim"
draft: false
snippet_types:
- emacs
---
I love [doomacs](https://github.com/hlissner/doom-emacs) but jumping around words it bugged me underscores were not
considered part of the word like in vim. After six months of dealing with it I
stumbled upon a solution!
Simply put this in my **.doom.d/config.el**.
```emacs
(modify-syntax-entry ?_ "w")
```
source: [emacs.stackexchange](https://emacs.stackexchange.com/questions/9583/how-to-treat-underscore-as-part-of-the-word)

View file

@ -0,0 +1,29 @@
---
title: "emacs mac umlauts"
seo_description: "How to type umlauts in emacs on mac"
date: 2021-11-05T13:50:36+01:00
draft: false
snippet_types:
- emacs
---
Recently I've been writing a lot for the German side of my personal site. When typing in German
I perfer to use the English QWERTY keyboard and just alt-u-u to type "ü". The problem I was having
was emacs would intercept this and execute the capitalize-word function 😞. After some digging into
my configs, ~/.doom.d/config.el, I was able to unset **M-u** only problem is it still didn't activate
the mac system umlaut feature.
```emacs
(global-unset-key (kbd "M-u"))
```
Finally after some more digging I found:
```emacs
(setq ns-alternate-modifier 'none
ns-right-alternate-modifier 'meta)
```
It works by un-assigning the left alt to meta, allowing the system keyboard feature to kick in.
source: https://emacs.stackexchange.com/questions/61019/umlauts-in-emacs-on-mac

View file

@ -0,0 +1,67 @@
---
title: "emacs replace across multiple files"
seo_description: "emacs tutorial on how to replace / edit across multiple files"
date: 2021-11-28T18:49:21+01:00
draft: false
snippet_types:
- emacs
---
This week I was making some changes to a [hugo shortcode](https://gohugo.io/content-management/shortcodes/) I use on my personal
site for videos.
Old one:
```
{< video-with-caption
remote_url="https://travisshears.com/image-service/videos/galtenberg-ski-tour/over_the_tree.webm"
backup_url="https://travisshears.com/image-service/videos/galtenberg-ski-tour/over_the_tree.mp4"
title="through the woods we go"
>}
```
New one:
```
{< video-with-caption
webm_url="https://travisshears.com/image-service/videos/galtenberg-ski-tour/over_the_tree.webm"
mp4_url="https://travisshears.com/image-service/videos/galtenberg-ski-tour/over_the_tree.mp4"
title="through the woods we go"
>}
```
Problem is this change crosses 50+ files. I knew some ways with sed to regex
substitute it across the files but I wanted something more emacs. Eventually
found [wgrep](https://github.com/emacsmirror/wgrep)! Its allows you to search
with normal `+default/search-project` then edit the results.
- \<SPC s p> to search the project
- type search ex: "remote_url"
- \<C-c C-o> to open the results
- delete some results with \<d>
- \<C-c C-p> to make the results editable
- make edits example :%s/remote_url/webm_url/g
- \<Z Z> to save changes across all the files
- lastly review changes via git
{{< video-with-caption
webm_url="https://travisshears.com/image-service/videos/emacs-replace-across-multiple-files/emacs_mass_edit_small.webm"
mp4_url="https://travisshears.com/image-service/videos/emacs-replace-across-multiple-files/emacs_mass_edit_small.mp4"
>}}
final patch: https://git.sr.ht/~travisshears/travisshears.com/commit/71e9c89c32f9b9f362e8e94ca8530530c1418284
---
In the making of this snippet I had some other fun:
- this screen recording led me to finding
[keycastr](https://github.com/keycastr/keycastr).
A very helpful mac osx program that displays keys as you type them.
Great for tutorials.
- My personal site is not open source because I write lot of drafts that don't
get published for months... For this snippet I wanted to show part of that
source code as a patch, decided on hosting it as a sourcehut paste. To create
the paste I wrote a [sourcehut-paste](https://git.sr.ht/~travisshears/sourcehut-paste).
- Video was created with Quicktime screen recording feature plus my video
helper app, [ts-video](https://git.sr.ht/~travisshears/ts-video)

View file

@ -0,0 +1,42 @@
---
title: "extending gpg keys"
date: 2020-06-22T11:58:48+02:00
draft: false
snippet_types: ["gpg"]
---
Don't let those keys expire. 🚨
Time to edit some keys:
```shell
gpg --edit-key t@travisshears.com
Secret key is available.
sec rsa2048/D4C2E4DFAB8BABF8
created: 2018-07-18 expires: 2020-07-17 usage: SC
ssb rsa2048/25C629D0FECC25B9
created: 2018-07-18 expires: 2020-07-17 usage: E
ssb rsa4096/97F7C2B46E6C5D11
created: 2019-09-28 expires: 2023-09-28 usage: E
```
1. select key to change `key 1`
1. `expire`
1. duration: `1y`
1. repeat until every key is updated
resulting output should be something like:
```shell
sec rsa2048/D4C2E4DFAB8BABF8
created: 2018-07-18 expires: 2021-06-22 usage: SC
ssb rsa2048/25C629D0FECC25B9
created: 2018-07-18 expires: 2021-06-22 usage: E
ssb rsa4096/97F7C2B46E6C5D11
created: 2019-09-28 expires: 2021-06-22 usage: E
```
lastly `save` and done
source:
- https://keithbeattyblog.wordpress.com/2019/05/04/how-to-extend-your-gpg-key/

View file

@ -0,0 +1,36 @@
---
title: "uploadable ffmpeg screen casts"
date: 2020-01-11T12:33:07+01:00
draft: false
snippet_types: ["ffmpeg", "media"]
---
I prefer tools like https://asciinema.org/ when I want to show terminal tricks and cli tools I
build, but sometimes you need to show something that breaks out of the terminal. In this case I
record a **.mov** screen cast using quicktime then do several transformations using **ffmpeg** to
get a small uploaded web friendly **.webm** file.
1. cut down the video if need be
```shell
$ ffmpeg -i movie.mp4 -ss 00:00:03 -t 00:00:08 -async 1 cut.mp4
```
2. scale down the **.mov** file to something more web friendly
```shell
$ ffmpeg -i vim_dic.mov -filter:v scale=512:-1 -c:a copy vim_dic_small.mov
```
3. convert the **.mov** to **.webm**
```shell
$ ffmpeg -i vim_dic_small.mov -c:v libvpx -crf 10 -b:v 1M -c:a libvorbis vim_dic.webm
```
If you don't have **ffmpeg** is available via brew
source:
- [brew ffmpeg](https://formulae.brew.sh/formula/ffmpeg)
- [convert mov to webm](https://davidwalsh.name/convert-to-webm)
- [resize mov](https://superuser.com/questions/624563/how-to-resize-a-video-to-make-it-smaller-with-ffmpeg)
- [html video element](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/video)
- [cutting video](https://stackoverflow.com/questions/18444194/cutting-the-videos-based-on-start-and-end-time-using-ffmpeg)

View file

@ -0,0 +1,20 @@
---
title: "file search plus size"
date: 2020-07-02T10:09:24+02:00
draft: false
snippet_types: ["fd", "du"]
---
How big are the fonts in this project? Not sure where they are even.
This command finds the files using [fd](https://github.com/sharkdp/fd) then
prints file size.
```shell
$ fd woff | xargs du -h
20K src/assets/fonts/brandon_text_medium.woff2
20K src/assets/fonts/brandon_text_regular.woff2
20K src/assets/fonts/chronicle_light.woff2
20K src/assets/fonts/chronicle_roman.woff2
4.0K types/woff
4.0K types/woff2
```

View file

@ -0,0 +1,17 @@
---
title: "filtered git diff"
date: 2021-06-25T10:49:51+02:00
draft: false
snippet_types:
- git
---
How to browse git diff of two hashes but exclude some files. In this case any file with test in the name won't be in the diff.
```shell
gd xxxsha001xxx...xxxsha002xxx -- $(gd --name-only xxxsha001xxx...xxxsha002xxx | rg --invert-match test)
```
"gd" == "git diff", via zsh git plugin
[source](https://www.yaplex.com/blog/find-list-of-files-which-are-different-between-two-branches-in-git)

View file

@ -0,0 +1,11 @@
---
title: "track down bugs with git bisect"
date: 2020-06-10T10:07:10+02:00
draft: true
snippet_types: ["git"]
---
Git bisect snippet
- [https://git-scm.com/docs/git-bisect](https://git-scm.com/docs/git-bisect)
- [https://social.nixnet.services/@amolith/104315651903786804](https://social.nixnet.services/@amolith/104315651903786804)

View file

@ -0,0 +1,17 @@
---
title: "find that lost folder"
date: 2020-01-11T05:09:47+01:00
draft: false
snippet_types: ["find", "search"]
---
```shell
$ find . -name php-e\*
```
Where the hell did that /php-extras folder go that I donloaded for my emacs?? On there you are!
also **ag -g 'php-e'** would work if you want to stick with silver searcher
source:
[man page](http://man7.org/linux/man-pages/man1/find.1.html)

View file

@ -0,0 +1,40 @@
---
title: "fix git file mode changes"
seo_description: "How to reset file modes via git diff."
date: 2022-06-04T12:16:22+02:00
draft: false
snippet_types:
- git
---
Before changing laptops I backed up all my personal projects to my NAS. When I
transfer them back the file modes got messed up and a **git status** returned
this:
```shell
diff --git a/docker/Dockerfile b/docker/Dockerfile
old mode 100644
new mode 100755
diff --git a/lib/DeployTool/CLI.rakumod b/lib/DeployTool/CLI.rakumogpg --list-secret-keys --keyid-format LONG <EMAIL>d
old mode 100644
new mode 100755
diff --git a/lib/DeployTool/Config.rakumod b/lib/DeployTool/Config.rakumod
old mode 100644
new mode 100755
```
Git diff shell magic, thanks to [Stanislav Khromov](https://snippets.khromov.se),
to the rescue!
```shell
$ git diff -p -R --no-ext-diff --no-color \
| grep -E "^(diff|(old|new) mode)" --color=never \
| git apply
```
This command uses git diff and some clever grep logic to swap the file modes back to what git remembers them as.
I also converted the snippet to a shell script [here](https://paste.sr.ht/~travisshears/ee89c97e7c6a54401b28d8b71ae1f796468dcb48)
source: [Stanislav Khromov's blog](https://snippets.khromov.se/reset-file-mode-chmod-changes-when-making-a-git-commit/)

View file

@ -0,0 +1,18 @@
---
title: "better git add"
date: 2020-01-11T05:07:50+01:00
draft: false
snippet_types: ["git"]
---
```shell
$ git add -p
```
I tend to use magit in emacs to stange and unstage files/hunks but in a pinch **-p** or **git add -i** +
selecting patch works great. You can choose exactly which hunks you want to stage leading to cleaner
incrimental commits. The bonus to using magit is you can easily edit the file during the process.
source:
[git docs](https://git-scm.com/docs/git-add)

View file

@ -0,0 +1,16 @@
---
title: "discard unstaged changes"
seo_description: "Snippet explaining how to disgard unstage git changes."
date: 2022-09-06T14:01:59+02:00
draft: false
snippet_types:
- git
---
Have a bunch of changes staged and want to drop the rest? Easy:
```shell
$ git restore .
```
[source](https://stackoverflow.com/questions/52704/how-do-i-discard-unstaged-changes-in-git)

View file

@ -0,0 +1,26 @@
---
title: "force push with --lease for safety"
date: 2020-06-10T10:06:23+02:00
draft: false
snippet_types: ["git"]
---
Just found out, via [tweet](https://twitter.com/joshdholtz/status/1270578433563787264?s=19), from
the insightful [joshdholtz](https://twitter.com/joshdholtz) that there is a safer alternative to
`git push --force`
Time to update my aliases:
```diff
alias gs="gst"
-alias gpf="gp --force"
+alias gpf="gp --force-with-lease"
alias gdc="gd --cached"
```
*I use zsh shell's git plugin that is why you see "gst" for `git status` and "gp" for `git push`. Highly recommend it*
**Source:**
- [stackoverflow](https://stackoverflow.com/questions/52823692/git-push-force-with-lease-vs-force)

View file

@ -0,0 +1,16 @@
---
title: "search git logs with grep"
date: 2020-01-11T05:13:57+01:00
draft: false
snippet_types: ["git", "search"]
---
```shell
$ git log --grep=NUTS-5288 --since=3.month
```
Have I already made commits for this Jira ticket?
source:
[gitster blog](https://gitster.livejournal.com/30195.html)

View file

@ -0,0 +1,21 @@
---
title: "move branch"
date: 2020-01-11T05:06:49+01:00
draft: false
snippet_types: ["git"]
---
```shell
$ git rebase source-commit --onto target-branch
```
Sometimes you need to start a new feature branch but the place you need to base the branch is not
ready yet, say coworker has merged his changes to a stage branch but not master. With this work flow
you simply start your branch off stage then move to master later. **git branch -m new-name** also
comes in handy to rename the branch after you've moved it if need be.
source:
[makandracards](https://makandracards.com/makandra/10173-git-how-to-rebase-your-feature-branch-from-one-branch-to-another)
[w3docs](https://www.w3docs.com/snippets/git/how-to-rename-git-local-and-remote-branches.html)

View file

@ -0,0 +1,21 @@
---
title: "soft merge"
date: 2020-01-11T05:05:05+01:00
draft: false
snippet_types: ["git"]
---
```shell
$ git merge feature/tickets/NUTS-1231 --no-commit --no-ff
```
Sometimes you need to merge but only parcial files and you want fine control over everything and and
possibly want to manually merge parts of files. This is when I use **--no-commit --no-ff**, ff for
fast forward, it basically stages the entire merge making it easy to go trough and make changes.
Using magit in spacemacs I go through hitting 'u' on files I don't want unstaging them, then maybe
'e' on a file to edit it via ediff, combining the new changes with orginal file seamlessy. Overall a
fun and impowering workflow!
source:
[stackoverflow](https://stackoverflow.com/questions/8640887/git-merge-without-auto-commit)

View file

@ -0,0 +1,14 @@
---
title: "see previous commit changes"
date: 2020-01-11T04:45:42+01:00
draft: false
snippet_types: ["git"]
---
`git log -p -2` or `git lg -p -2`
Viewing previous changes was something I relied on a GUI's for, like GitLab/Source-Tree, until I found this command! The **-p** stands for **--patch** and the **-2** stands for last two commits.
source:
[git docs](https://git-scm.com/book/en/v2/Git-Basics-Viewing-the-Commit-History)

View file

@ -0,0 +1,16 @@
---
title: "rewrite history git history"
date: 2020-01-11T04:58:50+01:00
draft: false
snippet_types: ["git"]
---
```shell
$ git rebase -i HEAD~3
```
Using this command you can rewrite a series of commits via dropping, fixing, squshing, and picking. It's most helpful before pushing to a remote repo if you did a bunch of small commits you want to roll into one.
source:
[git docs](https://git-scm.com/book/en/v2/Git-Tools-Rewriting-History)

View file

@ -0,0 +1,39 @@
---
title: "git repo backup"
date: 2020-02-23T10:14:44+01:00
draft: false
snippet_types: ["git", "pass"]
---
Backup a git repo without thats not hosted on remote repo with:
```shell
$ git bundle create /tmp/pass_"$(date +%s)".bundle --all
```
Then to confirm bundle:
```shell
$ git bundle verify /tmp/pass_1582448923.bundle
```
When you need to restore the backup into a new repo folder:
```shell
$ git clone -b master /tmp/pass_1582448923.bundle newrepo
```
I recently used this to backup my *~/.pass-store* [pass](https://www.passwordstore.org/)
repo. Basically it's a folder full of **.gpg** files each encrypting a password. Don't want to store
it on a remote host so I back it up locally. Create a git bundle then I encrypt the resulting bundle
file and store it somewhere safe.
```shell
$ gpg --symmetric ./pass_1582448923.bundle
```
source:
[git-memo docs](https://git-memo.readthedocs.io/en/latest/repository_backup.html)

View file

@ -0,0 +1,16 @@
---
title: "revert an entire feature branch"
date: 2020-01-11T05:12:42+01:00
draft: false
snippet_types: ["git"]
---
```shell
$ git revert -m 1 59e36575c691a05d33d34f403f5a831891df61b2
```
Yeah that whole feature was just a bad idea...
source:
[git docs](https://git-scm.com/docs/git-revert)

View file

@ -0,0 +1,16 @@
---
title: "oops i take that back"
date: 2020-01-11T04:57:24+01:00
draft: false
snippet_types: ["git"]
---
```shell
$ git revert sfjes_examplehash_f32f32h
```
Some times you only need to undo a spific commit, often when you have already pushed to orgin and can't rebase past a certin point that's where **git revert** comes in. Simply supply it with a commit hash and it will basiclly undo that commits changes. This can be combined with git rebase if you need to git revert servral commits then rebase all the reverts into a single revert.
source:
[git docs](https://git-scm.com/book/en/v2/Git-Tools-Rewriting-History)

View file

@ -0,0 +1,18 @@
---
title: git trim
seo_description: Code snippet showing how to use git-trim
date: 2023-02-23T13:09:31+01:00
draft: false
snippet_types:
- git
---
Easy clean up branches from local that are old or already merged on remote.
```shell
$ git trim -s -p
```
- [source](https://codelawd.hashnode.dev/managing-git-branches-with-git-trim#heading-basic-install)
- [git-trim project](https://github.com/jasonmccreary/git-trim)

View file

@ -0,0 +1,16 @@
---
title: "who last edited a file"
date: 2020-01-11T05:14:55+01:00
draft: false
snippet_types: ["git"]
---
```shell
$ git log -1 -- alice/alice/public/themes/core/views/layouts/src/css/main.scss
```
Sometimes you just need to know how to blame
source:
[git docs](https://www.git-scm.com/docs/git-log)

View file

@ -0,0 +1,54 @@
---
title: "zsh git plugin"
date: 2020-01-27T12:47:22+01:00
draft: false
snippet_types: ["git", "zsh"]
---
I use git exclusive from the command line and while it's interface is very clear since I spend so
much time I'm willing to trade some of that clarity for speed. ZSH has a great plugin for really
fast git tasks. Two of my favorite are pushing while setting a upstream branch and adding patches.
**GPSUP, git set upstream and push shortcut**
I push new git branches a few times a day and my previous work flow was to:
```shell
$ gp
fatal: The current branch NOTICKET-XXX has no upstream branch.
To push the current branch and set the remote as upstream, use
git push --set-upstream origin NOTICKET-XXX
```
followed by the classic [fuck command](https://github.com/ohmyzsh/ohmyzsh/tree/master/plugins/thefuck):
```shell
$ fuck
```
which then runs the proper push command setting upstream. News flash! This call all be accomplished
with a single plugin command:
```shell
$ gpsup
```
**GAPA, git add patch**
I like small commits and the key to that for me is adding by patch. This short cut turns
```shell
$ git add --patch
```
to
```shell
$ gapa
```
Can't recommend the zsh git plugin enough! There are too many shortcut commands to be worth
memorizing but I think **gpsup** and **gapa** are worth it. For a big list of commands it checkout the docs
[here](https://github.com/ohmyzsh/ohmyzsh/tree/master/plugins/git)

View file

@ -0,0 +1,18 @@
---
title: "configure more gitlab runners"
date: 2020-01-11T05:16:03+01:00
draft: false
snippet_types: ["gitlab", "docker"]
---
```shell
$ docker run --rm -t -i -v /home/travis/runner:/etc/gitlab-runner --name gitlab-runner-hypert-web gitlab/gitlab-runner register
```
Gitlab's ci pipelines are superpowerful and not hard to setup but you do have to
get runners going. For me running gitlab in out of a docker container this is
how I add runners to a new project
source:
[gitlab docs](https://docs.gitlab.com/runner/register/index.html#docker)

View file

@ -0,0 +1,24 @@
---
title: "gpg usb workflow"
date: 2021-06-13T08:49:38+02:00
draft: false
snippet_types:
- gpg
---
How to use a GPG key stored on a flash drive to encrypt files? I was perplexed for sometime. Eventually I figured out instead of exporting, importing, file system linking.. you just use a remote key ring that contains the keys you want!
1. Create the new key on the flash drive with
```shell
$ gpg --full-generate-key --homedir /Volumes/usb_flash_stick/key_homedir
```
2. Use that new public key to encrypt files
```shell
$ gpg --encrypt-files --homedir /Volumes/usb_flash_stick/key_homedir -r XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX ./file_a
```
This also bring the possibility of only storing the public key locally and having the secret key safe on the USB. See
[how to move keys snippet](/snippets/moving-gpg-keys).

View file

@ -0,0 +1,41 @@
---
title: "gzipping an existing tar"
date: 2020-10-14T09:12:54+02:00
draft: false
snippet_types:
- tar
- gzip
---
Part of my work process is taking lots of screenshot, ~5 per day. Then I back them up in AWS S3
Glacier once a month, using [freeze
app](https://apps.apple.com/us/app/freeze-for-amazon-glacier/id1046095491?mt=12). Like to start with
creating a regular tar file in /tmp.
```shell
$ tar cvf /tmp/pic_dump_14_10_20.tar ~/Desktop/**/*.png
```
Then append few more images. `r` in this case sanding for append..
```shell
$ tar rfv /tmp/pic_dump_14_10_20.tar ~/Pictures/resized/*
```
Now that the tar is complete I double check it by listing the files.
```shell
$ tar tf /tmp/pic_dump_14_10_20.tar
```
Lastly I need to compress the tar and I was confused if I could use tar command itself to compress a
tar into a tar.gz but turns you use gunzip.
```shell
$ gzip /tmp/pic_dump_14_10_20.tar
```
source: https://alvinalexander.com/blog/post/linux-unix/how-work-files-tar-gzip-tgz/ (4)

View file

@ -0,0 +1,41 @@
---
title: "imagemagik .png to .jpg with white background"
seo_description: "snippet explaining how to use imagemagik to covert images from .png .jpg and make the transparent parts white"
date: 2022-03-03T21:31:00+01:00
draft: false
snippet_types:
- imagemagik
---
Working on some .png map files today. I needed to covert them to small .jpg's for
uses in [Oblastle](https://oblastle.fun). Problem being by default the transparent part
fills into to black. Here is how to make it white:
```shell
$ ls
us-ak.png
us-al.png
us-ar.png
us-az.png
us-ca.png
$ for file in *.png ; do magick mogrify -format jpg -resize '500' -background white -flatten $file; done
$ ls
us-ak.png
us-ak.jpg
us-al.png
us-al.jpg
us-ar.png
us-ar.jpg
us-az.png
us-az.jpg
us-ca.png
$ rm ./*.png
```
The important bit being **-background white -flatten**.
[source](https://stackoverflow.com/questions/7943711/change-the-background-of-an-image-in-image-magick)

View file

@ -0,0 +1,25 @@
---
title: "init yubikey"
date: 2021-07-16T12:29:01+02:00
draft: false
snippet_types:
- yubikey
- gpg
---
Some servers at work require yubikey authentication and for some reason I have
to fetch and verify every time I want to use it.
```shell
$ export SSH_AUTH_SOCK=~/.gnupg/S.gpg-agent.ssh
$ gpg --card-edit
fetch
verify
enter key
via pass yubikey
quit
$ ssh-add -L
$ ssh travis@serverX
```

View file

@ -0,0 +1,16 @@
---
title: "ios xcode simulator dark mode"
seo_description: "How to switch xcode simulator to dark mode"
date: 2022-05-02T14:16:56+02:00
draft: false
snippet_types:
- xcode
---
Recently I was testing an IOS app on my wifes phone. The UI was completly broken.
Turns out she had dark mode enabled. That led me down the path of adding dark mode support to the
app. Which is testable via the Xcode simulator if you know how to enable it.
**Command + Shift + A**: Toggles dark mode
source: https://www.kindacode.com/article/how-to-toggle-dark-mode-on-ios-simulator/

View file

@ -0,0 +1,28 @@
---
title: "jq json processor"
date: 2020-01-30T09:10:56+01:00
draft: false
snippet_types: ["JSON", "curl", "jq"]
---
One of my oldest snippets is how to [Pretty print JSON](/snippets/pretty-print-json)
in the shell. This method works great for simple things where you just need to get an idea
of the JSON structure, it has the bonus of using python which you probably already have
installed. The problem is when you want to do more complex tasks it is quite limited in terms of
parsing. Thats where [jq](https://stedolan.github.io/jq) comes in
```shell
$ curl https://review-noticket-xxxxxxxx.eu/xxxxx/static/loadable-stats.json | jq '.entrypoints .sharedHeader .assets' | rg --invert-match map`
```
Simply piping to [jq](https://stedolan.github.io/jq) pretty prints the JSON
but by passing a query string, ex **".entrypoints .sharedHeader .assets"**,
you dig into the JSON and easily get what you need. This is easily
combinable with other shell utilities like in the example above which gets a
list of asset URLs than uses ripgrep invert-match to clean out the source map
URLs from the list. This is now my perfered way of working with JSON in the shell.
source:
- [brew formulae](https://formulae.brew.sh/formula/jq)
- [jq docs](https://formulae.brew.sh/formula/jq)

View file

@ -0,0 +1,30 @@
---
title: "jsx comments"
date: 2020-05-26T11:59:43+02:00
draft: false
snippet_types: ["react", "jsx"]
---
The other day at work we have an html comment in jsx slip on to stage. Made me relize I didn't know how to leave comments in jsx myself.
So as a reminder **DON'T** do this:
```jsx
<div>
<!-- comment here -->
<h2>Hello world</h2>
</div>
```
**Instead do this:**
```jsx
<div>
{/* comment here */}
<h2>Hello world</h2>
</div>
```
source:
https://wesbos.com/react-jsx-comments

View file

@ -0,0 +1,31 @@
---
title: "creating k8s registry secrets"
date: 2020-03-29T20:43:14+02:00
draft: false
snippet_types: ["kubernetes"]
---
Hosting side projects in kubernetes and using gitlab container registry? This
is the command I run to create the needed secret for the cluster to pull the
image:
```shell
$ kubectl create secret docker-registry cool-project-gitlab \
--docker-server=registry.gitlab.com \
--docker-username=gitlab+deploy-token-666666 \
--docker-password=xxxxxxxxxxxxxxxxxxxx \
--docker-email=xxxxxxxxx@xmail.com
```
Then in the deployment.yml use the gitlab registry image and newly created image
secret:
```yaml
containers:
image: registry.gitlab.com/btbtravis/cool-project:0.0.1
imagePullPolicy: IfNotPresent
name: cool-project-api
imagePullSecrets:
- name: cool-projects-gitlab
```

View file

@ -0,0 +1,23 @@
---
title: "list tar contents"
date: 2021-06-13T08:31:14+02:00
draft: false
snippet_types:
- tar
---
Before putting a tar file somewhere hard to access like S3 Glacier I note what is in it. I create a manifest file. Simply list the file names within:
```shell
$ tar tvf example.tar
```
With more automation worked in I came up with this:
- list all the tar files using fd (rust find replacment)
- list content of each one
- pipe that into a file for safe keeping
```shell
$ for x in $(fd -e tar) ; do (tar tvf "$x" && echo "\n") ; done > /tmp/example_manifest
```

View file

@ -0,0 +1,17 @@
---
title: "locally host istanbul js results"
date: 2021-06-23T11:15:24+02:00
draft: false
snippet_types:
- js
- npx
- npm
---
At work we use [istanbul js](https://istanbul.js.org/) for code coverage but at
times the cli output is not enough to debug. Luckily istanbul also outputs a HTML
report that can be viewed with the following command:
```shell
$ npx http-server ./coverage/lcov-report
```

View file

@ -0,0 +1,20 @@
---
title: "lower case string"
seo_description: "Shell snippet expanding how to lower case words"
date: 2022-02-13T12:17:50+01:00
draft: false
snippet_types:
- tr
---
TR is a Unix until to translate characters I recently learned about as part of [this awk snippet](/snippets/awk-last-column).
```shell
$ echo "TRAVIS" | tr '[:upper:]' '[:lower:]'
travis
```
[source](https://www.shellscript.sh/tips/case/)

View file

@ -0,0 +1,18 @@
---
title: "markdown to org file conversion"
date: 2021-06-21T10:58:47+02:00
draft: false
snippet_types:
- org
- markdown
---
I use the [Bear Notes app on IOS](https://bear.app/) which has a nice markdown
export feature. From there it is easy to air drop the file to my mac then I
paste it into my org notes after converting it.
```shell
$ pandoc -f markdown -t org -o note.org /tmp/md_note.md && pbcopy < /tmp/md_note.md
```
source: [stackexchange](https://emacs.stackexchange.com/questions/5465/how-to-migrate-markdown-files-to-emacs-org-mode-format)

View file

@ -0,0 +1,14 @@
---
title: "convert .mkv to .mp4"
date: 2020-06-30T16:31:35+02:00
draft: false
snippet_types: ["ffmpeg"]
---
Before I updated my [OBS](https://obsproject.com/) settings to record to .mp4
files I manually converted .mkv files to .mp4. This shell command does that
for every recording in a directory deleting the original.
```shell
$ for x in $(ls *.mkv | awk -F "." '{print $1}') ; do ffmpeg -i $x.mkv -c copy $x.mp4 && rm $x.mkv ; sleep 3; done
```

View file

@ -0,0 +1,24 @@
---
title: "measuring node.js function performance"
date: 2021-08-23T10:42:01+02:00
draft: false
snippet_types:
- js
- node-js
---
How fast is that new function?
```diff
import { xx } from 'xx';
+import { performance } from 'perf_hooks';
@@ -160,7 +161,10 @@ const userSessionMiddleware = async (req: XRequest, res: ExpressResponse, ne
+ var t0 = performance.now();
req.isBot = headersIndicateBot(req.headers);
+ var t1 = performance.now();
+ console.log('Took', (t1 - t0).toFixed(4), 'milliseconds to calculate is bot');
```
source:
https://www.sitepoint.com/measuring-javascript-functions-performance/

View file

@ -0,0 +1,16 @@
---
title: "move file range"
date: 2020-01-27T14:47:22+01:00
draft: false
snippet_types: ["files", "built-ins"]
---
Recently had to move a range of files and some zsh expansions came in handy.
```shell
$ for x in {1..6}; do mv AAAA.S01E0"$x".mkv ./a-$x.mkv; done
```
source:
https://serverfault.com/questions/370403/copy-a-range-of-files-in-command-line-zsh-bash

View file

@ -0,0 +1,33 @@
---
date: 2020-06-20T13:43:46.000Z
title: moving gpg keys
draft: false
snippet_types:
- gpg
seo_description: How to move GPG keys from one computer to the next.
---
New laptop? Got to move over those GPG keys.
```shell
$ cd /tmp && mkdir gpg_export
$ gpg --output gpg_export/main_pub.gpg --armor --export t@travisshears.com
$ gpg --output gpg_export/main_sec.gpg --armor --export-secret-key t@travisshears.com
$ tar cvfz gpg_export.tar.gz ./gpg_export
$ gpg --symmetric ./gpg_export.tar.gz
```
Then move the encrypted tar to the new computer, with airdrop for example.
To import the keys.
```shell
$ gpg --decrypt gpg_export.tar.gz.gpg > gpg_export.tar.gz
$ tar xvfz gpg_export.tar
$ gpg --import gpg_export/main_sec.gpg
$ gpg --import gpg_export/main_pub.gpg
```
Source:
* [https://www.debuntu.org/how-to-importexport-gpg-key-pair/](https://www.debuntu.org/how-to-importexport-gpg-key-pair/)

View file

@ -0,0 +1,25 @@
---
title: "npm i vs npm ci"
date: 2020-08-17T11:42:34+02:00
draft: false
snippet_types:
- npm
- js
---
Today I discovered `npm ci` from a colleague. It does a clean install wiping out the node_modules
before installing. For me this is perfect because I often find myself doing
```shell
$ rm -rf node_modules && npm i
```
no need just run `npm ci`
**note: you'll still want to do `npm i` when installing packages as `npm ci` does not update package\*..*
source:
- https://docs.npmjs.com/cli/install
- https://docs.npmjs.com/cli/ci

View file

@ -0,0 +1,28 @@
---
title: "open notion links"
date: 2020-06-07T12:05:55+02:00
draft: false
snippet_types: ["bash", "notion"]
---
Recently I've started using [Notion app](https://www.notion.so) for note
taking. There came times when I wanted to write about a specific directory,
but then came the problem of how to link the dic to notion page. How about a
tiny bash script? OSX's built in open command is capable of opening links so
I copied the link from notion and put in a `docs.sh` script. Boom! It opens
the notion page but in the browser not the app 😟.
After some digging I found this
[repo](https://github.com/creold/open-in-notion/) for a chrome extension that
opens notion links in the app. Taking a look at the source code it simply
adds a `/native` in the url to open in the app. Updated my script and now it
works 😀.
```bash
#!/usr/bin/env bash
open https://www.notion.so/native/Custom-Brew-Tap-e6f4faef5130442f94c9b06575806fc0
exit 0
```
If I continue to do this perhaps I'll write a small tool to create the script files for me.

View file

@ -0,0 +1,34 @@
---
title: "org-roam capture templates"
date: 2021-04-06T11:29:54+02:00
draft: false
snippet_types:
- org
- emacs
---
Recently I've started using [org-roam](https://www.orgroam.com/), so far so
good. Utilizing capture buffers I create notes for work and my reefing aquarium
hobby. Adding the roam tags manually became a pain so now I've figured out a way to
prefill them with capture templates.
```lisp
(setq org-roam-capture-templates
'(("r" "reef" plain (function org-roam-capture--get-point)
"%?"
:file-name "%<%Y%m%d%H%M%S>-${slug}"
:head "#+title: ${title}\n#+roam_tags: reef"
:unnarrowed t)
("w" "work" plain (function org-roam-capture--get-point)
"%?"
:file-name "%<%Y%m%d%H%M%S>-${slug}"
:head "#+title: ${title}\n#+roam_tags: work"
:unnarrowed t)
("d" "default" plain (function org-roam-capture--get-point)
"%?"
:file-name "%<%Y%m%d%H%M%S>-${slug}"
:head "#+title: ${title}\n"
:unnarrowed t)))
```
source: https://www.reddit.com/r/orgmode/comments/lmlsdr/simple_question_re_orgroam_how_to_access_capture/

View file

@ -0,0 +1,18 @@
---
title: "bulk import into pass"
date: 2020-01-11T04:51:47+01:00
draft: false
snippet_types: ["pass"]
---
```shell
$ passimport list.csv
```
Switching to [Pass](https://www.passwordstore.org/) was not exactly a straightforward process. It lacks a built-in mass import feature and I was dealing with a few hundred passwords and as a programmer entering them manually was unthinkable. After looking around at several plugins for [Pass](https://www.passwordstore.org/) nothing seemed simple enough so I wrote my open python script to handle the task. I later turned that script into an executable, run by this command, and pushed it to GitHub.
[my repo](https://github.com/BTBTravis/basic-pass-import)
source:
[pass docs](https://www.passwordstore.org/)

View file

@ -0,0 +1,15 @@
---
title: "copy password from pass to the keyboard"
date: 2020-01-11T04:49:16+01:00
draft: false
snippet_types: ["pass"]
---
```shell
$ pass -c github
```
Switching to [Pass](https://www.passwordstore.org/), a CLI based password manager, was a big time saver for me. I was using [Padlock](https://padlock.io/), a minimalist open source electron based manager, but was wasting so much time waiting for the GUI to load up and entering my master password scrolling to the desired entry and clicking. I'm not a time-saving purist but it was downright annoying UX pattern. Now I don't even have to leave the comfort of my keyboard.
source:
[pass docs](https://www.passwordstore.org/)

View file

@ -0,0 +1,15 @@
---
title: "search pass from password"
date: 2020-01-11T04:53:40+01:00
draft: false
snippet_types: ["pass"]
---
```shell
$ pass list | ag aws
```
Being a CLI interface the UX of [Pass](https://www.passwordstore.org/) fits amazingly into the rest of the shell ecosystem. Need can't remember if you have a password for AWS saved, run this.
source:
[pass docs](https://www.passwordstore.org/)

View file

@ -0,0 +1,37 @@
---
title: "per company git config"
seo_description: "how to configure git to use different emails per company / folder"
date: 2022-06-03T10:49:40+02:00
draft: false
snippet_types:
- git
---
Started new job this week and I wanted to have a seprate email on my work related repos then my
personal ones. Cool thing is git supports
[conditional config file includes](https://git-scm.com/docs/git-config#_conditional_includes)!
**~/.gitconfig**
```gitconfig
# per-user git config
[user]
name = Travis Shears
email = t@travisshears.com
[includeIf "gitdir:~/company-x/"]
path = .gitconfig-company-x
```
**~/.gitconfig-company-x**
```gitconfig
# Company X spefic git config
[user]
name = Travis Shears
email = travis.shears@company-x.com
```
Now any commits made under the directory **~/company-x** will use the email
**travis.shears@company-x.com** and not my personal email.
[source](https://stackoverflow.com/questions/8801729/is-it-possible-to-have-different-git-configuration-for-different-projects)

View file

@ -0,0 +1,25 @@
---
title: "percol to pick lines"
seo_description: "How to use percol to pick lines mid pipe"
date: 2023-05-16T15:37:29+02:00
draft: false
snippet_types:
- percol
---
Deep in a emac tutorial I discovered this hidden gem.
> "percol adds flavor of interactive selection to the traditional pipe concept on UNIX."
[project github](https://github.com/mooz/percol)
You can simply place it in the middle of a pipe and pick the options you want.
Example:
```shell
$ BUCKET="s3://cool-bucket" aws s3 ls $BUCKET | awk '{print $4}' | percol |sed "s;^;${BUCKET}/;" | xargs aws s3 rm
delete: ...
```
{{< asciicast-with-caption id="585386" title="demo" >}}

View file

@ -0,0 +1,15 @@
---
title: "pretty print json"
date: 2020-01-11T04:37:52+01:00
draft: false
snippet_types: ["curl", "json"]
---
```shell
$ curl -X GET https://some.json.endpoint.com | python -m json.tool
```
Need to check a API request but stuck reading a garbled mess. Try pipeing the result to python json.tool
source:
[stackoverflow](https://stackoverflow.com/questions/352098/how-can-i-pretty-print-json-in-a-shell-script)

View file

@ -0,0 +1,24 @@
---
title: "prevent vim auto new lines"
date: 2020-08-13T10:06:37+02:00
draft: false
snippet_types:
- vim
---
Sometimes when typing vim will automatically start a newline. This is an expected behavior but at
times can be really annoying. Ex working with macros, you can recored one on a short line that
breaks on longer lines 😟. The amount of text before vim will break to new line while typing is
controlled via the **textwidth** setting. So a fix is pretty simple. If don't want the behavior just set
**textwidth** to a big number. ex:
```
: set tw=500
```
Here is a asciicast of the problem and solution in action:
{{< asciicast-with-caption id="353148" title="demo of setting tw" >}}
source -- https://stackoverflow.com/questions/1272173/in-vim-how-do-i-break-one-really-long-line-into-multiple-lines

View file

@ -0,0 +1,17 @@
---
title: "pushing to remote branch with different name"
date: 2021-09-08T13:02:41+02:00
draft: false
snippet_types:
- git
---
Some times you rename a branch locally and want to push it to a remote branch with a different name.
This is how:
```shell
$ git push -u origin localBranch:remoteBranch
```
source: [stack-overflow](https://stackoverflow.com/questions/36139275/git-pushing-to-remote-branch)

View file

@ -0,0 +1,16 @@
---
title: "random password"
seo_description: "Create a random password on the commandline"
date: 2023-10-18T13:46:35+02:00
draft: false
snippet_types:
- openssl
---
This will generate a six char password.
```shell
$ openssl rand -base64 6
```
s3u3fi

View file

@ -0,0 +1,31 @@
---
title: "re-export javascript function"
date: 2021-10-05T08:45:34+02:00
draft: false
snippet_types:
- js
---
When splitting logic into files in JavaScript it sometimes happens you have a folder of files; say
for a single domain like Google analytics tracking. There comes a time in another part of the code
where you need to import some functions and types from this GATracking folder but you may not want
to dig into which file in the folder that function is. Re-exporting allows us to keep the API of the
GATracking logic consise and allow the rest of the program to not worry about how exactly the files
are structured. Example:
src/client/hooks/GATracking/index.tsx
```ts
export { default as TrackingAction } from './actions';
export const track = (action: TrackingAction) =>
...
```
Then somewhere else:
src/client/components/GoButton/index.tsx
```ts
import {TrackingAction, track} from '../hooks/GATracking';
...
```
source: https://stackoverflow.com/questions/34576276/re-export-a-default-export

View file

@ -0,0 +1,43 @@
---
title: "re export"
seo_description: "Code snippet explaining re-exporting enums in TypeScript"
date: 2022-01-05T09:13:55+01:00
draft: false
snippet_types:
- typescript
---
Today when writing a React hook I imported an enum type from another section of
code. This enum was used as an argument to the hook. Any components that used
this hook would also need to import that type enum. So just to use the hook at a
minimum the component would need to import two things, the hook itself, and the type
enum. To many imports cluttering stuff us!
```ts
import useCoolHook from 'Client/hooks/CoolHoook';
import HatStyleEnum from 'Client/hats/styles';
...
cool = useCoolHook(HatStyleEnum.cowboy);
```
What I found is the ability to re-export the type from the hook to keep the enum
bundled with the hook and more easily accessible.
```ts
export {HatStyleEnum} from 'Client/hats/styles';
export const useCoolHook = (hatId: HatStyleEnum) => {
...
```
This way the components only have to have one import, saving space.
```ts
import useCoolHook, {HatStyleEnum} from 'Client/hooks/CoolHoook';
```
Also see [similar snippet for js](/snippets/re-export-js-fn/)
source: [stackoverflow](https://stackoverflow.com/questions/36790911/reexport-class-in-typescript)

View file

@ -0,0 +1,34 @@
---
title: how to recover deleted file with git
seo_description: "short git tutorial on how to recover deleted files with git"
date: 2023-09-12T09:30:17+02:00
draft: false
snippet_types:
- git
---
I'd like to build a new nest.js command based off a file I deleted last week.
Here is how I go about getting that old file back via `git restore`
Step 1, find the file:
```shell
$ git log --name-only
/command
fd8bc2a6 Merge branch 'XXX-1111-remove-search-queue' into 'main'
fd07ddc3 XXX-1111: Remove search queue
apps/cli/src/cli.module.ts
apps/cli/src/db/backfill-search-queue.command.ts
apps/cli/src/db/db.module.ts
...
```
Step 2, restore the file:
```shell
$ git restore --source fd07ddc3~1 apps/cli/src/db/backfill-search-queue.command.ts
```
Note the `~1`. The file in question was deleted in commit fd07ddc3, so to
restore it we need to go one commit before it was deleted. `~1` does exactly
that.

View file

@ -0,0 +1,64 @@
---
date: 2022-06-05T10:26:08.000Z
title: remap §/± to `/~ on max osx
draft: false
snippet_types:
- mac
- osx
seo_description: How to remap (§) section sign to (`) backslash on mac.
---
Got a new Macbook Pro with the start of my new job and I love it. Except. It did
not come with and english keyboard. Now every time I try to type the backslash (\`)
or tilde (\~) instead I get "§" or "±" 😔. Lucky for me there are a bunch of people
with the same issue.
This base command to remap the key is:
```shell
$ hidutil property --set '{"UserKeyMapping":
[{"HIDKeyboardModifierMappingSrc":0x700000064,
"HIDKeyboardModifierMappingDst":0x700000035}]
}'
```
but to make this survive a computer restart things get a little more complicated.
For that you need you need a LaunchDaemon.
/Library/LaunchDaemons/org.custom.backslash-key-remap-fix.plist
```xml
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Label</key>
<string>org.custom.backslash-key-remap-fix</string>
<key>ProgramArguments</key>
<array>
<string>/Users/travis.shears/projects/scripts/bin/backslash-key-remap-fix.sh</string>
</array>
<key>RunAtLoad</key>
<true/>
<key>KeepAlive</key>
<false/>
</dict>
</plist>
```
/Users/travis.shears/projects/scripts/bin/backslash-key-remap-fix.sh
```shell
#/bin/sh -e
hidutil property --set '{"UserKeyMapping":
[{"HIDKeyboardModifierMappingSrc":0x700000064,
"HIDKeyboardModifierMappingDst":0x700000035}]
}'
```
## Sources:
* [https://rakhesh.com/mac/using-hidutil-to-map-macos-keyboard-keys/](https://rakhesh.com/mac/using-hidutil-to-map-macos-keyboard-keys/)
* [https://www.grzegorowski.com/how-to-remap-single-mac-keyboard-key](https://www.grzegorowski.com/how-to-remap-single-mac-keyboard-key)
* [https://superuser.com/questions/37042/remapping-of-keys-in-mac-os-x](https://superuser.com/questions/37042/remapping-of-keys-in-mac-os-x)

View file

@ -0,0 +1,30 @@
---
title: "remove brew packages"
seo_description: "A snippet explaining how to remove top level brew packages"
date: 2022-01-18T10:31:48+01:00
draft: false
snippet_types:
- brew
---
Trying to clean up my laptop a bit today by removing some unused brew packages.
Normally I would use `brew list` but this also list packages that are dependicies.
Here is a way to list the top level packages:
```shell
$ brew leaves
asciinema
aspell
autoconf
awscli
bat
bumpversion
...
$ brew uninstall neomutt newsboat travisshears/tap/deploy_tool weechat
...
```
[source](https://apple.stackexchange.com/questions/101090/list-of-all-packages-installed-using-homebrew)

View file

@ -0,0 +1,55 @@
---
title: "remove common lines with comm"
seo_description: "How to remove common lines from files using the comm command"
date: 2023-05-24T12:52:11+02:00
draft: false
snippet_types:
- comm
---
With help of ChatGPT I discovered this unix gem today.
Given two files
main_deck.txt:
```
1 Abzan Charm
1 Arcane Bombardment
1 Arcane Sanctum
1 Arcane Signet
1 Archmage Emeritus
1 Bant Charm
1 Boros Charm
1 Bring to Light
1 Brokers Charm
1 Brokers Confluence
```
considering.txt:
```
1 Abzan Charm
1 Arcane Sanctum
1 Taigam, Ojutai Master
1 Archmage Emeritus
1 Boros Charm
1 Tamanoa
1 Time Wipe
1 Trap Essence
1 Brokers Confluence
```
running
```shell
$ comm -23 <(cat considering.txt | sort) <(cat main_deck.txt | sort)
1 Taigam, Ojutai Master
1 Tamanoa
1 Time Wipe
1 Trap Essence
```
Returns a list of cards that are unique to the considering.txt.
If your text files are sorted you can skip the `<(cat ..)` magic.
And yes. I'm using the cli two help build Magic the Gathering Decks!

View file

@ -0,0 +1,17 @@
---
title: "remove gps metadata from image"
seo_description: code snippet on how to use exiftool in the shell to remove gps metadata from images
date: 2023-01-31T12:53:04+01:00
draft: false
snippet_types:
- exiftool
---
Recently I noticed some of the images on my site had GPS metadata info. Thats
not very private! So I did some googling and found this solution.
```shell
$ exiftool -GPS*= $(fd -e jpg -e png -e jpeg)
```
[source](https://exiftool.org/forum/index.php?topic=12615.0)

View file

@ -0,0 +1,22 @@
---
title: "restart nginx"
date: 2021-02-07T00:33:09+01:00
draft: false
snippet_types:
- nginx
---
Normally I use
```shell
$ sudo systemctl restart nginx
```
but recently I had to take more drastic measures.
```shell
sudo pkill -f nginx & wait $!
sudo systemctl start nginx
```
source -- [stackoverflow](https://stackoverflow.com/questions/14972792/nginx-nginx-emerg-bind-to-80-failed-98-address-already-in-use)

View file

@ -0,0 +1,10 @@
---
title: "run changed tests"
date: 2021-02-01T15:48:43+01:00
draft: false
snippet_types:
- npm
- git
---
npm run test -- $(git diff master --name-only | rg 'test\.ts') --coverage=false

View file

@ -0,0 +1,25 @@
---
title: "s3 copy recursive"
seo_description: "short snippet explaining how to upload folders recursivly to s3 from the cli"
date: 2022-03-03T21:39:59+01:00
draft: false
snippet_types:
- aws
- s3
---
Where were you **--recursive** when I was hacking around with
**for file * ; do aws s3 cp \$file.....**
```shell
$ aws s3 cp ./ s3://travisshears.images/image-service/images/oblastle/context/ --profile personal --recursive
upload: ./us-ak.jpg to s3://travisshears.images/image-service/images/oblastle/context/us-ak.jpg
upload: ./us-co.jpg to s3://travisshears.images/image-service/images/oblastle/context/us-co.jpg
upload: ./us-ar.jpg to s3://travisshears.images/image-service/images/oblastle/context/us-ar.jpg
upload: ./us-ca.jpg to s3://travisshears.images/image-service/images/oblastle/context/us-ca.jpg
upload: ./us-al.jpg to s3://travisshears.images/image-service/images/oblastle/context/us-al.jpg
upload: ./us-ct.jpg to s3://travisshears.images/image-service/images/oblastle/context/us-ct.jpg
...
```
**--recursive** is an easy win to upload a bunch of files to s3.

View file

@ -0,0 +1,19 @@
---
title: "s3 move folder / rename folder"
seo_description: "snippet showing how to move files from one s3 'folder' to another"
date: 2022-03-17T15:08:04+01:00
draft: false
snippet_types:
- s3
- aws
---
Working on my [Oblastle](https://oblastle.fun) game today. In an effort to standardize the way I store images for the game in S3 I needed to move all the files with key **image-service/images/oblastle/flags/** to **/image-service/images/oblastle/flag/**.
Here is how I did it:
```shell
$ aws s3 mv s3://travisshears.images/image-service/images/oblastle/flags/ s3://travisshears.images/image-service/images/oblastle/flag/ --recursive --profile personal
```
[source](https://bobbyhadz.com/blog/aws-s3-rename-folder)

17
old_snippets/scp.en.md Normal file
View file

@ -0,0 +1,17 @@
---
title: "i need a file off my server but i don't want to set up ftp"
date: 2020-01-11T05:11:00+01:00
draft: false
snippet_types: ["scp"]
---
```shell
$ scp -i ~/.ssh/privkey travis@199.199.19.199:/home/travis/example.txt ./
```
Yeah sometimes you just need to move files around. Any server you have ssh access to you can use
that same key to send files over ssh.
source:
[man page](http://man7.org/linux/man-pages/man1/scp.1.html)

View file

@ -0,0 +1,102 @@
---
title: "for loops in bash / zsh shells"
date: 2020-01-12T11:54:38+01:00
draft: false
snippet_types: ["zsh", "built-ins"]
---
Looping directly in the shell is something I do at lease once a week and it's a great time saver.
Here I want to run through a concrete example of how to use for loops to mass move and rename files.
During process of getting this snippet section of my site off the ground I realize I had put a bunch
of markdown files into named directories when I could have just named the files better and done without
the directories. What I needed to do was **mv** the files out of these dirs and rename them to the
name of the dir plus add the ".en.md" extension.
Here is the old directory structure:
```shell
$ tree
.
└── snippets
├── _index.en.md
├── aws-cloud-front-inval
│   └── index.en.md
├── aws-s3-sync
│   └── index.en.md
├── ffmpeg-screen-casts
│   └── index.en.md
├── find-folder
│   └── index.en.md
├── git-better-git-add
│   └── index.en.md
├── git-log-grep
│   └── index.en.md
├── git-move-branch
│   └── index.en.md
.
.
```
To start with I preformed the operation in question a single time to prove it works:
```shell
$ mv ./who-is-using-that-port/index.en.md ./who-is-using-that-port.en.md && rm -r ./who-is-using-that-port
```
Then I copy this command to the clipboard for later and use command history pull up a
previous for loop command and switch the shell command entry external editing
in vim mode, for me **\<C-x>\<C-e>**. As we are essentially writing a quick bash script in-line we need vim
powers! While in vim if you need the file list a quick **:r !ls** does the trick.
These little efficiencies like using command history are not just faster but make it so you
don't have to remember the bash syntax.
Resulting vim buffer with the for loop:
```shell
for x in \
aws-s3-sync \
ffmpeg-screen-casts \
find-folder \
git-better-git-add \
git-log-grep \
...
git-move-branch \
vim-window-resize ; do mv ./"$x"/index.en.md ./"$x".en.md && rm -r "$x" ; done
```
Resulting new file structure:
```shell
$ tree
.
└── snippets
├── _index.en.md
├── aws-cloud-front-inval.en.md
├── aws-s3-sync.en.md
├── ffmpeg-screen-casts.en.md
├── find-folder.en.md
├── git-better-git-add.en.md
├── git-log-grep.en.md
├── git-move-branch.en.md
.
.
```
Perfect, all files have been renamed properly and the empty directories deleted. This technique has
lots of other applications besides moving and renaming files. Earlier this week while debugging api
at work I dumped bunch of json responses into a file so I could search for translation key.
```shell
$ for x in \
https://api.poeditor.com/v2/download/file/xxx \
https://api.poeditor.com/v2/download/file/xxx \
https://api.poeditor.com/v2/download/file/xxx \
https://api.poeditor.com/v2/download/file/xxx \
https://api.poeditor.com/v2/download/file/xxx \
https://api.poeditor.com/v2/download/file/xxx \
https://api.poeditor.com/v2/download/file/xxx \
https://api.poeditor.com/v2/download/file/xxx \
https://api.poeditor.com/v2/download/file/xxx \
https://api.poeditor.com/v2/download/file/xxx ; do curl $x >> /tmp/translations.json; done
```

View file

@ -0,0 +1,41 @@
---
title: "using short server names"
date: 2020-08-12T12:04:32+02:00
draft: false
snippet_types:
- ssh
---
It use to be when I was trying to ssh into one of my servers I would just: <Ctrl><r> in the shell to
get my fzz backwards command search, type ssh, then scroll up and down until I found the correct server. As
my number of servers has grown this is no longer manageable because I don't remember which IPs align
with with server. Time for some human readable names.
By adding something like this to your `~/.ssh/config`:
```
Host de1
HostName vxxxxxxxxxxxxxxxxxxxx.megasrv.de
User travis
Host de2
HostName vyyyyyyyyyyyyyyyyyyyymegasrv.de
User travis
Host de3
HostName vzzzzzzzzzzzzzzzzzzz.megasrv.de
User travis
Host nyc1
HostName 123.123.123.1233
User travis
```
sshing becomes as easy as:
```shell
$ ssh de2
```
source -- https://www.techrepublic.com/article/how-to-use-per-host-ssh-configuration/

View file

@ -0,0 +1,15 @@
---
title: "silver searcher, it's like grep but faster and easier"
date: 2020-01-11T04:54:57+01:00
draft: false
snippet_types: ["ag", "search"]
---
```shell
$ ag -G .php 'the meaning of the universe'
```
Life on the command line means grepping for things on a daily basis. After doing this for a while I memorized so may flag like grep **-n** for line numbers **-i** for case-insensitive and even the **.** to specify file directory. Then I discovered [silver searcher](https://github.com/ggreer/the_silver_searcher), not only was the name a call back to one of my favorite comic book heroes but, it basically did work just as well as grep without all the flags and seemingly faster. It is now my go-to command for searching files whether it be inside vim via **:r !ag -G .php 'something'** or from the cli. It even has different outputs depending on the situation, in the cli it opens a temp window to browse results like **less** when piping or in vim it outputs a more machine-readable chunk of text with file paths, line numbers, and a single line of code.
source:
[silver searcher](https://github.com/ggreer/the_silver_searcher)

View file

@ -0,0 +1,46 @@
---
title: "sort numerically"
seo_description: "how to sort lines numerically in unix shell"
date: 2023-05-11T09:14:37+02:00
draft: false
snippet_types:
- sort
---
Working with with sitemap xml files in AWS S3 today and the default sort is a bit hard to read.
Example:
```shell
$ aws s3 ls s3://cool-bucket | awk '{print $4}'
sitemap.brands.xml
sitemap.episode.0.xml
sitemap.episode.1.xml
sitemap.episode.10.xml
sitemap.episode.11.xml
sitemap.episode.2.xml
sitemap.episode.20.xml
sitemap.episode.21.xml
sitemap.episode.22.xml
sitemap.episode.23.xml
...
```
Using `sort -V` it sorts the lines numerically!
Working example:
```shell
$ aws s3 ls s3://cool-bucket | awk '{print $4}' | sort -V
sitemap.brands.xml
sitemap.xml
sitemap.episode.0.xml
sitemap.episode.1.xml
sitemap.episode.2.xml
sitemap.episode.3.xml
sitemap.episode.4.xml
sitemap.episode.5.xml
sitemap.episode.6.xml
sitemap.episode.7.xml
...
```

View file

@ -0,0 +1,15 @@
---
title: "split files"
date: 2021-06-13T08:31:03+02:00
draft: false
snippet_types:
- split
---
To prevent upload errors to S3 Glacier I keep my files ~500mb. So larger ones must be split.
```shell
$ split -b 500mb example.txt example.txt.part
```
[perl script example](https://paste.sr.ht/~travisshears/1327a69aea4e86e17894aeec9e16011338280b8a)

View file

@ -0,0 +1,18 @@
---
title: "strip audio from video file"
date: 2021-01-09T15:27:08+01:00
draft: false
snippet_types:
- ffmpeg
---
Easy way to remove audio from a video file using ffmpeg
```shell
ffmpeg -i $input_file -c copy -an $output_file
```
source: [superuser](https://superuser.com/questions/268985/remove-audio-from-video-file-with-ffmpeg)

View file

@ -0,0 +1,21 @@
---
title: "creating a temporary tabel in sqlite"
date: 2021-09-21T10:33:46+02:00
draft: false
snippet_types:
- sqlite
- sql
---
Today I needed to export some data from an sqlite tabel to csv, as part of my poverty line project.
Using [sqlitebrowser](https://sqlitebrowser.org/) I could see the table had to much data and I only wanted
to export a few columns. To accomplish this I learned how to create a temporary table and exported that.
```sql
DROP TABLE IF EXISTS processing_data_export;
CREATE TEMPORARY TABLE processing_data_export AS
SELECT x,y, county_id FROM grid;
```
source: [stackoverflow](https://stackoverflow.com/questions/26491230/sqlite-query-results-into-a-temp-table)

View file

@ -0,0 +1,21 @@
---
title: "mutliplex all the shells"
date: 2020-01-11T04:59:56+01:00
draft: false
snippet_types: ["tmux"]
---
```shell
$ tmux new -s 'example_session_name'
```
Creates a new tmux session with the name 'example_session_name' which is very powerful and will
exist even if you close out of the terminal.
Once you launch use **\<Ctrl + ">** to split the window horizontally or **\<Ctrl + %>** for
vertically split and moving between windows with **\<Ctrl - Vim_Direction>**
You can detach and go back to shell at any time with **\<Ctrl + d>** then reattach later with
**tmux new-s 'example_session_name'**. If you forget the name a simple **tmux ls** show running
sessions.
[cheat sheet](https://git-scm.com/book/en/v2/Git-Tools-Rewriting-History)

View file

@ -0,0 +1,27 @@
---
title: "tmux plus screen"
date: 2020-08-17T14:19:16+02:00
draft: false
snippet_types:
- screen
- tmux
---
Recently I was sshed into my home pi server trying to sftp some big files from a remote server.
Some of the transfers are huge, 30gb+. On our internet that will take a while. Not
wanting to leave the shh terminal session open that whole time I used
[screen](https://linux.die.net/man/1/screen) on the pi.
Idea was to create a screen session start the transfer detach and comeback few hours later.
Detaching was a bit tricky. <Ctrl\><a\> <d\> is the default detach command for both tmux running on my
mac and screen running on the pi. So when I tried to detach from the screen session tmux would
detach instead. 😡
After messing with configs and some searching turns out if you press the tmux prefix key twice it
sends it once to the child shell. So eventually I was able to detach from the screen session with:
**<Ctrl\><a\> <Ctrl\><a\> <d\>**!!
sources:
- [stackoverflow](https://stackoverflow.com/questions/8518815/how-to-send-commands-when-opening-a-tmux-session-inside-another-tmux-session)
- [screen basics](https://www.howtogeek.com/662422/how-to-use-linuxs-screen-command)

View file

@ -0,0 +1,16 @@
---
title: "trust gpg key"
seo_description: "how to trust a gpg key on mac and stop the annoying popups related to non trusted
gpg keys"
date: 2022-06-04T10:50:22+02:00
draft: false
snippet_types:
- gpg
---
Just moved computers and thus moved my [pass store](https://www.passwordstore.org/). After importing
my gpg keys, [explained in this snippet](/snippets/moving-gpg-keys), I had to trust them in order to stop the annoying
warnings everytime I created a new password.
[source](https://yanhan.github.io/posts/2014-03-04-gpg-how-to-trust-imported-key/)

View file

@ -0,0 +1,25 @@
---
title: "twtxt config alias"
date: 2020-05-30T11:56:31+02:00
draft: false
snippet_types: ["zsh", "twtxt"]
---
Recently I started using [twtxt](https://twtxt.readthedocs.io/en/stable/). I
installed it with brew then setup it up via the quick setup command. The
problem is when you put a config file location other then the default when
you call the command you get a
**"✗ Config file not found or not readable. You may want to run twtxt quickstart."**
error. This is super annoying so I aliased the command to always include the location of my
config file.
```diff
alias gpf="gp --force"
+ alias tw="twtxt -c ~/.twtxt/config"
```
Now posting is as easy as:
```shell
$ tw tweet "authoring a snippet on how to configure alias twtxt"
```

View file

@ -0,0 +1,46 @@
---
title: "add types to a javascript when using typescript"
date: 2021-07-12T11:12:32+02:00
draft: false
snippet_types:
- js
- typescript
---
Today I had a .js file that I needed to import into the rest of a TypeScript app but could not convert it to a .ts file. To get around this limitation I added an accompanying .d.ts file.
countryConfig.js
```js
const countries = [
{
code: 'de',
domainExtension: '.de'
},
...
];
exports.countries = countries;
```
countryConfig.d.ts
```typescript
interface Country {
code: string;
domainExtension: string;
}
export const countries: Array<Country>;
```
app.ts
```typescript
import countries from './countryConfig'
...
```
source: https://www.typescriptlang.org/docs/handbook/declaration-files/templates/module-d-ts.html

View file

@ -0,0 +1,26 @@
---
title: "update a local zef module"
date: 2021-10-20T20:28:05+02:00
seo_description: "code snippet explainin how to install raku/zef modules locally"
draft: false
snippet_types:
- raku
- zef
---
Lately I've been working locally with raku/zef modules for my CLI apps.
I install them with:
```shell
$ zef install .
```
Turns out you can update a package very similarity. Just bump the version in
the META6.json and run the same command again.
```shell
$ zef install .
```
[docs](https://github.com/ugexe/zef)

View file

@ -0,0 +1,33 @@
---
title: "update npm packages to latest versions"
seo_description: "How to update the node modules of a project to their latest version"
date: 2022-06-15T09:28:28+02:00
draft: false
snippet_types:
- npm
- js
- node
---
I was doing some housekeeping on a Node.JS project today. Wanted to update all
the dependencies to their latest versions. At first I tried `npm update` but
come to find out that is ment more for upgrading single packages and not
major versions. In the end after some googling I found
[npm-check-updates](https://www.npmjs.com/package/npm-check-updates).
To upgrade the dependencies of a project without installing anything else I ran:
```shell
$ npx npm-check-updates -u
```
It updated the **package.json** so it must be followed up by a:
```shell
$ npm i
```
Which will install the new packages and update the **package-lock.json**.
[source](https://nodejs.dev/learn/update-all-the-nodejs-dependencies-to-their-latest-version)

View file

@ -0,0 +1,31 @@
---
title: update pleroma server
seo_description: Hoe to update your Pleroma OTP install quickly and easily
date: 2022-12-04T17:13:12+01:00
draft: false
snippet_types:
- pleroma
---
I always forget how to upgrade my Pleroma servicer when a new version comes out so I'm writing it here.
If you are unsure of what you are doing please consult the
[official docs](https://docs-develop.pleroma.social/backend/administration/updating/).
My Pleroma instance: [social.travisshears.xyz](http://social.travisshears.xyz)
Start by sshing into the server and switching to the pleroma user.
```shell
$ sudo su pleroma -s $SHELL
```
Then stop the server request the update, migrate the db and start it again.
```shell
$ ./bin/pleroma stop
$ ./bin/pleroma_ctl update
$ ./bin/pleroma_ctl migrate
$ ./bin/pleroma daemon
```
Boom new version!

View file

@ -0,0 +1,52 @@
---
title: "super powers of the arg list"
date: 2020-01-11T19:44:49+01:00
draft: false
snippet_types: ["vim", "search"]
---
Vim help:
> The argument list \*argument-list* \*arglist*
> If you give more than one file name when starting Vim, this list is remembered
> as the argument list. You can jump to each file in this list.
If you have been using vim a while you will be comfortable with the buffer list but often it's
full of random files, terminal sessions, and other none file buffers, this is where arglist comes
in. It starts out as list of files that were initially opened when the vim session was launched ex:
```shell
nvim ./a.txt ./b.txt ./c.txt
```
would start vim with three buffers in both arglist and buffer list. You can cycle through the
arglist with files via **:next**, **:prev**. Further files opened will not be automatically added to
the arglist so it's essentially a clean sub list of the buffer list. Helpful commands include
**:rew** to jump back to the first file you edited this session. Also **:all** which opens all
buffers of the arglist in splits.
Where I find the arglist particularly powerful is for mass file edits. Here is an day to day
example:
1. get files in questing into the arglist either pipe files to vim or from in vim override initial
arglist.
- from cli
```shell
$ rg -i 'snippet_typ.*vim' --files-with-matches | xargs nvim
```
- from inside vim (harder because no autocomplete)
```vim script
:args `rg -i 'snippet_typ.*vim' --files-with-matches`
```
1. now lets change all the posts front-matter to draft = true
```vim script
:argdo %s/draft: false/draft: false/g | update
```
thats it all the files have been modified and saved. Its best to use this kind of thing with
version control so during an interactive git add you can verify each modification.
For more complex edits I load files into the list a similar way then make quick manual and move to
next file with **:wn** which also saves.

View file

@ -0,0 +1,36 @@
---
title: "vim fzf plugin"
date: 2020-01-30T14:57:50+01:00
draft: false
snippet_types: ["vim", "search"]
---
I've used several fuzzy finder utilities in vim over the years like
[Command T](https://github.com/wincent/Command-T) or
[CtrlP](https://github.com/ctrlpvim/ctrlp.vim). They both have there pluses and
minuses but never found them to be that fast especially with large code bases
I'm often working in. Fzf for me is superior so I was excited to see a plugin
that integrates Fzf so well into vim. Its not just useful for finding files but
works great with buffers, files with git changes, commands, marks, and even
lines in open buffers.
My vim config for Fzf is as follows:
```vim script
nnoremap <Leader>pb :Buffers<CR>
nnoremap <Leader>pf :GFiles<CR>
nnoremap <Leader>pg :GFiles?<CR>
nnoremap <Leader>pm :Marks<CR>
nnoremap <Leader>pc :History:<CR>
nnoremap <Leader>pl :Lines<CR>
```
This allows me to easily zip around my code base with speed.
{{< asciicast-with-caption id="296788" title="demo using fzf in vim" >}}
sources:
- https://github.com/wincent/Command-T
- https://github.com/ctrlpvim/ctrlp.vim
- https://github.com/junegunn/fzf.vim

View file

@ -0,0 +1,21 @@
---
title: "remapping ability to jump"
date: 2020-01-11T13:09:41+01:00
draft: false
snippet_types: ["vim"]
---
Moving efficiently in vim is a massive topic but recently I just wanted to take advantage of the
jump list more but by xQuartz / xterm mac terminal setup meant the default key mappings we being
intercepted as promote / demote pane. Simple remap did the trick
```vim script
nnoremap <Leader>i <C-i>
nnoremap <Leader>o <C-o>
```
In the following demo I open this very post then navigate to my **.nvimrc** and yank the code
remapping snippet. With the snipped yanked, using this using jump list via **\<C-O>**, I jump back
to the markdown file I was editing and put.
{{< asciicast-with-caption id="292978" title="demo of using the jump list" >}}

View file

@ -0,0 +1,29 @@
---
title: "vim open file under cursor"
date: 2020-03-06T08:00:30+01:00
draft: false
snippet_types: ["vim"]
---
Half my time in the editor I'm not coding, I'm just browsing the file system trying to understand how things are connected. Trying figure out what needs to change in order to complete my task. This module imports this module which imports this module... There are many ways to navigate the file system and read files. Since I primarily code in VIM there are two main ways I navigate.
One is using the file explorer from plugin, NerdTree. It is great for getting a general overview of folder structure of a project and moving, deleting, or creating new files. Also really good for finding sibling files using the **:NerdTreeFind** command which I have remapped to **<Leader><n>**. Where is lacks however is opening a nested import for example. When you want to jump to a different file / module directly.
This is where the goto file comes in. Using the following commands in conjunction with moving up and down the jump list, navigation is easy. Here is an example where I use the goto command and a few of its variations.
{{< asciicast-with-caption id="307951" title="demo using vim's goto and jumplist commands" >}}
The most helpful ones to learn are
- **\<g>\<f>** goto file, same window
- **\<c-w>\<f>** goto file, new split
- **\<c-w>\<g>\<f>** goto file, new tab
Also don't forget to use the jump list
- **\<c-o>** or in my case remapped to **\<Leader>\<i>** to jump backwards
Resources:
- https://vim.fandom.com/wiki/Open_file_under_cursor
- https://github.com/preservim/nerdtree

View file

@ -0,0 +1,28 @@
---
title: "custom placeholders solution"
date: 2020-01-12T00:59:03+01:00
draft: false
snippet_types: ["vim"]
---
This little bit of magic I believe I picked up from watching a Luke Smith video. The point is to
leave placeholders in the form of the text "\<++>" in your code/writing then quickly snap to and
fill them in later.
```vim script
" placeholder magic
nnoremap <Space><Space> <Esc>/<++<CR>"_c4l
nnoremap <Space>n <Esc>l/<++><CR>h
" always fill p reg with <++>
:autocmd VimEnter * :call setreg('p', '<++>')
```
1. Thanks to the vim enter hook the magic placeholder text is always in my p register so its easy to put with **\<">\<p>**
1. Then to jump to the next placeholder and immediately start editing it with a simple **\<Space>\<Space>**
{{< asciicast-with-caption id="293101" title="placeholders demo" >}}
source:
- [Luke Smith Youtube Channel](https://www.youtube.com/channel/UC2eYFnH61tmytImy1mTYvhA)

Some files were not shown because too many files have changed in this diff Show more