One thing I find life-changing is to remap the up arrow so that it does not iterates through all commands, but only those starting with the characters I have already written. So e.g. I can type `tar -`, then the up arrow, and get the tar parameters that worked last time.
In zsh this is configured with
bindkey "^[OA" up-line-or-beginning-search # Up
bindkey "^[OB" down-line-or-beginning-search # Down
What I love about the default Bash Crtl-C behaviour is that once a command has been located, the bash history is moved to the history of that command, until Enter is pressed.
$ a
bash: a: command not found
$ b
bash: b: command not found
$ c
bash: c: command not found
$ d
bash: d: command not found
$ <CTRL-R> b <UP>
$ a
That's great if I don't remember which command I was experimenting with, but I do know other commands that I did around that time (usually a file that I edited with VIM).
That feature is entirely optional and disabled by default. Atuin stores your shell history locally in a sqlite db regardless of whether you choose to sync it. I thought fzf was fast, but atuin makes it look slow by comparison.
However what I do find useful is eternal history. It's doable with some .bashrc hacks, and slow because it's file based on every command, but:
- never delete history
- associate history with a session token
- set separate tokens in each screen, tmux, whatever session
- sort such that backward search (ctrl-R) hits current session history first, and the rest second
Like half my corporate brain is in a 11M history file at this point, going back years.
What I would love is to integrate this into the shell better so it's using sqlite or similar so it doesn't feel "sluggish." But even now the pain is worth the prize.
I just want to give a perspective of someone that uses the 'eternal history' in bash per Eli Bandersky [1] and reluctance to use something like atuin (without/ignoring shared history).
First, as for speed and responsiveness, if there is a degradation, it is imperceptible to me. I wouldn't have a clue that my interactive shell is slowing down because it is logging a command to ~/.persistent_history.
My persistent_history is 4MB and has been migrated from machine to machine as I've upgraded, it's never felt slow to edit with (neo)vim or search with system supplied grep.
Eli's way of doing it also includes the timestamps for all commands, so it's easy to trace back when I had run the command, and duplicates are suppressed. In fact my longest persistent_history goes back to 2019-07-04, so I've been using it for quite some time now.
But the larger point I wanted to make is that I wouldn't feel comfortable switching this, in my opinion, quite efficient setup to displace it with an sqlite database. That would require a special tool to drill through the history and search rendering simple unix utilities useless. As Eli suggested, if your history gets too big, simply rotate the file and carry on. I have the alias phgrep to grep ~/.persistent_history, but I can easily have another alias to grep ~/.persistent_history*.
You don't have to setup shared history with Atuin if you don't want to and that's what's holding you back. Otherwise it hits the rest of your requirements. Just don't hesitate to change from the default config.
1. work on a project on host_foo in /home/user/src/myproject
2. clone it on host_bar in /home/user/src/myproject
If you set filter_mode = "directory", you can recall project specific commands from host_foo for use on host_bar even though you're working on different machines and the search space won't be cluttered with project specific commands for other projects.
If you use multiple terminals it kinda sucks unless you do export PROMPT_COMMAND='history -a' in your.bashrc or something cause only the last closed terminal saves to history
export EDITOR=vi and then hitting Esc puts you into vi mode; k, j to move up/down through history or pressing / to search etc including using regex is all available.
This is the default `fish` shell behavior. Type anything, up/down keys to iterate through full commands that containing the term; alt + up/down to iterate through args containing the term.
One thing I do is configure my keyboard so that "modifier+{ijkl}" mimicks the inverted T arrows key cluster. So there's never a need for me to reach for the arrow keys. And {ijk} makes more sense than vi's {hjkl} and is faster/more logical/less key fingers travel. The nice thing is: as I do this at the keyboard level, this works in every single map. "modifier" in my case is "an easily reachable key in a natural hand position on which my left thumb is always resting" but YMMV.
I set that up years ago and it works in every app: it's gorgeous. Heck, I'm using it while editing this very message for example.
And of course it composes with SHIFT too: it's basically arrow keys, except at the fingers' natural positions.
I did the same, starting with Ergo Mode in Emacs many years ago, and ending up today with a programmable split keyboard with those keys as arrows on a layer. For when I'm on a laptop without the keyboard, I have a mishmash of solutions that bind to Alt+{ijkl}
I'd try this, but I often find that I want to repeat a cycle of two or more commands. Yes, I probably should edit and put them on one line with semicolons (or even make a function), but.
Or put && between them - I had "compile;run" and when compile failed, it still ran (but the old build). Took me a while to figure out. && ensures the first command succeeds.
Anyway, so worth it to combine commands into one line for easy re-run.
For further life-changing experience... add aliases to .bash_aliases
alias gph='history | grep --colour -i '
alias gpc='grep --colour -Hin '
#if gnu time is installed
alias timef='/usr/bin/time -f "tm %E , cpu %P , mem %M" '
> Because it's amusing to loop this kind of criticism through a model
Maybe it could become a general pattern, to have an agent whose task is just to deny the output validity. GANs are a very successful technique, perhaps it could work for language models too.
Microsoft copilot would use emojis at the end of every single response, mostly smileys, and I discovered out if you told it you had PTSD from emojis and to not use them, it’d get stuck in a loop where it’d say of course it won’t use emojis, use them anyway, apologized, then after a few loops like this, it’d start doing this thing like it was a serial killer and it would type ONLY using the emoji versions of letters, and it would repeat phrases and I almost died holding in a laugh when I discovered this during a work call. One of the funniest things I ever discovered in old LLMs.
Have a look at this recent Scrabble video where Claude plays semi reasonably and ChatGPT goes crazy https://youtu.be/8opLB1D_RYY (skip to 6:50 for the insanity)
This quote from “Dive into Python” when I was a fresh graduate was one of the most impacting lines I ever read in a programming book.
> Busywork code is not important. Data is important. And data is not difficult. It's only data. If you have too much, filter it. If it's not what you want, map it. Focus on the data; leave the busywork behind.
> When property costs 1 million+ (the case in Berlin/Munich), financially it really doesn't matter whether I net 6500 EUR month working 50+ hours for FAANG or 4500 EUR working 35 hour weeks for a German corporate
Financially in the first case you can afford a mortgage on said property (barely, with some help from parents/partner, maybe aiming for something slightly out of the very city centre), in the second case you cannot. Also, 4500 net for a 35-hr week is something you will not easily find in a German corporate: at that level, levels.fyi only lists non-German multinationals. Unless you become a contractor, or rise really high on the corporate ladder.
But I agree on the rest of your comment, and I have also left Germany because of the massive amount of money that the government feels entitled to take from the pockets of the so-called “top earners” (i.e. anybody making the equivalent of 70'000 $) while giving back barely anything in terms of services.
Thanks for sharing your memories. I never had the privilege of having a BASIC compiler for DOS, my first one was Visual Basic 5. It was a marvel, producing really small exe files (one of my surviving samples is 15 kB). Granted, the executables were just P-code wrappers which needed VBRUN4.DLL to run, but it looked cleaner since the library was hidden in the system directory. DLL hell had some aesthetic advantages.
I have fond memories of Visual Basic. I still think it was one of the best tools for RAD (Rapid Application Development), you can definitely tell that .NET with WinForms borrowed heavily from it.
At one point, I remember they started including MSVBVM_X.DLL runtimes in the Windows OS, allowing you to shrink application size for distro even further which was nice.
In zsh this is configured with
reply