Out of curiosity, how would you recursively grep files ignoring (hidden files [e.g., `.git`]), only matching a certain file extension? (E.g., `rg -g '*.foo' bar`.)

I use the command line a lot too and this is one of my most common commands, and I don't know of an elegant way to do it with the builtin Unix tools.

(And I have basically the same question for finding files matching a regex or glob [ignoring the stuff I obviously don't want], e.g., `fd '.foo.*'`.)

Depends on how big the directory is. If it only contains a few files, I'd just enumerate them all with `find`, filter the results with `grep`, and perform the actual `grep` for "bar" using `xargs`:

   find . -type f -name "*.foo" | grep -v '/\.' | xargs grep bar
(This one I could do from muscle memory.)

If traversing those hidden files/directories were expensive, I'd tell `find` itself to exclude them. This also lets me switch `xargs` for `find`'s own `-exec` functionality:

   find . -path '*/\.*' -prune -o -type f -name "*.foo" -exec grep bar {} +
(I had to look that one up.)

Thanks yeah this is a good example of why I prefer the simpler interface for `rg` and `fd`. Those examples would actually be fine if this were something only did once in awhile (or in a script). But I search from the command line many times per day when I'm working, so I prefer a more streamlined interface.

For the record, I think `git grep` is probably the best builtin solution to the problem I gave, but personally I don't know off-hand how to only search for files matching a glob and to use the current directory rather than the repository root with `git grep` (both of which are must haves for me). I'd also need to learn those same commands for different source control systems besides git (I use one other VCS regularly).

>Those examples would actually be fine if this were something only did once in awhile (or in a script). But I search from the command line many times per day when I'm working, so I prefer a more streamlined interface.

Makes sense. If I had to do this frequently, I'd add a function/alias encapsulating that `find` incantation to my .bashrc, which I keep in version control along with other configuration files in my home directory. That way, when moving to a new environment, I can just clone that repo into a fresh home directory and most of my customizations work out-of-the-box.

Yeah I do the same sometimes, at the risk of going too deep into personal preference. A couple of notes about that approach:

1. I don't recommend using shell functions or aliases for this (e.g., `bashrc`) because then these scripts can't be called from other contexts, e.g., like Vim and Emacs builtin support for shell commands. This can easily be solved by creating scripts that can be called from anywhere (my personal collection of these scripts is here https://github.com/robenkleene/Dotfiles/tree/master/scripts). Personally, I only use Bash functions for things that have to do with Bash's runtime state (e.g., augmenting PATH is a common one).

2. The more important part though, is I don't always want to search in `*.foo`, I want a flexible, well-designed, API that allows me to on-the-fly decide what to search.

#2 is particularly important and drifts in philosophy of tooling, a mistake I used to make is building my workflow today into customizations like scripts. This is a bad idea because then the scripts aren't useful as your tasks change, and hopefully your tasks are growing in complexity over time. I.e., don't choose your tools based on your workflow today, otherwise you're building in limitations. Use powerful tools that will support you no matter what task you're performing, that scale practically infinitely. "The measure of a bookshelf is not what has been read, but what remains to be read."

  alias grep='grep --exclude-dir .\* --exclude .\*'
  grep -r --include \*.foo bar

Just noting why I answered why don't do this (and honestly do think it's a very good idea) here https://news.ycombinator.com/item?id=45569313

  find . -type f -name '*.foo' -not -path '*/.*' -print0 | xargs -0 grep bar

The one issue with this approach is that it would still traverse all hidden folders, which could be expensive (e.g. in a git repo with an enormous revision history in `.git/`). `-not -path ...` just prevents entities from being printed, not being traversed. To actually prevent traversal, you need to use `-prune`.

grep -ri foo ./*

Hits in hidden files is not really a pain point for me

Curious if that answers the "I genuinely don't know what is going on here" then? Not searching hidden files (or third-party dependencies, which `rg` also does automatically with its ignore parsing) isn't just a nice to have, it's mandatory for a number of tasks a software engineer might be performing on a code base?

That doesn't apply to the very specific case for which the parent asked a solution.

    find . -name '.*?' -prune -o -name '*.foo' -exec grep bar /dev/null {} +
This is the POSIX way. You'd probably put it in a function in .bashrc

Just noting that I answered why I don't use this approach here https://news.ycombinator.com/item?id=45569313