I usually use bash/python/perl if I can be sure that it will be available on all systems I intend to run the scripts. A notable exception for this would be alpine based containers, there it’s nearly exclusively #!/bin/sh.
Depending on the complexity I will either have a git repository for all random scripts I need and not test them, or a single repo per script with Integrationtests.
Depends, if they are specific to my setup, no, otherwise the git repository is public on my git server.
Usually no, because the servers are not always under my direct control, so the scripts that are on servers are specific to that server/the server fleet.
Regarding your last question in the list: You do you, I personally don’t, partly because of my previous point. A lot of servers are “cattle” provisioned and destroyed on a whim. I would have to sync those modifications to all machines to effectively use them, which is not always possible. So I also don’t do this on any personal devices, because I don’t want to build muscle memory that doesn’t apply everywhere.