you've got it backwards - the server clock should be in UTC, and if an individual piece of software needs to know the location, that should be provided to it separately.

for example, I've got a server in my garage that runs Home Assistant. the overall server timezone is set to UTC, but I've configured Home Assistant with my "real" timezone so that I can define automation rules based on my local time.

Home Assistant also knows my GPS coordinates so that it can fetch weather, fire automation rules based on sunrise/sunset, etc. that wouldn't be possible with only the timezone.

I kind of assumed all computer clocks were UTC, but that you also specified a location, and when asked what time it is, it did the math for you.

Windows assumes computer clocks are local time. It can be configured to assume UTC. Other operating systems assume computer clocks are UTC. Many log tools are not time zone aware.

Computer clock is just counter, if you set the start counting point to UTC, then it's UTC, you set it to local time, then it's local time.

that's the difference between "aware" and "naive" timestamps. Python has a section explaining it in their docs (though the concept applies to any language):

https://docs.python.org/3/library/datetime.html#aware-and-na...

AKA time zones