VRR is Variable refresh rates, so if there is nothing going on in the content, they can bring the refresh rate down and save processing, thermal issues and energy. If there is a lot going on(say a game), they can ramp the refresh rate back up super high.

There are a few different "standards" around VRR, not every device supports all of them.

Meh, I wonder why I care about saving energy or processing on a tv that’s plugged in anyway but hey. Thanks for explaining!

Their explanation of the reason for VRR is bad. The primary reason people want it is gaming where the game is not locked to a specific frame rate. Without VRR, the timing of a frame being delivered isn't necessarily going to match when the display is expecting a new frame. This leads to one of two effects. Either the display is forced to hold an old frame for longer and pick up the new frame on the next refresh cycle, which creates stutter. Or the display switches which frame its using partway through the refresh cycle, which creates a visual tear in the image.