> Few studies use cross-attention to integrate time series into LLMs

I mean, sure, but why would you need a study for that? There's plenty of prior work using cross-attention to integrate time series dynamics into non-LLM transformer models, right? Or maybe I'm assuming that integrating a time series embedding with an LLM is easier than it is.

Looking at the repo, the training data seems extremely health-focused. I guess I would have to tune the model with my own datasets if I want it to answer questions about multi-source sensor data?