Also, it is worthwhile reading the standard proposal paper for lot of examples of how this might be used in the future.
https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2025/p29...
Also, it is worthwhile reading the standard proposal paper for lot of examples of how this might be used in the future.
https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2025/p29...
Thanks for this link. Between it and the main article, I have a much better sense for how compile time reflection will actually be useful. The syntax is bad enough to make my eyes bleed, but at least they'll bleed for a good reason I hope.
> The syntax is bad enough to make my eyes bleed
That's on brand for C++ in general. It works well, but it's ugly as sin while doing it.
I don't agree with you. Yes syntax is often awful but no it doesn't work that well. It's a minefield of Undefined Behavior.
While true, just like everyone on Rust land rarely thinks about configuring their projects without clippy, C and C++ developers should learn static analysers for C and eventually C++ exist since 1979.
It doesn't fix everything, yet it fixes more than those that don't use them at all.
I have been using such tools since mid-1990's when coding in either C or C++, and sadly it is still a discussion point regarding adoption.
I am happy^^ that ^^ is our quote character. ^^
The rationale in section 2.2 for the single std::meta::info type is interesting. While having a single “dynamic” type is fine insofar as any type error would occur at compile time anyway, I wonder if there wouldn’t be a way to maintain future backwards compatibility for finer-grained meta types, as long as the language elements they reflect remain backwards-compatible in the language as well. From first principles, one would think that it should be possible for the backwards compatibility of the reflection system to mirror the backwards compatibility of the reflected language. I’d be interested in the underlying reasons for why this isn’t the case, if any.