Yes, the typical way this works is that the lookup table is programmed during device calibration and that the microcontroller has a temperature sensor attached and uses a varicap to drive one of the two capacitors attached to the crystal (usually through a coupling capacitor to avoid loading up the circuit too much).
This is nice because it will help to keep the crystal on track but a lot depends on the time constants of the control circuit whether or not the Allan Deviation of the circuit as a whole is going to be acceptable across all applicable timescales.
As a domain this is both fascinating and far more complex than I had ever imagined it to be, but having spent the last couple of months researching this and building (small) prototypes I've learned enough to have holy respect for anything that doesn't have an atomic clock in it and that does better than 10^-7. That is a serious engineering challenge especially when you're on a tiny budget.
If you can use a GPS disciplined oscillator then that's one possible solution, but there to you may see short term deviations that are unacceptable even if the system is long term very precise.
Is there really a microcontroller in there? As in a general purpose microprocessor core executing machine code in ROM? Any references for that?
I find it baffling that this would be cost effective. Maybe by dropping in a CPU core and software you save some design cost vs. a more specialized IC. But it must be more expensive per unit to manufacture in a process where you can fit in all those transistors. And these things are manufactured in such quantities that design costs must be a pretty minimal part of the final part price.
This used to be implemented as a purely analog control loop, i.e. opamps and such. After all TCXOs predate the age of ubiquitous CPUs by decades. Even if there is a need for a factory-programmed temperature calibration curve, there are techniques where it can be implemented in a pure analog way, or in a dedicated digital circuit where the transistor count will be much lower vs. a general purpose CPU core.
That microcontroller costs a small fraction of the precision ground crystal it is boxed in with.
You need a way to calibrate the device after the package is sealed, that implies some smarts or you're going to end up with a whole raft of extra pins and that would be costlier than the microcontroller!
I'm sure there are alternative ways but in this day and age cpus and small amounts of flash + memory are priced a little bit above the sand they're made of. I have whole units packaged and with far larger capabilities for $3 Q1, and that's with a whole lot of assembly and other costly detailing.
Microchip, one particular embedded controller manufacturer lists their SMD packaged PIC16F15213-I/SN which is much more powerful than what you need here for $0.33, Q100 that drops to $ 0,27400. This is a complete device, not an unpackaged die, which would retail for a small fraction of that.
Control loops and analog stuff works well, but not if you also want to be able to do calibration after the fact package is sealed, I'm not aware of any tech that would be fully analog but that would have the same flexibility and long term stability, never mind mechanical stability (microphony, talking to a crystal is probably the cheapest and easiest way to get FM modulation!). Note that this is different precision wise from a device that simply measures the temperature and does a compensation based on that, the device you are looking at in this article is easily an order of magnitude better.
just because it is digital doesn't mean it has to be a microcontroller though, right?? i see no reason this wouldnt just be a state machine or whatever out of plain old logic.
Well, I've been working with a number of these devices, different brands but the same or slightly more functionality and they all have little controllers in them. Some are documented and you can talk to them directly (usually I^2C) others are 'black boxes', you can tell there is something living on the other side of the nominally 'NC' pin but not what and you don't have control over it.
I also have a couple of very fancy ones that you can compensate and whose NV memory you can write to directly. Those are pretty expensive, $100 or thereabouts but the precision is unreal for a non-governed device.
There's definitely enough transistors in there for a microcontroller. It only takes a few thousand. If you're building a custom integrated circuit in the first place, the cost of a microcontroller core is relatively low, and often the cheapest option. The alternative is to write the logic in a hardware description language (HDL) like Verilog, and implement it with logic gates.
The microcontroller approach uses a fixed number of transistors, with cheap mask ROM scaling with complexity, whereas the HDL approach scales its transistor usage with complexity. The HDL approach usually runs much, much faster, is far less error prone, and takes longer to develop.
Which approach is better depends a lot on the application.
If you implement a temperature-calibration curve by analog means, it will drift in time, unless you use very high-quality and expensive components.
Calibrations done with a microcontroller have replaced those done with analog components in most applications, because the total cost is reduced in this way.
Even a relatively powerful 32-bit ARM microcontroller costs a fraction of a dollar. Good analog components, with guaranteed behavior in temperature and in time, are usually more expensive than microcontrollers.