Optical repeaters are 1R repeaters, I.e. they regenerate power. Inside the repeater "boxes" (they are actually cylinders) there is an optical amplifier. For typically these are Erbium doped fiber amplifiers (EDFA). I other words a piece of fibre doped with Erbium (a rare earth). The amplifiers are pumped with laser diodes (typically 1-4 per EDFA) at 980 nm and 1480 nm wavelength. By pumping the doped fibre with these wavelength you provide high gain to the telecom channels which are usually in the optical C-band (~1525-1565nm). This way you can reamplify signals over a large bandwidth (~4 THz) without having to do detection and retransmission (which would be unscalable). Repeaters are typically spaced at 60-80 km in submarine, with a "transparent" design (the gain compensates for the transmission loss of the 60km fibre).
Power delivery to the laser diodes is done through the metal jacket of the cable. The whole submarine cable is essentially a very long DC transmission line. Which is a fascinating topic in itself, E.g. What is ground in such a line, it will differ by 1000s of Volts between continents.
> This way you can reamplify signals over a large bandwidth (~4 THz) without having to do detection and retransmission (which would be unscalable).
This trick also means the cable doesn't care about the rest of the technology. If it was a retransmitter then we'd need to replace the entire cable if we change from 100Gbps over Protocol #39 to 200 Gbps over Protocol #40 because every retransmitter needs to be equipped for the new protocol, but the optical amplifier doesn't care why these photons turned up, what they mean - when provided with power it just ensures proportionately more photons like them come out of the amplifier.
Because they're not actually the same photons weird quantum tricks that would work on bench scale, where it was literally the same photon at the receiver as when you transmitted, will not work, but any conventional signalling within quite broad limits is OK. Researchers at the University where I studied as an undergraduate developed EDFA.
Fun fact: Pirelli, the tire company, used to be big in submarine cable repeaters and related products. The ones I saw at telecom shows were painted Pirelli yellow. That part of Pirelli was sold to private equity.
That explains why the latency is still decent even after repeated amplification.
I wonder why DC though. Is AC lossy when surrounded by salt water?
There's a couple of factors at play here. One is that AC suffers from capacitive losses over long distances (high power multi-megawatt underground/undersea cables are often HVDC for this and other reasons).
The other more interesting one is that the repeaters in this kind of fibre optic cable are usually powered from both ends, from completely separate electrical grids (so one side sends -5000V and the other sends +5000V, for example). This allows for some level of redundancy as well as thinner insulation. With AC, keeping the phases on both sides aligned would be impractical, as well as the inherent inefficiencies of AC transmission.
AC is only popular because it works with transformers to step up/down the voltage, and it would be more expensive to step up/down a DC signal using electronics (which usually involves converting to AC internally anyway).
AC Voltage is specified in RMS volts, which is based on the average power the AC transmits. The peak voltage (top of the sine wave) is 1.414x the RMS voltage. The insulator only cares about the peak before it breaks down, so because DC doesn't waste time at lower voltages, can transmit more power for the same insulation.
These are coax cables, just by the nature of the external physical shielding required (steel cable sheath). So, the EMF should be contained inside and not affected by the salt water. But, I'm not an expert there and could be missing something.
In addition to the other replies, I also recall hearing some time ago that the AC EM field interacted with wildlife in surprising ways (causing sharks to attack the cable, IIRC). It could be an urban legend at this point though.
unrelated but the default power supply for Central Office telecom equipment was always -48V DC (and 23" vs 19" racks)
> What is ground in such a line, it will differ by 1000s of Volts between continents.
Does that translate to free energy for the repeaters?
It’s not free at all. Most of the voltage drop along the cable is caused by conversion of electrical energy into photons within the erbium-doped fiber amplifiers. A relatively small fraction of the voltage drop is caused by losses in the copper cable that carries the current along the route. The high supply voltage allows a relatively small amount of current to carry thousands of kilowatts of power to the amplifiers without causing much loss in the copper.
I took that as referring to how over large distances the results of driving a metal rod into the dirt don't always match, so if you do things like tie both ends of a shielded cable's shielding to separate ground rods you can get odd problems sometimes.
Although I hadn't thought the differences were usually anywhere close to that large.
You’re referring to creating an intentional ground loop, I believe [https://en.m.wikipedia.org/wiki/Ground_loop_(electricity)].
The challenge as I understand it, is that yes you will get ‘free’ power (not actually free, as you had to create the low resistance electrical path for it to exist), but you have no control over the properties or values of what you get - and it will vary unpredictably.
It’s also unlikely you’ll consistently get much actual net power out of it, as you’re competing against an entire planets worth of reasonably conductive (in bulk) parallel paths.
It’s almost always a problem because of that.
It’s almost always a problem because of that.
It varies too much to be useful for powering repeaters. It also can't sustain enough current to be useful, since the resistance across the entire system is huge.