I found Moritz Klein's video on bucket brigade (audio) delays informative:
If you momentarily put two capacitors in parallel, the charge will divide between the two, right? So how does all the charge end up on the other side?
The other side of the 'input' capacitor is pulled down to a much lower voltage, which means the charge prefers to flow into it. (And then this alternates so the next time the one it is transferring charge to is pulled lower)
There was also this popular (in the day) successor (sad-1024 dual analog delay line)
https://www.alldatasheet.com/datasheet-pdf/pdf/1161976/ETC2/...
Burying the lede there! Can someone expand on the bit about how CCDs are like bucket brigade delaylines?
It's basically the same thing, except the initial charges in each cell are created through the photoelectric effect when the sensor is exposed to light. After that, it's the same conveyor-belt-style readout proces as for the BBD.
It's also why CCDs tended to have problem horizontal or vertical halos around bright lights. Nowadays, most cameras use CMOS sensors, where the amplification and readout circuitry is integrated directly into each photosensitive cell.
A bucket brigade delay obtains a sample of a voltage on one end, and then shifts copy of that charge among the cells to the other end. The delay line as a whole contains a window of samples at any point, but that window is such is not accessed.
A CCD line obtains a window of samples directly: the cells are charged according to light sensors. Then this window of samples is shifted out in the same way as a bucket brigade delay line.
So the main differences are that the bucket brigade is clocked continuously, running nonstop and that only one of the cells obtains the sample.
The CCD runs only briefly after an image has been exposed and capture has been triggered, in order to shift out the samples, and sampling takes place at every cell.
A bucket brigade can be slowed down or sped up to change the delay. That affects the sampling rate and therefore frequency resolution. That's how we can create a chorus effect for musical instruments. So the clock speed is a direct functional parameter. Slowing down or speeding up the CCD doesn't make a difference to the result, except that there's likely an ideal range of rates, balancing between the time needed to accurately charge one cell to be equal to its neighbor, while being fast enough to avoid leakage.
Check out the datasheet for this single row CCD sensor[1]. Look at pages 2 and 6, which show the block diagram and control signal waveforms. There is a single analog output pin. To read the device the "analog shift register" is used, which is a bucket brigade device to move the charge from the photo diode to the output buffer.
Can someone explain why this doesn't distort the signal? Eg. with all these capacitors one might think it would filter out higher frequencies?
It does. It also obeys the Shannon-Nyquist sampling theorem when you take into account the clock frequency (you can't delay signals above half the clock frequency). It also exhibits nonlinearities that will create harmonic distortion.
At least in the pro audio space, those qualities are actually desirable.
It’s worth noting that in addition to the sample rate, the companding to improve SNR and filtering techniques for anti-aliasing and reconstruction usually employed to (attempt to) retain signal integrity play a large part in the character. A lot of people think of them as “dark” but that voicing can vary a lot between delays depending on how the designer chose to bandwidth limit it; therein lies the real art of analog delay!
I’ve been working on an MN3205 with digital controls and pushing it well past its reasonable specs is very fun; 4096 stages really starts to fall apart past 300ms, and when you clock it down to more than a single second you get a wet robot fart out of the other end!
They can be desirable. If I want to normalize the group delay of a bunch of drum mics, I sure as hell not gonna use a BBD-Delay for it. But if I want a flavourful analog delay onna guitar track, why not.
>a nunch ofndrum mkcs
"Am I having a stroke" comes to mind :) A bunch of drum mics? I'm no recording engineer, but I guess this is about mixing different signals from the same setup, with the signals having different delays (possibly due to filters, cables, or different types of mics)
Distance.
Microphones at different distances receive a signal at different times. When combined, the signals are not phase aligned. This causes comb filtering where the combined waves reinforce and cancel each other.
Though that is sometimes desirable (e.g. comb filtering is important for spatial ambience such as reverb).
Some audio engineers find careful microphone placement solves many potential problems.
Drums are notoriously hard because modern recording techniques use many microphones on a kit.
Corrected — smartphone keyboards suck.
Yeah this is about phase alignment of multiple sources. brudgers comment explains it quite eloquently.
if I want a flavourful analog delay
Sounds like use.
Makes sense, thank you!
A modern application of simmilar concept:
do I interpret the screenshots correctly: that's a whopping 2.5 millisecond delay?
Which is within the range needed for things like guitar pedals (flangers and chorus)