Greetings,
So I am designing a simple LDO with sky130 technology (I know there are already some designs out there on GitHub but they either actually have some issues or do not meet my design requirements). And I am having some problems with the PSRR performance of the LDO.
The first schematic below is my error amp for now which is a simple active load diff pair. The second schematic is the LDO itself. The third schematic is the testbench for LDO. For this design, Vdd is set to 2V and Vreg is set to 1.8V. Assuming I have a Vref of 1.8V, so there is no resistive divider here. I am targeting at this point a minimum load current of 100uA and a maximum load current of 10mA.
Now I am trying to do the loop-gain analysis in Ngspice (which is the main issue of many designs out there as their loop stability analysis is problematic). Here I have done three different loop-stability analyses: Ochoa's Z method, Middlebrook's method, and Tian's method. The fourth schematic shows the testbench for Ochoa's Z method, and the fifth schematic shows the testbench for Middlebrook's and Tian's method (same testbench but different way of calculations). Just in case if you are curious of how these methods are done, please see the following:
Ochoa's Z method:
https://www.youtube.com/watch?v=BLXNkmubQzA▾
https://www.youtube.com/watch?v=BLXNkmubQzA
Middlebrook's and Tian's method:
http://education.ingenazure.com/ac-stability-analysis-ngspice/
(It is also theoretically possible of doing big cap/ind method as described here:
https://www.eecg.toronto.edu/~johns/ece331/lecture_notes/22_LG_simulation.pdf but there are some convergence issue with big cap/ind values in Ngspice so I did not use this method here)
I have to note here that all these methods do not disturb DC biasing point, and are taking the loading effect into account.
Anyway, I obtained fairly similar results for these method for the loop-gain Bode plot with a load current of 100uA, which I have only shown the Middlebrook's result in the sixth figure, which I understand that the phase margin is pretty bad (as I have not optimized my design yet). It has a DC gain of about 50dB and an unity-gain frequency of 1MHz.
And now here is the problem. It is well-established that the PSRR of LDO, before the unity-gain frequency, is approximately equal to 1/L where L is the loop-gain. This means that I should expect a PSRR of around -50dB at DC and start to deteriorate around the unity-gain frequency (and the output decap starts to kick in at high frequency). But the simulation result of the PSRR is not what I expected, which is shown in the seventh schematic, and I am not very sure what causes this.
Any insight to this will be extremely helpful!