Regarding your question: 1) Rated update rate for a DAC should not result in significant degradation in performance. You should achieve your stated analog conversion performance at whatever update rate you chose. Usually update rate is defined by settling delays and a specified precision of conversion. If for example you spec'd 10 bits input and 0.2% precision then you would say you are failing at the frequency when you do not achieve this precision. (-6db is very signficantly worse than that)
3) The importance of small signal noise spectral density varies depending on the range and precision of conversion. You might find that a post layout extracted netlist with all layout parasitics simulated while doing conversions has much higher noise contribution than the small signal transistor models would show. As Luis said before I think a large signal transient simulation of some reference conversions (1Hz, 10Hz, 100Hz, 1kHz, 10kHz...) is likely to help if the output is viewed in the frequency domain. (It is usually surprisingly bad)🥲 . The good news is that usually some of these noise components have very predictable frequencies (Switching regulator switching frequency, AC mains, CPU clock, IO Bus frequencies, Mother board Clock etc) that are hopefully far from your desired output frequency and are therefore relatively easy to filter.