On Saturday 13 June 2009 01:09:04 RZ wrote:
> But the block signals effect was quite overwhelming and sufficient - so
> I stopped optimizing everything else ;-)
As far as I know this means all the update signals are then send afterwards.
Which mean after changing 10.000 data points, instead of a continuous stream
of 10.000 dataChanged(index) signals, you will get them all after the signals
are unblocked again. But still these are 10.000 signals. Which means after you
pushed new data into the model for 1 second, the gui will slowly adopt to this
for the next second. This might work when you only change lots of data for a
short period, it is the wrong behaviour when changing lots of data
continuously for hours (I have evaluations that run for several days non-stop;
and really before I optimized it, I had the gui update for several hours to
adopt to all the signals).
And sadly Qt' doesn't allow combining of signals via a unified way, so you have
to combine the dataChanged() signals somewhere in your code...
For these specialized high-troughput use-cases you simply _have_ to implement
your own models. Which isn't hard unless you want the special stuff like
drag'n'drop... (/me has yet to extend a custom model to do drag'n'drop.)
Qt-interest mailing list