I am in optimization mode now. If you have noticed the speed of checksumming to be abysmal on very large files, don’t worry, I have noticed as well.
I was able to improve the checksum speed by orders of magnitude on large files and probably exporting as well. I am cutting down anywhere we are applying code to extremely large buffers. This will allow python not to bog down trying to manage such large buffers. So in the next little bit, I will try to optimize all big processes (or at least attempt to further optimize). Loading is pretty good right now, I will see if I can push it a bit more. Checksumming and exporting should be much better. I will do some tests on restoring a tab speeds on large files, and also see if I can throttle highlighting on large selections.
Whether it happens before next commit or not, I will also be threading heavy hitting processes, and hopefully provide a cancel option in case you were silly enough to try and edit a 100MB binary file.
I will at least have the optimized checksumming/exporting in a commit tonight. I just need to get some more real world testing in.