Message boards : Number crunching : Improvements to Rosetta@home based on user feedback
Previous · 1 · 2 · 3 · 4
Author | Message |
---|---|
JoHnY Send message Joined: 15 Feb 06 Posts: 1 Credit: 1,879,398 RAC: 0 |
I have a question to project organizators. Why you don't use more effective algorithms of compression of WU. At last time i've recieved WU more then 3 Mb. I've analized compression degree on some archivators and got 50-100% more effective compression then Gzip curruntly used in project. It's really expensive to pay for such huge internet traffic. I think many other member will agree with me. For Example: aat001_09_05.200_v1_3.gz - 3 264 952 b aat000_09_05.cab - 1 864 416 b aat001_09_05.uha - 1 313 822 b |
FluffyChicken Send message Joined: 1 Nov 05 Posts: 1260 Credit: 369,635 RAC: 0 |
This has been pointed out some year + back when i first started, they did manage to trim the file sizes down on a few of the target they gave out, not sure if they still use that philosophy as sizes seem to have creapt back up. But they (Rosetta) reads from the files directly and gzip is what BOINC provides to do this. BOINC can also do ZIP compression, but not sure if that can use it in the same way. It may have to be expanded out first (like CPDN do/did). Not a problem for me as I would prefer that over the large downloads. They could of course recompress the .gz's with another program. But they would need to put it into the Rosetta client and be Windowsx32/64, Mac OS-X and Linux compatable. But you get the best of both worlds then. For BOINC to include it, normally they would need to have the source to make it to more platforms. Team mauisun.org |
Message boards :
Number crunching :
Improvements to Rosetta@home based on user feedback
©2025 University of Washington
https://www.bakerlab.org