Re: [Tails-dev] review and merge feature/6999-compress-png

Supprimer ce message

Répondre à ce message
Auteur: sajolida
Date:  
À: The Tails public development discussion list
Sujet: Re: [Tails-dev] review and merge feature/6999-compress-png
intrigeri:
> sajolida wrote (02 Mar 2015 14:53:20 GMT) :
>> intrigeri:
>>> Another option would be to have ikiwiki recompress these images when
>>> building the wiki. This would be a bit costly, so likely not suitable
>>> for local builds, but just fine on our production website I guess.
>>> Is there any plugin that does something like that? If not, it should
>>> be relatively trivial to write.
>
>> Maybe we need to make it more clear what's the objective. To me the
>> objective is to reduce the impact of PNG images on the size of our ISO
>> images. I don't care so much about online visits to the website, seeing
>> that this only bring 15% compression overall.
>
> I didn't get this initially.


Initially it was about website optimization you are right. But seeing we
only get 20% compression on 2.4MB total data, I'm not sure it's worth it
and though ISO image optimization made more sense as ISO image size have
more implication that just on Internet download (size on the USB stick,
isohybrid stuff, etc.).

> How much do we gain on the resulting ISO size (with the default, slow
> SquashFS compression settings), thanks to this recompression process?


I didn't try that. I'll do that at some point but that's low prio to me.

> Good catch. Yes, indeed. So what we gain on ISO images (that might be
> downloaded without Tor), we lose it on IUKs (that are always
> downloaded via Tor). So, we should really *not* recompress images at
> ISO build time, right?


Agreed.

>> If we agree on this, then I'm very sorry to have polluted the Git repo
>> with 2MB of compressed binaries right after you cleaned it...
>
> No problem: the improvement for the visitors of our web site still
> seems worth it to me, as long as we don't do it again every 6 weeks.
>
> So, I see three options:
>
>  a) Revert all changes to the release process doc, and merge this
>     branch to improve navigation (especially over Tor) on our website,
>     at the expense of a 3-4% bigger Git repository.

>
>  b) Delete this branch and hope for Git garbage collection to give us
>     these 2MB back.


I'd do that for the moment as this branch really have not really much
work that is not documented fully on the ticket already. How do I delete
a branch on the remote directory?

>  c) Refocus this effort on website navigation, and try the
>     web-optimized palette trick that I reported about on
>     #6999#note-1 (on the image I tested it on, back then I got a 60%
>     size reduction, compared to the 6.3% we got with optipng+advdef).


I'd do that as well. But not myself as Naar as proved to be far more
expert than me on this :)

>     Of course it's much more painful than the commands Naar and you
>     came up with, but in this case it's a one-shot effort, as opposed
>     to something that should be run for every release and needs to be
>     automated.


Right, not that the compression that we are proposing here could also,
in theory be a one-shot effort. The problem is actually to keep track
whether it has been done already.

>     Then, if it gives substantially better results, then rewrite the
>     topic branch's history to replace commit 1a7beb4 with one that
>     compresses images even better... and merge it!


Ack.

> My (humble) opinion:
>
> * If someone is excited by (c), please go for it.


I'll point the ticket to our discussion here.

>   * Else, I don't care much. (a) is good because we get an immediate
>     benefit from the great work that's been done. (b) is good — if it
>     actually works in terms of reclaiming space in Git — because it
>     leaves more room for someone doing (c) later without me
>     complaining that these images are already stored twice in .git.


Let's do b) for the time being. If c) never happens or doesn't work, we
can always rollback.

> Now, regardless of what we do with *existing* images, it would be good
> to have "something" that prevents poorly compressed images from being
> *added* to Git.
>
> There are a lot of candidate solutions, such as documentation for doc
> and automated test writers (will be regularly forgotten), a pre-commit
> hook that ask you to confirm that you've optimized stuff whenever you
> add or update a picture (only works for those who actually bother
> setting up the pre-commit hook and reading what it tells them), adding
> a check about that in our review'n'merge policy (won't work at all
> I bet), or a hook on our main Git repo that tries to recompress
> added/updated pictures and, if it can itself improve compression by
> some factor, forbids the branch from being pushed (probably the only
> really working solution, but it has to be written, and had better be
> pretty robust and safe).


I must admit that I'm not really excited at over-engineered solutions
with hooks and all. I'll try to come up with two other possible low-fi
techniques but they would require adding two copies of each images to
Git (the uncompressed first, and then the compressed version and you
might not like them:

1. Couldn't we say that we continue adding uncompressed images happily
like we did in the past and that on a regular basic (release time, once
a year, or anything else) we spot images that are in master and haven't
been compress yet (merged since the last time we compressed images for
example), and then compress those ones only.

2. We try our best to compress images before adding them. If the
compression process is manual and quite cumbersome, we should be able to
remember to specify in the commit messages if we did the compressing.
Then uncompressed images could be spotted based on that every now and
then and compressed.

Of course, this only make sense if we get a compression rate that is
worth doing all that stuff... so maybe we should investigate c) before
deciding on this and seem how the result affects the final ISO size.

--
sajolida