Re: [Tails-dev] review and merge feature/6999-compress-png

Supprimer ce message

Répondre à ce message
Auteur: intrigeri
Date:  
À: The Tails public development discussion list
Sujet: Re: [Tails-dev] review and merge feature/6999-compress-png
Hi,

[edit: you might want to jump straight to the part about IUKs.]

sajolida wrote (02 Mar 2015 14:53:20 GMT) :
> intrigeri:
>> Another option would be to have ikiwiki recompress these images when
>> building the wiki. This would be a bit costly, so likely not suitable
>> for local builds, but just fine on our production website I guess.
>> Is there any plugin that does something like that? If not, it should
>> be relatively trivial to write.


> Maybe we need to make it more clear what's the objective. To me the
> objective is to reduce the impact of PNG images on the size of our ISO
> images. I don't care so much about online visits to the website, seeing
> that this only bring 15% compression overall.


I didn't get this initially.

How much do we gain on the resulting ISO size (with the default, slow
SquashFS compression settings), thanks to this recompression process?

(It might be that our extreme SquashFS compression settings already
give us just the same as what recompressing images would.
We're talking of a mere 300-400kB anyway.)

> If we agree on that, then maybe that could be hooked in the ISO
> build process (and not in the wiki build process). Could that go in
> auto/build right after build-wiki?


Agreed * 2, except:

> But then, how would that interfere with IUKs? Would non-deterministic
> builds of those images always end up in IUKs?


Good catch. Yes, indeed. So what we gain on ISO images (that might be
downloaded without Tor), we lose it on IUKs (that are always
downloaded via Tor). So, we should really *not* recompress images at
ISO build time, right?

> But I guess that this does not depend on whether this is done at ISO
> or wiki build time...


Indeed, it doesn't.

> If we agree on this, then I'm very sorry to have polluted the Git repo
> with 2MB of compressed binaries right after you cleaned it...


No problem: the improvement for the visitors of our web site still
seems worth it to me, as long as we don't do it again every 6 weeks.

So, I see three options:

 a) Revert all changes to the release process doc, and merge this
    branch to improve navigation (especially over Tor) on our website,
    at the expense of a 3-4% bigger Git repository.


 b) Delete this branch and hope for Git garbage collection to give us
    these 2MB back.


 c) Refocus this effort on website navigation, and try the
    web-optimized palette trick that I reported about on
    #6999#note-1 (on the image I tested it on, back then I got a 60%
    size reduction, compared to the 6.3% we got with optipng+advdef).


    Of course it's much more painful than the commands Naar and you
    came up with, but in this case it's a one-shot effort, as opposed
    to something that should be run for every release and needs to be
    automated.


    Then, if it gives substantially better results, then rewrite the
    topic branch's history to replace commit 1a7beb4 with one that
    compresses images even better... and merge it!


My (humble) opinion:

* If someone is excited by (c), please go for it.

  * Else, I don't care much. (a) is good because we get an immediate
    benefit from the great work that's been done. (b) is good — if it
    actually works in terms of reclaiming space in Git — because it
    leaves more room for someone doing (c) later without me
    complaining that these images are already stored twice in .git.


Now, regardless of what we do with *existing* images, it would be good
to have "something" that prevents poorly compressed images from being
*added* to Git.

There are a lot of candidate solutions, such as documentation for doc
and automated test writers (will be regularly forgotten), a pre-commit
hook that ask you to confirm that you've optimized stuff whenever you
add or update a picture (only works for those who actually bother
setting up the pre-commit hook and reading what it tells them), adding
a check about that in our review'n'merge policy (won't work at all
I bet), or a hook on our main Git repo that tries to recompress
added/updated pictures and, if it can itself improve compression by
some factor, forbids the branch from being pushed (probably the only
really working solution, but it has to be written, and had better be
pretty robust and safe).

Thoughts?

Cheers,
--
intrigeri