Unable to submit commit to Phabricator due to post_max_size being too small
Categories
(Conduit :: Infrastructure, defect, P3)
Tracking
(Not tracked)
People
(Reporter: dminor, Assigned: glob)
References
Details
(Keywords: conduit-triaged)
Comment 1•6 years ago
|
||
Reporter | ||
Comment 2•6 years ago
|
||
Comment 3•6 years ago
|
||
Reporter | ||
Comment 4•6 years ago
|
||
Comment 5•6 years ago
|
||
Reporter | ||
Comment 7•6 years ago
|
||
Comment 8•6 years ago
|
||
Reporter | ||
Comment 10•6 years ago
|
||
Reporter | ||
Comment 11•6 years ago
|
||
For what it's worth, the problem does not appear specific to webrtc. I believe that I hit it every time I attempt to update ./mach vendor rust
.
Assignee | ||
Comment 14•6 years ago
|
||
i think rust vendoring is a slightly different case than webrtc, particularly around code review of vendored code.
when servo was automatically synchronised into mozilla-central, the mach vendor rust
step was automated and was outside the scope of code review; the updates to the lock file were reviewed, not the resulting vendoring changes.
i think this points to issues with how we vendor projects, touching all languages. i have a project on my backburner to chase this down, however unfortunately i'm not expecting to have much time to put against it in the short term
Reporter | ||
Comment 15•6 years ago
|
||
With webrtc.org, I have problems in two places:
The first is with the import of the third party code itself. This is done as a separate commit and does not include any of our local modifications. I'd argue that this should be outside the scope of code review, it's all third party code, any required modifications need to be addressed upstream or our local patches. This is only an issue while doing an update.
The second is the json configuration files that are generated by gn and are in turn used to generate moz.build files. This is a problem while doing an update, but also whenever we need to change any gn build configuration. The json files generated are quite large and contain a lot of extraneous information. As a workaround, we might be able to cut down on the size of the generated files by filtering them prior to committing. I'd also argue that the json files should be outside the scope of code review. What should be reviewed are the changes to the gn configuration files themselves, not the resulting code generated files.
To date, I've worked around these problems by manually splitting commits up into smaller commits or requesting review through splinter, with the review pointing to a commit in my user hg.mozilla.org repo. Neither of these are ideal.
If the plan for the future is for all code to land through Lando, I think we need some way of getting large commits to Phabricator/Lando without having to worry about splitting commits up manually prior to submitting. Maybe that just means some tooling that runs prior to submitting code for review, maybe something more complicated that touches on our review policy for vendored and code generated code, I'm not really sure.
Assignee | ||
Comment 16•6 years ago
|
||
thanks for the extra detail there dminor.
(In reply to Dan Minor [:dminor] from comment #15)
If the plan for the future is for all code to land through Lando
this is indeed the plan; however there will be "escape hatches" as there's a lot of edge cases such as this one.
Maybe that just means some tooling that runs prior to submitting code for review
we need some tooling in place, however my initial thoughts are more that code that's generated or vendored generally shouldn't need to be explicitly reviewed, and so they shouldn't need to be pushed to phabricator. rather they could be generated by transplant when landing is requested. i know this is somewhat idealised and not a perfect solution, which is why we'll be exploring options here prior to taking action.
diving into this is something i plan on doing soonish; i'll make sure to reach out to you and others at that time.
Assignee | ||
Comment 17•6 years ago
|
||
i'm going to explore options here and report back.
Comment 18•6 years ago
|
||
(In reply to Byron Jones ‹:glob› 🎈 from comment #17)
i'm going to explore options here and report back.
any progress? since this bug is last blocker for webrtc_updates updates, this would be nice to have sorted out :)
Assignee | ||
Comment 19•6 years ago
|
||
Sorry; i wasn't aware this was actually blocking webrtc updates.
Splitting up the code into smaller patches should through review be required is the best solution.
I'm still working through proposals and solutions to solidify and standardise our approach to reviewing of vendored third party code, with the goal of addressing the "is a review actually required of this code" question. The general consensus is that reviewing a patch of webrtc's size is not viable.
Calling arc diff
with --less-context
will result in smaller patches. I've filed bug 1549672 to expose this with moz-phab
, however even this given the normal patch was 27M that's still likely to bump up against the 32M limit if base64 encoding is in the mix.
Description
•