Closed Bug 757460 Opened 12 years ago Closed 12 years ago

Extract testing modules from test archives

Categories

(Release Engineering :: General, defect, P2)

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: gps, Assigned: catlee)

References

Details

Attachments

(1 file)

Bug 755339 will be adding a directory for JS modules to the tests archive. We need to extract this directory when running tests. The attached patch does this. I also refactored slightly to minimize redundancy. This is my first patch to buildbotconfigs. Not sure if I am doing it right or how this needs to land (there are multiple branches?).
Attachment #626024 - Flags: review?(catlee)
Blocks: 757860
This bug is in the dependency chain for landing Apps in the Cloud, which we're trying to get in Firefox 15. This is the only bug in the dependency set that I'm not sure how will work out as far as release schedule and deployment procedure (since it is part of RelEng/buildbot). Could someone shine some light on what to expect as far as the timeline goes? AFAIK, there is nothing more I need to do and the bug is in RelEng's court (for review).
Comment on attachment 626024 [details] [diff] [review] Extract modules from archive, v1 Aiui, unless 'modules' is in the test zip for every branch (mozilla-aurora, mozilla-beta, mozilla-release, mozilla-esr10 especially), this will make unit tests on those branches perma-red.
(In reply to Aki Sasaki [:aki] from comment #2) > Comment on attachment 626024 [details] [diff] [review] > Extract modules from archive, v1 > > Aiui, unless 'modules' is in the test zip for every branch (mozilla-aurora, > mozilla-beta, mozilla-release, mozilla-esr10 especially), this will make > unit tests on those branches perma-red. You are right: > $ unzip -o ../firefox-15.0a1.en-US.mac64.tests.zip 'foo*' 'bin*' > Archive: ../firefox-15.0a1.en-US.mac64.tests.zip > inflating: bin/certutil > inflating: bin/components/httpd.js > inflating: bin/components/httpd.manifest > inflating: bin/components/test_necko.xpt > inflating: bin/fix_macosx_stack.py > inflating: bin/fix_stack_using_bpsyms.py > inflating: bin/pk12util > inflating: bin/plugins/Test.plugin/Contents/Info.plist > inflating: bin/plugins/Test.plugin/Contents/MacOS/libnptest.dylib > inflating: bin/ssltunnel > inflating: bin/xpcshell > caution: filename not matched: foo* > $echo $? > 11 Compare to without the 'foo*', $? == 0. Boo :( How do we handle this? Surely there is precedent for this.
Blocks: 749336
(In reply to Gregory Szorc [:gps] from comment #3) > How do we handle this? Surely there is precedent for this. My stock answer would be, this will be easier once bug 650880 is fixed. However, it seems like we need this fixed earlier, so a few thoughts: 1) specify the list of default test zip bits per-branch in buildbot-configs/mozilla-tests/config.py. ['bin*', 'certs*', 'modules*'] for m-c based branches, and ['bin*', 'certs*'] for aurora,beta,release,esr10. You'd have to touch buildbot-configs/mozilla-tests/config.py, buildbotcustom/misc.py, buildbotcustom/process/factory.py, and steps/misc.py to get that through. One downside, other than a lot of testing, is that all non-m-c-based Try unit tests would be perma-red. Another would be having to remember to update this list every time the code merges down the trains. 2) create some sort of manifest, or shell script inside the zip, that tells you about the contents (or extracts them for you, e.g. a) extract script b) script.sh mochitests) The downside here is we have to have fallback behavior if this manifest or script doesn't exist, or we need to land this on all branches simultaneously. A second downside is it makes the zip extraction 2 steps. 3) create a dummy file or empty dir or w/e called 'modules' on all non-m-c branches. This feels very hacky but could be the easiest fix. It would have to land on all branches before the buildbot patch lands. 4) extract everything from the zip file all the time. I think there was a reason we're doing it this way (disk space or speeding things up), so reverting this would require investigating why that happened in the first place.
Comment on attachment 626024 [details] [diff] [review] Extract modules from archive, v1 please address aki's comments, and then I can take a look
Attachment #626024 - Flags: review?(catlee)
(In reply to Aki Sasaki [:aki] from comment #4) Ted and I talked about it on IRC and we are going to go with > 3) create a dummy file or empty dir or w/e called 'modules' on all non-m-c > branches. > > This feels very hacky but could be the easiest fix. It would have to land > on all branches before the buildbot patch lands. In bug 755399, I've landed a one-liner to the packaging Makefile to create an empty modules directory. This has landed on inbound already and I should get approval for aurora, beta, and esr shortly. It should hit m-c sometime in the next few hours. Hopefully it will propagate out to all the trees within a few days, allowing the changes in this bug to be deployed without tree breakage. Once this bug is rolled out, there will be a follow-up to bug 755399 to actually populate files in the modules directory. And, everything should just work. Is this plan satisfactory? The only bit that concerns me is the rate of propagation of the directory creation to all the branches. Do we have a mechanism for notifying branch owners that they need to do a merge or cherry pick a changeset? Or, do we just inform the sheriffs and have them triage when there is breakage?
I think that will work... tried zipping/unzipping empty dirs and that seems to be fine. Sending a dev.tree-management+dev.planning post to warn branch owners to merge would probably suffice.
After talking with John, raising the severity level, and noting that this is for Apps, part of k9o, trying to make firefox 15, whose uplift date is soon, so this bug is time sensitive.
Severity: normal → major
bug 755339 has landed in at least {inbound,central,aurora,beta,esr,release}. I /think/ this is all the major trees. Post sent to dev.planning and dev.tree-management outlining necessary actions for other trees. So, the changes in this bug can land once we wait a little while. Would sometime next week work? How about Monday or Tuesday? Chris: Do you have enough to do the review now?
(In reply to Gregory Szorc [:gps] from comment #9) > bug 755339 has landed in at least {inbound,central,aurora,beta,esr,release}. > I /think/ this is all the major trees. > > Post sent to dev.planning and dev.tree-management outlining necessary > actions for other trees. > > So, the changes in this bug can land once we wait a little while. Would > sometime next week work? How about Monday or Tuesday? > > Chris: Do you have enough to do the review now? (In reply to Chris AtLee [:catlee] from comment #5) > Comment on attachment 626024 [details] [diff] [review] > Extract modules from archive, v1 > > please address aki's comments, and then I can take a look gps: Is the patch (v1) still current? Hard to follow from the discussions afterwards, but aiui, we're waiting for a revised patch to be marked "r?".
Comment on attachment 626024 [details] [diff] [review] Extract modules from archive, v1 This patch should be ready for review again. Changes from bug 755339 have made it possible to make the changes in this bug without (hopefully much) tree breakage.
Attachment #626024 - Flags: review?(catlee)
Attachment #626024 - Flags: review?(catlee) → review+
https://hg.mozilla.org/build/buildbotcustom/rev/7d0cd546fb77 I'm not sure what the bug process is for buildbot commits. I'll let someone else resolve this bug. What's the ETA for this being pushed out to the infra?
I imagine tomorrow or monday we'll have another merge+reconfig.
Assignee: nobody → gps
No longer blocks: 749336
I refactored the AITC patches to not require this bug/feature. It is hacky and I wish I didn't have to do it, but it will enable us to land for the Firefox 15 release. It wasn't captured in this bug, but https://hg.mozilla.org/build/buildbotcustom/rev/7d0cd546fb77 was backed out because of burnage. A new patch in bug 755339 landed in m-c. It will need to propagate to all the trees again. In a week or two, we should be able to reland the patch in this bug to buildbotcustom then do the merge+reconfig. At that time, the other bugs blocked on this can land and testing modules will be usable.
OK. The new patch from bug 755339 has landed in all major trees. So, that technically unblocks this bug from proceeding. During the tree burnage last week, I think someone mentioned there is a staging environment to test things on. Perhaps we should verify this works there first before proceeding? Or, I guess we can just re-land attachment #626024 [details] [diff] [review] whenever you are ready. Please note that this will likely turn older, unmerged trees red. Which ones, I don't know. But, the fix is simple: merge m-c or cherry-pick https://hg.mozilla.org/mozilla-central/rev/4411b40ef38e. How should we proceed?
I can test this tomorrow.
Assignee: gps → catlee
Priority: -- → P2
This will break people's try pushes unless they have an up-to-date repo...are we ok with that?
Works on mozilla-central at least: Archive: firefox-16.0a1.en-US.linux-i686.tests.zip inflating: bin/fix-linux-stack.pl inflating: bin/screentopng inflating: bin/ssltunnel inflating: bin/certutil inflating: bin/pk12util inflating: bin/components/httpd.js inflating: bin/components/test_necko.xpt inflating: bin/components/httpd.manifest inflating: bin/plugins/libnptest.so inflating: bin/xpcshell inflating: bin/fix_stack_using_bpsyms.py inflating: certs/secmod.db inflating: certs/mochitest.client inflating: certs/pgoca.p12 inflating: certs/bug483440-pk10oflo.ca inflating: certs/jartests-object.ca inflating: certs/pgoca.ca inflating: certs/cert8.db inflating: certs/key3.db inflating: certs/bug483440-attack2b.ca inflating: certs/bug483440-attack7.ca extracting: modules/.dummy inflating: reftest/remoteautomation.py inflating: reftest/automationutils.py ...
(In reply to Chris AtLee [:catlee] from comment #17) > This will break people's try pushes unless they have an up-to-date > repo...are we ok with that? This change was announced on dev.planning and dev.tree-management and included the caveat about try pushes. Nobody raised any specific objections. The initial announcement was 2 weeks ago today. So, I think the window for acceptable warning has passed. (In reply to Chris AtLee [:catlee] from comment #18) > Works on mozilla-central at least: \o/ So, we're good to go. Just be sure to communicate the push to the sheriffs and #developers so people know what to attribute burnage to.
Comment on attachment 626024 [details] [diff] [review] Extract modules from archive, v1 re-landed on default. this will go live on the next reconfig.
Attachment #626024 - Flags: checked-in+
This made it to production today.
(In reply to Aki Sasaki [:aki] from comment #21) > This made it to production today. So, I guess this means we can resolve! Unfortunately, my Try build failed. And, I know why. I /think/ I can hack around it without another config update. But, there may be another one on the horizon. Look out!
Status: NEW → RESOLVED
Closed: 12 years ago
Resolution: --- → FIXED
Blocks: 762837
Product: mozilla.org → Release Engineering
Component: General Automation → General
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: