Closed
Bug 1079391
Opened 10 years ago
Closed 8 years ago
mozilla-b2g34-v2_1 builds are not uploading to tinderbox-builds/mozilla-b2g34_v2_1-*_gecko/ which is stopping all Firefox OS Gaia v2.1 testing
Categories
(Release Engineering :: General, defect, P3)
Tracking
(Not tracked)
RESOLVED
WONTFIX
People
(Reporter: jhford, Unassigned)
References
Details
(Whiteboard: [kanban:engops:https://mozilla.kanbanize.com/ctrl_board/6/3217] )
We need a way to figure out which builds to use for Gaia Testing. Because this has to be done by scrapping things instead of using some sort of central index of build artifacts, we are some what limited in how that happens.
My scrapper is being broken by the b2g34 v2.1 repository being present on hg.mozilla.org but the builds not going to
https://ftp.mozilla.org/pub/mozilla.org/b2g/tinderbox-builds/mozilla-b2g32_v2_0-linux64_gecko/
Please either:
1) delete/hide this repository until builds are actually generated from it
2) set builds for this repository to upload to the correct location
3) write a patch on http://pastebin.mozilla.org/6736143 to make the following call return valid data:
pf = require('./platform_files');
pf.all('v2.1', function(e,o) { console.log(o) });
Reporter | ||
Comment 1•10 years ago
|
||
(In reply to John Ford [:jhford] -- please use 'needinfo?' instead of a CC from comment #0)
> https://ftp.mozilla.org/pub/mozilla.org/b2g/tinderbox-builds/mozilla-
> b2g32_v2_0-linux64_gecko/
Sorry, should read:
https://ftp.mozilla.org/pub/mozilla.org/b2g/tinderbox-builds/mozilla-b2g34_v2_1-linux64_gecko/
Comment 2•10 years ago
|
||
Releng need that repo visible to setup a(In reply to John Ford [:jhford] -- please use 'needinfo?' instead of a CC from comment #0)
> 1) delete/hide this repository until builds are actually generated from it
We need this repo in advance to test the automation around this repo. Bug 1075607 tracks this.
> 2) set builds for this repository to upload to the correct location
We don't build off of this repo yet.
> 3) write a patch on http://pastebin.mozilla.org/6736143 to make the
> following call return valid data:
Who owns this code?
Updated•10 years ago
|
Component: Buildduty → General Automation
QA Contact: bugspam.Callek → catlee
Comment 3•10 years ago
|
||
Status update:
- vidyo'd with jhford, and agreed on multi prong approach
i) jhford will hardcode script for this merge cycle
ii) releng will notify jhford by email as the uplift happens this cycle
iii) we'll work on specific requirements & a supported relengapi prior to next merge cycle.
- the hardcode didn't work, due to issues with reliably accessing data from _just_ mozilla-aurora hgweb view.
- jhford working with developer services on that issue
Reporter | ||
Comment 4•10 years ago
|
||
I worked with bkero and cturra to figure out what's going on.
Ben determined that this issue was with all high-volume https requests to hg.mozilla.org using curl. Ben found that using other DigitalOcean machines in difference DCs worked reliably when using Arch and Ubuntu 'trusty'. I set up a production machine in the sfo1 datacenter running the same centos6.5 as production and it also has trouble connecting to mozilla https.
I don't have this trouble with non-mozilla.org https sites. I set up a new machine based on Ubuntu trusty in the DO sfo1 DC as a production client. So far, it seems to be working better. This suggests to me that there is something strange about mozilla.org's https that interacts badly with something fairly low level in Centos 6.5, at least on DigitalOcean's kernel. I say this because I'm using the exact same node binaries downloaded from the node website on Centos 6.5 and Ubuntu and because this happens with both curl and node. OpenSSL is included directly in the node binaries and there is no dependency on the system OpenSSL.
For now, I've turned off my Centos 6.5 workers and am going to set up a couple of Ubuntu workers. Gecko builds for most branches are broken right now because Buildbot had killed the builds because of lost master->slave connections, so I can't test completely. I did test an old branch (v1.3t) which I know has builds available, and it did work fine there.
I'm lowering priority of this bug as we have a (bad) workaround.
The ideal long term solution to this problem in general is to have an easily parsable source for figuring out the latest build of gecko.
Severity: critical → normal
Priority: P1 → P3
Updated•10 years ago
|
Whiteboard: [kanban:engops:https://mozilla.kanbanize.com/ctrl_board/6/3207]
Updated•10 years ago
|
Whiteboard: [kanban:engops:https://mozilla.kanbanize.com/ctrl_board/6/3207] → [kanban:engops:https://mozilla.kanbanize.com/ctrl_board/6/3212]
Updated•10 years ago
|
Whiteboard: [kanban:engops:https://mozilla.kanbanize.com/ctrl_board/6/3212] → [kanban:engops:https://mozilla.kanbanize.com/ctrl_board/6/3217]
Updated•8 years ago
|
Status: NEW → RESOLVED
Closed: 8 years ago
Resolution: --- → WONTFIX
Assignee | ||
Updated•7 years ago
|
Component: General Automation → General
You need to log in
before you can comment on or make changes to this bug.
Description
•