Closed
Bug 1133074
Opened 10 years ago
Closed 9 years ago
Add additional routes to index builds
Categories
(Release Engineering :: Applications: MozharnessCore, defect, P1)
Release Engineering
Applications: MozharnessCore
Tracking
(blocking-b2g:2.5+)
RESOLVED
FIXED
blocking-b2g | 2.5+ |
People
(Reporter: mshal, Assigned: mshal)
References
Details
(Keywords: qablocker, qaurgent)
Attachments
(11 files, 6 obsolete files)
(deleted),
patch
|
jonasfj
:
review+
|
Details | Diff | Splinter Review |
(deleted),
text/plain
|
Details | |
(deleted),
text/plain
|
Details | |
(deleted),
text/plain
|
Details | |
(deleted),
text/plain
|
Details | |
(deleted),
patch
|
jonasfj
:
review+
|
Details | Diff | Splinter Review |
(deleted),
patch
|
garndt
:
review+
|
Details | Diff | Splinter Review |
(deleted),
patch
|
garndt
:
review+
|
Details | Diff | Splinter Review |
(deleted),
patch
|
mshal
:
review+
|
Details | Diff | Splinter Review |
(deleted),
patch
|
jlund
:
review+
|
Details | Diff | Splinter Review |
(deleted),
patch
|
jlund
:
review+
garndt
:
review+
|
Details | Diff | Splinter Review |
We'll need additional routes to index nightly builds in a usable way. On ftp.m.o, builds are currently stored in a directory hierarchy like:
http://ftp.mozilla.org/pub/mozilla.org/<base_repo_name>/nightly/<year>/<month>/<year>-<month>-<day>-<repo>/<artifact-name>
and without the year/month directories:
http://ftp.mozilla.org/pub/mozilla.org/<base_repo_name/nightly/<year>-<month>-<day>-<repo>/<artifact-name>
So I think we should probably index them for buildbot builds as:
buildbot.<base_repo_name>.<year>.<month>.<day>.<repo>
or maybe:
buildbot.<base_repo_name>.<repo>.<year>.<month>.<day>
Comment 1•10 years ago
|
||
I'm not sure what the difference between <base_repo_name> and <repo> is, but thing of this a folder structure.
So if you go with something like:
buildbot.<base_repo_name>.<repo>.<year>.<month>.<day>
It's easy to list tasks indexed under a given month for a given <repo>. Of course if you wanted to list <repo>'s for a given day, this would be a structure.
In many cases you'll want the dynamic elements as far to the right as possible.
I suspect <day> is what changes the most here. But I don't know what <repo>, <base_repo_name> is...
Btw, I suspect <channel> would fit in too, unless you really only want "nightly", in which case you might want to hardcode "nightly" in there somewhere anyways. ALso consider <platform> and <config> too, config as in "debug", "opt", "pgo", etc.
Assignee | ||
Comment 2•10 years ago
|
||
catlee, nthomas - I think we talked briefly about this in a meeting. Do you guys have any thoughts on the layout for indexing nightlies and/or any other additional indexes that we'll want to add? mozharness currently uses the following routes:
"index.buildbot.branches.%s.%s" % (self.branch, self.platform),
"index.buildbot.revisions.%s.%s.%s" % (self.revision, self.branch, self.platform),
Flags: needinfo?(nthomas)
Flags: needinfo?(catlee)
Assignee | ||
Comment 3•10 years ago
|
||
(In reply to Jonas Finnemann Jensen (:jonasfj) from comment #1)
> I'm not sure what the difference between <base_repo_name> and <repo> is, but
> thing of this a folder structure.
<base_repo_name> is something like 'firefox' or 'b2g', while <repo> is something like 'mozilla-central', 'mozilla-aurora', etc. Though I do see some that have 'debug' in them as well, like 'mozilla-central-debug'. Some directories also seem to have times as well as the dates, while others just have the dates. The current ftp hierarchy is a bit confusing to me...
Comment 4•10 years ago
|
||
so base_repo_name == product?
I would leave "-debug" and other variants out of branch names here.
For nightlies, having dates and buildids also as routes would be handy.
Flags: needinfo?(catlee)
Assignee | ||
Comment 5•10 years ago
|
||
(In reply to Chris AtLee [:catlee] from comment #4)
> so base_repo_name == product?
I guess so - I'm taking "base_repo_name" from how it's referred to in mozregression. It refers to the directory at this level of the hierarchy: http://ftp.mozilla.org/pub/mozilla.org/
Comment 6•10 years ago
|
||
Renaming this bug to make it more general.
Ok, after chatting things over on #releng, I think some kind of interface which lets you pull down the list of build and metadata locations for a specific platform and branch on a monthly/daily basis should do the trick for mozregression.
Any API should handle the following use cases:
* Given a specific date and platform, get the nightly (m-c) build and revision for that date
* Given a range of dates, branch and platform, get a list of all builds and revision information for that platform and branch on those dates (i.e. all linux64 inbound builds from 2014-11-02 -> 2014-11-09)
It is ok if it takes multiple API calls to gather all the data required for the above use cases.
Summary: Add additional routes to index nightly builds → Add additional routes to index builds
Updated•10 years ago
|
Flags: needinfo?(nthomas)
Comment 7•10 years ago
|
||
@wlach, when you say:
> Given a range of dates, branch and platform, get a list of all builds and revision
> information for that platform and branch on those dates
Do you mean:
1) for each day in date-range: get *one* build on that day for given platform/branch, or
2) for each day in date-range: get *all* builds on that day for given platform/branch
Flags: needinfo?(wlachance)
Comment 8•10 years ago
|
||
(In reply to Jonas Finnemann Jensen (:jonasfj) from comment #7)
> @wlach, when you say:
> > Given a range of dates, branch and platform, get a list of all builds and revision
> > information for that platform and branch on those dates
>
> Do you mean:
> 1) for each day in date-range: get *one* build on that day for given
> platform/branch, or
> 2) for each day in date-range: get *all* builds on that day for given
> platform/branch
(2). With mozregression we want to do a bisection against every revision that gets pushed to the try (modulo coalescing).
Flags: needinfo?(wlachance)
Comment 9•10 years ago
|
||
So you probably want the ordering information about revisions too. Ie. to know in which order they came?
This is IMO where it gets hard, we can approximate it with timestamps, but for something exact, you need to query the repository or pushlog. In which case (depending on coalescing) it might be just as easy to
do a quick point query for each revision.
Options:
A) Index tasks as:
"buildbot.by-date.<branch>.<platform>.<year>.<month>.<day>.<timestamp>"
Then we can approximate the ordering using the timestamp.
B) Index tasks as:
"buildbot.by-date.<branch>.<platform>.<year>.<month>.<day>.<revision-number>"
Then we rely on mercurial revision numbers to provide ordering of builds.
C) Index tasks as currently done, then use pushlog API to get list of revisions between dates
and lookup each revision in the index. For bisect you don't need to lookup each revision, just
the revision in the middle, if not indexed (due to coalescing or failed build) you just try the
next one until you find one.
D) Index tasks as:
"buildbot.by-date.<branch>.<platform>.<year>.<month>.<day>.<revision>"
Then when you want all revisions between two dates, you:
1) call index.listTask("buildbot.by-date.<branch>.<platform>.<year>.<month>.<day>") for each date
2) call push log to
Note, the revision/changeset hash is either indexed as the data property, exported as an artifact, or
evident from the task-definition. So when a task is found the index we assume we have both revision
and artifacts. Granted revision ordering is not included.
---
For the case where:
> Given a specific date and platform, get the nightly (m-c) build and revision for that date
We probably just index tasks like this:
"buildbot.nightly.<branch>.<platform>.<year>.<month>.<day>"
As there is only one per day. Perhaps there is two, so they can either overwrite each other or we can append build id or something.
Comment 10•10 years ago
|
||
I think (D) would likely make the most sense for mozregression. Looking up revision information in a tight range shouldn't be too bad. Needinfo'ing Julien (mozregression co-maintainer) to get a second opinion.
Flags: needinfo?(j.parkouss)
Comment 11•10 years ago
|
||
Note, I forgot to finish step (2) of (D). It's important to note that index.taskcluster.net does not
provide any ordering. You could insert timestamps but that will always be a sad way to order revisions.
(mercurial revision-number is sane, but fairly mercurial specific).
D) Index tasks as:
"buildbot.by-date.<branch>.<platform>.<year>.<month>.<day>.<revision>"
Then when you want all revisions between two dates, you:
1) call index.listTask("buildbot.by-date.<branch>.<platform>.<year>.<month>.<day>") for each date
2) call push-log to get list of revisions in proper order.
Hence, (1) tells us which revisions we have builds for, and (2) tells what the revisions comes in.
I would suggest (C), unless for some reason you want to know exactly which revisions you have
builds for. What do you do now...
- Do you get the ordering from the push-log? or
- Does the ordering come from FTP somehow?
Note, on (D) which could also be:
"buildbot.by-date.<branch>.<platform>.<year>.<month>.<revision>"
Removing the layer makes easy to list all revisions in a single month, assuming that isn't too many.
List operation is paging, but you get up to 1k results per page, so 20k results shouldn't be a problem.
Comment 12•10 years ago
|
||
Hmm, I'm not sure I totally understood here, and I hope there is no confusion between nightlies and inbound.
I will focus on inbound only here.
Currently, we are getting inbound build information within two changesets +/- twelve hours around (see https://github.com/mozilla/mozregression/blob/1c3258678aa634fd54ec68d14175ae483611fb6d/mozregression/bisector.py#L398)
To do that, we do the following:
- call push-log to get all dates and changesets within the two ones
- apply the twelve hours for dates of our two changesets, that give us a start time and a end time
- use that start and end time to crawl on the ftp (ftp keep build with timestamps)
- the ordering comes from the ftp timestamps
- for each build dir on the ftp, test that the build is really interesting by checking that the changeset (in the txt file of the build) is in one of the changesets we found when calling push-log.
This logic can be seen here: https://github.com/mozilla/mozregression/blob/1c3258678aa634fd54ec68d14175ae483611fb6d/mozregression/build_data.py#L356
So I would say that we really do not care about dates here. We only use them to find the interesting build folders on ftp. If we could ask for build information within two changesets (and have them ordered), that will be just what we need.
Since we can have all the needed changesets by calling push-log and that they are already sorted (and that's not an issue for us to call push-log), get build information for one changeset is what we need I would say. Something like this:
"buildbot.<app_name>.<repo>.<platform>.<changeset>" would be great.
- app_name something like firefox, b2g
- repo is is something like 'mozilla-central', 'mozilla-aurora', etc
- platform may be linux32, linux64, macos, win32, win64
- changeset is the requested changeset
That being said, it is another story for nightlies. For nightlies we indeed need to get information given a date (that is year, month and day) but it feel strange to me that taskcluster would be able to find old nightlies (like in 2009 for example). From what I understood by reading the docs, it seems not designed to keep information quite a long time - tell me if I'm wrong.
Flags: needinfo?(j.parkouss)
Comment 13•10 years ago
|
||
Just to clarify, the scope of this work is to index builds on integration branches (and m-c)? For nightlies, will we still be able to get them from ftp.mozilla.org?
Flags: needinfo?(jopsen)
Comment 14•10 years ago
|
||
@wlach, I think mshal or catlee is the right person to answer this.
As I understand it the goal is to store artifacts on tasks in taskcluster and use taskcluster index
to track tasks with artifacts, as a way to migrate away from FTP and maintain compatibility when someday
builds starts happening on taskcluster.
Note:
- taskcluster artifacts, tasks and index entries are designed to have an expiration date
(but I think setting the expiration to year 3k is okay)
- we might need to back patch data from FTP, if we want to keep things forever.
(assuming ftp is going away)
@mshal, is this ^ at least somewhat right :)
--
On-topic: seems to me that we just need to index nightlies as the others can already be fetched by revision. Which is how it'll approached when ordering comes from the pushlog.
Flags: needinfo?(jopsen) → needinfo?(mshal)
Assignee | ||
Comment 15•10 years ago
|
||
Sounds like we'll want a longer ("forever") expiration only for nightly and release builds. For everything else, the current 52 weeks is fine. Backporting data from FTP and indexing it sounds like a smart idea to me, though IT is backing up FTP into S3 anyway (not indexed, though - I guess it's more like a mirror?) so this might be a nice-to-have.
Flags: needinfo?(mshal)
Updated•10 years ago
|
Comment 16•10 years ago
|
||
So as discussed with :mshal on #taskcluster, the additional routes by date should use the same date as what is used by buildbot to create the current ftp directory structure, that is the buildid property (timestamp of the build).
This is important to note - this is not the time of a push, but the time of a starting build.
Comment 17•10 years ago
|
||
It is actually the time of the push, not the time the build starts. All builds triggered by the same push will therefore share the same buildid, regardless of when they actually start.
Comment 18•10 years ago
|
||
Ok, thanks for the correction here. :) Please forget my comment 16 then;
Comment 19•10 years ago
|
||
Hmm, I am not able to find the link between the date in a ftp folder and the date of a push.
Here is what I'm trying:
- I'm using http://ftp.mozilla.org/pub/mozilla.org/firefox/nightly/2015/04/2015-04-21-06-24-53-mozilla-central/firefox-40.0a1.en-US.mac.txt to get a revision. I find b8d59286a581
- then I look into https://hg.mozilla.org/mozilla-central/json-pushes?startdate=2015-04-19&enddate=2015-04-23, and search for revision b8d59286a581 in there
- There is one, so I take the associated push timestamp: 1429622261
- I try to convert this to a date, I got datetime.datetime(2015, 4, 21, 15, 17, 41) (using python datetime.datetime.fromtimestamp(1429622261))
And I was expecting 2015-04-21-06-24-53 (like in the ftp folder name).
Same kind of result if I try with http://ftp.mozilla.org/pub/mozilla.org/firefox/nightly/2015/04/2015-04-22-03-02-06-mozilla-central, I get a final datetime.datetime(2015, 4, 22, 5, 3, 11) date which was not expected.
Did I do something wrong here ? How can I get the same date as the one in the ftp folder using json-pushes ?
Flags: needinfo?(catlee)
Comment 20•10 years ago
|
||
I'm sorry, I didn't provide a complete enough explanation earlier.
For "per-push" builds, we use the push time as the build id. These builds currently end up in the "tinderbox-builds" folder on FTP, and you can use the push time to find the directory name on FTP.
For nightly builds, we use the time that the builds are triggered as the build id. This is generally 3am pacific time, but often new nightly builds get triggered manually throughout the day.
Flags: needinfo?(catlee)
Comment 21•9 years ago
|
||
For QAnalysts qa this is something that is very necessary. We use a lot of older builds when doing regression windows and bisection and the current structure in task cluster is not very useful.
In addition to the hashes being unhelpful when trying to find a specific build, they are currently sorted by their hex hash instead of being listed in the order they were created. This means that we don't even have an idea if the build we are downloading is earlier or later than the build next to it.
Updated•9 years ago
|
Comment 22•9 years ago
|
||
Michael, is there any update on this? It's blocking is pretty hard from being able to bisect on the Z3C. Without this, we will be in danger of not being able to find, diagnose, and fix issues that arise for many users.
Severity: normal → critical
blocking-b2g: --- → spark+
Flags: needinfo?(mshal)
Assignee | ||
Comment 23•9 years ago
|
||
(In reply to Doug Sherk (:drs) (use needinfo?) from comment #22)
> Michael, is there any update on this? It's blocking is pretty hard from
> being able to bisect on the Z3C. Without this, we will be in danger of not
> being able to find, diagnose, and fix issues that arise for many users.
I don't have a definitive timeline, but it is next on my list. Maybe I can grab some people at the work week and hash it out.
Flags: needinfo?(mshal)
blocking-b2g: spark+ → 2.5+
Assignee | ||
Comment 24•9 years ago
|
||
Assignee | ||
Comment 25•9 years ago
|
||
Assignee | ||
Comment 26•9 years ago
|
||
Assignee | ||
Comment 27•9 years ago
|
||
Assignee | ||
Comment 28•9 years ago
|
||
The attachments here are intended to provide a starting point for discussion.
The routes.json file is an initial proposal of what our set of routes could look like. The "routes" field is used by non-l10n builds, and if it's a nightly it also uses the "nightly" section. L10n repacks are entirely separate, and use only the "l10n" section.
The routes-mozharness.txt file is a list of most (all?) mozharness builds for mozilla-central and mozilla-aurora, and what their corresponding routes would look like using this format. Similarly, routes-taskcluster.txt is the set of current Taskcluster builds, along with their current routes (prefixed with a '-'), and the new routes using routes.json (prefixed with a '+'). I generated these so that we could get a view of what all the routes would actually look like across all our build types.
The routes-all-mc.txt file is simply a grep of all the mozilla-central routes in both mozharness and taskcluster, and sorted so that you can get a rough idea of the structure of the index. (There's probably some awesome way of turning that into a more readable hierarchy, so ping me if you have something!)
Although there is a ton of data in the routes files, please look through them carefully to see if any of the routes look wrong, or not stored in a useful way (eg: missing routes), or named confusingly, etc
Some random thoughts/concerns/questions:
- Whatever we decide on, should the new routes be '.v2'?
- The current Taskcluster routes all have '.linux' hardcoded into them, but I'm not sure what the intent here was. Can someone clarify?
- On FTP, the directory name was based on the 'stage_platform' variable (eg: something like linux64-asan-debug). In Taskcluster, it seems we're splitting this out into build_name and build_type. I've tried to mimic this in mozharness by using 'platform' as the build_name, and some logic for the build_type[1]. Should we just go back to using stage_platform for everything and specify this in Taskcluster too?
- There was some talk above about having extra nightly routes for just {year}.{month}.{revision} - that's easy to add, though do we want both revisions and days in the same level of the index?
- Do we need to add {revision} in the {year}.{month}.{day} nightly route?
Although part of the goal of putting the routes in a shared file in tree is to make them easy to change, I think churning too much on it makes it difficult for external tools to use it well. It's certainly much easier to fiddle with now and just re-run the scripts to generate the giant list of routes and see what it will look like :)
[1]
if self.config.get('build_type'):
build_type = self.config['build_type']
elif self.config.get('pgo_build'):
build_type = 'pgo'
elif self.query_is_nightly():
build_type = 'nightly'
elif self.config.get('debug_build', False):
build_type = 'debug'
else:
build_type = 'opt'
(The asan and static-analysis builds specify the build_type explicitly in the config to avoid conflicting with regular builds, while the rest are inferred from the elif/else blocks.)
Comment 29•9 years ago
|
||
This looks pretty good IMO, and I like separating build_type and platform. Julien, can you take a look and see if what is listed here will be sufficient for mozregression?
For mozdownload, at any rate, we'll additionally need an index to releases, the equivalent of http://ftp.mozilla.org/pub/mozilla.org/firefox/releases/. I'm not sure whether that work belongs here or will be handled separately.
Flags: needinfo?(j.parkouss)
Comment 30•9 years ago
|
||
@mshal, Wow amazing work tracking down what we will be indexing :)
I'm, however, unsure of the value of suffixes like ".nightly" and ".opt",
I agree we want it in the namespace/route.
Generally, think of "." as the folder separator in a file system.
A folder containing a single file is boring to open.
Bad Examples: (3 folders w. one entry)
...b2g.aries-dogfood.opt
...b2g.aries-eng-blobfree.opt
...b2g.aries-eng.opt
Better examples: (2 folders w. 2 entries each)
...b2g.aries.debug
...b2g.aries.opt
...b2g.aries-blobfree.debug
...b2g.aries-blobfree.opt
Perhaps we could refactor to: (unless that is too hard to split)
...b2g.aries.blobfree-debug
...b2g.aries.blobfree-opt
...b2g.aries.debug
...b2g.aries.dogfood-opt
...b2g.aries.eng-blobfree-opt
...b2g.aries.eng-opt
...b2g.aries.opt
That way it's more interesting to browser the "...b2g.aries" folder/namespace.
An other alternative might be to use a dash between "{build_name}-{build_type}", so we get:
...b2g.aries-blobfree-debug
...b2g.aries-blobfree-opt
...b2g.aries-debug
...b2g.aries-dogfood-opt
...b2g.aries-eng-blobfree-opt
...b2g.aries-eng-opt
...b2g.aries-opt
That'll give us one folder called "...b2g." with a lot of entries.
But it might be easier to browse a single folder with 50-200 entries
than browsing a complicated hierarchy with lots of small folders.
> - Whatever we decide on, should the new routes be '.v2'?
Yeah, why not. We can probably roll them out in "garbage.gecko.v2" for testing.
Then we should have a talk about locking down the scopes, to:
i) help us trust indexed binaries
ii) Prevent people accidentally poisoning the index
Comment 31•9 years ago
|
||
This looks good for mozregression! Thanks :mshal.
(In reply to Michael Shal [:mshal] from comment #28)
> The attachments here are intended to provide a starting point for discussion.
>
> - Do we need to add {revision} in the {year}.{month}.{day} nightly route?
For mozregression, this is not needed. Only year/month/day is enough here.
I am just wondering, since there are multiple nightly builds in one day, will it be possible with TaskCluster to get a list of all the builds for one day ? (I think it may be called "run" in taskcluster, but I don't see a way to list runs with the api: http://docs.taskcluster.net/queue/api-docs/). Also just pointing to the latest successful build of the day would be enough here, instead of listing all the runs.
Flags: needinfo?(j.parkouss)
Assignee | ||
Comment 32•9 years ago
|
||
(In reply to Jonathan Griffin (:jgriffin) from comment #29)
> For mozdownload, at any rate, we'll additionally need an index to releases,
> the equivalent of http://ftp.mozilla.org/pub/mozilla.org/firefox/releases/.
> I'm not sure whether that work belongs here or will be handled separately.
Releases are going to handled as a part of the release promotion work. I believe we won't be putting additional "release" routes into routes.json, but rather doing a normal build, and then the promotion aspect involves copying the bits into a different bucket and categorizing it there.
Assignee | ||
Comment 33•9 years ago
|
||
(In reply to Julien Pagès from comment #31)
> I am just wondering, since there are multiple nightly builds in one day,
> will it be possible with TaskCluster to get a list of all the builds for one
> day ? (I think it may be called "run" in taskcluster, but I don't see a way
> to list runs with the api: http://docs.taskcluster.net/queue/api-docs/).
> Also just pointing to the latest successful build of the day would be enough
> here, instead of listing all the runs.
I think for this we'd need to add a {head_rev} in the route as well. Something like:
"{index}.gecko.v1.{project}.{year}.{month}.{day}.{build_product}.{build_name}.{build_type}",
+ "{index}.gecko.v1.{project}.{year}.{month}.{day}.revisions.{head_rev}.{build_product}.{build_name}.{build_type}",
So if you grab 2015.07.17.linux.(etc) then you'd just get whichever nightly built last on that particular day, but you could also get a list of builds on the 17th by looking at 2015.07.17.revisions.
Assignee | ||
Comment 34•9 years ago
|
||
:catlee also suggested in IRC that we index l10n builds just by the locale rather than by the chunk number. That seems much more useful to me, so I'll switch to that. If there are any reasons why the chunk number is useful and should be kept, please let me know.
Assignee | ||
Comment 35•9 years ago
|
||
(In reply to Michael Shal [:mshal] from comment #33)
> (In reply to Julien Pagès from comment #31)
> > I am just wondering, since there are multiple nightly builds in one day,
> > will it be possible with TaskCluster to get a list of all the builds for one
> > day ? (I think it may be called "run" in taskcluster, but I don't see a way
> > to list runs with the api: http://docs.taskcluster.net/queue/api-docs/).
> > Also just pointing to the latest successful build of the day would be enough
> > here, instead of listing all the runs.
>
> I think for this we'd need to add a {head_rev} in the route as well.
Note, this doesn't help if there are multiple builds triggered off of the same revision. I don't see a good way to distinguish those, since they also have the same buildid and builduid. In this case, whichever build uploads last wins.
Assignee | ||
Comment 36•9 years ago
|
||
Changes include:
1) .v1 -> .v2
2) {build_name}.{build_type} is now {build_name}-{build_type}. Although I would prefer the '.' separator if we could use build_name=='aries' for all the aries varieties, unfortunately I think this would require that we either change that the build_name is the "platform" in buildbot / trychooser / etc, or we have yet another naming scheme which is specific to indexing. I don't really like either of those options, so I went with the '-' separator.
3) I moved the {year}.{month}.{day} index under the '.nightly' group
4) I moved the latest nightlies under a '.latest' group
5) I added {index}.gecko.v2.{project}.nightly.{year}.{month}.{day}.revision.{head_rev}.{build_product}.{build_name}-{build_type} so if there are multiple different revisions built on the same date, we have access to them by revision
6) l10n routes append "-l10n" to the {build_product}, so at this level we see ["b2g", "firefox", "firefox-l10n", "mobile", "mobile-l10n"]. Before, l10n was buried inside the list of platforms
7) l10n routes use {build_name}-{build_type} instead of just {build_name} in order to distinguish between android-api-11 and android-api-9
8) l10n routes end with .{locale} instead of .l10n-{chunk}, so at the last level in the index you'll see ["ar", "ast", "cs", ...]
The parameters to be specified are this:
{index} - "index" for production builds and "index.garbage.staging" for staging builds. This may only matter for buildbot builds, but initially I'll just point it to garbage for testing
{project} - also known as the branch, eg: "mozilla-central", "mozilla-aurora", etc
{head_rev} - hg revision
{year} / {month} / {day} - obvious
{build_product} - one of "b2g", "mobile", or "firefox"
{build_name} - this is the "platform" config item in mozharness (which doesn't match up with the key to PLATFORM_VARS in buildbot) and the key in the "builds" variable in job_flags.yml for taskcluster. eg: "linux64", "android", or "linux64-mulet"
{build_type} - generally, one of ["opt", "debug", "pgo"]. I think I have to fiddle with this in mozharness to get it to be correct, since it's notion of a platform isn't unique. In taskcluster, it is just the key under the "types:" field in job_flags.yml
{locale} - one of the strings in all-locales
Attachment #8634291 -
Attachment is obsolete: true
Attachment #8638142 -
Flags: review?(jopsen)
Assignee | ||
Comment 37•9 years ago
|
||
Attachment #8634292 -
Attachment is obsolete: true
Assignee | ||
Comment 38•9 years ago
|
||
Attachment #8634293 -
Attachment is obsolete: true
Assignee | ||
Comment 39•9 years ago
|
||
Attachment #8634294 -
Attachment is obsolete: true
Assignee | ||
Comment 40•9 years ago
|
||
Comment 41•9 years ago
|
||
Comment on attachment 8638142 [details] [diff] [review]
0001-Bug-1133074-Add-routes.json-for-taskcluster-indexing.patch
> "{index}.gecko.v2.{project}.nightly.{year}.{month}.{day}.{build_product}.{build_name}-{build_type}"
I would consider adding "latest", like this:
"{index}.gecko.v2.{project}.nightly.{year}.{month}.{day}.latest.{build_product}.{build_name}-{build_type}"
To make it feel more consistent, as you have "latest" and "revision" opposite each other everywhere else.
Attachment #8638142 -
Flags: review?(jopsen) → review+
Comment 42•9 years ago
|
||
> 6) l10n routes append "-l10n" to the {build_product}, so at this level we
> see ["b2g", "firefox", "firefox-l10n", "mobile", "mobile-l10n"]. Before,
> l10n was buried inside the list of platforms
>
> 7) l10n routes use {build_name}-{build_type} instead of just {build_name}
> in order to distinguish between android-api-11 and android-api-9
>
> 8) l10n routes end with .{locale} instead of .l10n-{chunk}, so at the last
> level in the index you'll see ["ar", "ast", "cs", ...]
Can we also publish the standard builds to the 'en-US' locale route(s)?
Comment 43•9 years ago
|
||
Are there URLS using any of the proposed routes now? As a side project, I would like to add all builds, and their properties, to a database and see what service can be made from that.
Thank you!
Assignee | ||
Comment 44•9 years ago
|
||
(In reply to Kyle Lahnakoski [:ekyle] from comment #43)
> Are there URLS using any of the proposed routes now? As a side project, I
> would like to add all builds, and their properties, to a database and see
> what service can be made from that.
Not quite yet - I'm pushing things to try under garbage.mshal-testing.gecko.v2, but it's currently a WIP. I hope to have that sorted out in the next day or two.
What sort of build properties were you looking to make available via the service? I wrote some scripts to generate builds & their properties using both buildbot (with mozharness) and taskcluster, though they were cobbled together in one-off fashion in order to generate the list of routes.
Comment 45•9 years ago
|
||
(In reply to Michael Shal [:mshal] from comment #44)
> What sort of build properties were you looking to make available via the
> service? I wrote some scripts to generate builds & their properties using
> both buildbot (with mozharness) and taskcluster, though they were cobbled
> together in one-off fashion in order to generate the list of routes.
Something simple, like the properties found in [1], along with the build URLs. I would like to add the matching buildbot properties. And, if someone thinks adding the matching repo JSON [2] would be useful to query on. That can be added too. But first, let's start simple and see if we can answer these search questions with what is easily available.
If you can provide a JSON file like [1] at the leaves of the routes, I can index them.
[1] http://ftp.mozilla.org/pub/mozilla.org/firefox/nightly/2015-07-28-03-02-09-mozilla-central/firefox-42.0a1.en-US.linux-i686.json
[2] http://activedata.allizom.org/tools/query.html#query_id=aMaJFyih
Comment 46•9 years ago
|
||
> Can we also publish the standard builds to the 'en-US' locale route(s)?
If we do we should swap the order of "{build_name}-{build_type}" and "",
as it's more interesting to browse a folder containing more than one thing, example:
routes:
{index}.gecko.v2.{project}.latest.{build_product}-l10n.{locale}.{build_name}-{build_type}
l10n:
{index}.gecko.v2.{project}.latest.{build_product}.en-US.{build_name}-{build_type}
---
@ekyle, (off-topic)
> Are there URLS using any of the proposed routes now? As a side project, I would like to add all builds,
> and their properties, to a database and see what service can be made from that.
I would love to see people build things on top of this. But rather than polling the index, just listen
to pulse :) If tasks are indexed by these "routes" being added to "task.routes", then you'll be able
to pick up the tasks using pulse.
When task.routes = ["index.gecko.v2.x.y.z"], then messages about the task will be CC'ed with the
routing key: "route.index.gecko.v2.x.y.z"; so you could listen to "route.index.gecko.v2.#" on the
task-completed exchange: http://docs.taskcluster.net/queue/exchanges/#taskCompleted
In fact you could just use the routing-key: "route.index.gecko.v2.*.revision.#", depending on what
you want to pickup and stuff into your database.
(for details ping me on irc, or see docs for queue on task-specific routes)
Assignee | ||
Comment 47•9 years ago
|
||
(In reply to Jonas Finnemann Jensen (:jonasfj) from comment #46)
> > Can we also publish the standard builds to the 'en-US' locale route(s)?
> If we do we should swap the order of "{build_name}-{build_type}" and "",
> as it's more interesting to browse a folder containing more than one thing,
> example:
> routes:
>
> {index}.gecko.v2.{project}.latest.{build_product}-l10n.{locale}.{build_name}-
> {build_type}
> l10n:
>
> {index}.gecko.v2.{project}.latest.{build_product}.en-US.{build_name}-
> {build_type}
I think catlee was suggesting to route nightly builds that trigger single-locale l10n repacks to also push to an en-US route. So a nightly linux build would push to the regular nightly routes (eg: index.gecko.v2.mozilla-central.nightly.latest.firefox.linux-opt) as well as l10n routes like this:
+ index.gecko.v2.mozilla-central.revision.abcdef12345.firefox-l10n.linux-opt.en-US
+ index.gecko.v2.mozilla-central.latest.firefox-l10n.linux-opt.en-US
In this case, the en-US is not the only thing in the folder, since the repacks will push to:
+ index.gecko.v2.mozilla-central.revision.abcdef12345.firefox-l10n.linux-opt.ar
+ index.gecko.v2.mozilla-central.latest.firefox-l10n.linux-opt.ar
+ index.gecko.v2.mozilla-central.revision.abcdef12345.firefox-l10n.linux-opt.ast
+ index.gecko.v2.mozilla-central.latest.firefox-l10n.linux-opt.ast
+ index.gecko.v2.mozilla-central.revision.abcdef12345.firefox-l10n.linux-opt.cs
+ index.gecko.v2.mozilla-central.latest.firefox-l10n.linux-opt.cs
So in the linux-opt folder, you'll see en-US alongside other locales (ar, ast, etc), but the en-US is really just an alias for the nightly build.
Unfortunately I don't think mozharness has direct knowledge of which builds trigger l10n repacks, since buildbot handles that. However, I can add a configuration option that tells mozharness whether or not nightly builds should also publish to the l10n routes using locale=en-US so that we are using them for the right builds.
:catlee, am I understanding your intent correctly? Or were you thinking this would be for all builds (even those that don't trigger l10n repacks and/or non-nightlies)?
:jonasfj, does this address your concern?
Flags: needinfo?(jopsen)
Flags: needinfo?(catlee)
Comment 48•9 years ago
|
||
> :jonasfj, does this address your concern?
Absolutely, that makes total sense! :)
Flags: needinfo?(jopsen)
Assignee | ||
Comment 49•9 years ago
|
||
Interdiff that includes the "latest" next to the revision in the nightly routes. I also added a route for all nightlies by revision, since that seemed to be missing. Now latest & revision are counterparts everywhere :)
catlee's suggestion doesn't needed to be added in routes.json - it will be up to mozharness/taskcluster to add the en-US l10n routes where appropriate.
Attachment #8641259 -
Flags: review?(jopsen)
Updated•9 years ago
|
Attachment #8641259 -
Flags: review?(jopsen) → review+
Assignee | ||
Comment 50•9 years ago
|
||
This adds the build_product variable to task files for use in the routes. I'm not sure if putting it in the "extra" group is the right way to go, so if there's a better way to handle that let me know.
Attachment #8642628 -
Flags: review?(garndt)
Assignee | ||
Comment 51•9 years ago
|
||
This adds the configuration settings needed for mozharness. The publish_nightly_en_US_routes flag is new, and is used for determining whether or not nightly builds should also publish to the l10n routes under the en-US locale (#c42). The build_type flags are used to make the {platform}-{build_type} unique for some cases where it can't be determined automatically. This is used because what mozharness thinks of "platform" isn't necessarily unique, and doesn't always line up with what buildbot thinks it is.
Attachment #8642631 -
Flags: review?(jlund)
Assignee | ||
Comment 52•9 years ago
|
||
For now I'm publishing to index.garbage.staging.mshal-testing so we can see what it looks like live before making it go to the top-level index. Once it looks good I'll have a followup to remove that.
I'm not sure if the BuildbotMixin is the best place for the shared code like query_build_name() and query_build_type(), so if you have a better idea there let me know.
The l10n changes are slightly more complicated since it was switched from a task-per-chunk to a task-per-locale model, so that we could get each locale in its own route.
Attachment #8642634 -
Flags: review?(jlund)
Assignee | ||
Comment 53•9 years ago
|
||
This uses the routes in taskcluster. I added a 'decorate_task_json_routes' function somewhat similar to decorate_task_treeherder_routes. There aren't any nightly or l10n builds in TC yet AFAIK, hence the TODO. Similar to the mozharness changes, the garbage.mshal-testing index is just temporary until we can see how it looks live.
Attachment #8642636 -
Flags: review?(garndt)
Comment 54•9 years ago
|
||
Comment on attachment 8642631 [details] [diff] [review]
0002-Bug-1133074-Add-mozharness-parameters-for-taskcluste.patch
Review of attachment 8642631 [details] [diff] [review]:
-----------------------------------------------------------------
::: testing/mozharness/scripts/b2g_build.py
@@ +137,4 @@
> 'influx_credentials_file': 'oauth.txt',
> 'balrog_credentials_file': 'oauth.txt',
> 'build_resources_path': '%(abs_obj_dir)s/.mozbuild/build_resources.json',
> + 'stage_product': 'b2g',
where is this being used?
Attachment #8642631 -
Flags: review?(jlund) → review+
Comment 55•9 years ago
|
||
Comment on attachment 8642634 [details] [diff] [review]
0003-Bug-1133074-Use-routes.json-for-mozharness-TC-upload.patch
Review of attachment 8642634 [details] [diff] [review]:
-----------------------------------------------------------------
lgtm. some comments below to discuss first before r+
::: testing/mozharness/mozharness/mozilla/buildbot.py
@@ +185,5 @@
> + def query_build_name(self):
> + build_name = self.config.get('platform')
> + if not build_name:
> + # B2G builds use 'target' and 'target_suffix' instead of 'platform'
> + build_name = self.config.get('target', '') + self.config.get('target_suffix', '')
this will allow only target_suffix to exist where we end up with build_names like '-eng'. Not sure the risk here but your fatal msg suggests target must also exist.
maybe it would be easier to use a try/catch
e.g.
try:
build_name = self.config.get('platform') or self.config['target'] + self.config.get('target_suffix', '')
except KeyError:
self.fatal('Must specify "platform" or "target" in the mozharness config for indexing.')
@@ +215,5 @@
> + buildidDict['hour'] = buildid[8:10]
> + buildidDict['minute'] = buildid[10:12]
> + buildidDict['second'] = buildid[12:14]
> + except:
> + raise "Could not parse buildid!"
could we log here instead of just bubbling this up? Should we be fataling?
::: testing/mozharness/mozharness/mozilla/building/buildbase.py
@@ +1393,5 @@
> logging.getLogger('taskcluster').setLevel(logging.DEBUG)
>
> + routes_json = os.path.join(dirs['abs_src_dir'],
> + 'testing/taskcluster/routes.json')
> + with open(routes_json) as f:
you could use parse_config_file too here instead if you want:
https://dxr.mozilla.org/mozilla-central/source/testing/mozharness/mozharness/mozilla/l10n/locales.py#81
https://dxr.mozilla.org/mozilla-central/source/testing/mozharness/mozharness/base/config.py#139
::: testing/mozharness/scripts/desktop_l10n.py
@@ +1039,5 @@
> + 'build_type': self.query_build_type(),
> + 'locale': locale,
> + }
> + fmt.update(self.buildid_to_dict(self._query_buildid()))
> + routes.append(template.format(**fmt))
sanity check: do we use python 2.7 for all of desktop l10n?
Attachment #8642634 -
Flags: review?(jlund) → feedback+
Assignee | ||
Comment 56•9 years ago
|
||
(In reply to Jordan Lund (:jlund) from comment #54)
> ::: testing/mozharness/scripts/b2g_build.py
> @@ +137,4 @@
> > 'influx_credentials_file': 'oauth.txt',
> > 'balrog_credentials_file': 'oauth.txt',
> > 'build_resources_path': '%(abs_obj_dir)s/.mozbuild/build_resources.json',
> > + 'stage_product': 'b2g',
>
> where is this being used?
Good catch - it isn't. I'll remove it. For some reason I was thinking we uploaded to TC from B2G device builds as well, but we don't. And b2g desktop builds use the fx desktop script.
Assignee | ||
Comment 57•9 years ago
|
||
Removed the b2g_build.py change. r+ carried forward.
Attachment #8643435 -
Flags: review+
Assignee | ||
Comment 58•9 years ago
|
||
(In reply to Jordan Lund (:jlund) from comment #55)
> ::: testing/mozharness/mozharness/mozilla/buildbot.py
> @@ +185,5 @@
> > + def query_build_name(self):
> > + build_name = self.config.get('platform')
> > + if not build_name:
> > + # B2G builds use 'target' and 'target_suffix' instead of 'platform'
> > + build_name = self.config.get('target', '') + self.config.get('target_suffix', '')
>
> this will allow only target_suffix to exist where we end up with build_names
> like '-eng'. Not sure the risk here but your fatal msg suggests target must
> also exist.
So I was adding this thinking we needed to support b2g_build.py as well, but since we don't (at least not at the moment), I can just remove the target/target_suffix part.
> @@ +215,5 @@
> > + buildidDict['hour'] = buildid[8:10]
> > + buildidDict['minute'] = buildid[10:12]
> > + buildidDict['second'] = buildid[12:14]
> > + except:
> > + raise "Could not parse buildid!"
>
> could we log here instead of just bubbling this up? Should we be fataling?
I changed it to self.fatal and also print out the buildid. I'm not sure what we should do if we can't parse the buildid - any thoughts? I imagine this isn't the only thing that would break if we have a bad buildid, though.
>
> ::: testing/mozharness/mozharness/mozilla/building/buildbase.py
> @@ +1393,5 @@
> > logging.getLogger('taskcluster').setLevel(logging.DEBUG)
> >
> > + routes_json = os.path.join(dirs['abs_src_dir'],
> > + 'testing/taskcluster/routes.json')
> > + with open(routes_json) as f:
>
> you could use parse_config_file too here instead if you want:
>
> https://dxr.mozilla.org/mozilla-central/source/testing/mozharness/mozharness/
> mozilla/l10n/locales.py#81
> https://dxr.mozilla.org/mozilla-central/source/testing/mozharness/mozharness/
> base/config.py#139
Ahh, I'm not sure if we want to search through those extra paths to find a routes.json file. If we happen to name a file "routes.json" in one of those other directories we could end up using the wrong file. We do also load other json directly (eg: mach_build_properties.json), so it's not totally out of character.
> ::: testing/mozharness/scripts/desktop_l10n.py
> @@ +1039,5 @@
> > + 'build_type': self.query_build_type(),
> > + 'locale': locale,
> > + }
> > + fmt.update(self.buildid_to_dict(self._query_buildid()))
> > + routes.append(template.format(**fmt))
>
> sanity check: do we use python 2.7 for all of desktop l10n?
I think? Is there a good way to check this? I built a Windows and Linux desktop l10n in staging, but it's a pain to setup :/
Assignee | ||
Updated•9 years ago
|
Attachment #8642631 -
Attachment is obsolete: true
Assignee | ||
Comment 59•9 years ago
|
||
interdiff:
--- a/testing/mozharness/mozharness/mozilla/buildbot.py
+++ b/testing/mozharness/mozharness/mozilla/buildbot.py
@@ -185,10 +185,7 @@ class BuildbotMixin(object):
def query_build_name(self):
build_name = self.config.get('platform')
if not build_name:
- # B2G builds use 'target' and 'target_suffix' instead of 'platform'
- build_name = self.config.get('target', '') + self.config.get('target_suffix', '')
- if not build_name:
- self.fatal('Must specify "platform" or "target" in the mozharness config for indexing')
+ self.fatal('Must specify "platform" in the mozharness config for indexing')
return build_name
@@ -216,5 +213,5 @@ class BuildbotMixin(object):
buildidDict['minute'] = buildid[10:12]
buildidDict['second'] = buildid[12:14]
except:
- raise "Could not parse buildid!"
+ self.fatal('Could not parse buildid into YYYYMMDDHHMMSS: %s' % buildid)
return buildidDict
Attachment #8642634 -
Attachment is obsolete: true
Attachment #8643436 -
Flags: review?(jlund)
Comment 60•9 years ago
|
||
Comment on attachment 8643436 [details] [diff] [review]
0003-Bug-1133074-Use-routes.json-for-mozharness-TC-upload.patch
Review of attachment 8643436 [details] [diff] [review]:
-----------------------------------------------------------------
I think we use 2.7. not easy to check without doing setup but win and linux should be a good enough test.
ship it
Attachment #8643436 -
Flags: review?(jlund) → review+
Assignee | ||
Comment 61•9 years ago
|
||
I chatted with catlee - he is fine with the approach taken for the en-US routes.
Flags: needinfo?(catlee)
Comment 62•9 years ago
|
||
Comment on attachment 8642636 [details] [diff] [review]
0004-Bug-1133074-Use-routes.json-for-Taskcluster-routes.patch
Review of attachment 8642636 [details] [diff] [review]:
-----------------------------------------------------------------
LGTM
::: testing/taskcluster/mach_commands.py
@@ +287,4 @@
>
> # Template parameters used when expanding the graph
> parameters = dict(gaia_info().items() + {
> + 'index': 'index.garbage.staging.mshal-testing', #TODO
I assume this will be removed?
Attachment #8642636 -
Flags: review?(garndt) → review+
Comment 63•9 years ago
|
||
Comment on attachment 8642628 [details] [diff] [review]
0001-Bug-1133074-Add-build_-parameters-to-Taskcluster-tas.patch
Review of attachment 8642628 [details] [diff] [review]:
-----------------------------------------------------------------
LGTM
Attachment #8642628 -
Flags: review?(garndt) → review+
Assignee | ||
Comment 64•9 years ago
|
||
(In reply to Greg Arndt [:garndt] from comment #62)
> ::: testing/taskcluster/mach_commands.py
> @@ +287,4 @@
> >
> > # Template parameters used when expanding the graph
> > parameters = dict(gaia_info().items() + {
> > + 'index': 'index.garbage.staging.mshal-testing', #TODO
>
> I assume this will be removed?
I'm planning to land with mshal-testing in there so we can look at it live without cluttering up the "real" index, and then once we're satisfied I'll remove it in a followup.
Updated•9 years ago
|
Blocks: b2g-emulator-x86-KK
Updated•9 years ago
|
Priority: -- → P1
Assignee | ||
Updated•9 years ago
|
Keywords: leave-open
Comment 65•9 years ago
|
||
Comment 66•9 years ago
|
||
Comment 67•9 years ago
|
||
Comment 68•9 years ago
|
||
Comment 69•9 years ago
|
||
Updated•9 years ago
|
Assignee | ||
Comment 71•9 years ago
|
||
I think we can go ahead and remove the garbage.staging.mshal-testing part of the route. :jlund for the mozharness changes and :garndt for the taskcluster change.
Attachment #8648844 -
Flags: review?(jlund)
Attachment #8648844 -
Flags: review?(garndt)
Updated•9 years ago
|
Attachment #8648844 -
Flags: review?(garndt) → review+
Comment 72•9 years ago
|
||
Comment on attachment 8648844 [details] [diff] [review]
0001-Bug-1133074-Make-the-gecko.v2-routes-public.patch
Review of attachment 8648844 [details] [diff] [review]:
-----------------------------------------------------------------
sorry for the delay.
Attachment #8648844 -
Flags: review?(jlund) → review+
Comment 73•9 years ago
|
||
Assignee | ||
Comment 75•9 years ago
|
||
We're done here for now. Any changes or additional work can be done in followups.
Status: NEW → RESOLVED
Closed: 9 years ago
Resolution: --- → FIXED
Comment 76•7 years ago
|
||
Removing leave-open keyword from resolved bugs, per :sylvestre.
Keywords: leave-open
You need to log in
before you can comment on or make changes to this bug.
Description
•