Closed Bug 731339 Opened 13 years ago Closed 13 years ago

Verify if these slaves are supposed to show up on http://build.mozilla.org/builds/last-job-per-slave.html

Categories

(Release Engineering :: General, defect, P3)

x86
macOS
defect

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: armenzg, Assigned: armenzg)

Details

(Whiteboard: [buildduty])

Attachments

(1 file)

talos-r4-snow-003 - preproduction? talos-r4-snow-082 talos-r4-snow-083 talos-r4-snow-085 tegra-178 tegra-180 tegra-184 tegra-185 tegra-186 tegra-193 tegra-197 tegra-218 tegra-226 tegra-228 tegra-239 tegra-241
Whiteboard: [buildduty]
(In reply to Armen Zambrano G. [:armenzg] - Release Engineer from comment #0) > talos-r4-snow-003 - preproduction > talos-r4-snow-082 > talos-r4-snow-083 > talos-r4-snow-085 > tegra-178 <-- not in pool > tegra-180 <-- not in pool > tegra-184 <-- not in pool > tegra-185 > tegra-186 <-- not in pool > tegra-193 <-- not in pool > tegra-197 <-- not in pool > tegra-218 > tegra-226 <-- not in pool > tegra-228 > tegra-239 <-- not in pool > tegra-241 <-- not in pool
talos-r4-lion-003 - preproduction?
Priority: -- → P3
Attached patch adjust slaves (deleted) — Splinter Review
> > talos-r4-snow-003 - preproduction > > talos-r4-snow-082 - back from reparis > > talos-r4-snow-083 - re-purposed as signing mac > > talos-r4-snow-085 - not existant > > tegra-178 <-- not in pool > > tegra-180 <-- not in pool > > tegra-184 <-- not in pool > > tegra-185 - staging > > tegra-186 <-- not in pool > > tegra-193 <-- not in pool > > tegra-197 <-- not in pool > > tegra-218 ? > > tegra-226 <-- not in pool > > tegra-228 ? > > tegra-239 <-- not in pool > > tegra-241 <-- not in pool bear what is the status of 218 and 228? Are they supposed to be production? I can't find any bugmail about them two. Armens-MacBook-Air:puppet-manifests armenzg$ grep -r talos-r4-snow-003 . ./staging.pp:node "talos-r4-snow-003" inherits "darwin10-i386-test" { Armens-MacBook-Air:puppet-manifests armenzg$ grep -r talos-r4-lion-003 . ./scl-production.pp:node "talos-r4-lion-003" inherits "darwin11-x86_64-test" { ./staging.pp:node "talos-r4-lion-003" inherits "darwin11-x86_64-test" {(
Assignee: nobody → armenzg
Status: NEW → ASSIGNED
Attachment #602075 - Flags: review?(jhford)
Attachment #602075 - Flags: review?(bear)
Attachment #602075 - Flags: review?(bear) → review+
--> Trying to connect to the wrong port foopy19:/builds/tegra-228 2012-03-01 12:56:10-0800 [Uninitialized] Connection to buildbot-master19.build.mozilla.org:9161 failed: Connection Refused 2012-03-01 12:56:10-0800 [Uninitialized] <twisted.internet.tcp.Connector instance at 0x102c68fc8> will retry in 315 seconds 2012-03-01 12:56:10-0800 [Uninitialized] Stopping factory <buildslave.bot.BotFactory instance at 0x102f24e60> 2012-03-01 12:58:10-0800 [-] connection attempt timed out (is the port number correct?) -> Stopped on Dec. 20th foopy18:/builds/tegra-218/twistd.log 2011-12-20 00:36:17-0800 [Broker,client] Lost connection to dev-master01.build.scl1.mozilla.com:9160 2011-12-20 00:36:17-0800 [Broker,client] Stopping factory <buildslave.bot.BotFactory instance at 0x108fdcef0> 2011-12-20 00:36:17-0800 [-] Main loop terminated. 2011-12-20 00:36:17-0800 [-] Server Shut Down.
jhford: ping for r?
Comment on attachment 602075 [details] [diff] [review] adjust slaves sorry, didn't see the review request. rev4 parts look good to me!
Attachment #602075 - Flags: review?(jhford) → review+
Comment on attachment 602075 [details] [diff] [review] adjust slaves ec8756c95173 landed on "default"
Attachment #602075 - Flags: checked-in+
This change went live in a reconfiguration around 9AM.
Is this bug FIXED now?
Yes. I just fixed tegra-228 which had the wrong port number on the buildbot.tac.
Status: ASSIGNED → RESOLVED
Closed: 13 years ago
Resolution: --- → FIXED
Product: mozilla.org → Release Engineering
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: